Computer-based simulation training in emergency medicine designed in the light of malpractice cases.
Karakuş, Akan; Duran, Latif; Yavuz, Yücel; Altintop, Levent; Calişkan, Fatih
2014-07-27
Using computer-based simulation systems in medical education is becoming more and more common. Although the benefits of practicing with these systems in medical education have been demonstrated, advantages of using computer-based simulation in emergency medicine education are less validated. The aim of the present study was to assess the success rates of final year medical students in doing emergency medical treatment and evaluating the effectiveness of computer-based simulation training in improving final year medical students' knowledge. Twenty four Students trained with computer-based simulation and completed at least 4 hours of simulation-based education between the dates Feb 1, 2010 - May 1, 2010. Also a control group (traditionally trained, n =24) was chosen. After the end of training, students completed an examination about 5 randomized medical simulation cases. In 5 cases, an average of 3.9 correct medical approaches carried out by computer-based simulation trained students, an average of 2.8 correct medical approaches carried out by traditionally trained group (t = 3.90, p < 0.005). We found that the success of students trained with simulation training in cases which required complicated medical approach, was statistically higher than the ones who didn't take simulation training (p ≤ 0.05). Computer-based simulation training would be significantly effective in learning of medical treatment algorithms. We thought that these programs can improve the success rate of students especially in doing adequate medical approach to complex emergency cases.
Ten Eyck, Raymond P; Tews, Matthew; Ballester, John M; Hamilton, Glenn C
2010-06-01
To determine the impact of simulation-based instruction on student performance in the role of emergency department resuscitation team leader. A randomized, single-blinded, controlled study using an intention to treat analysis. Eighty-three fourth-year medical students enrolled in an emergency medicine clerkship were randomly allocated to two groups differing only by instructional format. Each student individually completed an initial simulation case, followed by a standardized curriculum of eight cases in either group simulation or case-based group discussion format before a second individual simulation case. A remote coinvestigator measured eight objective performance end points using digital recordings of all individual simulation cases. McNemar chi2, Pearson correlation, repeated measures multivariate analysis of variance, and follow-up analysis of variance were used for statistical evaluation. Sixty-eight students (82%) completed both initial and follow-up individual simulations. Eight students were lost from the simulation group and seven from the discussion group. The mean postintervention case performance was significantly better for the students allocated to simulation instruction compared with the group discussion students for four outcomes including a decrease in mean time to (1) order an intravenous line; (2) initiate cardiac monitoring; (3) order initial laboratory tests; and (4) initiate blood pressure monitoring. Paired comparisons of each student's initial and follow-up simulations demonstrated significant improvement in the same four areas, in mean time to order an abdominal radiograph and in obtaining an allergy history. A single simulation-based teaching session significantly improved student performance as a team leader. Additional simulation sessions provided further improvement compared with instruction provided in case-based group discussion format.
Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.
2011-01-01
Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.
Use case driven approach to develop simulation model for PCS of APR1400 simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang
2006-07-01
The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less
Some Results of Weak Anticipative Concept Applied in Simulation Based Decision Support in Enterprise
NASA Astrophysics Data System (ADS)
Kljajić, Miroljub; Kofjač, Davorin; Kljajić Borštnar, Mirjana; Škraba, Andrej
2010-11-01
The simulation models are used as for decision support and learning in enterprises and in schools. Tree cases of successful applications demonstrate usefulness of weak anticipative information. Job shop scheduling production with makespan criterion presents a real case customized flexible furniture production optimization. The genetic algorithm for job shop scheduling optimization is presented. Simulation based inventory control for products with stochastic lead time and demand describes inventory optimization for products with stochastic lead time and demand. Dynamic programming and fuzzy control algorithms reduce the total cost without producing stock-outs in most cases. Values of decision making information based on simulation were discussed too. All two cases will be discussed from optimization, modeling and learning point of view.
Rajan, Shobana; Khanna, Ashish; Argalious, Maged; Kimatian, Stephen J; Mascha, Edward J; Makarova, Natalya; Nada, Eman M; Elsharkawy, Hesham; Firoozbakhsh, Farhad; Avitsian, Rafi
2016-02-01
Simulation-based learning is emerging as an alternative educational tool in this era of a relative shortfall of teaching anesthesiologists. The objective of the study is to assess whether screen-based (interactive computer simulated) case scenarios are more effective than problem-based learning discussions (PBLDs) in improving test scores 4 and 8 weeks after these interventions in anesthesia residents during their first neuroanesthesia rotation. Prospective, nonblinded quasi-crossover study. Cleveland Clinic. Anesthesiology residents. Two case scenarios were delivered from the Anesoft software as screen-based sessions, and parallel scripts were developed for 2 PBLDs. Each resident underwent both types of training sessions, starting with the PBLD session, and the 2 cases were alternated each month (ie, in 1 month, the screen-based intervention used case 1 and the PBLD used case 2, and vice versa for the next month). Test scores before the rotation (baseline), immediately after the rotation (4 weeks after the start of the rotation), and 8 weeks after the start of rotation were collected on each topic from each resident. The effect of training method on improvement in test scores was assessed using a linear mixed-effects model. Compared to the departmental standard of PBLD, the simulation method did not improve either the 4- or 8-week mean test scores (P = .41 and P = .40 for training method effect on 4- and 8-week scores, respectively). Resident satisfaction with the simulation module on a 5-point Likert scale showed subjective evidence of a positive impact on resident education. Screen-based simulators were not more effective than PBLD for education during the neuroanesthesia rotation in anesthesia residency. Copyright © 2016 Elsevier Inc. All rights reserved.
Warrington, Steven J; Beeson, Michael S; Fire, Frank L
2013-05-01
Emergency medicine residents use simulation training for many reasons, such as gaining experience with critically ill patients and becoming familiar with disease processes. Residents frequently criticize simulation training using current high-fidelity mannequins due to the poor quality of physical exam findings present, such as auscultatory findings, as it may lead them down an alternate diagnostic or therapeutic pathway. Recently wireless remote programmed stethoscopes (simulation stethoscopes) have been developed that allow wireless transmission of any sound to a stethoscope receiver, which improves the fidelity of a physical examination and the simulation case. Following institutional review committee approval, 14 PGY1-3 emergency medicine residents were assessed during 2 simulation-based cases using pre-defined scoring anchors on multiple actions, such as communication skills and treatment decisions (Appendix 1). Each case involved a patient presenting with dyspnea requiring management based off physical examination findings. One case was a patient with exacerbation of heart failure, while the other was a patient with a tension pneumothorax. Each resident was randomized into a case associated with the simulation stethoscope. Following the cases residents were asked to fill out an evaluation questionnaire. Residents perceived the most realistic physical exam findings on those associated with the case using the simulation stethoscope (13/14, 93%). Residents also preferred the simulation stethoscope as an adjunct to the case (13/14, 93%), and they rated the simulation stethoscope case to have significantly more realistic auscultatory findings (4.4/5 vs. 3.0/5 difference of means 1.4, p=0.0007). Average scores of residents were significantly better in the simulation stethoscope-associated case (2.5/3 vs. 2.3/3 difference of means 0.2, p=0.04). There was no considerable difference in the total time taken per case. A simulation stethoscope may be a useful adjunct to current emergency medicine simulation-based training. Residents both preferred the use of the simulation stethoscope and perceived physical exam findings to be more realistic, leading to improved fidelity. Potential sources of bias include the small population, narrow scoring range, and the lack of blinding. Further research, focusing on use for resident assessment and clinical significance with a larger population and blinding of graders, is needed.
Beeson, Michael S.; Fire, Frank L.
2013-01-01
Introduction: Emergency medicine residents use simulation training for many reasons, such as gaining experience with critically ill patients and becoming familiar with disease processes. Residents frequently criticize simulation training using current high-fidelity mannequins due to the poor quality of physical exam findings present, such as auscultatory findings, as it may lead them down an alternate diagnostic or therapeutic pathway. Recently wireless remote programmed stethoscopes (simulation stethoscopes) have been developed that allow wireless transmission of any sound to a stethoscope receiver, which improves the fidelity of a physical examination and the simulation case. Methods: Following institutional review committee approval, 14 PGY1-3 emergency medicine residents were assessed during 2 simulation-based cases using pre-defined scoring anchors on multiple actions, such as communication skills and treatment decisions (Appendix 1). Each case involved a patient presenting with dyspnea requiring management based off physical examination findings. One case was a patient with exacerbation of heart failure, while the other was a patient with a tension pneumothorax. Each resident was randomized into a case associated with the simulation stethoscope. Following the cases residents were asked to fill out an evaluation questionnaire. Results: Residents perceived the most realistic physical exam findings on those associated with the case using the simulation stethoscope (13/14, 93%). Residents also preferred the simulation stethoscope as an adjunct to the case (13/14, 93%), and they rated the simulation stethoscope case to have significantly more realistic auscultatory findings (4.4/5 vs. 3.0/5 difference of means 1.4, p=0.0007). Average scores of residents were significantly better in the simulation stethoscope-associated case (2.5/3 vs. 2.3/3 difference of means 0.2, p=0.04). There was no considerable difference in the total time taken per case. Conclusion: A simulation stethoscope may be a useful adjunct to current emergency medicine simulation-based training. Residents both preferred the use of the simulation stethoscope and perceived physical exam findings to be more realistic, leading to improved fidelity. Potential sources of bias include the small population, narrow scoring range, and the lack of blinding. Further research, focusing on use for resident assessment and clinical significance with a larger population and blinding of graders, is needed. PMID:23687548
ERIC Educational Resources Information Center
Lundeberg, Mary A.; Bergland, Mark; Klyczek, Karen; Mogen, Kim; Johnson, Doug; Harmes, Nina
Software designed to promote the use of open-ended investigations in science education was evaluated in a study of whether using case-based simulation enhances students' understanding of ethical issues and data interpretation in science. The software was a DNA gel electrophoresis simulation that enabled students to conduct simulated genetic tests.…
Acoustic-based proton range verification in heterogeneous tissue: simulation studies
NASA Astrophysics Data System (ADS)
Jones, Kevin C.; Nie, Wei; Chu, James C. H.; Turian, Julius V.; Kassaee, Alireza; Sehgal, Chandra M.; Avery, Stephen
2018-01-01
Acoustic-based proton range verification (protoacoustics) is a potential in vivo technique for determining the Bragg peak position. Previous measurements and simulations have been restricted to homogeneous water tanks. Here, a CT-based simulation method is proposed and applied to a liver and prostate case to model the effects of tissue heterogeneity on the protoacoustic amplitude and time-of-flight range verification accuracy. For the liver case, posterior irradiation with a single proton pencil beam was simulated for detectors placed on the skin. In the prostate case, a transrectal probe measured the protoacoustic pressure generated by irradiation with five separate anterior proton beams. After calculating the proton beam dose deposition, each CT voxel’s material properties were mapped based on Hounsfield Unit values, and thermoacoustically-generated acoustic wave propagation was simulated with the k-Wave MATLAB toolbox. By comparing the simulation results for the original liver CT to homogenized variants, the effects of heterogeneity were assessed. For the liver case, 1.4 cGy of dose at the Bragg peak generated 50 mPa of pressure (13 cm distal), a 2× lower amplitude than simulated in a homogeneous water tank. Protoacoustic triangulation of the Bragg peak based on multiple detector measurements resulted in 0.4 mm accuracy for a δ-function proton pulse irradiation of the liver. For the prostate case, higher amplitudes are simulated (92-1004 mPa) for closer detectors (<8 cm). For four of the prostate beams, the protoacoustic range triangulation was accurate to ⩽1.6 mm (δ-function proton pulse). Based on the results, application of protoacoustic range verification to heterogeneous tissue will result in decreased signal amplitudes relative to homogeneous water tank measurements, but accurate range verification is still expected to be possible.
An Evaluation of Gender Differences in Computer-Based Case Simulations.
ERIC Educational Resources Information Center
Scheuneman, Janice Dowd; And Others
As part of the research leading to the implementation of computer-based case simulations (CCS) for the licensing examinations of the National Board of Medical Examiners, gender differences in performance were studied for one form consisting of 18 cases. A secondary purpose of the study was to note differences in style or approach that might…
Mejía, Vilma; Gonzalez, Carlos; Delfino, Alejandro E; Altermatt, Fernando R; Corvetto, Marcia A
The primary purpose of this study was to compare the effect of high fidelity simulation versus a computer-based case solving self-study, in skills acquisition about malignant hyperthermia on first year anesthesiology residents. After institutional ethical committee approval, 31 first year anesthesiology residents were enrolled in this prospective randomized single-blinded study. Participants were randomized to either a High Fidelity Simulation Scenario or a computer-based Case Study about malignant hyperthermia. After the intervention, all subjects' performance in was assessed through a high fidelity simulation scenario using a previously validated assessment rubric. Additionally, knowledge tests and a satisfaction survey were applied. Finally, a semi-structured interview was done to assess self-perception of reasoning process and decision-making. 28 first year residents finished successfully the study. Resident's management skill scores were globally higher in High Fidelity Simulation versus Case Study, however they were significant in 4 of the 8 performance rubric elements: recognize signs and symptoms (p = 0.025), prioritization of initial actions of management (p = 0.003), recognize complications (p = 0.025) and communication (p = 0.025). Average scores from pre- and post-test knowledge questionnaires improved from 74% to 85% in the High Fidelity Simulation group, and decreased from 78% to 75% in the Case Study group (p = 0.032). Regarding the qualitative analysis, there was no difference in factors influencing the student's process of reasoning and decision-making with both teaching strategies. Simulation-based training with a malignant hyperthermia high-fidelity scenario was superior to computer-based case study, improving knowledge and skills in malignant hyperthermia crisis management, with a very good satisfaction level in anesthesia residents. Copyright © 2018 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights reserved.
Sperl-Hillen, JoAnn; O'Connor, Patrick J; Ekstrom, Heidi L; Rush, William A; Asche, Stephen E; Fernandes, Omar D; Appana, Deepika; Amundson, Gerald H; Johnson, Paul E; Curran, Debra M
2014-12-01
To test a virtual case-based Simulated Diabetes Education intervention (SimDE) developed to teach primary care residents how to manage diabetes. Nineteen primary care residency programs, with 341 volunteer residents in all postgraduate years (PGY), were randomly assigned to a SimDE intervention group or control group (CG). The Web-based interactive educational intervention used computerized virtual patients who responded to provider actions through programmed simulation models. Eighteen distinct learning cases (L-cases) were assigned to SimDE residents over six months from 2010 to 2011. Impact was assessed using performance on four virtual assessment cases (A-cases), an objective knowledge test, and pre-post changes in self-assessed diabetes knowledge and confidence. Group comparisons were analyzed using generalized linear mixed models, controlling for clustering of residents within residency programs and differences in baseline knowledge. The percentages of residents appropriately achieving A-case composite clinical goals for glucose, blood pressure, and lipids were as follows: A-case 1: SimDE = 21.2%, CG = 1.8%, P = .002; A-case 2: SimDE = 15.7%, CG = 4.7%, P = .02; A-case 3: SimDE = 48.0%, CG = 10.4%, P < .001; and A-case 4: SimDE = 42.1%, CG = 18.7%, P = .004. The mean knowledge score and pre-post changes in self-assessed knowledge and confidence were significantly better for SimDE group than CG participants. A virtual case-based simulated diabetes education intervention improved diabetes management skills, knowledge, and confidence for primary care residents.
Face and construct validity of a computer-based virtual reality simulator for ERCP.
Bittner, James G; Mellinger, John D; Imam, Toufic; Schade, Robert R; Macfadyen, Bruce V
2010-02-01
Currently, little evidence supports computer-based simulation for ERCP training. To determine face and construct validity of a computer-based simulator for ERCP and assess its perceived utility as a training tool. Novice and expert endoscopists completed 2 simulated ERCP cases by using the GI Mentor II. Virtual Education and Surgical Simulation Laboratory, Medical College of Georgia. Outcomes included times to complete the procedure, reach the papilla, and use fluoroscopy; attempts to cannulate the papilla, pancreatic duct, and common bile duct; and number of contrast injections and complications. Subjects assessed simulator graphics, procedural accuracy, difficulty, haptics, overall realism, and training potential. Only when performance data from cases A and B were combined did the GI Mentor II differentiate novices and experts based on times to complete the procedure, reach the papilla, and use fluoroscopy. Across skill levels, overall opinions were similar regarding graphics (moderately realistic), accuracy (similar to clinical ERCP), difficulty (similar to clinical ERCP), overall realism (moderately realistic), and haptics. Most participants (92%) claimed that the simulator has definite training potential or should be required for training. Small sample size, single institution. The GI Mentor II demonstrated construct validity for ERCP based on select metrics. Most subjects thought that the simulated graphics, procedural accuracy, and overall realism exhibit face validity. Subjects deemed it a useful training tool. Study repetition involving more participants and cases may help confirm results and establish the simulator's ability to differentiate skill levels based on ERCP-specific metrics.
An Investigation of Computer-based Simulations for School Crises Management.
ERIC Educational Resources Information Center
Degnan, Edward; Bozeman, William
2001-01-01
Describes development of a computer-based simulation program for training school personnel in crisis management. Addresses the data collection and analysis involved in developing a simulated event, the systems requirements for simulation, and a case study of application and use of the completed simulation. (Contains 21 references.) (Authors/PKP)
NASA Astrophysics Data System (ADS)
Kukkonen, Jari Ensio; Kärkkäinen, Sirpa; Dillon, Patrick; Keinonen, Tuula
2014-02-01
Research has demonstrated that simulation-based inquiry learning has significant advantages for learning outcomes when properly scaffolded. For successful learning in science with simulation-based inquiry, one needs to ascertain levels of background knowledge so as to support learners in making, evaluating and modifying hypotheses, conducting experiments and interpreting data, and to regulate the learning process. This case study examines the influence of scaffolded simulation-based inquiry learning on fifth-graders' (n = 21) models of the greenhouse effect. The pupils were asked to make annotated drawings about the greenhouse effect both before and after scaffolding through simulation-based instructional interventions. The data were analysed qualitatively to investigate the impact of the interventions on the representations that pupils used in their descriptions of the greenhouse effect. It was found that scaffolded simulation-based inquiry learning noticeably enriched the concepts pupils used in their representations leading to better understanding of the phenomenon. In many cases, the fifth graders produced quite sophisticated representations.
Auditorium acoustics evaluation based on simulated impulse response
NASA Astrophysics Data System (ADS)
Wu, Shuoxian; Wang, Hongwei; Zhao, Yuezhe
2004-05-01
The impulse responses and other acoustical parameters of Huangpu Teenager Palace in Guangzhou were measured. Meanwhile, the acoustical simulation and auralization based on software ODEON were also made. The comparison between the parameters based on computer simulation and measuring is given. This case study shows that auralization technique based on computer simulation can be used for predicting the acoustical quality of a hall at its design stage.
Obi, Andrea; Chung, Jennifer; Chen, Ryan; Lin, Wandi; Sun, Siyuan; Pozehl, William; Cohn, Amy M; Daskin, Mark S; Seagull, F Jacob; Reddy, Rishindra M
2015-11-01
Certain operative cases occur unpredictably and/or have long operative times, creating a conflict between Accreditation Council for Graduate Medical Education (ACGME) rules and adequate training experience. A ProModel-based simulation was developed based on historical data. Probabilistic distributions of operative time calculated and combined with an ACGME compliant call schedule. For the advanced surgical cases modeled (cardiothoracic transplants), 80-hour violations were 6.07% and the minimum number of days off was violated 22.50%. There was a 36% chance of failure to fulfill any (either heart or lung) minimum case requirement despite adequate volume. The variable nature of emergency cases inevitably leads to work hour violations under ACGME regulations. Unpredictable cases mandate higher operative volume to ensure achievement of adequate caseloads. Publically available simulation technology provides a valuable avenue to identify adequacy of case volumes for trainees in both the elective and emergency setting. Copyright © 2015 Elsevier Inc. All rights reserved.
Enríquez, Diego; Lamborizio, María J; Firenze, Lorena; Jaureguizar, María de la P; Díaz Pumará, Estanislao; Szyld, Edgardo
2017-08-01
To evaluate the performance of resident physicians in diagnosing and treating a case of anaphylaxis, six months after participating in simulation training exercises. Initially, a group of pediatric residents were trained using simulation techniques in the management of critical pediatric cases. Based on their performance in this exercise, participants were assigned to one of 3 groups. At six months post-training, 4 residents were randomly chosen from each group to be re-tested, using the same performance measure as previously used. During the initial training session, 56 of 72 participants (78%) correctly identified and treated the case. Six months after the initial training, all 12 (100%) resident physicians who were re-tested successfully diagnosed and treated the simulated anaphylaxis case. The training through simulation techniques allowed correction or optimization of the treatment of simulated anaphylaxis cases in resident physicians evaluated after 6 months of the initial training.
Simulation-Based Valuation of Transactive Energy Systems
Huang, Qiuhua; McDermott, Tom; Tang, Yingying; ...
2018-05-18
Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less
Simulation-Based Valuation of Transactive Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Qiuhua; McDermott, Tom; Tang, Yingying
Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less
Anonymity and Historical-Anonymity in Location-Based Services
NASA Astrophysics Data System (ADS)
Bettini, Claudio; Mascetti, Sergio; Wang, X. Sean; Freni, Dario; Jajodia, Sushil
The problem of protecting user’s privacy in Location-Based Services (LBS) has been extensively studied recently and several defense techniques have been proposed. In this contribution, we first present a categorization of privacy attacks and related defenses. Then, we consider the class of defense techniques that aim at providing privacy through anonymity and in particular algorithms achieving “historical k- anonymity” in the case of the adversary obtaining a trace of requests recognized as being issued by the same (anonymous) user. Finally, we investigate the issues involved in the experimental evaluation of anonymity based defense techniques; we show that user movement simulations based on mostly random movements can lead to overestimate the privacy protection in some cases and to overprotective techniques in other cases. The above results are obtained by comparison to a more realistic simulation with an agent-based simulator, considering a specific deployment scenario.
Couto, Thomaz Bittencourt; Farhat, Sylvia C L; Geis, Gary L; Olsen, Orjan; Schvartsman, Claudio
2015-06-01
To compare high-fidelity simulation with case-based discussion for teaching medical students about pediatric emergencies, as assessed by a knowledge post-test, a knowledge retention test and a survey of satisfaction with the method. This was a non-randomized controlled study using a crossover design for the methods, as well as multiple-choice questionnaire tests and a satisfaction survey. Final-year medical students were allocated into two groups: group 1 participated in an anaphylaxis simulation and a discussion of a supraventricular tachycardia case, and conversely, group 2 participated in a discussion of an anaphylaxis case and a supraventricular tachycardia simulation. Students were tested on each theme at the end of their rotation (post-test) and 4-6 months later (retention test). Most students (108, or 66.3%) completed all of the tests. The mean scores for simulation versus case-based discussion were respectively 43.6% versus 46.6% for the anaphylaxis pre-test (p=0.42), 63.5% versus 67.8% for the post-test (p=0.13) and 61.5% versus 65.5% for the retention test (p=0.19). Additionally, the mean scores were respectively 33.9% versus 31.6% for the supraventricular tachycardia pre-test (p=0.44), 42.5% versus 47.7% for the post-test (p=0.09) and 41.5% versus 39.5% for the retention test (p=0.47). For both themes, there was improvement between the pre-test and the post-test (p<0.05), and no significant difference was observed between the post-test and the retention test (p>0.05). Moreover, the satisfaction survey revealed a preference for simulation (p<0.001). As a single intervention, simulation is not significantly different from case-based discussion in terms of acquisition and retention of knowledge but is superior in terms of student satisfaction.
Couto, Thomaz Bittencourt; Farhat, Sylvia C.L.; Geis, Gary L; Olsen, Orjan; Schvartsman, Claudio
2015-01-01
OBJECTIVE: To compare high-fidelity simulation with case-based discussion for teaching medical students about pediatric emergencies, as assessed by a knowledge post-test, a knowledge retention test and a survey of satisfaction with the method. METHODS: This was a non-randomized controlled study using a crossover design for the methods, as well as multiple-choice questionnaire tests and a satisfaction survey. Final-year medical students were allocated into two groups: group 1 participated in an anaphylaxis simulation and a discussion of a supraventricular tachycardia case, and conversely, group 2 participated in a discussion of an anaphylaxis case and a supraventricular tachycardia simulation. Students were tested on each theme at the end of their rotation (post-test) and 4–6 months later (retention test). RESULTS: Most students (108, or 66.3%) completed all of the tests. The mean scores for simulation versus case-based discussion were respectively 43.6% versus 46.6% for the anaphylaxis pre-test (p=0.42), 63.5% versus 67.8% for the post-test (p=0.13) and 61.5% versus 65.5% for the retention test (p=0.19). Additionally, the mean scores were respectively 33.9% versus 31.6% for the supraventricular tachycardia pre-test (p=0.44), 42.5% versus 47.7% for the post-test (p=0.09) and 41.5% versus 39.5% for the retention test (p=0.47). For both themes, there was improvement between the pre-test and the post-test (p<0.05), and no significant difference was observed between the post-test and the retention test (p>0.05). Moreover, the satisfaction survey revealed a preference for simulation (p<0.001). CONCLUSION: As a single intervention, simulation is not significantly different from case-based discussion in terms of acquisition and retention of knowledge but is superior in terms of student satisfaction. PMID:26106956
Decision Manifold Approximation for Physics-Based Simulations
NASA Technical Reports Server (NTRS)
Wong, Jay Ming; Samareh, Jamshid A.
2016-01-01
With the recent surge of success in big-data driven deep learning problems, many of these frameworks focus on the notion of architecture design and utilizing massive databases. However, in some scenarios massive sets of data may be difficult, and in some cases infeasible, to acquire. In this paper we discuss a trajectory-based framework that quickly learns the underlying decision manifold of binary simulation classifications while judiciously selecting exploratory target states to minimize the number of required simulations. Furthermore, we draw particular attention to the simulation prediction application idealized to the case where failures in simulations can be predicted and avoided, providing machine intelligence to novice analysts. We demonstrate this framework in various forms of simulations and discuss its efficacy.
Fung, Lillia; Boet, Sylvain; Bould, M Dylan; Qosa, Haytham; Perrier, Laure; Tricco, Andrea; Tavares, Walter; Reeves, Scott
2015-01-01
Crisis resource management (CRM) abilities are important for different healthcare providers to effectively manage critical clinical events. This study aims to review the effectiveness of simulation-based CRM training for interprofessional and interdisciplinary teams compared to other instructional methods (e.g., didactics). Interprofessional teams are composed of several professions (e.g., nurse, physician, midwife) while interdisciplinary teams are composed of several disciplines from the same profession (e.g., cardiologist, anaesthesiologist, orthopaedist). Medline, EMBASE, CINAHL, Cochrane Central Register of Controlled Trials, and ERIC were searched using terms related to CRM, crisis management, crew resource management, teamwork, and simulation. Trials comparing simulation-based CRM team training versus any other methods of education were included. The educational interventions involved interprofessional or interdisciplinary healthcare teams. The initial search identified 7456 publications; 12 studies were included. Simulation-based CRM team training was associated with significant improvements in CRM skill acquisition in all but two studies when compared to didactic case-based CRM training or simulation without CRM training. Of the 12 included studies, one showed significant improvements in team behaviours in the workplace, while two studies demonstrated sustained reductions in adverse patient outcomes after a single simulation-based CRM team intervention. In conclusion, CRM simulation-based training for interprofessional and interdisciplinary teams show promise in teaching CRM in the simulator when compared to didactic case-based CRM education or simulation without CRM teaching. More research, however, is required to demonstrate transfer of learning to workplaces and potential impact on patient outcomes.
Implementation and evaluation of a dilation and evacuation simulation training curriculum.
York, Sloane L; McGaghie, William C; Kiley, Jessica; Hammond, Cassing
2016-06-01
To evaluate obstetrics and gynecology resident physicians' performance following a simulation curriculum on dilation and evacuation (D&E) procedures. This study included two phases: simulation curriculum development and resident physician performance evaluation following training on a D&E simulator. Trainees participated in two evaluations. Simulation training evaluated participants performing six cases on a D&E simulator, measuring procedural time and a 26-step checklist of D&E steps. The operative training portion evaluated residents' performance after training on the simulator using mastery learning techniques. Intra-operative evaluation was based on a 21-step checklist score, Objective Structured Assessment of Technical Skills (OSATS), and percentage of cases completed. Twenty-two residents participated in simulation training, demonstrating improved performance from cases one and two to cases five and six, as measured by checklist score and procedural time (p<.001 and p=.001, respectively). Of 10 participants in the operative training, all performed at least three D&Es, while seven performed at least six cases. While checklist scores did not change significantly from the first to sixth case (mean for first case: 18.3; for sixth case: 19.6; p=.593), OSATS ratings improved from case one (19.7) to case three (23.5; p=.001) and to case six (26.8; p=.005). Trainees completed approximately 71.6% of their first case (range: 21.4-100%). By case six, the six participants performed 81.2% of the case (range: 14.3-100%). D&E simulation using a newly-developed uterine model and simulation curriculum improves resident technical skills. Simulation training with mastery learning techniques transferred to high level of performance in OR using checklist. The OSATS measured skills and showed improvement in performance with subsequent cases. Implementation of a D&E simulation curriculum offers potential for improved surgical training and abortion provision. Copyright © 2016 Elsevier Inc. All rights reserved.
Petkewich, Matthew D.; Campbell, Bruce G.
2009-01-01
The effect of injecting reclaimed water into the Middendorf aquifer beneath Mount Pleasant, South Carolina, was simulated using a groundwater-flow model of the Coastal Plain Physiographic Province of South Carolina and parts of Georgia and North Carolina. Reclaimed water, also known as recycled water, is wastewater or stormwater that has been treated to an appropriate level so that the water can be reused. The scenarios were simulated to evaluate potential changes in groundwater flow and groundwater-level conditions caused by injecting reclaimed water into the Middendorf aquifer. Simulations included a Base Case and two injection scenarios. Maximum pumping rates were simulated as 6.65, 8.50, and 10.5 million gallons per day for the Base Case, Scenario 1, and Scenario 2, respectively. The Base Case simulation represents a non-injection estimate of the year 2050 groundwater levels for comparison purposes for the two injection scenarios. For Scenarios 1 and 2, the simulated injection of reclaimed water at 3 million gallons per day begins in 2012 and continues through 2050. The flow paths and time of travel for the injected reclaimed water were simulated using particle-tracking analysis. The simulations indicated a general decline of groundwater altitudes in the Middendorf aquifer in the Mount Pleasant, South Carolina, area between 2004 and 2050 for the Base Case and two injection scenarios. For the Base Case, groundwater altitudes generally declined about 90 feet from the 2004 groundwater levels. For Scenarios 1 and 2, although groundwater altitudes initially increased in the Mount Pleasant area because of the simulated injection, these higher groundwater levels declined as Mount Pleasant Waterworks pumping increased over time. When compared to the Base Case simulation, 2050 groundwater altitudes for Scenario 1 are between 15 feet lower to 23 feet higher for production wells, between 41 and 77 feet higher for the injection wells, and between 9 and 23 feet higher for observation wells in the Mount Pleasant area. When compared to the Base Case simulation, 2050 groundwater altitudes for Scenario 2 are between 2 and 106 feet lower for production wells and observation wells and between 11 and 27 feet higher for the injection wells in the Mount Pleasant area. Water budgets for the model area immediately surrounding the Mount Pleasant area were calculated for 2011 and for 2050. The largest flow component for the 2050 water budget in the Mount Pleasant area is discharge through wells at rates between 7.1 and 10.9 million gallons of water per day. This groundwater is replaced predominantly by between 6.0 and 7.8 million gallons per day of lateral groundwater flow within the Middendorf aquifer for the Base Case and two scenarios and through reclaimed-water injection of 3 million gallons per day for Scenarios 1 and 2. In addition, between 175,000 and 319,000 gallons of groundwater are removed from this area per day because of the regional hydraulic gradient. Additional sources of water to this area are groundwater storage releases at rates between 86,800 and 116,000 gallons per day and vertical flow from over- and underlying confining units at rates between 69,100 and 150,000 gallons per day. Reclaimed water injected into the Middendorf aquifer at three hypothetical injection wells moved to the Mount Pleasant Waterworks production wells in 18 to 256 years as indicated by particle-tracking simulations. Time of travel varied from 18 to 179 years for simulated conditions of 20 percent uniform aquifer porosity and between 25 to 256 years for 30 percent uniform aquifer porosity.
ERIC Educational Resources Information Center
Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.
2011-01-01
An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…
[Comparison of Flu Outbreak Reporting Standards Based on Transmission Dynamics Model].
Yang, Guo-jing; Yi, Qing-jie; Li, Qin; Zeng, Qing
2016-05-01
To compare the current two flu outbreak reporting standards for the purpose of better prevention and control of flu outbreaks. A susceptible-exposed-infectious/asymptomatic-removed (SEIAR) model without interventions was set up first, followed by a model with interventions based on real situation. Simulated interventions were developed based on the two reporting standards, and evaluated by estimated duration of outbreaks, cumulative new cases, cumulative morbidity rates, decline in percentage of morbidity rates, and cumulative secondary cases. The basic reproductive number of the outbreak was estimated as 8. 2. The simulation produced similar results as the real situation. The effect of interventions based on reporting standard one (10 accumulated new cases in a week) was better than that of interventions based on reporting standard two (30 accumulated new cases in a week). The reporting standard one (10 accumulated new cases in a week) is more effective for prevention and control of flu outbreaks.
2013-01-01
Background The validity of studies describing clinicians’ judgements based on their responses to paper cases is questionable, because - commonly used - paper case simulations only partly reflect real clinical environments. In this study we test whether paper case simulations evoke similar risk assessment judgements to the more realistic simulated patients used in high fidelity physical simulations. Methods 97 nurses (34 experienced nurses and 63 student nurses) made dichotomous assessments of risk of acute deterioration on the same 25 simulated scenarios in both paper case and physical simulation settings. Scenarios were generated from real patient cases. Measures of judgement ‘ecology’ were derived from the same case records. The relationship between nurses’ judgements, actual patient outcomes (i.e. ecological criteria), and patient characteristics were described using the methodology of judgement analysis. Logistic regression models were constructed to calculate Lens Model Equation parameters. Parameters were then compared between the modeled paper-case and physical-simulation judgements. Results Participants had significantly less achievement (ra) judging physical simulations than when judging paper cases. They used less modelable knowledge (G) with physical simulations than with paper cases, while retaining similar cognitive control and consistency on repeated patients. Respiration rate, the most important cue for predicting patient risk in the ecological model, was weighted most heavily by participants. Conclusions To the extent that accuracy in judgement analysis studies is a function of task representativeness, improving task representativeness via high fidelity physical simulations resulted in lower judgement performance in risk assessments amongst nurses when compared to paper case simulations. Lens Model statistics could prove useful when comparing different options for the design of simulations used in clinical judgement analysis. The approach outlined may be of value to those designing and evaluating clinical simulations as part of education and training strategies aimed at improving clinical judgement and reasoning. PMID:23718556
A Novel Approach to Medical Student Peer-assisted Learning Through Case-based Simulations
Jauregui, Joshua; Bright, Steven; Strote, Jared; Shandro, Jamie
2018-01-01
Introduction Peer-assisted learning (PAL) is the development of new knowledge and skills through active learning support from peers. Benefits of PAL include introduction of teaching skills for students, creation of a safe learning environment, and efficient use of faculty time. We present a novel approach to PAL in an emergency medicine (EM) clerkship curriculum using an inexpensive, tablet-based app for students to cooperatively present and perform low-fidelity, case-based simulations that promotes accountability for student learning, fosters teaching skills, and economizes faculty presence. Methods We developed five clinical cases in the style of EM oral boards. Fourth-year medical students were each assigned a unique case one week in advance. Students also received an instructional document and a video example detailing how to lead a case. During the 90-minute session, students were placed in small groups of 3–5 students and rotated between facilitating their assigned cases and participating as a team for the cases presented by their fellow students. Cases were supplemented with a half-mannequin that can be intubated, airway supplies, and a tablet-based app (SimMon, $22.99) to remotely display and update vital signs. One faculty member rotated among groups to provide additional assistance and clarification. Three EM faculty members iteratively developed a survey, based on the literature and pilot tested it with fourth-year medical students, to evaluate the course. Results 135 medical students completed the course and course evaluation survey. Learner satisfaction was high with an overall score of 4.6 on a 5-point Likert scale. In written comments, students reported that small groups with minimal faculty involvement provided a safe learning environment and a unique opportunity to lead a group of peers. They felt that PAL was more effective than traditional simulations for learning. Faculty reported that students remained engaged and required minimal oversight. Conclusion Unlike other simulations, our combination of brief, student-assisted cases using low-fidelity simulation provides a cost-, resource- and time-effective way to implement a medical student clerkship educational experience. PMID:29383080
A Novel Approach to Medical Student Peer-assisted Learning Through Case-based Simulations.
Jauregui, Joshua; Bright, Steven; Strote, Jared; Shandro, Jamie
2018-01-01
Peer-assisted learning (PAL) is the development of new knowledge and skills through active learning support from peers. Benefits of PAL include introduction of teaching skills for students, creation of a safe learning environment, and efficient use of faculty time. We present a novel approach to PAL in an emergency medicine (EM) clerkship curriculum using an inexpensive, tablet-based app for students to cooperatively present and perform low-fidelity, case-based simulations that promotes accountability for student learning, fosters teaching skills, and economizes faculty presence. We developed five clinical cases in the style of EM oral boards. Fourth-year medical students were each assigned a unique case one week in advance. Students also received an instructional document and a video example detailing how to lead a case. During the 90-minute session, students were placed in small groups of 3-5 students and rotated between facilitating their assigned cases and participating as a team for the cases presented by their fellow students. Cases were supplemented with a half-mannequin that can be intubated, airway supplies, and a tablet-based app (SimMon, $22.99) to remotely display and update vital signs. One faculty member rotated among groups to provide additional assistance and clarification. Three EM faculty members iteratively developed a survey, based on the literature and pilot tested it with fourth-year medical students, to evaluate the course. 135 medical students completed the course and course evaluation survey. Learner satisfaction was high with an overall score of 4.6 on a 5-point Likert scale. In written comments, students reported that small groups with minimal faculty involvement provided a safe learning environment and a unique opportunity to lead a group of peers. They felt that PAL was more effective than traditional simulations for learning. Faculty reported that students remained engaged and required minimal oversight. Unlike other simulations, our combination of brief, student-assisted cases using low-fidelity simulation provides a cost-, resource- and time-effective way to implement a medical student clerkship educational experience.
Residents' perceptions of simulation as a clinical learning approach.
Walsh, Catharine M; Garg, Ankit; Ng, Stella L; Goyal, Fenny; Grover, Samir C
2017-02-01
Simulation is increasingly being integrated into medical education; however, there is little research into trainees' perceptions of this learning modality. We elicited trainees' perceptions of simulation-based learning, to inform how simulation is developed and applied to support training. We conducted an instrumental qualitative case study entailing 36 semi-structured one-hour interviews with 12 residents enrolled in an introductory simulation-based course. Trainees were interviewed at three time points: pre-course, post-course, and 4-6 weeks later. Interview transcripts were analyzed using a qualitative descriptive analytic approach. Residents' perceptions of simulation included: 1) simulation serves pragmatic purposes; 2) simulation provides a safe space; 3) simulation presents perils and pitfalls; and 4) optimal design for simulation: integration and tension. Key findings included residents' markedly narrow perception of simulation's capacity to support non-technical skills development or its use beyond introductory learning. Trainees' learning expectations of simulation were restricted. Educators should critically attend to the way they present simulation to learners as, based on theories of problem-framing, trainees' a priori perceptions may delimit the focus of their learning experiences. If they view simulation as merely a replica of real cases for the purpose of practicing basic skills, they may fail to benefit from the full scope of learning opportunities afforded by simulation.
Lang, Alon; Melzer, Ehud; Bar-Meir, Simon; Eliakim, Rami; Ziv, Amitai
2006-11-01
The continuing development in computer-based medical simulators provides an ideal platform for simulator-assisted training programs for medical trainees. Computer-based endoscopic simulators provide a virtual reality environment for training endoscopic procedures. This study illustrates the use of a comprehensive training model combining the use of endoscopic simulators with simulated (actor) patients (SP). To evaluate the effectiveness of a comprehensive simulation workshop from the trainee perspective. Four case studies were developed with emphasis on communication skills. Three workshops with 10 fellows in each were conducted. During each workshop the trainees spent half of the time in SP case studies and the remaining half working with computerized endoscopic simulators with continuous guidance by an expert endoscopist. Questionnaires were completed by the fellows at the end of the workshop. Seventy percent of the fellows felt that the endoscopic simulator was close or very close to reality for gastroscopy and 63% for colonoscopy. Eighty eight percent thought the close guidance was important for the learning process with the simulator. Eighty percent felt that the case studies were an important learning experience for risk management. Further evaluation of multi-modality simulation workshops in gastroenterologist training is needed to identify how best to incorporate this form of instruction into training for gastroenterologists.
ERIC Educational Resources Information Center
Loke, Swee-Kin; Tordoff, June; Winikoff, Michael; McDonald, Jenny; Vlugter, Peter; Duffull, Stephen
2011-01-01
Several scholars contend that learning with computer games and simulations results in students thinking more like professionals. Bearing this goal in mind, we investigated how a group of pharmacy students learnt with an in-house developed computer simulation, SimPharm. Adopting situated cognition as our theoretical lens, we conducted a case study…
Curtin, Lindsay B; Finn, Laura A; Czosnowski, Quinn A; Whitman, Craig B; Cawley, Michael J
2011-08-10
To assess the impact of computer-based simulation on the achievement of student learning outcomes during mannequin-based simulation. Participants were randomly assigned to rapid response teams of 5-6 students and then teams were randomly assigned to either a group that completed either computer-based or mannequin-based simulation cases first. In both simulations, students used their critical thinking skills and selected interventions independent of facilitator input. A predetermined rubric was used to record and assess students' performance in the mannequin-based simulations. Feedback and student performance scores were generated by the software in the computer-based simulations. More of the teams in the group that completed the computer-based simulation before completing the mannequin-based simulation achieved the primary outcome for the exercise, which was survival of the simulated patient (41.2% vs. 5.6%). The majority of students (>90%) recommended the continuation of simulation exercises in the course. Students in both groups felt the computer-based simulation should be completed prior to the mannequin-based simulation. The use of computer-based simulation prior to mannequin-based simulation improved the achievement of learning goals and outcomes. In addition to improving participants' skills, completing the computer-based simulation first may improve participants' confidence during the more real-life setting achieved in the mannequin-based simulation.
Residents’ perceptions of simulation as a clinical learning approach
Walsh, Catharine M.; Garg, Ankit; Ng, Stella L.; Goyal, Fenny; Grover, Samir C.
2017-01-01
Background Simulation is increasingly being integrated into medical education; however, there is little research into trainees’ perceptions of this learning modality. We elicited trainees’ perceptions of simulation-based learning, to inform how simulation is developed and applied to support training. Methods We conducted an instrumental qualitative case study entailing 36 semi-structured one-hour interviews with 12 residents enrolled in an introductory simulation-based course. Trainees were interviewed at three time points: pre-course, post-course, and 4–6 weeks later. Interview transcripts were analyzed using a qualitative descriptive analytic approach. Results Residents’ perceptions of simulation included: 1) simulation serves pragmatic purposes; 2) simulation provides a safe space; 3) simulation presents perils and pitfalls; and 4) optimal design for simulation: integration and tension. Key findings included residents’ markedly narrow perception of simulation’s capacity to support non-technical skills development or its use beyond introductory learning. Conclusion Trainees’ learning expectations of simulation were restricted. Educators should critically attend to the way they present simulation to learners as, based on theories of problem-framing, trainees’ a priori perceptions may delimit the focus of their learning experiences. If they view simulation as merely a replica of real cases for the purpose of practicing basic skills, they may fail to benefit from the full scope of learning opportunities afforded by simulation. PMID:28344719
Use of clinical simulations for patient education: targeting an untapped audience.
Siwe, Karin; Berterö, Carina; Pugh, Carla; Wijma, Barbro
2009-01-01
In most cases, the health professional has been the target for simulation based learning curricula. We have developed a simulation based curriculum for patient education. In our curriculum lay-women learn how to perform the clinical female pelvic examination using a manikin-based trainer. Learner assessments show that prior negative expectations turned into positive expectations regarding future pelvic examinations.
Case-based tutoring from a medical knowledge base.
Chin, H L; Cooper, G F
1989-01-01
The past decade has seen the emergence of programs that make use of large knowledge bases to assist physicians in diagnosis within the general field of internal medicine. One such program, Internist-I, contains knowledge about over 600 diseases, covering a significant proportion of internal medicine. This paper describes the process of converting a subset of this knowledge base--in the area of cardiovascular diseases--into a probabilistic format, and the use of this resulting knowledge base to teach medical diagnostic knowledge. The system (called KBSimulator--for Knowledge-Based patient Simulator) generates simulated patient cases and uses these cases as a focal point from which to teach medical knowledge. This project demonstrates the feasibility of building an intelligent, flexible instructional system that uses a knowledge base constructed primarily for medical diagnosis.
NASA Technical Reports Server (NTRS)
Daigle, Matthew John; Goebel, Kai Frank
2010-01-01
Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.
Empirical Analysis and Refinement of Expert System Knowledge Bases
1988-08-31
refinement. Both a simulated case generation program, and a random rule basher were developed to enhance rule refinement experimentation. *Substantial...the second fiscal year 88 objective was fully met. Rule Refinement System Simulated Rule Basher Case Generator Stored Cases Expert System Knowledge...generated until the rule is satisfied. Cases may be randomly generated for a given rule or hypothesis. Rule Basher Given that one has a correct
Pneumafil casing blower through moving reference frame (MRF) - A CFD simulation
NASA Astrophysics Data System (ADS)
Manivel, R.; Vijayanandh, R.; Babin, T.; Sriram, G.
2018-05-01
In this analysis work, the ring frame of Pneumafil casing blower of the textile mills with a power rating of 5 kW have been simulated using Computational Fluid Dynamics (CFD) code. The CFD analysis of the blower is carried out in Ansys Workbench 16.2 with Fluent using MRF solver settings. The simulation settings and boundary conditions are based on literature study and field data acquired. The main objective of this work is to reduce the energy consumption of the blower. The flow analysis indicated that the power consumption is influenced by the deflector plate orientation and deflector plate strip situated at the outlet casing of the blower. The energy losses occurred in the blower is due to the recirculation zones formed around the deflector plate strip. The deflector plate orientation is changed and optimized to reduce the energy consumption. The proposed optimized model is based on the simulation results which had relatively lesser power consumption than the existing and other cases. The energy losses in the Pneumafil casing blower are reduced through CFD analysis.
Clinical results of computerized tomography-based simulation with laser patient marking.
Ragan, D P; Forman, J D; He, T; Mesina, C F
1996-02-01
Accuracy of a patient treatment portal marking device and computerized tomography (CT) simulation have been clinically tested. A CT-based simulator has been assembled based on a commercial CT scanner. This includes visualization software and a computer-controlled laser drawing device. This laser drawing device is used to transfer the setup, central axis, and/or radiation portals from the CT simulator to the patient for appropriate patient skin marking. A protocol for clinical testing is reported. Twenty-five prospectively, sequentially accessioned patients have been analyzed. The simulation process can be completed in an average time of 62 min. Under many cases, the treatment portals can be designed and the patient marked in one session. Mechanical accuracy of the system was found to be within +/- 1mm. The portal projection accuracy in clinical cases is observed to be better than +/- 1.2 mm. Operating costs are equivalent to the conventional simulation process it replaces. Computed tomography simulation is a clinical accurate substitute for conventional simulation when used with an appropriate patient marking system and digitally reconstructed radiographs. Personnel time spent in CT simulation is equivalent to time in conventional simulation.
Web-Based Simulation Games for the Integration of Engineering and Business Fundamentals
ERIC Educational Resources Information Center
Calfa, Bruno; Banholzer, William; Alger, Monty; Doherty, Michael
2017-01-01
This paper describes a web-based suite of simulation games that have the purpose to enhance the chemical engineering curriculum with business-oriented decisions. Two simulation cases are discussed whose teaching topics include closing material and energy balances, importance of recycle streams, price-volume relationship in a dynamic market, impact…
Research on burnout fault of moulded case circuit breaker based on finite element simulation
NASA Astrophysics Data System (ADS)
Xue, Yang; Chang, Shuai; Zhang, Penghe; Xu, Yinghui; Peng, Chuning; Shi, Erwei
2017-09-01
In the failure event of molded case circuit breaker, overheating of the molded case near the wiring terminal has a very important proportion. The burnout fault has become an important factor restricting the development of molded case circuit breaker. This paper uses the finite element simulation software to establish the model of molded case circuit breaker by coupling multi-physics field. This model can simulate the operation and study the law of the temperature distribution. The simulation results show that the temperature near the wiring terminal, especially the incoming side of the live wire, of the molded case circuit breaker is much higher than that of the other areas. The steady-state and transient simulation results show that the temperature at the wiring terminals is abnormally increased by increasing the contact resistance of the wiring terminals. This is consistent with the frequent occurrence of burnout of the molded case in this area. Therefore, this paper holds that the burnout failure of the molded case circuit breaker is mainly caused by the abnormal increase of the contact resistance of the wiring terminal.
System-Level Reuse of Space Systems Simulations
NASA Technical Reports Server (NTRS)
Hazen, Michael R.; Williams, Joseph C.
2004-01-01
One of the best ways to enhance space systems simulation fidelity is to leverage off of (reuse) existing high-fidelity simulations. But what happens when the model you would like to reuse is in a different coding language or other barriers arise that make one want to just start over with a clean sheet of paper? Three diverse system-level simulation reuse case studies are described based on experience to date in the development of NASA's Space Station Training Facility (SSTF) at the Johnson Space Center in Houston, Texas. Case studies include (a) the Boeing/Rocketdyne-provided Electrical Power Simulation (EPSIM), (b) the NASA Automation and Robotics Division-provided TRICK robotics systems model, and (c) the Russian Space Agency- provided Russian Segment Trainer. In each case, there was an initial tendency to dismiss simulation reuse candidates based on an apparent lack of suitability. A more careful examination based on a more structured assessment of architectural and requirements-oriented representations of the reuse candidates revealed significant reuse potential. Specific steps used to conduct the detailed assessments are discussed. The steps include the following: 1) Identifying reuse candidates; 2) Requirements compatibility assessment; 3) Maturity assessment; 4) Life-cycle cost determination; and 5) Risk assessment. Observations and conclusions are presented related to the real cost of system-level simulation component reuse. Finally, lessons learned that relate to maximizing the benefits of space systems simulation reuse are shared. These concepts should be directly applicable for use in the development of space systems simulations in the future.
Chiêm, Jean-Christophe; Van Durme, Thérèse; Vandendorpe, Florence; Schmitz, Olivier; Speybroeck, Niko; Cès, Sophie; Macq, Jean
2014-08-01
Various elderly case management projects have been implemented in Belgium. This type of long-term health care intervention involves contextual factors and human interactions. These underlying complex mechanisms can be usefully informed with field experts' knowledge, which are hard to make explicit. However, computer simulation has been suggested as one possible method of overcoming the difficulty of articulating such elicited qualitative views. A simulation model of case management was designed using an agent-based methodology, based on the initial qualitative research material. Variables and rules of interaction were formulated into a simple conceptual framework. This model has been implemented and was used as a support for a structured discussion with experts in case management. The rigorous formulation provided by the agent-based methodology clarified the descriptions of the interventions and the problems encountered regarding: the diverse network topologies of health care actors in the project; the adaptation time required by the intervention; the communication between the health care actors; the institutional context; the organization of the care; and the role of the case manager and his or hers personal ability to interpret the informal demands of the frail older person. The simulation model should be seen primarily as a tool for thinking and learning. A number of insights were gained as part of a valuable cognitive process. Computer simulation supporting field experts' elicitation can lead to better-informed decisions in the organization of complex health care interventions. © 2013 John Wiley & Sons, Ltd.
A qualitative and quantitative assessment for a bone marrow harvest simulator.
Machado, Liliane S; Moraes, Ronei M
2009-01-01
Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.
Evaluation of a grid based molecular dynamics approach for polypeptide simulations.
Merelli, Ivan; Morra, Giulia; Milanesi, Luciano
2007-09-01
Molecular dynamics is very important for biomedical research because it makes possible simulation of the behavior of a biological macromolecule in silico. However, molecular dynamics is computationally rather expensive: the simulation of some nanoseconds of dynamics for a large macromolecule such as a protein takes very long time, due to the high number of operations that are needed for solving the Newton's equations in the case of a system of thousands of atoms. In order to obtain biologically significant data, it is desirable to use high-performance computation resources to perform these simulations. Recently, a distributed computing approach based on replacing a single long simulation with many independent short trajectories has been introduced, which in many cases provides valuable results. This study concerns the development of an infrastructure to run molecular dynamics simulations on a grid platform in a distributed way. The implemented software allows the parallel submission of different simulations that are singularly short but together bring important biological information. Moreover, each simulation is divided into a chain of jobs to avoid data loss in case of system failure and to contain the dimension of each data transfer from the grid. The results confirm that the distributed approach on grid computing is particularly suitable for molecular dynamics simulations thanks to the elevated scalability.
The Application of Simulation Method in Isothermal Elastic Natural Gas Pipeline
NASA Astrophysics Data System (ADS)
Xing, Chunlei; Guan, Shiming; Zhao, Yue; Cao, Jinggang; Chu, Yanji
2018-02-01
This Elastic pipeline mathematic model is of crucial importance in natural gas pipeline simulation because of its compliance with the practical industrial cases. The numerical model of elastic pipeline will bring non-linear complexity to the discretized equations. Hence the Newton-Raphson method cannot achieve fast convergence in this kind of problems. Therefore A new Newton Based method with Powell-Wolfe Condition to simulate the Isothermal elastic pipeline flow is presented. The results obtained by the new method aregiven based on the defined boundary conditions. It is shown that the method converges in all cases and reduces significant computational cost.
Boundary Avoidance Tracking for Instigating Pilot Induced Oscillations
NASA Technical Reports Server (NTRS)
Craun, Robert W.; Acosta, Diana M.; Beard, Steven D.; Hardy, Gordon H.; Leonard, Michael W.; Weinstein, Michael
2013-01-01
In order to advance research in the area of pilot induced oscillations, a reliable method to create PIOs in a simulated environment is necessary. Using a boundary avoidance tracking task, researchers performing an evaluation of control systems were able to create PIO events in 42% of cases using a nominal aircraft, and 91% of cases using an aircraft with reduced actuator rate limits. The simulator evaluation took place in the NASA Ames Vertical Motion Simulator, a high-fidelity motion-based simulation facility.
The Role of Simulation Case Studies in Enterprise Education
ERIC Educational Resources Information Center
Tunstall, Richard; Lynch, Martin
2010-01-01
Purpose: This paper aims to explore the role of electronic simulation case studies in enterprise education, their effectiveness, and their relationship to traditional forms of classroom-based approaches to experiential learning. The paper seeks to build on previous work within the field of enterprise and management education, specifically in…
A Study of Fan Stage/Casing Interaction Models
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Carney, Kelly; Gallardo, Vicente
2003-01-01
The purpose of the present study is to investigate the performance of several existing and new, blade-case interactions modeling capabilities that are compatible with the large system simulations used to capture structural response during blade-out events. Three contact models are examined for simulating the interactions between a rotor bladed disk and a case: a radial and linear gap element and a new element based on a hydrodynamic formulation. The first two models are currently available in commercial finite element codes such as NASTRAN and have been showed to perform adequately for simulating rotor-case interactions. The hydrodynamic model, although not readily available in commercial codes, may prove to be better able to characterize rotor-case interactions.
Guidance Provided by Teacher and Simulation for Inquiry-Based Learning: A Case Study
ERIC Educational Resources Information Center
Lehtinen, Antti; Viiri, Jouni
2017-01-01
Current research indicates that inquiry-based learning should be guided in order to achieve optimal learning outcomes. The need for guidance is even greater when simulations are used because of their high information content and the difficulty of extracting information from them. Previous research on guidance for learning with simulations has…
The transesophageal echocardiography simulator based on computed tomography images.
Piórkowski, Adam; Kempny, Aleksander
2013-02-01
Simulators are a new tool in education in many fields, including medicine, where they greatly improve familiarity with medical procedures, reduce costs, and, importantly, cause no harm to patients. This is so in the case of transesophageal echocardiography (TEE), in which the use of a simulator facilitates spatial orientation and helps in case studies. The aim of the project described in this paper is to simulate an examination by TEE. This research makes use of available computed tomography data to simulate the corresponding echocardiographic view. This paper describes the essential characteristics that distinguish these two modalities and the key principles of the wave phenomena that should be considered in the simulation process, taking into account the conditions specific to the echocardiography. The construction of the CT2TEE (Web-based TEE simulator) is also presented. The considerations include ray-tracing and ray-casting techniques in the context of ultrasound beam and artifact simulation. An important aspect of the interaction with the user is raised.
Understanding casing flow in Pelton turbines by numerical simulation
NASA Astrophysics Data System (ADS)
Rentschler, M.; Neuhauser, M.; Marongiu, J. C.; Parkinson, E.
2016-11-01
For rehabilitation projects of Pelton turbines, the flow in the casing may have an important influence on the overall performance of the machine. Water sheets returning on the jets or on the runner significantly reduce efficiency, and run-away speed depends on the flow in the casing. CFD simulations can provide a detailed insight into this type of flow, but these simulations are computationally intensive. As in general the volume of water in a Pelton turbine is small compared to the complete volume of the turbine housing, a single phase simulation greatly reduces the complexity of the simulation. In the present work a numerical tool based on the SPH-ALE meshless method is used to simulate the casing flow in a Pelton turbine. Using improved order schemes reduces the numerical viscosity. This is necessary to resolve the flow in the jet and on the casing wall, where the velocity differs by two orders of magnitude. The results are compared to flow visualizations and measurement in a hydraulic laboratory. Several rehabilitation projects proved the added value of understanding the flow in the Pelton casing. The flow simulation helps designing casing insert, not only to see their influence on the flow, but also to calculate the stress in the inserts. In some projects, the casing simulation leads to the understanding of unexpected behavior of the flow. One such example is presented where the backsplash of a deflector hit the runner, creating a reversed rotation of the runner.
Raque, Jessica; Goble, Adam; Jones, Veronica M; Waldman, Lindsey E; Sutton, Erica
2015-07-01
With the introduction of Fundamentals of Endoscopic Surgery, training methods in flexible endoscopy are being augmented with simulation-based curricula. The investment for virtual reality simulators warrants further research into its training advantage. Trainees were randomized into bedside or simulator training groups (BED vs SIM). SIM participated in a proficiency-based virtual reality curriculum. Trainees' endoscopic skills were rated using the Global Assessment of Gastrointestinal Endoscopic Skills (GAGES) in the patient care setting. The number of cases to reach 90 per cent of the maximum GAGES score and calculated costs of training were compared. Nineteen residents participated in the study. There was no difference in the average number of cases required to achieve 90 per cent of the maximum GAGES score for esophagogastroduodenoscopy, 13 (SIM) versus11 (BED) (P = 0.63), or colonoscopy 21 (SIM) versus 4 (BED) (P = 0.34). The average per case cost of training for esophagogastroduodenoscopy was $35.98 (SIM) versus $39.71 (BED) (P = 0.50), not including the depreciation costs associated with the simulator ($715.00 per resident over six years). Use of a simulator appeared to increase the cost of training without accelerating the learning curve or decreasing faculty time spent in instruction. The importance of simulation in endoscopy training will be predicated on more cost-effective simulators.
Flowfield analysis of helicopter rotor in hover and forward flight based on CFD
NASA Astrophysics Data System (ADS)
Zhao, Qinghe; Li, Xiaodong
2018-05-01
The helicopter rotor field is simulated in hover and forward flight based on Computational Fluid Dynamics(CFD). In hover case only one rotor is simulated with the periodic boundary condition in the rotational coordinate system and the grid is fixed. In the non-lift forward flight case, the total rotor is simulated in inertia coordinate system and the whole grid moves rigidly. The dual-time implicit scheme is applied to simulate the unsteady flowfield on the movement grids. The k – ω turbulence model is employed in order to capture the effects of turbulence. To verify the solver, the flowfield around the Caradonna-Tung rotor is computed. The comparison shows a good agreement between the numerical results and the experimental data.
Case-Based Learning for Orofacial Pain and Temporomandibular Disorders.
ERIC Educational Resources Information Center
Clark, Glenn T.; And Others
1993-01-01
The use of interactive computer-based simulation of cases of chronic orofacial pain and temporomandibular joint disfunction patients for clinical dental education is described. Its application as a voluntary study aid in a third-year dental course is evaluated for effectiveness and for time factors in case completion. (MSE)
Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson
2008-01-01
We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...
Monte Carlo simulations within avalanche rescue
NASA Astrophysics Data System (ADS)
Reiweger, Ingrid; Genswein, Manuel; Schweizer, Jürg
2016-04-01
Refining concepts for avalanche rescue involves calculating suitable settings for rescue strategies such as an adequate probing depth for probe line searches or an optimal time for performing resuscitation for a recovered avalanche victim in case of additional burials. In the latter case, treatment decisions have to be made in the context of triage. However, given the low number of incidents it is rarely possible to derive quantitative criteria based on historical statistics in the context of evidence-based medicine. For these rare, but complex rescue scenarios, most of the associated concepts, theories, and processes involve a number of unknown "random" parameters which have to be estimated in order to calculate anything quantitatively. An obvious approach for incorporating a number of random variables and their distributions into a calculation is to perform a Monte Carlo (MC) simulation. We here present Monte Carlo simulations for calculating the most suitable probing depth for probe line searches depending on search area and an optimal resuscitation time in case of multiple avalanche burials. The MC approach reveals, e.g., new optimized values for the duration of resuscitation that differ from previous, mainly case-based assumptions.
Simulation in Occupational Therapy Curricula: A literature review.
Bennett, Sally; Rodger, Sylvia; Fitzgerald, Cate; Gibson, Libby
2017-08-01
Simulated learning experiences are increasingly being used in health-care education to enhance student engagement and provide experiences that reflect clinical practice; however, simulation has not been widely investigated in occupational therapy curricula. The aim of this paper was to: (i) describe the existing research about the use and evaluation of simulation over the last three decades in occupational therapy curricula and (ii) consider how simulation has been used to develop competence in students. A literature review was undertaken with searches of MEDLINE, CINAHL and ERIC to locate articles that described or evaluated the use of simulation in occupational therapy curricula. Fifty-seven papers were identified. Occupational therapy educators have used the full scope of simulation modalities, including written case studies (22), standardised patients (13), video case studies (15), computer-based and virtual reality cases (7), role-play (8) and mannequins and part-task trainers (4). Ten studies used combinations of these modalities and two papers compared modalities. Most papers described the use of simulation for foundational courses, as for preparation for fieldwork, and to address competencies necessary for newly graduating therapists. The majority of studies were descriptive, used pre-post design, or were student's perceptions of the value of simulation. Simulation-based education has been used for a wide range of purposes in occupational therapy curricula and appears to be well received. Randomised controlled trials are needed to more accurately understand the effects of simulation not just for occupational therapy students but for longer term outcomes in clinical practice. © 2017 Occupational Therapy Australia.
Tschannen, Dana; Aebersold, Michelle; Sauter, Cecilia; Funnell, Martha M
2013-06-01
Nurses who provide case management can improve care practice and outcomes among patients who have type 2 diabetes through appropriate training and systems of care. This study was conducted to improve ambulatory care nurses' perceptions of competency in empowerment-based skills required for diabetes self-management education after participation in a multifaceted educational session that included problem-based learning and simulation. After participation in the multifaceted educational session, nurses (n = 21) perceived that the education provided an excellent opportunity for knowledge uptake and applicability to their respective work settings. The learning strategies provided opportunities for engagement in a safe and relaxed atmosphere. The simulation experience allowed participants to deliberately practice the competencies. These nurses considered this a very effective learning activity. Through the use of problem-based learning and simulation, nurses may be able to more efficiently and effectively develop the necessary skills to provide effective case management of chronic disease. Copyright 2013, SLACK Incorporated.
Endo, Satoshi; Fridlind, Ann M.; Lin, Wuyin; ...
2015-06-19
A 60-hour case study of continental boundary layer cumulus clouds is examined using two large-eddy simulation (LES) models. The case is based on observations obtained during the RACORO Campaign (Routine Atmospheric Radiation Measurement [ARM] Aerial Facility [AAF] Clouds with Low Optical Water Depths [CLOWD] Optical Radiative Observations) at the ARM Climate Research Facility's Southern Great Plains site. The LES models are driven by continuous large-scale and surface forcings, and are constrained by multi-modal and temporally varying aerosol number size distribution profiles derived from aircraft observations. We compare simulated cloud macrophysical and microphysical properties with ground-based remote sensing and aircraft observations.more » The LES simulations capture the observed transitions of the evolving cumulus-topped boundary layers during the three daytime periods, and generally reproduce variations of droplet number concentration with liquid water content (LWC), corresponding to the gradient between the cloud centers and cloud edges at given heights. The observed LWC values fall within the range of simulated values; the observed droplet number concentrations are commonly higher than simulated, but differences remain on par with potential estimation errors in the aircraft measurements. Sensitivity studies examine the influences of bin microphysics versus bulk microphysics, aerosol advection, supersaturation treatment, and aerosol hygroscopicity. Simulated macrophysical cloud properties are found to be insensitive in this non-precipitating case, but microphysical properties are especially sensitive to bulk microphysics supersaturation treatment and aerosol hygroscopicity.« less
NASA Technical Reports Server (NTRS)
Fridlind, Ann M.; Li, Xiaowen; Wu, Di; van Lier-Walqui, Marcus; Ackerman, Andrew S.; Tao, Wei-Kuo; McFarquhar, Greg M.; Wu, Wei; Dong, Xiquan; Wang, Jingyu;
2017-01-01
Advancing understanding of deep convection microphysics via mesoscale modeling studies of well-observed case studies requires observation-based aerosol inputs. Here, we derive hygroscopic aerosol size distribution input profiles from ground-based and airborne measurements for six convection case studies observed during the Midlatitude Continental Convective Cloud Experiment (MC3E) over Oklahoma. We demonstrate use of an input profile in simulations of the only well-observed case study that produced extensive stratiform outflow on 20 May 2011. At well-sampled elevations between -11 and -23 C over widespread stratiform rain, ice crystal number concentrations are consistently dominated by a single mode near approx. 400 microm in randomly oriented maximum dimension (Dmax). The ice mass at -23 C is primarily in a closely collocated mode, whereas a mass mode near Dmax approx. 1000 microns becomes dominant with decreasing elevation to the -11 C level, consistent with possible aggregation during sedimentation. However, simulations with and without observation-based aerosol inputs systematically overpredict mass peak Dmax by a factor of 3-5 and underpredict ice number concentration by a factor of 4-10. Previously reported simulations with both two-moment and size-resolved microphysics have shown biases of a similar nature. The observed ice properties are notably similar to those reported from recent tropical measurements. Based on several lines of evidence, we speculate that updraft microphysical pathways determining outflow properties in the 20 May case are similar to a tropical regime, likely associated with warm-temperature ice multiplication that is not well understood or well represented in models.
ERIC Educational Resources Information Center
Helms, Samuel Arthur
2010-01-01
This single subject case study followed a high school student and his use of a simulation of marine ecosystems. The study examined his metaworld, motivation, and learning before, during and after using the simulation. A briefing was conceptualized based on the literature on pre-instructional activities, advance organizers, and performance…
Sperling, Jeremy D.; Clark, Sunday; Kang, Yoon
2013-01-01
Introduction Simulation-based medical education (SBME) is increasingly being utilized for teaching clinical skills in undergraduate medical education. Studies have evaluated the impact of adding SBME to third- and fourth-year curriculum; however, very little research has assessed its efficacy for teaching clinical skills in pre-clerkship coursework. To measure the impact of a simulation exercise during a pre-clinical curriculum, a simulation session was added to a pre-clerkship course at our medical school where the clinical approach to altered mental status (AMS) is traditionally taught using a lecture and an interactive case-based session in a small group format. The objective was to measure simulation's impact on students’ knowledge acquisition, comfort, and perceived competence with regards to the AMS patient. Methods AMS simulation exercises were added to the lecture and small group case sessions in June 2010 and 2011. Simulation sessions consisted of two clinical cases using a high-fidelity full-body simulator followed by a faculty debriefing after each case. Student participation in a simulation session was voluntary. Students who did and did not participate in a simulation session completed a post-test to assess knowledge and a survey to understand comfort and perceived competence in their approach to AMS. Results A total of 154 students completed the post-test and survey and 65 (42%) attended a simulation session. Post-test scores were higher in students who attended a simulation session compared to those who did not (p<0.001). Students who participated in a simulation session were more comfortable in their overall approach to treating AMS patients (p=0.05). They were also more likely to state that they could articulate a differential diagnosis (p=0.03), know what initial diagnostic tests are needed (p=0.01), and understand what interventions are useful in the first few minutes (p=0.003). Students who participated in a simulation session were more likely to find the overall AMS curriculum useful (p<0.001). Conclusion Students who participated in a simulation exercise performed better on a knowledge-based test and reported increased comfort and perceived competence in their clinical approach to AMS. SBME shows significant promise for teaching clinical skills to medical students during pre-clinical curriculum. PMID:23561054
Post Pareto optimization-A case
NASA Astrophysics Data System (ADS)
Popov, Stoyan; Baeva, Silvia; Marinova, Daniela
2017-12-01
Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multi-objective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training.
Dependability analysis of parallel systems using a simulation-based approach. M.S. Thesis
NASA Technical Reports Server (NTRS)
Sawyer, Darren Charles
1994-01-01
The analysis of dependability in large, complex, parallel systems executing real applications or workloads is examined in this thesis. To effectively demonstrate the wide range of dependability problems that can be analyzed through simulation, the analysis of three case studies is presented. For each case, the organization of the simulation model used is outlined, and the results from simulated fault injection experiments are explained, showing the usefulness of this method in dependability modeling of large parallel systems. The simulation models are constructed using DEPEND and C++. Where possible, methods to increase dependability are derived from the experimental results. Another interesting facet of all three cases is the presence of some kind of workload of application executing in the simulation while faults are injected. This provides a completely new dimension to this type of study, not possible to model accurately with analytical approaches.
Point-of-care ultrasound education: the increasing role of simulation and multimedia resources.
Lewiss, Resa E; Hoffmann, Beatrice; Beaulieu, Yanick; Phelan, Mary Beth
2014-01-01
This article reviews the current technology, literature, teaching models, and methods associated with simulation-based point-of-care ultrasound training. Patient simulation appears particularly well suited for learning point-of-care ultrasound, which is a required core competency for emergency medicine and other specialties. Work hour limitations have reduced the opportunities for clinical practice, and simulation enables practicing a skill multiple times before it may be used on patients. Ultrasound simulators can be categorized into 2 groups: low and high fidelity. Low-fidelity simulators are usually static simulators, meaning that they have nonchanging anatomic examples for sonographic practice. Advantages are that the model may be reused over time, and some simulators can be homemade. High-fidelity simulators are usually high-tech and frequently consist of many computer-generated cases of virtual sonographic anatomy that can be scanned with a mock probe. This type of equipment is produced commercially and is more expensive. High-fidelity simulators provide students with an active and safe learning environment and make a reproducible standardized assessment of many different ultrasound cases possible. The advantages and disadvantages of using low- versus high-fidelity simulators are reviewed. An additional concept used in simulation-based ultrasound training is blended learning. Blended learning may include face-to-face or online learning often in combination with a learning management system. Increasingly, with simulation and Web-based learning technologies, tools are now available to medical educators for the standardization of both ultrasound skills training and competency assessment.
Nonlinear Reduced-Order Simulation Using An Experimentally Guided Modal Basis
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Przekop, Adam
2012-01-01
A procedure is developed for using nonlinear experimental response data to guide the modal basis selection in a nonlinear reduced-order simulation. The procedure entails using nonlinear acceleration response data to first identify proper orthogonal modes. Special consideration is given to cases in which some of the desired response data is unavailable. Bases consisting of linear normal modes are then selected to best represent the experimentally determined transverse proper orthogonal modes and either experimentally determined inplane proper orthogonal modes or the special case of numerically computed in-plane companions. The bases are subsequently used in nonlinear modal reduction and dynamic response simulations. The experimental data used in this work is simulated to allow some practical considerations, such as the availability of in-plane response data and non-idealized test conditions, to be explored. Comparisons of the nonlinear reduced-order simulations are made with the surrogate experimental data to demonstrate the effectiveness of the approach.
Simulation-Based Evaluation of Light Posts and Street Signs as 3-D Geolocation Targets in SAR Images
NASA Astrophysics Data System (ADS)
Auer, S.; Balss, U.
2017-05-01
The assignment of phase center positions (in 2D or 3D) derived from SAR data to physical object is challenging for many man-made structures such as buildings or bridges. In contrast, light poles and traffic signs are promising targets for tasks based on 3-D geolocation as they often show a prominent and spatially isolated appearance. For a detailed understanding of the nature of both targets, this paper presents results of a dedicated simulation case study, which is based on ray tracing methods (simulator RaySAR). For the first time, the appearance of the targets is analyzed in 2D (image plane) and 3D space (world coordinates of scene model) and reflecting surfaces are identified for related dominant image pixels. The case studies confirms the crucial impact of spatial resolution in the context of light poles and traffic signs and the appropriateness of light poles as target for 3-D geolocation in case of horizontal ground surfaces beneath.
NASA Technical Reports Server (NTRS)
Grantham, William D.; Williams, Robert H.
1987-01-01
For the case of an approach-and-landing piloting task emphasizing response to the landing flare, pilot opinion and performance parameters derived from jet transport aircraft six-degree-of-freedom ground-based and in-flight simulators were compared in order to derive data for the flight-controls/flying-qualities engineers. The data thus obtained indicate that ground simulation results tend to be conservative, and that the effect of control sensitivity is more pronounced for ground simulation. The pilot also has a greater tendency to generate pilot-induced oscillation in ground-based simulation than in flight.
Zary, Nabil; Johnson, Gunilla; Boberg, Jonas; Fors, Uno GH
2006-01-01
Background The Web-based Simulation of Patients (Web-SP) project was initiated in order to facilitate the use of realistic and interactive virtual patients (VP) in medicine and healthcare education. Web-SP focuses on moving beyond the technology savvy teachers, when integrating simulation-based education into health sciences curricula, by making the creation and use of virtual patients easier. The project strives to provide a common generic platform for design/creation, management, evaluation and sharing of web-based virtual patients. The aim of this study was to evaluate if it was possible to develop a web-based virtual patient case simulation environment where the entire case authoring process might be handled by teachers and which would be flexible enough to be used in different healthcare disciplines. Results The Web-SP system was constructed to support easy authoring, management and presentation of virtual patient cases. The case authoring environment was found to facilitate for teachers to create full-fledged patient cases without the assistance of computer specialists. Web-SP was successfully implemented at several universities by taking into account key factors such as cost, access, security, scalability and flexibility. Pilot evaluations in medical, dentistry and pharmacy courses shows that students regarded Web-SP as easy to use, engaging and to be of educational value. Cases adapted for all three disciplines were judged to be of significant educational value by the course leaders. Conclusion The Web-SP system seems to fulfil the aim of providing a common generic platform for creation, management and evaluation of web-based virtual patient cases. The responses regarding the authoring environment indicated that the system might be user-friendly enough to appeal to a majority of the academic staff. In terms of implementation strengths, Web-SP seems to fulfil most needs from course directors and teachers from various educational institutions and disciplines. The system is currently in use or under implementation in several healthcare disciplines at more than ten universities worldwide. Future aims include structuring the exchange of cases between teachers and academic institutions by building a VP library function. We intend to follow up the positive results presented in this paper with other studies looking at the learning outcomes, critical thinking and patient management. Studying the potential of Web-SP as an assessment tool will also be performed. More information about Web-SP: PMID:16504041
Zary, Nabil; Johnson, Gunilla; Boberg, Jonas; Fors, Uno G H
2006-02-21
The Web-based Simulation of Patients (Web-SP) project was initiated in order to facilitate the use of realistic and interactive virtual patients (VP) in medicine and healthcare education. Web-SP focuses on moving beyond the technology savvy teachers, when integrating simulation-based education into health sciences curricula, by making the creation and use of virtual patients easier. The project strives to provide a common generic platform for design/creation, management, evaluation and sharing of web-based virtual patients. The aim of this study was to evaluate if it was possible to develop a web-based virtual patient case simulation environment where the entire case authoring process might be handled by teachers and which would be flexible enough to be used in different healthcare disciplines. The Web-SP system was constructed to support easy authoring, management and presentation of virtual patient cases. The case authoring environment was found to facilitate for teachers to create full-fledged patient cases without the assistance of computer specialists. Web-SP was successfully implemented at several universities by taking into account key factors such as cost, access, security, scalability and flexibility. Pilot evaluations in medical, dentistry and pharmacy courses shows that students regarded Web-SP as easy to use, engaging and to be of educational value. Cases adapted for all three disciplines were judged to be of significant educational value by the course leaders. The Web-SP system seems to fulfil the aim of providing a common generic platform for creation, management and evaluation of web-based virtual patient cases. The responses regarding the authoring environment indicated that the system might be user-friendly enough to appeal to a majority of the academic staff. In terms of implementation strengths, Web-SP seems to fulfil most needs from course directors and teachers from various educational institutions and disciplines. The system is currently in use or under implementation in several healthcare disciplines at more than ten universities worldwide. Future aims include structuring the exchange of cases between teachers and academic institutions by building a VP library function. We intend to follow up the positive results presented in this paper with other studies looking at the learning outcomes, critical thinking and patient management. Studying the potential of Web-SP as an assessment tool will also be performed. More information about Web-SP: http://websp.lime.ki.se.
Al-Dahir, Sara; Bryant, Kendrea; Kennedy, Kathleen B; Robinson, Donna S
2014-05-15
To evaluate the efficacy of faculty-led problem-based learning (PBL) vs online simulated-patient case in fourth-year (P4) pharmacy students. Fourth-year pharmacy students were randomly assigned to participate in either online branched-case learning using a virtual simulation platform or a small-group discussion. Preexperience and postexperience student assessments and a survey instrument were completed. While there were no significant differences in the preexperience test scores between the groups, there was a significant increase in scores in both the virtual-patient group and the PBL group between the preexperience and postexperience tests. The PBL group had higher postexperience test scores (74.8±11.7) than did the virtual-patient group (66.5±13.6) (p=0.001). The PBL method demonstrated significantly greater improvement in postexperience test scores than did the virtual-patient method. Both were successful learning methods, suggesting that a diverse approach to simulated patient cases may reach more student learning styles.
Willaert, Willem I M; Cheshire, Nicholas J; Aggarwal, Rajesh; Van Herzeele, Isabelle; Stansby, Gerard; Macdonald, Sumaira; Vermassen, Frank E
2012-12-01
Carotid artery stenting (CAS) is a technically demanding procedure with a risk of periprocedural stroke. A scoring system based on anatomic criteria has been developed to facilitate patient selection for CAS. Advancements in simulation science also enable case evaluation through patient-specific virtual reality (VR) rehearsal on an endovascular simulator. This study aimed to validate the anatomic scoring system for CAS using the patient-specific VR technology. Three patients were selected and graded according to the CAS scoring system (maximum score, 9): one easy (score, <4.9), one intermediate (score, 5.0-5.9), and one difficult (score, >7.0). The three cases were performed on the simulator in random order by 20 novice interventionalists pretrained in CAS. Technical performances were assessed using simulator-based metrics and expert-based ratings. The interventionalists took significantly longer to perform the difficult CAS case (median, 31.6 vs 19.7 vs 14.6 minutes; P<.0001) compared with the intermediate and easy cases; similarly, more fluoroscopy time (20.7 vs 12.1 vs 8.2 minutes; P<.0001), contrast volume (56.5 vs 51.5 vs 50.0 mL; P=.0060), and roadmaps (10 vs 9 vs 9; P=.0040) were used. The quality of performance declined significantly as the cases became more challenging (score, 24 vs 22 vs 19; P<.0001). The anatomic scoring system for CAS can predict the difficulty of a CAS procedure as measured by patient-specific VR. This scoring system, with or without the additional use of patient-specific VR, can guide novice interventionalists in selecting appropriate patients for CAS. This may reduce the perioperative stroke risk and enhance patient safety. Copyright © 2012 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
RTDS-Based Design and Simulation of Distributed P-Q Power Resources in Smart Grid
NASA Astrophysics Data System (ADS)
Taylor, Zachariah David
In this Thesis, we propose to utilize a battery system together with its power electronics interfaces and bidirectional charger as a distributed P-Q resource in power distribution networks. First, we present an optimization-based approach to operate such distributed P-Q resources based on the characteristics of the battery and charger system as well as the features and needs of the power distribution network. Then, we use the RTDS Simulator, which is an industry-standard simulation tool of power systems, to develop two RTDS-based design approaches. The first design is based on an ideal four-quadrant distributed P-Q power resource. The second design is based on a detailed four-quadrant distributed P-Q power resource that is developed using power electronics components. The hardware and power electronics circuitry as well as the control units are explained for the second design. After that, given the two-RTDS designs, we conducted extensive RTDS simulations to assess the performance of the designed distributed P-Q Power Resource in an IEEE 13 bus test system. We observed that the proposed design can noticeably improve the operational performance of the power distribution grid in at least four key aspects: reducing power loss, active power peak load shaving at substation, reactive power peak load shaving at substation, and voltage regulation. We examine these performance measures across three design cases: Case 1: There is no P-Q Power Resource available on the power distribution network. Case 2: The installed P-Q Power Resource only supports active power, i.e., it only utilizes its battery component. Case 3: The installed P-Q Power Resource supports both active and reactive power, i.e., it utilizes both its battery component and its power electronics charger component. In the end, we present insightful interpretations on the simulation results and suggest some future works.
A Homogenization Approach for Design and Simulation of Blast Resistant Composites
NASA Astrophysics Data System (ADS)
Sheyka, Michael
Structural composites have been used in aerospace and structural engineering due to their high strength to weight ratio. Composite laminates have been successfully and extensively used in blast mitigation. This dissertation examines the use of the homogenization approach to design and simulate blast resistant composites. Three case studies are performed to examine the usefulness of different methods that may be used in designing and optimizing composite plates for blast resistance. The first case study utilizes a single degree of freedom system to simulate the blast and a reliability based approach. The first case study examines homogeneous plates and the optimal stacking sequence and plate thicknesses are determined. The second and third case studies use the homogenization method to calculate the properties of composite unit cell made of two different materials. The methods are integrated with dynamic simulation environments and advanced optimization algorithms. The second case study is 2-D and uses an implicit blast simulation, while the third case study is 3-D and simulates blast using the explicit blast method. Both case studies 2 and 3 rely on multi-objective genetic algorithms for the optimization process. Pareto optimal solutions are determined in case studies 2 and 3. Case study 3 is an integrative method for determining optimal stacking sequence, microstructure and plate thicknesses. The validity of the different methods such as homogenization, reliability, explicit blast modeling and multi-objective genetic algorithms are discussed. Possible extension of the methods to include strain rate effects and parallel computation is also examined.
Promotion of self-directed learning using virtual patient cases.
Benedict, Neal; Schonder, Kristine; McGee, James
2013-09-12
To assess the effectiveness of virtual patient cases to promote self-directed learning (SDL) in a required advanced therapeutics course. Virtual patient software based on a branched-narrative decision-making model was used to create complex patient case simulations to replace lecture-based instruction. Within each simulation, students used SDL principles to learn course objectives, apply their knowledge through clinical recommendations, and assess their progress through patient outcomes and faculty feedback linked to their individual decisions. Group discussions followed each virtual patient case to provide further interpretation, clarification, and clinical perspective. Students found the simulated patient cases to be organized (90%), enjoyable (82%), intellectually challenging (97%), and valuable to their understanding of course content (91%). Students further indicated that completion of the virtual patient cases prior to class permitted better use of class time (78%) and promoted SDL (84%). When assessment questions regarding material on postoperative nausea and vomiting were compared, no difference in scores were found between the students who attended the lecture on the material in 2011 (control group) and those who completed the virtual patient case on the material in 2012 (intervention group). Completion of virtual patient cases, designed to replace lectures and promote SDL, was overwhelmingly supported by students and proved to be as effective as traditional teaching methods.
Promotion of Self-directed Learning Using Virtual Patient Cases
Schonder, Kristine; McGee, James
2013-01-01
Objective. To assess the effectiveness of virtual patient cases to promote self-directed learning (SDL) in a required advanced therapeutics course. Design. Virtual patient software based on a branched-narrative decision-making model was used to create complex patient case simulations to replace lecture-based instruction. Within each simulation, students used SDL principles to learn course objectives, apply their knowledge through clinical recommendations, and assess their progress through patient outcomes and faculty feedback linked to their individual decisions. Group discussions followed each virtual patient case to provide further interpretation, clarification, and clinical perspective. Assessments. Students found the simulated patient cases to be organized (90%), enjoyable (82%), intellectually challenging (97%), and valuable to their understanding of course content (91%). Students further indicated that completion of the virtual patient cases prior to class permitted better use of class time (78%) and promoted SDL (84%). When assessment questions regarding material on postoperative nausea and vomiting were compared, no difference in scores were found between the students who attended the lecture on the material in 2011 (control group) and those who completed the virtual patient case on the material in 2012 (intervention group). Conclusion. Completion of virtual patient cases, designed to replace lectures and promote SDL, was overwhelmingly supported by students and proved to be as effective as traditional teaching methods. PMID:24052654
Evaluation of Littoral Combat Ships for Open-Ocean Anti-Submarine Warfare
2016-03-01
known. Source: R. R. Hill, R. G. Carl, and L. E. Champagne , “Using Agent-Based Simulation to Empirically Examine Search Theory Using a Historical Case...coverage over a small area. Source: R. R. Hill, R. G. Carl, and L. E. Champagne , “Using Agent-Based Simulation to Empirically Examine Search Theory...Defense Tech, May 30. Hill, R R, R G Carl, and L E Champagne . “Using agent-based simulation to empirically examine search theory using a
Willaert, Willem I M; Aggarwal, Rajesh; Daruwalla, Farhad; Van Herzeele, Isabelle; Darzi, Ara W; Vermassen, Frank E; Cheshire, Nicholas J
2012-06-01
Patient-specific simulated rehearsal (PsR) of a carotid artery stenting procedure (CAS) enables the interventionalist to rehearse the case before performing the procedure on the actual patient by incorporating patient-specific computed tomographic data into the simulation software. This study aimed to evaluate whether PsR of a CAS procedure can enhance the operative performance versus a virtual reality (VR) generic CAS warm-up procedure or no preparation at all. During a 10-session cognitive/technical VR course, medical residents were trained in CAS. Thereafter, in a randomized crossover study, each participant performed a patient-specific CAS case 3 times on the simulator, preceded by 3 different tasks: a PsR, a generic case, or no preparation. Technical performances were assessed using simulator-based metrics and expert-based ratings. Twenty medical residents (surgery, cardiology, radiology) were recruited. Training plateaus were observed after 10 sessions for all participants. Performances were significantly better after PsR than after a generic warm-up or no warm-up for total procedure time (16.3 ± 0.6 vs 19.7 ± 1.0 vs 20.9 ± 1.1 minutes, P = 0.001) and fluoroscopy time (9.3 ± 0.1 vs 11.2 ± 0.6 vs 11.2 ± 0.5 minutes, P = 0.022) but did not influence contrast volume or number of roadmaps used during the "real" case. PsR significantly improved the quality of performance as measured by the expert-based ratings (scores 28 vs 25 vs 25, P = 0.020). Patient-specific simulated rehearsal of a CAS procedure significantly improves operative performance, compared to a generic VR warm-up or no warm-up. This technology requires further investigation with respect to improved outcomes on patients in the clinical setting.
ERIC Educational Resources Information Center
Shifflet, Mark; Brown, Jane
2006-01-01
The purpose of this study was to investigate how exposure to classroom instruction affected the use of a computer simulation that was designed to provide students an opportunity to apply material presented in class. The study involved an analysis of a computer-based crisis communication case study designed for a college-level public relations…
GAPD: a GPU-accelerated atom-based polychromatic diffraction simulation code.
E, J C; Wang, L; Chen, S; Zhang, Y Y; Luo, S N
2018-03-01
GAPD, a graphics-processing-unit (GPU)-accelerated atom-based polychromatic diffraction simulation code for direct, kinematics-based, simulations of X-ray/electron diffraction of large-scale atomic systems with mono-/polychromatic beams and arbitrary plane detector geometries, is presented. This code implements GPU parallel computation via both real- and reciprocal-space decompositions. With GAPD, direct simulations are performed of the reciprocal lattice node of ultralarge systems (∼5 billion atoms) and diffraction patterns of single-crystal and polycrystalline configurations with mono- and polychromatic X-ray beams (including synchrotron undulator sources), and validation, benchmark and application cases are presented.
MODFLOW/MT3DMS-based simulation of variable-density ground water flow and transport
Langevin, C.D.; Guo, W.
2006-01-01
This paper presents an approach for coupling MODFLOW and MT3DMS for the simulation of variable-density ground water flow. MODFLOW routines were modified to solve a variable-density form of the ground water flow equation in which the density terms are calculated using an equation of state and the simulated MT3DMS solute concentrations. Changes to the MODFLOW and MT3DMS input files were kept to a minimum, and thus existing data files and data files created with most pre- and postprocessors can be used directly with the SEAWAT code. The approach was tested by simulating the Henry problem and two of the saltpool laboratory experiments (low- and high-density cases). For the Henry problem, the simulated results compared well with the steady-state semianalytic solution and also the transient isochlor movement as simulated by a finite-element model. For the saltpool problem, the simulated breakthrough curves compared better with the laboratory measurements for the low-density case than for the high-density case but showed good agreement with the measured salinity isosurfaces for both cases. Results from the test cases presented here indicate that the MODFLOW/MT3DMS approach provides accurate solutions for problems involving variable-density ground water flow and solute transport. ?? 2006 National Ground Water Association.
Distributed environmental control
NASA Technical Reports Server (NTRS)
Cleveland, Gary A.
1992-01-01
We present an architecture of distributed, independent control agents designed to work with the Computer Aided System Engineering and Analysis (CASE/A) simulation tool. CASE/A simulates behavior of Environmental Control and Life Support Systems (ECLSS). We describe a lattice of agents capable of distributed sensing and overcoming certain sensor and effector failures. We address how the architecture can achieve the coordinating functions of a hierarchical command structure while maintaining the robustness and flexibility of independent agents. These agents work between the time steps of the CASE/A simulation tool to arrive at command decisions based on the state variables maintained by CASE/A. Control is evaluated according to both effectiveness (e.g., how well temperature was maintained) and resource utilization (the amount of power and materials used).
[The characteristics of computer simulation of traffic accidents].
Zou, Dong-Hua; Liu, Ning-Guo; Chen, Jian-Guo; Jin, Xian-Long; Zhang, Xiao-Yun; Zhang, Jian-Hua; Chen, Yi-Jiu
2008-12-01
To reconstruct the collision process of traffic accident and the injury mode of the victim by computer simulation technology in forensic assessment of traffic accident. Forty actual accidents were reconstructed by stimulation software and high performance computer based on analysis of the trace evidences at the scene, damage of the vehicles and injury of the victims, with 2 cases discussed in details. The reconstruction correlated very well in 28 cases, well in 9 cases, and suboptimal in 3 cases with the above parameters. Accurate reconstruction of the accident would be helpful for assessment of the injury mechanism of the victims. Reconstruction of the collision process of traffic accident and the injury mechanism of the victim by computer simulation is useful in traffic accident assessment.
Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0
NASA Technical Reports Server (NTRS)
Knox, J. C.
1996-01-01
The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.
Construction schedule simulation of a diversion tunnel based on the optimized ventilation time.
Wang, Xiaoling; Liu, Xuepeng; Sun, Yuefeng; An, Juan; Zhang, Jing; Chen, Hongchao
2009-06-15
Former studies, the methods for estimating the ventilation time are all empirical in construction schedule simulation. However, in many real cases of construction schedule, the many factors have impact on the ventilation time. Therefore, in this paper the 3D unsteady quasi-single phase models are proposed to optimize the ventilation time with different tunneling lengths. The effect of buoyancy is considered in the momentum equation of the CO transport model, while the effects of inter-phase drag, lift force, and virtual mass force are taken into account in the momentum source of the dust transport model. The prediction by the present model for airflow in a diversion tunnel is confirmed by the experimental values reported by Nakayama [Nakayama, In-situ measurement and simulation by CFD of methane gas distribution at a heading faces, Shigen-to-Sozai 114 (11) (1998) 769-775]. The construction ventilation of the diversion tunnel of XinTangfang power station in China is used as a case. The distributions of airflow, CO and dust in the diversion tunnel are analyzed. A theory method for GIS-based dynamic visual simulation for the construction processes of underground structure groups is presented that combines cyclic operation network simulation, system simulation, network plan optimization, and GIS-based construction processes' 3D visualization. Based on the ventilation time the construction schedule of the diversion tunnel is simulated by the above theory method.
Gaewsky, James P; Weaver, Ashley A; Koya, Bharath; Stitzel, Joel D
2015-01-01
A 3-phase real-world motor vehicle crash (MVC) reconstruction method was developed to analyze injury variability as a function of precrash occupant position for 2 full-frontal Crash Injury Research and Engineering Network (CIREN) cases. Phase I: A finite element (FE) simplified vehicle model (SVM) was developed and tuned to mimic the frontal crash characteristics of the CIREN case vehicle (Camry or Cobalt) using frontal New Car Assessment Program (NCAP) crash test data. Phase II: The Toyota HUman Model for Safety (THUMS) v4.01 was positioned in 120 precrash configurations per case within the SVM. Five occupant positioning variables were varied using a Latin hypercube design of experiments: seat track position, seat back angle, D-ring height, steering column angle, and steering column telescoping position. An additional baseline simulation was performed that aimed to match the precrash occupant position documented in CIREN for each case. Phase III: FE simulations were then performed using kinematic boundary conditions from each vehicle's event data recorder (EDR). HIC15, combined thoracic index (CTI), femur forces, and strain-based injury metrics in the lung and lumbar vertebrae were evaluated to predict injury. Tuning the SVM to specific vehicle models resulted in close matches between simulated and test injury metric data, allowing the tuned SVM to be used in each case reconstruction with EDR-derived boundary conditions. Simulations with the most rearward seats and reclined seat backs had the greatest HIC15, head injury risk, CTI, and chest injury risk. Calculated injury risks for the head, chest, and femur closely correlated to the CIREN occupant injury patterns. CTI in the Camry case yielded a 54% probability of Abbreviated Injury Scale (AIS) 2+ chest injury in the baseline case simulation and ranged from 34 to 88% (mean = 61%) risk in the least and most dangerous occupant positions. The greater than 50% probability was consistent with the case occupant's AIS 2 hemomediastinum. Stress-based metrics were used to predict injury to the lower leg of the Camry case occupant. The regional-level injury metrics evaluated for the Cobalt case occupant indicated a low risk of injury; however, strain-based injury metrics better predicted pulmonary contusion. Approximately 49% of the Cobalt occupant's left lung was contused, though the baseline simulation predicted 40.5% of the lung to be injured. A method to compute injury metrics and risks as functions of precrash occupant position was developed and applied to 2 CIREN MVC FE reconstructions. The reconstruction process allows for quantification of the sensitivity and uncertainty of the injury risk predictions based on occupant position to further understand important factors that lead to more severe MVC injuries.
NASA Astrophysics Data System (ADS)
Mukhtar, Maseeh; Thiel, Bradley
2018-03-01
In fabrication, overlay measurements of semiconductor device patterns have conventionally been performed using optical methods. Beginning with image-based techniques using box-in-box to the more recent diffraction-based overlay (DBO). Alternatively, use of SEM overlay is under consideration for in-device overlay. Two main application spaces are measurement features from multiple mask levels on the same surface and buried features. Modern CD-SEMs are adept at measuring overlay for cases where all features are on the surface. In order to measure overlay of buried features, HV-SEM is needed. Gate-to-fin and BEOL overlay are important use cases for this technique. A JMONSEL simulation exercise was performed for these two cases using 10 nm line/space gratings of graduated increase in depth of burial. Backscattered energy loss results of these simulations were used to calculate the sensitivity measurements of buried features versus electron dosage for an array of electron beam voltages.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fridlind, Ann M.; Li, Xiaowen; Wu, Di
Advancing understanding of deep convection microphysics via mesoscale modeling studies of well-observed case studies requires observation-based aerosol inputs. Here, we derive hygroscopic aerosol size distribution input profiles from ground-based and airborne measurements for six convection case studies observed during the Midlatitude Continental Convective Cloud Experiment (MC3E) over Oklahoma. We demonstrate use of an input profile in simulations of the only well-observed case study that produced extensive stratiform outflow on 20 May 2011. At well-sampled elevations between –11 and –23 °C over widespread stratiform rain, ice crystal number concentrations are consistently dominated by a single mode near ~400 µm in randomly oriented maximummore » dimension ( D max). The ice mass at –23 °C is primarily in a closely collocated mode, whereas a mass mode near D max ~1000 µm becomes dominant with decreasing elevation to the –11 °C level, consistent with possible aggregation during sedimentation. However, simulations with and without observation-based aerosol inputs systematically overpredict mass peak D max by a factor of 3–5 and underpredict ice number concentration by a factor of 4–10. Previously reported simulations with both two-moment and size-resolved microphysics have shown biases of a similar nature. Furthermore, the observed ice properties are notably similar to those reported from recent tropical measurements. Based on several lines of evidence, we speculate that updraft microphysical pathways determining outflow properties in the 20 May case are similar to a tropical regime, likely associated with warm-temperature ice multiplication that is not well understood or well represented in models.« less
NASA Astrophysics Data System (ADS)
Fridlind, Ann M.; Li, Xiaowen; Wu, Di; van Lier-Walqui, Marcus; Ackerman, Andrew S.; Tao, Wei-Kuo; McFarquhar, Greg M.; Wu, Wei; Dong, Xiquan; Wang, Jingyu; Ryzhkov, Alexander; Zhang, Pengfei; Poellot, Michael R.; Neumann, Andrea; Tomlinson, Jason M.
2017-05-01
Advancing understanding of deep convection microphysics via mesoscale modeling studies of well-observed case studies requires observation-based aerosol inputs. Here, we derive hygroscopic aerosol size distribution input profiles from ground-based and airborne measurements for six convection case studies observed during the Midlatitude Continental Convective Cloud Experiment (MC3E) over Oklahoma. We demonstrate use of an input profile in simulations of the only well-observed case study that produced extensive stratiform outflow on 20 May 2011. At well-sampled elevations between -11 and -23 °C over widespread stratiform rain, ice crystal number concentrations are consistently dominated by a single mode near ˜ 400 µm in randomly oriented maximum dimension (Dmax). The ice mass at -23 °C is primarily in a closely collocated mode, whereas a mass mode near Dmax ˜ 1000 µm becomes dominant with decreasing elevation to the -11 °C level, consistent with possible aggregation during sedimentation. However, simulations with and without observation-based aerosol inputs systematically overpredict mass peak Dmax by a factor of 3-5 and underpredict ice number concentration by a factor of 4-10. Previously reported simulations with both two-moment and size-resolved microphysics have shown biases of a similar nature. The observed ice properties are notably similar to those reported from recent tropical measurements. Based on several lines of evidence, we speculate that updraft microphysical pathways determining outflow properties in the 20 May case are similar to a tropical regime, likely associated with warm-temperature ice multiplication that is not well understood or well represented in models.
Fridlind, Ann M.; Li, Xiaowen; Wu, Di; ...
2017-05-15
Advancing understanding of deep convection microphysics via mesoscale modeling studies of well-observed case studies requires observation-based aerosol inputs. Here, we derive hygroscopic aerosol size distribution input profiles from ground-based and airborne measurements for six convection case studies observed during the Midlatitude Continental Convective Cloud Experiment (MC3E) over Oklahoma. We demonstrate use of an input profile in simulations of the only well-observed case study that produced extensive stratiform outflow on 20 May 2011. At well-sampled elevations between –11 and –23 °C over widespread stratiform rain, ice crystal number concentrations are consistently dominated by a single mode near ~400 µm in randomly oriented maximummore » dimension ( D max). The ice mass at –23 °C is primarily in a closely collocated mode, whereas a mass mode near D max ~1000 µm becomes dominant with decreasing elevation to the –11 °C level, consistent with possible aggregation during sedimentation. However, simulations with and without observation-based aerosol inputs systematically overpredict mass peak D max by a factor of 3–5 and underpredict ice number concentration by a factor of 4–10. Previously reported simulations with both two-moment and size-resolved microphysics have shown biases of a similar nature. Furthermore, the observed ice properties are notably similar to those reported from recent tropical measurements. Based on several lines of evidence, we speculate that updraft microphysical pathways determining outflow properties in the 20 May case are similar to a tropical regime, likely associated with warm-temperature ice multiplication that is not well understood or well represented in models.« less
NASA Technical Reports Server (NTRS)
Fridlind, Ann M.; Xiaowen, Li; Wu, Di; Van Lier-Walqui, Marcus; Ackerman, Andrew S.; Tao, Wei-Kuo; McFarquhar, Greg M.; Wu, Wei; Dong, Xiquan; Wang, Jingyu;
2017-01-01
Advancing understanding of deep convection microphysics via mesoscale modeling studies of well-observed case studies requires observation-based aerosol inputs. Here, we derive hygroscopic aerosol size distribution input profiles from ground-based and airborne measurements for six convection case studies observed during the Midlatitude Continental Convective Cloud Experiment (MC3E) over Oklahoma. We demonstrate use of an input profile in simulations of the only well-observed case study that produced extensive stratiform outflow on 20 May 2011. At well-sampled elevations between -11 and -23 degree C over widespread stratiform rain, ice crystal number concentrations are consistently dominated by a single mode near 400 micrometer in randomly oriented maximum dimension (D[superscript max] ). The ice mass at -23 degree C is primarily in a closely collocated mode, whereas a mass mode near D[superscript max] -1000 micrometer becomes dominant with decreasing elevation to the -11 degree C level, consistent with possible aggregation during sedimentation. However, simulations with and without observation-based aerosol inputs systematically overpredict mass peak D[superscript max] by a factor of 3-5 and underpredict ice number concentration by a factor of 4-10. Previously reported simulations with both two-moment and size-resolved microphysics have shown biases of a similar nature. The observed ice properties are notably similar to those reported from recent tropical measurements. Based on several lines of evidence, we speculate that updraft microphysical pathways determining outflow properties in the 20 May case are similar to a tropical regime, likely associated with warm-temperature ice multiplication that is not well understood or well represented in models.
Simulating the impact of case-mix adjusted hospice rates.
Mor, V; Laliberte, L
1986-01-01
The Medicare hospice benefit prospectively reimburses hospices based on the inpatient status of the patient, whether or not the patient is at home, and whether the patient is receiving round-the-clock nursing. Using national Hospice Study data, two case-mix adjusters based on patient functioning and living arrangement were found to be significantly related to per diem cost. These were tested by simulating their impact on hospice revenues. Increasing per diem reimbursements 35 percent for nonambulatory patients living alone only increases hospice revenues by 4 percent; hospices with sicker patients benefit the most.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salko, Robert K; Sung, Yixing; Kucukboyaci, Vefa
The Virtual Environment for Reactor Applications core simulator (VERA-CS) being developed by the Consortium for the Advanced Simulation of Light Water Reactors (CASL) includes coupled neutronics, thermal-hydraulics, and fuel temperature components with an isotopic depletion capability. The neutronics capability employed is based on MPACT, a three-dimensional (3-D) whole core transport code. The thermal-hydraulics and fuel temperature models are provided by the COBRA-TF (CTF) subchannel code. As part of the CASL development program, the VERA-CS (MPACT/CTF) code system was applied to model and simulate reactor core response with respect to departure from nucleate boiling ratio (DNBR) at the limiting time stepmore » of a postulated pressurized water reactor (PWR) main steamline break (MSLB) event initiated at the hot zero power (HZP), either with offsite power available and the reactor coolant pumps in operation (high-flow case) or without offsite power where the reactor core is cooled through natural circulation (low-flow case). The VERA-CS simulation was based on core boundary conditions from the RETRAN-02 system transient calculations and STAR-CCM+ computational fluid dynamics (CFD) core inlet distribution calculations. The evaluation indicated that the VERA-CS code system is capable of modeling and simulating quasi-steady state reactor core response under the steamline break (SLB) accident condition, the results are insensitive to uncertainties in the inlet flow distributions from the CFD simulations, and the high-flow case is more DNB limiting than the low-flow case.« less
ERIC Educational Resources Information Center
Xiang, Lin
2011-01-01
This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on…
The Validity of Computer Audits of Simulated Cases Records.
ERIC Educational Resources Information Center
Rippey, Robert M.; And Others
This paper describes the implementation of a computer-based approach to scoring open-ended problem lists constructed to evaluate student and practitioner clinical judgment from real or simulated records. Based on 62 previously administered and scored problem lists, the program was written in BASIC for a Heathkit H11A computer (equivalent to DEC…
The Case for Forensic Toxicology
ERIC Educational Resources Information Center
Baker, William P.; DeBeus, Elizabeth; Jones, Carleton
2006-01-01
Understanding natural and human-induced hazards is an important part of the standards-based science curriculum. Experience, however, indicates that the topic is a difficult one for many students. We have developed an exciting investigative laboratory exercise that uses simulated food-based case studies to promote critical thinking and improve…
ERIC Educational Resources Information Center
Douglas-Lenders, Rachel Claire; Holland, Peter Jeffrey; Allen, Belinda
2017-01-01
Purpose: The purpose of this paper is to examine the impact of experiential simulation-based learning of employee self-efficacy. Design/Methodology/Approach: The research approach is an exploratory case study of a group of trainees from the same organisation. Using a quasi-experiment, one group, pre-test-post-test design (Tharenou et al., 2007), a…
ERIC Educational Resources Information Center
Douglas-Lenders, Rachel Claire; Holland, Peter Jeffrey; Allen, Belinda
2017-01-01
Purpose: The purpose of this paper is to examine the impact of experiential simulation-based learning of employee self-efficacy. Design/Methodology/Approach: The research approach is an exploratory case study of a group of trainees from the same organisation. Using a quasi-experiment, one group, pre-test-post-test design (Tharenou et al., 2007), a…
Carlson, Jim; Min, Elana; Bridges, Diane
2009-01-01
Methodology to train team behavior during simulation has received increased attention, but standard performance measures are lacking, especially at the undergraduate level. Our purposes were to develop a reliable team behavior measurement tool and explore the relationship between team behavior and the delivery of an appropriate standard of care specific to the simulated case. Authors developed a unique team measurement tool based on previous work. Trainees participated in a simulated event involving the presentation of acute dyspnea. Performance was rated by separate raters using the team behavior measurement tool. Interrater reliability was assessed. The relationship between team behavior and the standard of care delivered was explored. The instrument proved to be reliable for this case and group of raters. Team behaviors had a positive relationship with the standard of medical care delivered specific to the simulated case. The methods used provide a possible method for training and assessing team performance during simulation.
ERIC Educational Resources Information Center
Evagorou, Maria; Korfiatis, Kostas; Nicolaou, Christiana; Constantinou, Costas
2009-01-01
The purpose of this study was to investigate the impact of a simulation-based learning environment on elementary school students' (11-12 years old) development of system thinking skills. The learning environment included interactive simulations using the Stagecast Creator software to simulate the ecosystem of a marsh. Simulations are an important…
Application of State Quantization-Based Methods in HEP Particle Transport Simulation
NASA Astrophysics Data System (ADS)
Santi, Lucio; Ponieman, Nicolás; Jun, Soon Yung; Genser, Krzysztof; Elvira, Daniel; Castro, Rodrigo
2017-10-01
Simulation of particle-matter interactions in complex geometries is one of the main tasks in high energy physics (HEP) research. An essential aspect of it is an accurate and efficient particle transportation in a non-uniform magnetic field, which includes the handling of volume crossings within a predefined 3D geometry. Quantized State Systems (QSS) is a family of numerical methods that provides attractive features for particle transportation processes, such as dense output (sequences of polynomial segments changing only according to accuracy-driven discrete events) and lightweight detection and handling of volume crossings (based on simple root-finding of polynomial functions). In this work we present a proof-of-concept performance comparison between a QSS-based standalone numerical solver and an application based on the Geant4 simulation toolkit, with its default Runge-Kutta based adaptive step method. In a case study with a charged particle circulating in a vacuum (with interactions with matter turned off), in a uniform magnetic field, and crossing up to 200 volume boundaries twice per turn, simulation results showed speedups of up to 6 times in favor of QSS while it being 10 times slower in the case with zero volume boundaries.
Simulations of Ground and Space-Based Oxygen Atom Experiments
NASA Technical Reports Server (NTRS)
Finchum, A. (Technical Monitor); Cline, J. A.; Minton, T. K.; Braunstein, M.
2003-01-01
A low-earth orbit (LEO) materials erosion scenario and the ground-based experiment designed to simulate it are compared using the direct-simulation Monte Carlo (DSMC) method. The DSMC model provides a detailed description of the interactions between the hyperthermal gas flow and a normally oriented flat plate for each case. We find that while the general characteristics of the LEO exposure are represented in the ground-based experiment, multi-collision effects can potentially alter the impact energy and directionality of the impinging molecules in the ground-based experiment. Multi-collision phenomena also affect downstream flux measurements.
NASA Astrophysics Data System (ADS)
O'Connor, J. Michael; Pretorius, P. Hendrik; Gifford, Howard C.; Licho, Robert; Joffe, Samuel; McGuiness, Matthew; Mehurg, Shannon; Zacharias, Michael; Brankov, Jovan G.
2012-02-01
Our previous Single Photon Emission Computed Tomography (SPECT) myocardial perfusion imaging (MPI) research explored the utility of numerical observers. We recently created two hundred and eighty simulated SPECT cardiac cases using Dynamic MCAT (DMCAT) and SIMIND Monte Carlo tools. All simulated cases were then processed with two reconstruction methods: iterative ordered subset expectation maximization (OSEM) and filtered back-projection (FBP). Observer study sets were assembled for both OSEM and FBP methods. Five physicians performed an observer study on one hundred and seventy-nine images from the simulated cases. The observer task was to indicate detection of any myocardial perfusion defect using the American Society of Nuclear Cardiology (ASNC) 17-segment cardiac model and the ASNC five-scale rating guidelines. Human observer Receiver Operating Characteristic (ROC) studies established the guidelines for the subsequent evaluation of numerical model observer (NO) performance. Several NOs were formulated and their performance was compared with the human observer performance. One type of NO was based on evaluation of a cardiac polar map that had been pre-processed using a gradient-magnitude watershed segmentation algorithm. The second type of NO was also based on analysis of a cardiac polar map but with use of a priori calculated average image derived from an ensemble of normal cases.
Model-Based Verification and Validation of the SMAP Uplink Processes
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun
2013-01-01
This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.
Dawe, Susan R; Windsor, John A; Broeders, Joris A J L; Cregan, Patrick C; Hewett, Peter J; Maddern, Guy J
2014-02-01
A systematic review to determine whether skills acquired through simulation-based training transfer to the operating room for the procedures of laparoscopic cholecystectomy and endoscopy. Simulation-based training assumes that skills are directly transferable to the operation room, but only a few studies have investigated the effect of simulation-based training on surgical performance. A systematic search strategy that was used in 2006 was updated to retrieve relevant studies. Inclusion of articles was determined using a predetermined protocol, independent assessment by 2 reviewers, and a final consensus decision. Seventeen randomized controlled trials and 3 nonrandomized comparative studies were included in this review. In most cases, simulation-based training was in addition to patient-based training programs. Only 2 studies directly compared simulation-based training in isolation with patient-based training. For laparoscopic cholecystectomy (n = 10 studies) and endoscopy (n = 10 studies), participants who reached simulation-based skills proficiency before undergoing patient-based assessment performed with higher global assessment scores and fewer errors in the operating room than their counterparts who did not receive simulation training. Not all parameters measured were improved. Two of the endoscopic studies compared simulation-based training in isolation with patient-based training with different results: for sigmoidoscopy, patient-based training was more effective, whereas for colonoscopy, simulation-based training was equally effective. Skills acquired by simulation-based training seem to be transferable to the operative setting for laparoscopic cholecystectomy and endoscopy. Future research will strengthen these conclusions by evaluating predetermined competency levels on the same simulators and using objective validated global rating scales to measure operative performance.
Availability Simulation of AGT Systems
DOT National Transportation Integrated Search
1975-02-01
The report discusses the analytical and simulation procedures that were used to evaluate the effects of failure in a complex dual mode transportation system based on a worst case study-state condition. The computed results are an availability figure ...
The control effect in a detached laminar boundary layer of an array of normal synthetic jets
NASA Astrophysics Data System (ADS)
Valenzuela Calva, Fernando; Avila Rodriguez, Ruben
2016-11-01
In this work, 3D numerical simulations of an array of three normal circular synthetic jets embedded in an attached laminar boundary layer that separates under the influence of an inclined flap are performed for flow separation control. At the beginning of the present study, three cases are used to validate the numerical simulation with data obtained from experiments. The experimental data is chosen based on the cases which presented higher repeatability and reliability. Simulations showed reasonable agreement when compared with experiments. The simulations are undertaken at three synthetic jet operating conditions, i.e. Case A: L = 2, VR = 0.32; Case B: L = 4, VR = 0.64 and Case C: L = 6, VR = 0.96. The vortical structures produced for each synthetic jet operating condition are hairpin vortices for Case A and tilted vortices for Case B and C, respectively. By examining the spatial wall shear stress variations, the effect on the boundary layer prior to separation of the middle synthetic jet is evaluated. For effective flow control, produced at a relatively low the finding from this study suggests that hairpin vortical structures are more desirable structures. Universidad Nacional Autonoma de Mexico.
Vibration modelling and verifications for whole aero-engine
NASA Astrophysics Data System (ADS)
Chen, G.
2015-08-01
In this study, a new rotor-ball-bearing-casing coupling dynamic model for a practical aero-engine is established. In the coupling system, the rotor and casing systems are modelled using the finite element method, support systems are modelled as lumped parameter models, nonlinear factors of ball bearings and faults are included, and four types of supports and connection models are defined to model the complex rotor-support-casing coupling system of the aero-engine. A new numerical integral method that combines the Newmark-β method and the improved Newmark-β method (Zhai method) is used to obtain the system responses. Finally, the new model is verified in three ways: (1) modal experiment based on rotor-ball bearing rig, (2) modal experiment based on rotor-ball-bearing-casing rig, and (3) fault simulations for a certain type of missile turbofan aero-engine vibration. The results show that the proposed model can not only simulate the natural vibration characteristics of the whole aero-engine but also effectively perform nonlinear dynamic simulations of a whole aero-engine with faults.
Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y K
2016-05-01
We present an efficient and scalable scheme for implementing agent-based modeling (ABM) simulation with In Situ visualization of large complex systems on heterogeneous computing platforms. The scheme is designed to make optimal use of the resources available on a heterogeneous platform consisting of a multicore CPU and a GPU, resulting in minimal to no resource idle time. Furthermore, the scheme was implemented under a client-server paradigm that enables remote users to visualize and analyze simulation data as it is being generated at each time step of the model. Performance of a simulation case study of vocal fold inflammation and wound healing with 3.8 million agents shows 35× and 7× speedup in execution time over single-core and multi-core CPU respectively. Each iteration of the model took less than 200 ms to simulate, visualize and send the results to the client. This enables users to monitor the simulation in real-time and modify its course as needed.
Modeling Sediment Transport to the Ganga-Brahmaputra-Meghna Delta
NASA Astrophysics Data System (ADS)
Silvestre, J.; Higgins, S.; Jennings, K. S.
2016-12-01
India's National River Linking Project (NRLP) will transfer approximately 174 Bm3/y of water from the mountainous, water-rich north to the water-scarce south and west. Although there are many short-term benefits of the NRLP, such as decreased flooding during the monsoon season and increased water resources for irrigation, long-term consequences may include decreased sedimentation to the Ganga-Brahmaputra-Meghna Delta (GBM). Currently the GBM has a vertical aggradation rate of approximately 1-2 cm/y and is able to compensate for a global mean sea level rise of 3.3 ± 0.4 mm/y. However, Bangladesh and the GBM stand to be geomorphically impacted should the aggradation rate fall below sea level rise. This study better constrains influences of anthropogenic activities and sediment transport to the GBM. We employ HydroTrend, a climate-driven hydrological and sediment transport model, to simulate daily sediment and water fluxes for the period 1982 - 2012. Simulations are calibrated and validated against water discharge data from the Farakka Barrage, and different ways of delineating the Ganga Basin into sub-catchments are explored. Preliminary results show a 47% difference between simulated and observed mean annual water discharge when using basin-averaged input values and only a 1% difference for the base-case scenario, where proposed dams and canals are not included. Comparisons between the canals simulation (proposed NRLP included) and validation data suggest a 60% reduction in sediment load. However, comparison between the base-case simulation and the canals simulation suggests that India's water transfer project could decrease sediment delivery to the GBM by 9%. Further work should investigate improvements in the agreement between base-case simulation and validation data.
ERIC Educational Resources Information Center
Eränpalo, Tommi
2014-01-01
This article is based on a case study where groups of Finnish, Swedish and Norwegian young people played a simulation game that stimulated collective deliberation on social issues. The game has been designed to provoke students to deliberate and to reflect on social problems relating to issues of citizenship and democracy. The analysis of the…
NASA Technical Reports Server (NTRS)
Stevens, N. J.
1979-01-01
Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.
Simulating the impact of case-mix adjusted hospice rates
Mor, Vincent; Laliberte, Linda
1986-01-01
The Medicare hospice benefit prospectively reimburses hospices based on the inpatient status of the patient, whether or not the patient is at home, and whether the patient is receiving round-the-clock nursing. Using National Hospice Study data, two case-mix adjusters based on patient functioning and living arrangement were found to be significantly related to per diem cost. These were tested by simulating their impact on hospice revenues. Increasing per diem reimbursements 35 percent for nonambulatory patients living alone only increases hospice revenues by 4 percent; hospices with sicker patients benefit the most. PMID:10312012
Studying distributed cognition of simulation-based team training with DiCoT.
Rybing, Jonas; Nilsson, Heléne; Jonson, Carl-Oscar; Bang, Magnus
2016-03-01
Health care organizations employ simulation-based team training (SBTT) to improve skill, communication and coordination in a broad range of critical care contexts. Quantitative approaches, such as team performance measurements, are predominantly used to measure SBTTs effectiveness. However, a practical evaluation method that examines how this approach supports cognition and teamwork is missing. We have applied Distributed Cognition for Teamwork (DiCoT), a method for analysing cognition and collaboration aspects of work settings, with the purpose of assessing the methodology's usefulness for evaluating SBTTs. In a case study, we observed and analysed four Emergo Train System® simulation exercises where medical professionals trained emergency response routines. The study suggests that DiCoT is an applicable and learnable tool for determining key distributed cognition attributes of SBTTs that are of importance for the simulation validity of training environments. Moreover, we discuss and exemplify how DiCoT supports design of SBTTs with a focus on transfer and validity characteristics. Practitioner Summary: In this study, we have evaluated a method to assess simulation-based team training environments from a cognitive ergonomics perspective. Using a case study, we analysed Distributed Cognition for Teamwork (DiCoT) by applying it to the Emergo Train System®. We conclude that DiCoT is useful for SBTT evaluation and simulator (re)design.
Trentham-Dietz, Amy; Ergun, Mehmet Ali; Alagoz, Oguzhan; Stout, Natasha K; Gangnon, Ronald E; Hampton, John M; Dittus, Kim; James, Ted A; Vacek, Pamela M; Herschorn, Sally D; Burnside, Elizabeth S; Tosteson, Anna N A; Weaver, Donald L; Sprague, Brian L
2018-02-01
Due to limitations in the ability to identify non-progressive disease, ductal carcinoma in situ (DCIS) is usually managed similarly to localized invasive breast cancer. We used simulation modeling to evaluate the potential impact of a hypothetical test that identifies non-progressive DCIS. A discrete-event model simulated a cohort of U.S. women undergoing digital screening mammography. All women diagnosed with DCIS underwent the hypothetical DCIS prognostic test. Women with test results indicating progressive DCIS received standard breast cancer treatment and a decrement to quality of life corresponding to the treatment. If the DCIS test indicated non-progressive DCIS, no treatment was received and women continued routine annual surveillance mammography. A range of test performance characteristics and prevalence of non-progressive disease were simulated. Analysis compared discounted quality-adjusted life years (QALYs) and costs for test scenarios to base-case scenarios without the test. Compared to the base case, a perfect prognostic test resulted in a 40% decrease in treatment costs, from $13,321 to $8005 USD per DCIS case. A perfect test produced 0.04 additional QALYs (16 days) for women diagnosed with DCIS, added to the base case of 5.88 QALYs per DCIS case. The results were sensitive to the performance characteristics of the prognostic test, the proportion of DCIS cases that were non-progressive in the model, and the frequency of mammography screening in the population. A prognostic test that identifies non-progressive DCIS would substantially reduce treatment costs but result in only modest improvements in quality of life when averaged over all DCIS cases.
Participation in Science Practices while Working in a Multimedia Case-Based Environment
ERIC Educational Resources Information Center
Kang, Hosun; Lundeberg, Mary A.
2010-01-01
The purpose of this study was to investigate how two female students participated in science practices as they worked in a multimedia case-based environment: interpreting simulated results, reading and writing multiple texts, role-playing, and Internet conferencing. Using discourse analysis, the following data were analyzed: students' published…
Computer Modeling to Evaluate the Impact of Technology Changes on Resident Procedural Volume.
Grenda, Tyler R; Ballard, Tiffany N S; Obi, Andrea T; Pozehl, William; Seagull, F Jacob; Chen, Ryan; Cohn, Amy M; Daskin, Mark S; Reddy, Rishindra M
2016-12-01
As resident "index" procedures change in volume due to advances in technology or reliance on simulation, it may be difficult to ensure trainees meet case requirements. Training programs are in need of metrics to determine how many residents their institutional volume can support. As a case study of how such metrics can be applied, we evaluated a case distribution simulation model to examine program-level mediastinoscopy and endobronchial ultrasound (EBUS) volumes needed to train thoracic surgery residents. A computer model was created to simulate case distribution based on annual case volume, number of trainees, and rotation length. Single institutional case volume data (2011-2013) were applied, and 10 000 simulation years were run to predict the likelihood (95% confidence interval) of all residents (4 trainees) achieving board requirements for operative volume during a 2-year program. The mean annual mediastinoscopy volume was 43. In a simulation of pre-2012 board requirements (thoracic pathway, 25; cardiac pathway, 10), there was a 6% probability of all 4 residents meeting requirements. Under post-2012 requirements (thoracic, 15; cardiac, 10), however, the likelihood increased to 88%. When EBUS volume (mean 19 cases per year) was concurrently evaluated in the post-2012 era (thoracic, 10; cardiac, 0), the likelihood of all 4 residents meeting case requirements was only 23%. This model provides a metric to predict the probability of residents meeting case requirements in an era of changing volume by accounting for unpredictable and inequitable case distribution. It could be applied across operations, procedures, or disease diagnoses and may be particularly useful in developing resident curricula and schedules.
2017-06-01
cases have the most significant impact on reducing the number of lethal shots fired in the simulation. Table 10 shows the reduction in the average...Figure ES-2 was developed to show the results of the focused study on maximum effective range. After analyzing the results of the 1,700 simulated...toward other agents based on whose side they are on at that time. This attribute is critical to this study as the sidedness of the local population is
Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment
NASA Astrophysics Data System (ADS)
Zeigler, Bernard P.; Lee, J. S.
1998-08-01
In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.
ERIC Educational Resources Information Center
Craig, Shelley L.; McInroy, Lauren B.; Bogo, Marion; Thompson, Michelle
2017-01-01
Simulation-based learning (SBL) is a powerful tool for social work education, preparing students to practice in integrated health care settings. In an educational environment addressing patient health using an integrated care model, there is growing emphasis on students developing clinical competencies prior to entering clinical placements or…
Circuit-based versus full-wave modelling of active microwave circuits
NASA Astrophysics Data System (ADS)
Bukvić, Branko; Ilić, Andjelija Ž.; Ilić, Milan M.
2018-03-01
Modern full-wave computational tools enable rigorous simulations of linear parts of complex microwave circuits within minutes, taking into account all physical electromagnetic (EM) phenomena. Non-linear components and other discrete elements of the hybrid microwave circuit are then easily added within the circuit simulator. This combined full-wave and circuit-based analysis is a must in the final stages of the circuit design, although initial designs and optimisations are still faster and more comfortably done completely in the circuit-based environment, which offers real-time solutions at the expense of accuracy. However, due to insufficient information and general lack of specific case studies, practitioners still struggle when choosing an appropriate analysis method, or a component model, because different choices lead to different solutions, often with uncertain accuracy and unexplained discrepancies arising between the simulations and measurements. We here design a reconfigurable power amplifier, as a case study, using both circuit-based solver and a full-wave EM solver. We compare numerical simulations with measurements on the manufactured prototypes, discussing the obtained differences, pointing out the importance of measured parameters de-embedding, appropriate modelling of discrete components and giving specific recipes for good modelling practices.
An Analysis of the Verdicts and Decision-Making Variables of Simulated Juries.
ERIC Educational Resources Information Center
Anapol, Malthon M.
In order to examine jury deliberations, researchers simulated and videotaped court proceedings and jury deliberations based upon an actual civil court case. Special care was taken to make the simulated trial as authentic as the original trial. College students and the general public provided the jurors, which were then divided into twelve separate…
Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo
2018-01-01
It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.
Retrievals of aerosol microphysics from simulations of spaceborne multiwavelength lidar measurements
NASA Astrophysics Data System (ADS)
Whiteman, David N.; Pérez-Ramírez, Daniel; Veselovskii, Igor; Colarco, Peter; Buchard, Virginie
2018-01-01
In support of the Aerosol, Clouds, Ecosystems mission, simulations of a spaceborne multiwavelength lidar are performed based on global model simulations of the atmosphere along a satellite orbit track. The yield for aerosol microphysical inversions is quantified and comparisons are made between the aerosol microphysics inherent in the global model and those inverted from both the model's optical data and the simulated three backscatter and two extinction lidar measurements, which are based on the model's optical data. We find that yield can be significantly increased if inversions based on a reduced optical dataset of three backscatter and one extinction are acceptable. In general, retrieval performance is better for cases where the aerosol fine mode dominates although a lack of sensitivity to particles with sizes less than 0.1 μm is found. Lack of sensitivity to coarse mode cases is also found, in agreement with earlier studies. Surface area is generally the most robustly retrieved quantity. The work here points toward the need for ancillary data to aid in the constraints of the lidar inversions and also for joint inversions involving lidar and polarimeter measurements.
Retrievals of Aerosol Microphysics from Simulations of Spaceborne Multiwavelength Lidar Measurements
NASA Technical Reports Server (NTRS)
Whiteman, David N.; Perez-Ramírez, Daniel; Veselovskii, Igor; Colarco, Peter; Buchard, Virginie
2017-01-01
In support of the Aerosol, Clouds, Ecosystems mission, simulations of a spaceborne multiwavelength lidar are performed based on global model simulations of the atmosphere along a satellite orbit track. The yield for aerosol microphysical inversions is quantified and comparisons are made between the aerosol microphysics inherent in the global model and those inverted from both the model's optical data and the simulated three backscatter and two extinction lidar measurements, which are based on the model's optical data. We find that yield can be significantly increased if inversions based on a reduced optical dataset of three backscatter and one extinction are acceptable. In general, retrieval performance is better for cases where the aerosol fine mode dominates although a lack of sensitivity to particles with sizes less than 0.1 microns is found. Lack of sensitivity to coarse mode cases is also found, in agreement with earlier studies. Surface area is generally the most robustly retrieved quantity. The work here points toward the need for ancillary data to aid in the constraints of the lidar inversions and also for joint inversions involving lidar and polarimeter measurements.
NASA Astrophysics Data System (ADS)
Manessa, Masita Dwi Mandini; Kanno, Ariyo; Sagawa, Tatsuyuki; Sekine, Masahiko; Nurdin, Nurjannah
2018-01-01
Lyzenga's multispectral bathymetry formula has attracted considerable interest due to its simplicity. However, there has been little discussion of the effect that variation in optical conditions and bottom types-which commonly appears in coral reef environments-has on this formula's results. The present paper evaluates Lyzenga's multispectral bathymetry formula for a variety of optical conditions and bottom types. A noiseless dataset of above-water remote sensing reflectance from WorldView-2 images over Case-1 shallow coral reef water is simulated using a radiative transfer model. The simulation-based assessment shows that Lyzenga's formula performs robustly, with adequate generality and good accuracy, under a range of conditions. As expected, the influence of bottom type on depth estimation accuracy is far greater than the influence of other optical parameters, i.e., chlorophyll-a concentration and solar zenith angle. Further, based on the simulation dataset, Lyzenga's formula estimates depth when the bottom type is unknown almost as accurately as when the bottom type is known. This study provides a better understanding of Lyzenga's multispectral bathymetry formula under various optical conditions and bottom types.
Computer simulation of surface and film processes
NASA Technical Reports Server (NTRS)
Tiller, W. A.
1981-01-01
A molecular dynamics technique based upon Lennard-Jones type pair interactions is used to investigate time-dependent as well as equilibrium properties. The case study deals with systems containing Si and O atoms. In this case a more involved potential energy function (PEF) is employed and the system is simulated via a Monte-Carlo procedure. This furnishes the equilibrium properties of the system at its interfaces and surfaces as well as in the bulk.
2015-01-01
Procedure. The simulated annealing (SA) algorithm is a well-known local search metaheuristic used to address discrete, continuous, and multiobjective...design of experiments (DOE) to tune the parameters of the optimiza- tion algorithm . Section 5 shows the results of the case study. Finally, concluding... metaheuristic . The proposed method is broken down into two phases. Phase I consists of a Monte Carlo simulation to obtain the simulated percentage of failure
Xu, Yuan; Bai, Ti; Yan, Hao; Ouyang, Luo; Pompos, Arnold; Wang, Jing; Zhou, Linghong; Jiang, Steve B.; Jia, Xun
2015-01-01
Cone-beam CT (CBCT) has become the standard image guidance tool for patient setup in image-guided radiation therapy. However, due to its large illumination field, scattered photons severely degrade its image quality. While kernel-based scatter correction methods have been used routinely in the clinic, it is still desirable to develop Monte Carlo (MC) simulation-based methods due to their accuracy. However, the high computational burden of the MC method has prevented routine clinical application. This paper reports our recent development of a practical method of MC-based scatter estimation and removal for CBCT. In contrast with conventional MC approaches that estimate scatter signals using a scatter-contaminated CBCT image, our method used a planning CT image for MC simulation, which has the advantages of accurate image intensity and absence of image truncation. In our method, the planning CT was first rigidly registered with the CBCT. Scatter signals were then estimated via MC simulation. After scatter signals were removed from the raw CBCT projections, a corrected CBCT image was reconstructed. The entire workflow was implemented on a GPU platform for high computational efficiency. Strategies such as projection denoising, CT image downsampling, and interpolation along the angular direction were employed to further enhance the calculation speed. We studied the impact of key parameters in the workflow on the resulting accuracy and efficiency, based on which the optimal parameter values were determined. Our method was evaluated in numerical simulation, phantom, and real patient cases. In the simulation cases, our method reduced mean HU errors from 44 HU to 3 HU and from 78 HU to 9 HU in the full-fan and the half-fan cases, respectively. In both the phantom and the patient cases, image artifacts caused by scatter, such as ring artifacts around the bowtie area, were reduced. With all the techniques employed, we achieved computation time of less than 30 sec including the time for both the scatter estimation and CBCT reconstruction steps. The efficacy of our method and its high computational efficiency make our method attractive for clinical use. PMID:25860299
Falcione, Bonnie A; Meyer, Susan M
2014-10-15
To design an elective for pharmacy students that facilitates antimicrobial stewardship awareness, knowledge, and skill development by solving clinical cases, using human patient simulation technology. The elective was designed for PharmD students to describe principles and functions of stewardship programs, select, evaluate, refine, or redesign patient-specific plans for infectious diseases in the context of antimicrobial stewardship, and propose criteria and stewardship management strategies for an antimicrobial class at a health care institution. Teaching methods included active learning and lectures. Cases of bacterial endocarditis and cryptococcal meningitis were developed that incorporated human patient simulation technology. Forty-five pharmacy students completed an antimicrobial stewardship elective between 2010 and 2013. Outcomes were assessed using student perceptions of and performance on rubric-graded assignments. A PharmD elective using active learning, including novel cases conducted with human patient simulation technology, enabled outcomes consistent with those desired of pharmacists assisting in antimicrobial stewardship programs.
Wasson, Katherine; Parsi, Kayhan; McCarthy, Michael; Siddall, Viva Jo; Kuczewski, Mark
2016-06-01
The American Society for Bioethics and Humanities has created a quality attestation (QA) process for clinical ethics consultants; the pilot phase of reviewing portfolios has begun. One aspect of the QA process which is particularly challenging is assessing the interpersonal skills of individual clinical ethics consultants. We propose that using case simulation to evaluate clinical ethics consultants is an approach that can meet this need provided clear standards for assessment are identified. To this end, we developed the Assessing Clinical Ethics Skills (ACES) tool, which identifies and specifies specific behaviors that a clinical ethics consultant should demonstrate in an ethics case simulation. The aim is for the clinical ethics consultant or student to use a videotaped case simulation, along with the ACES tool scored by a trained rater, to demonstrate their competence as part of their QA portfolio. The development and piloting of the tool is described.
Applying Service-Oriented Architecture on The Development of Groundwater Modeling Support System
NASA Astrophysics Data System (ADS)
Li, C. Y.; WANG, Y.; Chang, L. C.; Tsai, J. P.; Hsiao, C. T.
2016-12-01
Groundwater simulation has become an essential step on the groundwater resources management and assessment. There are many stand-alone pre- and post-processing software packages to alleviate the model simulation loading, but the stand-alone software do not consider centralized management of data and simulation results neither do they provide network sharing functions. Hence, it is difficult to share and reuse the data and knowledge (simulation cases) systematically within or across companies. Therefore, this study develops a centralized and network based groundwater modeling support system to assist model construction. The system is based on service-oriented architecture and allows remote user to develop their modeling cases on internet. The data and cases (knowledge) are thus easy to manage centralized. MODFLOW is the modeling engine of the system, which is the most popular groundwater model in the world. The system provides a data warehouse to restore groundwater observations, MODFLOW Support Service, MODFLOW Input File & Shapefile Convert Service, MODFLOW Service, and Expert System Service to assist researchers to build models. Since the system architecture is service-oriented, it is scalable and flexible. The system can be easily extended to include the scenarios analysis and knowledge management to facilitate the reuse of groundwater modeling knowledge.
Clinical decision-making by midwives: managing case complexity.
Cioffi, J; Markham, R
1997-02-01
In making clinical judgements, it is argued that midwives use 'shortcuts' or heuristics based on estimated probabilities to simplify the decision-making task. Midwives (n = 30) were given simulated patient assessment situations of high and low complexity and were required to think aloud. Analysis of verbal protocols showed that subjective probability judgements (heuristics) were used more frequently in the high than low complexity case and predominated in the last quarter of the assessment period for the high complexity case. 'Representativeness' was identified more frequently in the high than in the low case, but was the dominant heuristic in both. Reports completed after each simulation suggest that heuristics based on memory for particular conditions affect decisions. It is concluded that midwives use heuristics, derived mainly from their clinical experiences, in an attempt to save cognitive effort and to facilitate reasonably accurate decisions in the decision-making process.
Variance reduction for Fokker–Planck based particle Monte Carlo schemes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorji, M. Hossein, E-mail: gorjih@ifd.mavt.ethz.ch; Andric, Nemanja; Jenny, Patrick
Recently, Fokker–Planck based particle Monte Carlo schemes have been proposed and evaluated for simulations of rarefied gas flows [1–3]. In this paper, the variance reduction for particle Monte Carlo simulations based on the Fokker–Planck model is considered. First, deviational based schemes were derived and reviewed, and it is shown that these deviational methods are not appropriate for practical Fokker–Planck based rarefied gas flow simulations. This is due to the fact that the deviational schemes considered in this study lead either to instabilities in the case of two-weight methods or to large statistical errors if the direct sampling method is applied.more » Motivated by this conclusion, we developed a novel scheme based on correlated stochastic processes. The main idea here is to synthesize an additional stochastic process with a known solution, which is simultaneously solved together with the main one. By correlating the two processes, the statistical errors can dramatically be reduced; especially for low Mach numbers. To assess the methods, homogeneous relaxation, planar Couette and lid-driven cavity flows were considered. For these test cases, it could be demonstrated that variance reduction based on parallel processes is very robust and effective.« less
Maljovec, D.; Liu, S.; Wang, B.; ...
2015-07-14
Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less
CaseWorld™: Interactive, media rich, multidisciplinary case based learning.
Gillham, David; Tucker, Katie; Parker, Steve; Wright, Victoria; Kargillis, Christina
2015-11-01
Nurse educators are challenged to keep up with highly specialised clinical practice, emerging research evidence, regulation requirements and rapidly changing information technology while teaching very large numbers of diverse students in a resource constrained environment. This complex setting provides the context for the CaseWorld project, which aims to simulate those aspects of clinical practice that can be represented by e-learning. This paper describes the development, implementation and evaluation of CaseWorld, a simulated learning environment that supports case based learning. CaseWorld provides nursing students with the opportunity to view unfolding authentic cases presented in a rich multimedia context. The first round of comprehensive summative evaluation of CaseWorld is discussed in the context of earlier formative evaluation, reference group input and strategies for integration of CaseWorld with subject content. This discussion highlights the unique approach taken in this project that involved simultaneous prototype development and large scale implementation, thereby necessitating strong emphasis on staff development, uptake and engagement. The lessons learned provide an interesting basis for further discussion of broad content sharing across disciplines and universities, and the contribution that local innovations can make to global education advancement. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Simulation of shoreline development in a groyne system, with a case study Sanur Bali beach
NASA Astrophysics Data System (ADS)
Gunawan, P. H.; Pudjaprasetya, S. R.
2018-03-01
The process of shoreline changes due to transport of sediment by littoral drift is studied in this paper. Pelnard-Considère is the commonly adopted model. This model is based on the principle of sediment conservation, without diffraction. In this research, we adopt the Pelnard-Considère equation with diffraction, and a numerical scheme based on the finite volume method is implemented. Shoreline development in a groyne system is then simulated. For a case study, the Sanur Bali Beach, Indonesia is considered, in which from Google Earth photos, the beach experiences changes of coastline caused by sediment trapped in a groyne system.
Cabaraban, Maria Theresa I; Kroll, Charles N; Hirabayashi, Satoshi; Nowak, David J
2013-05-01
A distributed adaptation of i-Tree Eco was used to simulate dry deposition in an urban area. This investigation focused on the effects of varying temperature, LAI, and NO2 concentration inputs on estimated NO2 dry deposition to trees in Baltimore, MD. A coupled modeling system is described, wherein WRF provided temperature and LAI fields, and CMAQ provided NO2 concentrations. A base case simulation was conducted using built-in distributed i-Tree Eco tools, and simulations using different inputs were compared against this base case. Differences in land cover classification and tree cover between the distributed i-Tree Eco and WRF resulted in changes in estimated LAI, which in turn resulted in variations in simulated NO2 dry deposition. Estimated NO2 removal decreased when CMAQ-derived concentration was applied to the distributed i-Tree Eco simulation. Discrepancies in temperature inputs did little to affect estimates of NO2 removal by dry deposition to trees in Baltimore. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Chaparro-Pelaez, Julian; Iglesias-Pradas, Santiago; Pascual-Miguel, Felix J.; Hernandez-Garcia, Angel
2013-01-01
Although literature about problem based learning (PBL) is not scarce, there is little research on experiences about learning methodologies that combine PBL and the use of simulation tools. This lack of studies is even more notable in the case of engineering courses. The motivation for this study is to show how such a combination of PBL and…
Henke, Lauren; Kashani, Rojano; Yang, Deshan; Zhao, Tianyu; Green, Olga; Olsen, Lindsey; Rodriguez, Vivian; Wooten, H. Omar; Li, H. Harold; Hu, Yanle; Bradley, Jeffrey; Robinson, Clifford; Parikh, Parag; Michalski, Jeff; Mutic, Sasa; Olsen, Jeffrey
2017-01-01
Purpose/Objectives Stereotactic body radiotherapy (SBRT) is increasingly used to treat oligometastatic or unresectable primary malignancy, although proximity of organs-at-risk (OAR) may limit delivery of sufficiently ablative dose. Magnetic resonance (MR)-based online-adaptive radiotherapy (ART) has potential to improve SBRT’s therapeutic ratio. This study characterizes potential advantages of online-adaptive MR-guided SBRT to treat oligometastatic disease of the non-liver abdomen and central thorax. Materials/Methods Ten patients treated with RT for unresectable primary or oligometastatic disease of the non-liver abdomen (n=5) or central thorax (n=5) underwent imaging throughout treatment on a clinical MR-IGRT system. SBRT plans were created based on tumor/OAR anatomy at initial CT simulation (PI) and simulated adaptive plans were created based on observed MR-image set tumor/OAR “anatomy-of-the-day” (PA). Each PA was planned under workflow constraints to simulate online-ART. Prescribed dose was 50Gy/5fractions with goal coverage of 95% PTV by 95% of the prescription, subject to hard OAR constraints. PI was applied to each MR dataset and compared to PA to evaluate changes in dose delivered to tumor/OARs, with dose escalation when possible. Results Hard OAR constraints were met for all PI based on anatomy from initial CT simulation, and all PA based on anatomy from each daily MR-image set. Application of the PI to anatomy-of-the-day caused OAR constraint violation in 19/30 cases. Adaptive planning increased PTV coverage in 21/30 cases, including 14 cases where hard OAR constraints were violated by the non-adaptive plan. For 9 PA cases, decreased PTV coverage was required to meet hard OAR constraints that would have been violated in a non-adaptive setting. Conclusions Online-adaptive MRI-guided SBRT may allow PTV dose escalation and/or simultaneous OAR sparing compared to non-adaptive SBRT. A prospective clinical trial is underway at our institution to evaluate clinical outcomes of this technique. PMID:27742541
High fidelity case-based simulation debriefing: everything you need to know.
Hart, Danielle; McNeil, Mary Ann; Griswold-Theodorson, Sharon; Bhatia, Kriti; Joing, Scott
2012-09-01
In this 30-minute talk, the authors take an in-depth look at how to debrief high-fidelity case-based simulation sessions, including discussion on debriefing theory, goals, approaches, and structure, as well as ways to create a supportive and safe learning environment, resulting in successful small group learning and self-reflection. Emphasis is placed on the "debriefing with good judgment" approach. Video clips of sample debriefing attempts, highlighting the "dos and don'ts" of simulation debriefing, are included. The goal of this talk is to provide you with the necessary tools and information to develop a successful and effective debriefing approach. There is a bibliography and a quick reference guide in Data Supplements S1 and S2 (available as supporting information in the online version of this paper). © 2012 by the Society for Academic Emergency Medicine.
Convergence of methods for coupling of microscopic and mesoscopic reaction-diffusion simulations
NASA Astrophysics Data System (ADS)
Flegg, Mark B.; Hellander, Stefan; Erban, Radek
2015-05-01
In this paper, three multiscale methods for coupling of mesoscopic (compartment-based) and microscopic (molecular-based) stochastic reaction-diffusion simulations are investigated. Two of the three methods that will be discussed in detail have been previously reported in the literature; the two-regime method (TRM) and the compartment-placement method (CPM). The third method that is introduced and analysed in this paper is called the ghost cell method (GCM), since it works by constructing a "ghost cell" in which molecules can disappear and jump into the compartment-based simulation. Presented is a comparison of sources of error. The convergent properties of this error are studied as the time step Δt (for updating the molecular-based part of the model) approaches zero. It is found that the error behaviour depends on another fundamental computational parameter h, the compartment size in the mesoscopic part of the model. Two important limiting cases, which appear in applications, are considered: Δt → 0 and h is fixed; Δt → 0 and h → 0 such that √{ Δt } / h is fixed. The error for previously developed approaches (the TRM and CPM) converges to zero only in the limiting case (ii), but not in case (i). It is shown that the error of the GCM converges in the limiting case (i). Thus the GCM is superior to previous coupling techniques if the mesoscopic description is much coarser than the microscopic part of the model.
Computer Simulations Improve University Instructional Laboratories1
2004-01-01
Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599
NASA Astrophysics Data System (ADS)
Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin
2012-08-01
Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.
Parker, Daniel M; Landier, Jordi; von Seidlein, Lorenz; Dondorp, Arjen; White, Lisa; Hanboonkunupakarn, Borimas; Maude, Richard J; Nosten, François H
2016-11-25
Reactive case detection is an approach that has been proposed as a tool for malaria elimination in low-transmission settings. It is an intuitively justified approach based on the concept of space-time clustering of malaria cases. When an index malaria clinical case is detected, it triggers reactive screening and treatment in the index house and neighbouring houses. However, the efficacy of this approach at varying screening radii and malaria prevalence remains ill defined. Data were obtained from a detailed demographic and geographic surveillance study in four villages on the Myanmar-Thailand border. Clinical cases were recorded at village malaria clinics and were linked back to patients' residencies. These data were used to simulate the efficacy of reactive case detection for clinical cases using rapid diagnostic tests (RDT). Simulations took clinical cases in a given month and tabulated the number of cases that would have been detected in the following month at varying screening radii around the index houses. Simulations were run independently for both falciparum and vivax malaria. Each simulation of a reactive case detection effort was run in comparison with a strategy using random selection of houses for screening. In approximately half of the screenings for falciparum and 10% for vivax it would have been impossible to detect any malaria cases regardless of the screening strategy because the screening would have occurred during times when there were no cases. When geographically linked cases were present in the simulation, reactive case detection would have only been successful at detecting most malaria cases using larger screening radii (150-m radius and above). At this screening radius and above, reactive case detection does not perform better than random screening of an equal number of houses in the village. Screening within very small radii detects only a very small proportion of cases, but despite this low performance is better than random screening with the same sample size. The results of these simulations indicate that reactive case detection for clinical cases using RDTs has limited ability in halting transmission in regions of low and unstable transmission. This is linked to high spatial heterogeneity of cases, acquisition of malaria infections outside the village, as well missing asymptomatic infections. When cases are few and sporadic, reactive case detection would result in major time and budgetary losses.
Human swallowing simulation based on videofluorography images using Hamiltonian MPS method
NASA Astrophysics Data System (ADS)
Kikuchi, Takahiro; Michiwaki, Yukihiro; Kamiya, Tetsu; Toyama, Yoshio; Tamai, Tasuku; Koshizuka, Seiichi
2015-09-01
In developed nations, swallowing disorders and aspiration pneumonia have become serious problems. We developed a method to simulate the behavior of the organs involved in swallowing to clarify the mechanisms of swallowing and aspiration. The shape model is based on anatomically realistic geometry, and the motion model utilizes forced displacements based on realistic dynamic images to reflect the mechanisms of human swallowing. The soft tissue organs are modeled as nonlinear elastic material using the Hamiltonian MPS method. This method allows for stable simulation of the complex swallowing movement. A penalty method using metaballs is employed to simulate contact between organ walls and smooth sliding along the walls. We performed four numerical simulations under different analysis conditions to represent four cases of swallowing, including a healthy volunteer and a patient with a swallowing disorder. The simulation results were compared to examine the epiglottic downfolding mechanism, which strongly influences the risk of aspiration.
A qualitative analysis of bus simulator training on transit incidents : a case study in Florida.
DOT National Transportation Integrated Search
2013-06-01
The purpose of this research was to track and observe three Florida public transit agencies as they incorporated and integrated computer-based transit bus simulators into their existing bus operator training programs. In addition to the three Florida...
spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains
NASA Astrophysics Data System (ADS)
Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo
2016-09-01
The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.
Teaching End-of-Life Care Using Interprofessional Simulation.
Gannon, Jane; Motycka, Carol; Egelund, Eric; Kraemer, Dale F; Smith, W Thomas; Solomon, Kathleen
2017-04-01
Competency in end-of-life (EOL) care is a growing expectation for health professions students. This study assessed the impact of four EOL care scenarios, using high-fidelity simulation, on the perceived learning needs and attitudes of pharmacy and nursing students. On three campuses, pharmacy students (N = 158) were exposed to standard paper EOL case scenarios, while a fourth campus exposed eight graduate nursing and 37 graduate pharmacy students to simulated versions of the same cases. The paper-based groups produced similar pre-post changes on the End of Life Professional Caregiver Survey. Results were pooled and compared with the simulation-only group, revealing significantly higher changes in pre-post scores for the simulation group. Students participating in the simulation group showed some significant differences in attitudes toward EOL care, compared with students in the classroom setting. [J Nurs Educ. 2017;56(4):205-210.]. Copyright 2017, SLACK Incorporated.
The effect of misclassification errors on case mix measurement.
Sutherland, Jason M; Botz, Chas K
2006-12-01
Case mix systems have been implemented for hospital reimbursement and performance measurement across Europe and North America. Case mix categorizes patients into discrete groups based on clinical information obtained from patient charts in an attempt to identify clinical or cost difference amongst these groups. The diagnosis related group (DRG) case mix system is the most common methodology, with variants adopted in many countries. External validation studies of coding quality have confirmed that widespread variability exists between originally recorded diagnoses and re-abstracted clinical information. DRG assignment errors in hospitals that share patient level cost data for the purpose of establishing cost weights affects cost weight accuracy. The purpose of this study is to estimate bias in cost weights due to measurement error of reported clinical information. DRG assignment error rates are simulated based on recent clinical re-abstraction study results. Our simulation study estimates that 47% of cost weights representing the least severe cases are over weight by 10%, while 32% of cost weights representing the most severe cases are under weight by 10%. Applying the simulated weights to a cross-section of hospitals, we find that teaching hospitals tend to be under weight. Since inaccurate cost weights challenges the ability of case mix systems to accurately reflect patient mix and may lead to potential distortions in hospital funding, bias in hospital case mix measurement highlights the role clinical data quality plays in hospital funding in countries that use DRG-type case mix systems. Quality of clinical information should be carefully considered from hospitals that contribute financial data for establishing cost weights.
Managing simulation-based training: A framework for optimizing learning, cost, and time
NASA Astrophysics Data System (ADS)
Richmond, Noah Joseph
This study provides a management framework for optimizing training programs for learning, cost, and time when using simulation based training (SBT) and reality based training (RBT) as resources. Simulation is shown to be an effective means for implementing activity substitution as a way to reduce risk. The risk profile of 22 US Air Force vehicles are calculated, and the potential risk reduction is calculated under the assumption of perfect substitutability of RBT and SBT. Methods are subsequently developed to relax the assumption of perfect substitutability. The transfer effectiveness ratio (TER) concept is defined and modeled as a function of the quality of the simulator used, and the requirements of the activity trained. The Navy F/A-18 is then analyzed in a case study illustrating how learning can be maximized subject to constraints in cost and time, and also subject to the decision maker's preferences for the proportional and absolute use of simulation. Solution methods for optimizing multiple activities across shared resources are next provided. Finally, a simulation strategy including an operations planning program (OPP), an implementation program (IP), an acquisition program (AP), and a pedagogical research program (PRP) is detailed. The study provides the theoretical tools to understand how to leverage SBT, a case study demonstrating these tools' efficacy, and a set of policy recommendations to enable the US military to better utilize SBT in the future.
Everglades Landscape Model: Integrated Assessment of Hydrology, Biogeochemistry, and Biology
NASA Astrophysics Data System (ADS)
Fitz, H. C.; Wang, N.; Sklar, F. H.
2002-05-01
Water management infrastructure and operations have fragmented the greater Everglades into separate, impounded basins, altering flows and hydropatterns. A significant area of this managed system has experienced anthropogenic eutrophication. This combination of altered hydrology and water quality has interacted to degrade vegetative habitats and other ecological characteristics of the Everglades. One of the modeling tools to be used in developing restoration alternatives is the Everglades Landscape Model (ELM), a process-based, spatially explicit simulation of ecosystem dynamics across a heterogeneous, 10,000 km2 region. The model has been calibrated to capture hydrologic and surface water quality dynamics across most of the Everglades landscape over decadal time scales. We evaluated phosphorus loading throughout the Everglades system under two base scenarios. The 1995 base case assumed current management operations, with phosphorus inflow concentrations fixed at their long term, historical average. The 2050 base case assumed future modifications in water and nutrient management, with all managed inflows to the Everglades having reduced phosphorus concentrations. In an example indicator subregion that currently is highly eutrophic, the 31-yr simulations predicted that desirable periphyton and macrophyte communities were maintained under the 2050 base case, whereas in the 1995 base case, periphyton biomass and production decreased to negligible levels and macrophytes became extremely dense. The negative periphyton response in the 1995 base case was due to high phosphorus loads and rapid macrophyte growth that shaded this algal community. Along an existing 11 km eutrophication gradient, the model indicated that the 2050 base case had ecologically significant reductions in phosphorus accumulation compared to the 1995 base case. Indicator regions (in Everglades National Park) distant from phosphorus inflow points also exhibited reductions in phosphorus accumulation under the 2050 base case, albeit to a lesser extent due to its distance from phosphorus inflows. The ELM fills a critical information need in Everglades management, and has become an accepted tool in evaluating scenarios of potential restoration of the natural system.
Gao, Yuan; Zhang, Chuanrong; He, Qingsong; Liu, Yaolin
2017-06-15
Ecological security is an important research topic, especially urban ecological security. As highly populated eco-systems, cities always have more fragile ecological environments. However, most of the research on urban ecological security in literature has focused on evaluating current or past status of the ecological environment. Very little literature has carried out simulation or prediction of future ecological security. In addition, there is even less literature exploring the urban ecological environment at a fine scale. To fill-in the literature gap, in this study we simulated and predicted urban ecological security at a fine scale (district level) using an improved Cellular Automata (CA) approach. First we used the pressure-state-response (PSR) method based on grid-scale data to evaluate urban ecological security. Then, based on the evaluation results, we imported the geographically weighted regression (GWR) concept into the CA model to simulate and predict urban ecological security. We applied the improved CA approach in a case study-simulating and predicting urban ecological security for the city of Wuhan in Central China. By comparing the simulated ecological security values from 2010 using the improved CA model to the actual ecological security values of 2010, we got a relatively high value of the kappa coefficient, which indicates that this CA model can simulate or predict well future development of ecological security in Wuhan. Based on the prediction results for 2020, we made some policy recommendations for each district in Wuhan.
Rupture of DNA aptamer: New insights from simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Rakesh Kumar; Nath, Shesh; Kumar, Sanjay
2015-10-28
Base-pockets (non-complementary base-pairs) in a double-stranded DNA play a crucial role in biological processes. Because of thermal fluctuations, it can lower the stability of DNA, whereas, in case of DNA aptamer, small molecules, e.g., adenosinemonophosphate and adenosinetriphosphate, form additional hydrogen bonds with base-pockets termed as “binding-pockets,” which enhance the stability. Using the Langevin dynamics simulations of coarse grained model of DNA followed by atomistic simulations, we investigated the influence of base-pocket and binding-pocket on the stability of DNA aptamer. Striking differences have been reported here for the separation induced by temperature and force, which require further investigation by single moleculemore » experiments.« less
Temporal bone dissection simulator for training pediatric otolaryngology surgeons
NASA Astrophysics Data System (ADS)
Tabrizi, Pooneh R.; Sang, Hongqiang; Talari, Hadi F.; Preciado, Diego; Monfaredi, Reza; Reilly, Brian; Arikatla, Sreekanth; Enquobahrie, Andinet; Cleary, Kevin
2017-03-01
Cochlear implantation is the standard of care for infants born with severe hearing loss. Current guidelines approve the surgical placement of implants as early as 12 months of age. Implantation at a younger age poses a greater surgical challenge since the underdeveloped mastoid tip, along with thin calvarial bone, creates less room for surgical navigation and can result in increased surgical risk. We have been developing a temporal bone dissection simulator based on actual clinical cases for training otolaryngology fellows in this delicate procedure. The simulator system is based on pre-procedure CT (Computed Tomography) images from pediatric infant cases (<12 months old) at our hospital. The simulator includes: (1) simulation engine to provide the virtual reality of the temporal bone surgery environment, (2) a newly developed haptic interface for holding the surgical drill, (3) an Oculus Rift to provide a microscopic-like view of the temporal bone surgery, and (4) user interface to interact with the simulator through the Oculus Rift and the haptic device. To evaluate the system, we have collected 10 representative CT data sets and segmented the key structures: cochlea, round window, facial nerve, and ossicles. The simulator will present these key structures to the user and warn the user if needed by continuously calculating the distances between the tip of surgical drill and the key structures.
DAMS: A Model to Assess Domino Effects by Using Agent-Based Modeling and Simulation.
Zhang, Laobing; Landucci, Gabriele; Reniers, Genserik; Khakzad, Nima; Zhou, Jianfeng
2017-12-19
Historical data analysis shows that escalation accidents, so-called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent-based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent-based modeling technique explains the domino effects from a bottom-up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher-level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large-scale complicated cases. © 2017 Society for Risk Analysis.
2012-05-30
annealing-based or Bayesian sequential simulation approaches B. Dafflon1,2 and W. Barrash1 Received 13 May 2011; revised 12 March 2012; accepted 17 April 2012...the withheld porosity log are also withheld for this estimation process. For both cases we do this for two wells having locally variable stratigraphy ...borehole location is given at the bottom of each log comparison panel. For comparison with stratigraphy at the BHRS, contacts between Units 1 to 4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, H; Manning, M; Sintay, B
Purpose: Tumor motion in lung SBRT is typically managed by creating an internal target volume (ITV) based on 4D-CT information. Another option, which may reduce lung dose and imaging artifact, is to use a breath hold (BH) during simulation and delivery. Here we evaluate the reproducibility of tumor position at repeated BH using a newly released spirometry system. Methods: Three patients underwent multiple BH CT’s at simulation. All patients underwent a BH cone beam CT (CBCT) prior to each treatment. All image sets were registered to a patient’s first simulation CT based on local bony anatomy. The gross tumor volumemore » (GTV), and the diaphragm or the apex of the lung were contoured on the first image set and expanded in 1 mm increments until the GTVs and diaphragms on all image sets were included inside an expanded structure. The GTV and diaphragm margins necessary to encompass the structures were recorded. Results: The first patient underwent 2 BH CT’s and fluoroscopy at simulation, the remaining patients underwent 3 BH CT’s at simulation. In all cases the GTV’s remained within 1 mm expansions and the diaphragms remained within 2 mm expansions on repeat scans. Each patient underwent 3 daily BH CBCT’s. In all cases the GTV’s remained within a 2 mm expansions, and the diaphragms (or lung apex in one case) remained within 2 mm expansions at daily BH imaging. Conclusions: These case studies demonstrate spirometry as an effective tool for limiting tumor motion (and imaging artifact) and facilitating reproducible tumor positioning over multiple set-ups and BH’s. This work was partially supported by Qfix.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rizzi, Silvio; Hereld, Mark; Insley, Joseph
In this work we perform in-situ visualization of molecular dynamics simulations, which can help scientists to visualize simulation output on-the-fly, without incurring storage overheads. We present a case study to couple LAMMPS, the large-scale molecular dynamics simulation code with vl3, our parallel framework for large-scale visualization and analysis. Our motivation is to identify effective approaches for covisualization and exploration of large-scale atomistic simulations at interactive frame rates.We propose a system of coupled libraries and describe its architecture, with an implementation that runs on GPU-based clusters. We present the results of strong and weak scalability experiments, as well as future researchmore » avenues based on our results.« less
Accelerated Monte Carlo Simulation on the Chemical Stage in Water Radiolysis using GPU
Tian, Zhen; Jiang, Steve B.; Jia, Xun
2018-01-01
The accurate simulation of water radiolysis is an important step to understand the mechanisms of radiobiology and quantitatively test some hypotheses regarding radiobiological effects. However, the simulation of water radiolysis is highly time consuming, taking hours or even days to be completed by a conventional CPU processor. This time limitation hinders cell-level simulations for a number of research studies. We recently initiated efforts to develop gMicroMC, a GPU-based fast microscopic MC simulation package for water radiolysis. The first step of this project focused on accelerating the simulation of the chemical stage, the most time consuming stage in the entire water radiolysis process. A GPU-friendly parallelization strategy was designed to address the highly correlated many-body simulation problem caused by the mutual competitive chemical reactions between the radiolytic molecules. Two cases were tested, using a 750 keV electron and a 5 MeV proton incident in pure water, respectively. The time-dependent yields of all the radiolytic species during the chemical stage were used to evaluate the accuracy of the simulation. The relative differences between our simulation and the Geant4-DNA simulation were on average 5.3% and 4.4% for the two cases. Our package, executed on an Nvidia Titan black GPU card, successfully completed the chemical stage simulation of the two cases within 599.2 s and 489.0 s. As compared with Geant4-DNA that was executed on an Intel i7-5500U CPU processor and needed 28.6 h and 26.8 h for the two cases using a single CPU core, our package achieved a speed-up factor of 171.1-197.2. PMID:28323637
Accelerated Monte Carlo simulation on the chemical stage in water radiolysis using GPU
NASA Astrophysics Data System (ADS)
Tian, Zhen; Jiang, Steve B.; Jia, Xun
2017-04-01
The accurate simulation of water radiolysis is an important step to understand the mechanisms of radiobiology and quantitatively test some hypotheses regarding radiobiological effects. However, the simulation of water radiolysis is highly time consuming, taking hours or even days to be completed by a conventional CPU processor. This time limitation hinders cell-level simulations for a number of research studies. We recently initiated efforts to develop gMicroMC, a GPU-based fast microscopic MC simulation package for water radiolysis. The first step of this project focused on accelerating the simulation of the chemical stage, the most time consuming stage in the entire water radiolysis process. A GPU-friendly parallelization strategy was designed to address the highly correlated many-body simulation problem caused by the mutual competitive chemical reactions between the radiolytic molecules. Two cases were tested, using a 750 keV electron and a 5 MeV proton incident in pure water, respectively. The time-dependent yields of all the radiolytic species during the chemical stage were used to evaluate the accuracy of the simulation. The relative differences between our simulation and the Geant4-DNA simulation were on average 5.3% and 4.4% for the two cases. Our package, executed on an Nvidia Titan black GPU card, successfully completed the chemical stage simulation of the two cases within 599.2 s and 489.0 s. As compared with Geant4-DNA that was executed on an Intel i7-5500U CPU processor and needed 28.6 h and 26.8 h for the two cases using a single CPU core, our package achieved a speed-up factor of 171.1-197.2.
Accelerated Monte Carlo simulation on the chemical stage in water radiolysis using GPU.
Tian, Zhen; Jiang, Steve B; Jia, Xun
2017-04-21
The accurate simulation of water radiolysis is an important step to understand the mechanisms of radiobiology and quantitatively test some hypotheses regarding radiobiological effects. However, the simulation of water radiolysis is highly time consuming, taking hours or even days to be completed by a conventional CPU processor. This time limitation hinders cell-level simulations for a number of research studies. We recently initiated efforts to develop gMicroMC, a GPU-based fast microscopic MC simulation package for water radiolysis. The first step of this project focused on accelerating the simulation of the chemical stage, the most time consuming stage in the entire water radiolysis process. A GPU-friendly parallelization strategy was designed to address the highly correlated many-body simulation problem caused by the mutual competitive chemical reactions between the radiolytic molecules. Two cases were tested, using a 750 keV electron and a 5 MeV proton incident in pure water, respectively. The time-dependent yields of all the radiolytic species during the chemical stage were used to evaluate the accuracy of the simulation. The relative differences between our simulation and the Geant4-DNA simulation were on average 5.3% and 4.4% for the two cases. Our package, executed on an Nvidia Titan black GPU card, successfully completed the chemical stage simulation of the two cases within 599.2 s and 489.0 s. As compared with Geant4-DNA that was executed on an Intel i7-5500U CPU processor and needed 28.6 h and 26.8 h for the two cases using a single CPU core, our package achieved a speed-up factor of 171.1-197.2.
Nonlattice simulation for supersymmetric gauge theories in one dimension.
Hanada, Masanori; Nishimura, Jun; Takeuchi, Shingo
2007-10-19
Lattice simulation of supersymmetric gauge theories is not straightforward. In some cases the lack of manifest supersymmetry just necessitates cumbersome fine-tuning, but in the worse cases the chiral and/or Majorana nature of fermions makes it difficult to even formulate an appropriate lattice theory. We propose circumventing all these problems inherent in the lattice approach by adopting a nonlattice approach for one-dimensional supersymmetric gauge theories, which are important in the string or M theory context. In particular, our method can be used to investigate the gauge-gravity duality from first principles, and to simulate M theory based on the matrix theory conjecture.
Schwenke, Michael; Georgii, Joachim; Preusser, Tobias
2017-07-01
Focused ultrasound (FUS) is rapidly gaining clinical acceptance for several target tissues in the human body. Yet, treating liver targets is not clinically applied due to a high complexity of the procedure (noninvasiveness, target motion, complex anatomy, blood cooling effects, shielding by ribs, and limited image-based monitoring). To reduce the complexity, numerical FUS simulations can be utilized for both treatment planning and execution. These use-cases demand highly accurate and computationally efficient simulations. We propose a numerical method for the simulation of abdominal FUS treatments during respiratory motion of the organs and target. Especially, a novel approach is proposed to simulate the heating during motion by solving Pennes' bioheat equation in a computational reference space, i.e., the equation is mathematically transformed to the reference. The approach allows for motion discontinuities, e.g., the sliding of the liver along the abdominal wall. Implementing the solver completely on the graphics processing unit and combining it with an atlas-based ultrasound simulation approach yields a simulation performance faster than real time (less than 50-s computing time for 100 s of treatment time) on a modern off-the-shelf laptop. The simulation method is incorporated into a treatment planning demonstration application that allows to simulate real patient cases including respiratory motion. The high performance of the presented simulation method opens the door to clinical applications. The methods bear the potential to enable the application of FUS for moving organs.
Convective Systems over the South China Sea: Cloud-Resolving Model Simulations.
NASA Astrophysics Data System (ADS)
Tao, W.-K.; Shie, C.-L.; Simpson, J.; Braun, S.; Johnson, R. H.; Ciesielski, P. E.
2003-12-01
The two-dimensional version of the Goddard Cumulus Ensemble (GCE) model is used to simulate two South China Sea Monsoon Experiment (SCSMEX) convective periods [18 26 May (prior to and during the monsoon onset) and 2 11 June (after the onset of the monsoon) 1998]. Observed large-scale advective tendencies for potential temperature, water vapor mixing ratio, and horizontal momentum are used as the main forcing in governing the GCE model in a semiprognostic manner. The June SCSMEX case has stronger forcing in both temperature and water vapor, stronger low-level vertical shear of the horizontal wind, and larger convective available potential energy (CAPE).The temporal variation of the model-simulated rainfall, time- and domain-averaged heating, and moisture budgets compares well to those diagnostically determined from soundings. However, the model results have a higher temporal variability. The model underestimates the rainfall by 17% to 20% compared to that based on soundings. The GCE model-simulated rainfall for June is in very good agreement with the Tropical Rainfall Measuring Mission (TRMM), precipitation radar (PR), and the Global Precipitation Climatology Project (GPCP). Overall, the model agrees better with observations for the June case rather than the May case.The model-simulated energy budgets indicate that the two largest terms for both cases are net condensation (heating/drying) and imposed large-scale forcing (cooling/moistening). These two terms are opposite in sign, however. The model results also show that there are more latent heat fluxes for the May case. However, more rainfall is simulated for the June case. Net radiation (solar heating and longwave cooling) are about 34% and 25%, respectively, of the net condensation (condensation minus evaporation) for the May and June cases. Sensible heat fluxes do not contribute to rainfall in either of the SCSMEX cases. Two types of organized convective systems, unicell (May case) and multicell (June case), are simulated by the model. They are determined by the observed mean U wind shear (unidirectional versus reverse shear profiles above midlevels).Several sensitivity tests are performed to examine the impact of the radiation, microphysics, and large-scale mean horizontal wind on the organization and intensity of the SCSMEX convective systems.
NASA Astrophysics Data System (ADS)
Deng, Shuang; Xiang, Wenting; Tian, Yangge
2009-10-01
Map coloring is a hard task even to the experienced map experts. In the GIS project, usually need to color map according to the customer, which make the work more complex. With the development of GIS, more and more programmers join the project team, which lack the training of cartology, their coloring map are harder to meet the requirements of customer. From the experience, customers with similar background usually have similar tastes for coloring map. So, we developed a GIS color scheme decision-making system which can select color schemes of similar customers from case base for customers to select and adjust. The system is a BS/CS mixed system, the client side use JSP and make it possible for the system developers to go on remote calling of the colors scheme cases in the database server and communicate with customers. Different with general case-based reasoning, even the customers are very similar, their selection may have difference, it is hard to provide a "best" option. So, we select the Simulated Annealing Algorithm (SAA) to arrange the emergence order of different color schemes. Customers can also dynamically adjust certain features colors based on existing case. The result shows that the system can facilitate the communication between the designers and the customers and improve the quality and efficiency of coloring map.
NASA Technical Reports Server (NTRS)
Cheng, Anning; Xu, Kuan-Man
2006-01-01
The abilities of cloud-resolving models (CRMs) with the double-Gaussian based and the single-Gaussian based third-order closures (TOCs) to simulate the shallow cumuli and their transition to deep convective clouds are compared in this study. The single-Gaussian based TOC is fully prognostic (FP), while the double-Gaussian based TOC is partially prognostic (PP). The latter only predicts three important third-order moments while the former predicts all the thirdorder moments. A shallow cumulus case is simulated by single-column versions of the FP and PP TOC models. The PP TOC improves the simulation of shallow cumulus greatly over the FP TOC by producing more realistic cloud structures. Large differences between the FP and PP TOC simulations appear in the cloud layer of the second- and third-order moments, which are related mainly to the underestimate of the cloud height in the FP TOC simulation. Sensitivity experiments and analysis of probability density functions (PDFs) used in the TOCs show that both the turbulence-scale condensation and higher-order moments are important to realistic simulations of the boundary-layer shallow cumuli. A shallow to deep convective cloud transition case is also simulated by the 2-D versions of the FP and PP TOC models. Both CRMs can capture the transition from the shallow cumuli to deep convective clouds. The PP simulations produce more and deeper shallow cumuli than the FP simulations, but the FP simulations produce larger and wider convective clouds than the PP simulations. The temporal evolutions of cloud and precipitation are closely related to the turbulent transport, the cold pool and the cloud-scale circulation. The large amount of turbulent mixing associated with the shallow cumuli slows down the increase of the convective available potential energy and inhibits the early transition to deep convective clouds in the PP simulation. When the deep convective clouds fully develop and the precipitation is produced, the cold pools produced by the evaporation of the precipitation are not favorable to the formation of shallow cumuli.
Operational framework for quantum measurement simulability
NASA Astrophysics Data System (ADS)
Guerini, Leonardo; Bavaresco, Jessica; Terra Cunha, Marcelo; Acín, Antonio
2017-09-01
We introduce a framework for simulating quantum measurements based on classical processing of a set of accessible measurements. Well-known concepts such as joint measurability and projective simulability naturally emerge as particular cases of our framework, but our study also leads to novel results and questions. First, a generalisation of joint measurability is derived, which yields a hierarchy for the incompatibility of sets of measurements. A similar hierarchy is defined based on the number of outcomes necessary to perform a simulation of a given measurement. This general approach also allows us to identify connections between different kinds of simulability and, in particular, we characterise the qubit measurements that are projective-simulable in terms of joint measurability. Finally, we discuss how our framework can be interpreted in the context of resource theories.
Evaluation of Inventory Reduction Strategies: Balad Air Base Case Study
2012-03-01
produced by conducting individual simulations using a unique random seed generated by the default Anylogic © random number generator. The...develops an agent-based simulation model of the sustainment supply chain supporting Balad AB during its closure using the software AnyLogic ®. The...research. The goal of USAF Stockage Policy is to maximize customer support while minimizing inventory costs (DAF, 2011:1). USAF stocking decisions
NASA Astrophysics Data System (ADS)
Jalali, Mohammad; Ramazi, Hamidreza
2018-04-01
This article is devoted to application of a simulation algorithm based on geostatistical methods to compile and update seismotectonic provinces in which Iran has been chosen as a case study. Traditionally, tectonic maps together with seismological data and information (e.g., earthquake catalogues, earthquake mechanism, and microseismic data) have been used to update seismotectonic provinces. In many cases, incomplete earthquake catalogues are one of the important challenges in this procedure. To overcome this problem, a geostatistical simulation algorithm, turning band simulation, TBSIM, was applied to make a synthetic data to improve incomplete earthquake catalogues. Then, the synthetic data was added to the traditional information to study the seismicity homogeneity and classify the areas according to tectonic and seismic properties to update seismotectonic provinces. In this paper, (i) different magnitude types in the studied catalogues have been homogenized to moment magnitude (Mw), and earthquake declustering was then carried out to remove aftershocks and foreshocks; (ii) time normalization method was introduced to decrease the uncertainty in a temporal domain prior to start the simulation procedure; (iii) variography has been carried out in each subregion to study spatial regressions (e.g., west-southwestern area showed a spatial regression from 0.4 to 1.4 decimal degrees; the maximum range identified in the azimuth of 135 ± 10); (iv) TBSIM algorithm was then applied to make simulated events which gave rise to make 68,800 synthetic events according to the spatial regression found in several directions; (v) simulated events (i.e., magnitudes) were classified based on their intensity in ArcGIS packages and homogenous seismic zones have been determined. Finally, according to the synthetic data, tectonic features, and actual earthquake catalogues, 17 seismotectonic provinces were introduced in four major classes introduced as very high, high, moderate, and low seismic potential provinces. Seismotectonic properties of very high seismic potential provinces have been also presented.
Cotton, Cary C; Erim, Daniel; Eluri, Swathi; Palmer, Sarah H; Green, Daniel J; Wolf, W Asher; Runge, Thomas M; Wheeler, Stephanie; Shaheen, Nicholas J; Dellon, Evan S
2017-06-01
Topical corticosteroids or dietary elimination are recommended as first-line therapies for eosinophilic esophagitis, but data to directly compare these therapies are scant. We performed a cost utility comparison of topical corticosteroids and the 6-food elimination diet (SFED) in treatment of eosinophilic esophagitis, from the payer perspective. We used a modified Markov model based on current clinical guidelines, in which transition between states depended on histologic response simulated at the individual cohort-member level. Simulation parameters were defined by systematic review and meta-analysis to determine the base-case estimates and bounds of uncertainty for sensitivity analysis. Meta-regression models included adjustment for differences in study and cohort characteristics. In the base-case scenario, topical fluticasone was about as effective as SFED but more expensive at a 5-year time horizon ($9261.58 vs $5719.72 per person). SFED was more effective and less expensive than topical fluticasone and topical budesonide in the base-case scenario. Probabilistic sensitivity analysis revealed little uncertainty in relative treatment effectiveness. There was somewhat greater uncertainty in the relative cost of treatments; most simulations found SFED to be less expensive. In a cost utility analysis comparing topical corticosteroids and SFED for first-line treatment of eosinophilic esophagitis, the therapies were similar in effectiveness. SFED was on average less expensive, and more cost effective in most simulations, than topical budesonide and topical fluticasone, from a payer perspective and not accounting for patient-level costs or quality of life. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.
Sechopoulos, Ioannis; Ali, Elsayed S M; Badal, Andreu; Badano, Aldo; Boone, John M; Kyprianou, Iacovos S; Mainegra-Hing, Ernesto; McMillan, Kyle L; McNitt-Gray, Michael F; Rogers, D W O; Samei, Ehsan; Turner, Adam C
2015-10-01
The use of Monte Carlo simulations in diagnostic medical imaging research is widespread due to its flexibility and ability to estimate quantities that are challenging to measure empirically. However, any new Monte Carlo simulation code needs to be validated before it can be used reliably. The type and degree of validation required depends on the goals of the research project, but, typically, such validation involves either comparison of simulation results to physical measurements or to previously published results obtained with established Monte Carlo codes. The former is complicated due to nuances of experimental conditions and uncertainty, while the latter is challenging due to typical graphical presentation and lack of simulation details in previous publications. In addition, entering the field of Monte Carlo simulations in general involves a steep learning curve. It is not a simple task to learn how to program and interpret a Monte Carlo simulation, even when using one of the publicly available code packages. This Task Group report provides a common reference for benchmarking Monte Carlo simulations across a range of Monte Carlo codes and simulation scenarios. In the report, all simulation conditions are provided for six different Monte Carlo simulation cases that involve common x-ray based imaging research areas. The results obtained for the six cases using four publicly available Monte Carlo software packages are included in tabular form. In addition to a full description of all simulation conditions and results, a discussion and comparison of results among the Monte Carlo packages and the lessons learned during the compilation of these results are included. This abridged version of the report includes only an introductory description of the six cases and a brief example of the results of one of the cases. This work provides an investigator the necessary information to benchmark his/her Monte Carlo simulation software against the reference cases included here before performing his/her own novel research. In addition, an investigator entering the field of Monte Carlo simulations can use these descriptions and results as a self-teaching tool to ensure that he/she is able to perform a specific simulation correctly. Finally, educators can assign these cases as learning projects as part of course objectives or training programs.
Dankbaar, Mary E W; Alsma, Jelmer; Jansen, Els E H; van Merrienboer, Jeroen J G; van Saase, Jan L C M; Schuit, Stephanie C E
2016-08-01
Simulation games are becoming increasingly popular in education, but more insight in their critical design features is needed. This study investigated the effects of fidelity of open patient cases in adjunct to an instructional e-module on students' cognitive skills and motivation. We set up a three-group randomized post-test-only design: a control group working on an e-module; a cases group, combining the e-module with low-fidelity text-based patient cases, and a game group, combining the e-module with a high-fidelity simulation game with the same cases. Participants completed questionnaires on cognitive load and motivation. After a 4-week study period, blinded assessors rated students' cognitive emergency care skills in two mannequin-based scenarios. In total 61 students participated and were assessed; 16 control group students, 20 cases students and 25 game students. Learning time was 2 h longer for the cases and game groups than for the control group. Acquired cognitive skills did not differ between groups. The game group experienced higher intrinsic and germane cognitive load than the cases group (p = 0.03 and 0.01) and felt more engaged (p < 0.001). Students did not profit from working on open cases (in adjunct to an e-module), which nonetheless challenged them to study longer. The e-module appeared to be very effective, while the high-fidelity game, although engaging, probably distracted students and impeded learning. Medical educators designing motivating and effective skills training for novices should align case complexity and fidelity with students' proficiency level. The relation between case-fidelity, motivation and skills development is an important field for further study.
Zierler, R Eugene; Leotta, Daniel F; Sansom, Kurt; Aliseda, Alberto; Anderson, Mark D; Sheehan, Florence H
2016-07-01
Duplex ultrasound scanning with B-mode imaging and both color Doppler and Doppler spectral waveforms is relied upon for diagnosis of vascular pathology and selection of patients for further evaluation and treatment. In most duplex ultrasound applications, classification of disease severity is based primarily on alterations in blood flow velocities, particularly the peak systolic velocity (PSV) obtained from Doppler spectral waveforms. We developed a duplex ultrasound simulator for training and assessment of scanning skills. Duplex ultrasound cases were prepared from 2-dimensional (2D) images of normal and stenotic carotid arteries by reconstructing the common carotid, internal carotid, and external carotid arteries in 3 dimensions and computationally simulating blood flow velocity fields within the lumen. The simulator displays a 2D B-mode image corresponding to transducer position on a mannequin, overlaid by color coding of velocity data. A spectral waveform is generated according to examiner-defined settings (depth and size of the Doppler sample volume, beam steering, Doppler beam angle, and pulse repetition frequency or scale). The accuracy of the simulator was assessed by comparing the PSV measured from the spectral waveforms with the true PSV which was derived from the computational flow model based on the size and location of the sample volume within the artery. Three expert examiners made a total of 36 carotid artery PSV measurements based on the simulated cases. The PSV measured by the examiners deviated from true PSV by 8% ± 5% (N = 36). The deviation in PSV did not differ significantly between artery segments, normal and stenotic arteries, or examiners. To our knowledge, this is the first simulation of duplex ultrasound that can create and display real-time color Doppler images and Doppler spectral waveforms. The results demonstrate that an examiner can measure PSV from the spectral waveforms using the settings on the simulator with a mean absolute error in the velocity measurement of less than 10%. With the addition of cases with a range of pathologies, this duplex ultrasound simulator will be a useful tool for training health-care providers in vascular ultrasound applications and for assessing their skills in an objective and quantitative manner. © The Author(s) 2016.
Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Muravyov, Alexander A.
2000-01-01
A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.
Quantitative risk management in gas injection project: a case study from Oman oil and gas industry
NASA Astrophysics Data System (ADS)
Khadem, Mohammad Miftaur Rahman Khan; Piya, Sujan; Shamsuzzoha, Ahm
2017-09-01
The purpose of this research was to study the recognition, application and quantification of the risks associated in managing projects. In this research, the management of risks in an oil and gas project is studied and implemented within a case company in Oman. In this study, at first, the qualitative data related to risks in the project were identified through field visits and extensive interviews. These data were then translated into numerical values based on the expert's opinion. Further, the numerical data were used as an input to Monte Carlo simulation. RiskyProject Professional™ software was used to simulate the system based on the identified risks. The simulation result predicted a delay of about 2 years as a worse case with no chance of meeting the project's on stream date. Also, it has predicted 8% chance of exceeding the total estimated budget. The result of numerical analysis from the proposed model is validated by comparing it with the result of qualitative analysis, which was obtained through discussion with various project managers of company.
NASA Astrophysics Data System (ADS)
Maghami, Mahsa; Sukthankar, Gita
In this paper, we introduce an agent-based simulation for investigating the impact of social factors on the formation and evolution of task-oriented groups. Task-oriented groups are created explicitly to perform a task, and all members derive benefits from task completion. However, even in cases when all group members act in a way that is locally optimal for task completion, social forces that have mild effects on choice of associates can have a measurable impact on task completion performance. In this paper, we show how our simulation can be used to model the impact of stereotypes on group formation. In our simulation, stereotypes are based on observable features, learned from prior experience, and only affect an agent's link formation preferences. Even without assuming stereotypes affect the agents' willingness or ability to complete tasks, the long-term modifications that stereotypes have on the agents' social network impair the agents' ability to form groups with sufficient diversity of skills, as compared to agents who form links randomly. An interesting finding is that this effect holds even in cases where stereotype preference and skill existence are completely uncorrelated.
Stiegler, Marjorie; Hobbs, Gene; Martinelli, Susan M; Zvara, David; Arora, Harendra; Chen, Fei
2018-01-01
Background Simulation is an effective method for creating objective summative assessments of resident trainees. Real-time assessment (RTA) in simulated patient care environments is logistically challenging, especially when evaluating a large group of residents in multiple simulation scenarios. To date, there is very little data comparing RTA with delayed (hours, days, or weeks later) video-based assessment (DA) for simulation-based assessments of Accreditation Council for Graduate Medical Education (ACGME) sub-competency milestones. We hypothesized that sub-competency milestone evaluation scores obtained from DA, via audio-video recordings, are equivalent to the scores obtained from RTA. Methods Forty-one anesthesiology residents were evaluated in three separate simulated scenarios, representing different ACGME sub-competency milestones. All scenarios had one faculty member perform RTA and two additional faculty members perform DA. Subsequently, the scores generated by RTA were compared with the average scores generated by DA. Variance component analysis was conducted to assess the amount of variation in scores attributable to residents and raters. Results Paired t-tests showed no significant difference in scores between RTA and averaged DA for all cases. Cases 1, 2, and 3 showed an intraclass correlation coefficient (ICC) of 0.67, 0.85, and 0.50 for agreement between RTA scores and averaged DA scores, respectively. Analysis of variance of the scores assigned by the three raters showed a small proportion of variance attributable to raters (4% to 15%). Conclusions The results demonstrate that video-based delayed assessment is as reliable as real-time assessment, as both assessment methods yielded comparable scores. Based on a department’s needs or logistical constraints, our findings support the use of either real-time or delayed video evaluation for assessing milestones in a simulated patient care environment. PMID:29736352
Wisp, the Windows Interface for Simulating Plumes, is designed to be an easy-to-use windows platform program for aquatic modeling. Wisp inherits many of its capabilities from its predecessor, the DOS-based PLUMES (Baumgartner, Frick, Roberts, 1994). These capabilities have been ...
Three Success Factors for Simulation Based Construction Education.
ERIC Educational Resources Information Center
Park, Moonseo; Chan, Swee Lean; Ingawale-Verma, Yashada
2003-01-01
Factors in successful implementation of simulation in construction education are as follows: (1) considering human factors and feedback effects; (2) focusing on tradeoffs between with managerial decisions and construction policies; and (3) developing a standalone tool that runs on any platform. Case studies demonstrated the effectiveness of these…
Predicting agricultural impacts of large-scale drought: 2012 and the case for better modeling
USDA-ARS?s Scientific Manuscript database
We present an example of a simulation-based forecast for the 2012 U.S. maize growing season produced as part of a high-resolution, multi-scale, predictive mechanistic modeling study designed for decision support, risk management, and counterfactual analysis. The simulations undertaken for this analy...
Simulation of the Beating Heart Based on Physically Modeling aDeformable Balloon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohmer, Damien; Sitek, Arkadiusz; Gullberg, Grant T.
2006-07-18
The motion of the beating heart is complex and createsartifacts in SPECT and x-ray CT images. Phantoms such as the JaszczakDynamic Cardiac Phantom are used to simulate cardiac motion forevaluationof acquisition and data processing protocols used for cardiacimaging. Two concentric elastic membranes filled with water are connectedto tubing and pump apparatus for creating fluid flow in and out of theinner volume to simulate motion of the heart. In the present report, themovement of two concentric balloons is solved numerically in order tocreate a computer simulation of the motion of the moving membranes in theJaszczak Dynamic Cardiac Phantom. A system ofmore » differential equations,based on the physical properties, determine the motion. Two methods aretested for solving the system of differential equations. The results ofboth methods are similar providing a final shape that does not convergeto a trivial circular profile. Finally,a tomographic imaging simulationis performed by acquiring static projections of the moving shape andreconstructing the result to observe motion artifacts. Two cases aretaken into account: in one case each projection angle is sampled for ashort time interval and the other case is sampled for a longer timeinterval. The longer sampling acquisition shows a clear improvement indecreasing the tomographic streaking artifacts.« less
Fernández-Ávila, Daniel G; Ruiz, Álvaro J; Gil, Fabián; Mora, Sergio A; Tobar, Carlos; Gutiérrez, Juan M; Rosselli, Diego
2018-03-01
The aim of the present study was to evaluate the effectiveness of an educational tool for general physicians, based on rheumatological clinical simulation, for the diagnosis of rheumatoid arthritis and osteoarthritis. A randomized clinical study was carried out, in which the physician research subjects were assigned to one of two groups: the experimental group (educational intervention for rheumatoid arthritis with clinical simulation) or the control group (educational intervention for the basic aspects of the diagnosis and treatment of osteoporosis). Four weeks after the educational intervention, the members of both groups completed an examination that included four clinical cases with real patients, two clinical cases with two clinical simulation models and six virtual clinical cases. In this examination, the participants noted clinical findings, established a diagnosis and defined the complementary tests they would request, if necessary, to corroborate their diagnosis. A total of 160 doctors participated (80 in the active educational intervention for rheumatoid arthritis and 80 in the control group), of whom 89 were women (56%). The mean age was 35 (standard deviation 7.7) years. Success was defined as a physician correctly diagnosing at least 10 of the 12 cases presented. A significant difference of 81.3% (95% confidence interval 72-90%; p < 0.001) in success was found in favour of the active group (88.8% versus 7.5%). A greater number of correct answers was found in the active group compared with the control group in the detection of clinical findings and in the number of complementary tests requested (p < 0.001). The study showed the effectiveness of an educational intervention based on clinical simulation to improve the diagnostic approach to rheumatoid arthritis and osteoarthritis. The results open a new horizon in the teaching of rheumatology. Copyright © 2017 John Wiley & Sons, Ltd.
Hall, S; Poller, B; Bailey, C; Gregory, S; Clark, R; Roberts, P; Tunbridge, A; Poran, V; Evans, C; Crook, B
2018-06-01
Variations currently exist across the UK in the choice of personal protective equipment (PPE) used by healthcare workers when caring for patients with suspected high-consequence infectious diseases (HCIDs). To test the protection afforded to healthcare workers by current PPE ensembles during assessment of a suspected HCID case, and to provide an evidence base to justify proposal of a unified PPE ensemble for healthcare workers across the UK. One 'basic level' (enhanced precautions) PPE ensemble and five 'suspected case' PPE ensembles were evaluated in volunteer trials using 'Violet'; an ultraviolet-fluorescence-based simulation exercise to visualize exposure/contamination events. Contamination was photographed and mapped. There were 147 post-simulation and 31 post-doffing contamination events, from a maximum of 980, when evaluating the basic level of PPE. Therefore, this PPE ensemble did not afford adequate protection, primarily due to direct contamination of exposed areas of the skin. For the five suspected case ensembles, 1584 post-simulation contamination events were recorded, from a maximum of 5110. Twelve post-doffing contamination events were also observed (face, two events; neck, one event; forearm, one event; lower legs, eight events). All suspected case PPE ensembles either had post-doffing contamination events or other significant disadvantages to their use. This identified the need to design a unified PPE ensemble and doffing procedure, incorporating the most protective PPE considered for each body area. This work has been presented to, and reviewed by, key stakeholders to decide on a proposed unified ensemble, subject to further evaluation. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
Meyer, Susan M.
2014-01-01
Objective. To design an elective for pharmacy students that facilitates antimicrobial stewardship awareness, knowledge, and skill development by solving clinical cases, using human patient simulation technology. Design. The elective was designed for PharmD students to describe principles and functions of stewardship programs, select, evaluate, refine, or redesign patient-specific plans for infectious diseases in the context of antimicrobial stewardship, and propose criteria and stewardship management strategies for an antimicrobial class at a health care institution. Teaching methods included active learning and lectures. Cases of bacterial endocarditis and cryptococcal meningitis were developed that incorporated human patient simulation technology. Assessment. Forty-five pharmacy students completed an antimicrobial stewardship elective between 2010 and 2013. Outcomes were assessed using student perceptions of and performance on rubric-graded assignments. Conclusion. A PharmD elective using active learning, including novel cases conducted with human patient simulation technology, enabled outcomes consistent with those desired of pharmacists assisting in antimicrobial stewardship programs. PMID:25386016
Multidisciplinary tailoring of hot composite structures
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.; Chamis, Christos C.
1993-01-01
A computational simulation procedure is described for multidisciplinary analysis and tailoring of layered multi-material hot composite engine structural components subjected to simultaneous multiple discipline-specific thermal, structural, vibration, and acoustic loads. The effect of aggressive environments is also simulated. The simulation is based on a three-dimensional finite element analysis technique in conjunction with structural mechanics codes, thermal/acoustic analysis methods, and tailoring procedures. The integrated multidisciplinary simulation procedure is general-purpose including the coupled effects of nonlinearities in structure geometry, material, loading, and environmental complexities. The composite material behavior is assessed at all composite scales, i.e., laminate/ply/constituents (fiber/matrix), via a nonlinear material characterization hygro-thermo-mechanical model. Sample tailoring cases exhibiting nonlinear material/loading/environmental behavior of aircraft engine fan blades, are presented. The various multidisciplinary loads lead to different tailored designs, even those competing with each other, as in the case of minimum material cost versus minimum structure weight and in the case of minimum vibration frequency versus minimum acoustic noise.
Hybrid neuro-heuristic methodology for simulation and control of dynamic systems over time interval.
Woźniak, Marcin; Połap, Dawid
2017-09-01
Simulation and positioning are very important aspects of computer aided engineering. To process these two, we can apply traditional methods or intelligent techniques. The difference between them is in the way they process information. In the first case, to simulate an object in a particular state of action, we need to perform an entire process to read values of parameters. It is not very convenient for objects for which simulation takes a long time, i.e. when mathematical calculations are complicated. In the second case, an intelligent solution can efficiently help on devoted way of simulation, which enables us to simulate the object only in a situation that is necessary for a development process. We would like to present research results on developed intelligent simulation and control model of electric drive engine vehicle. For a dedicated simulation method based on intelligent computation, where evolutionary strategy is simulating the states of the dynamic model, an intelligent system based on devoted neural network is introduced to control co-working modules while motion is in time interval. Presented experimental results show implemented solution in situation when a vehicle transports things over area with many obstacles, what provokes sudden changes in stability that may lead to destruction of load. Therefore, applied neural network controller prevents the load from destruction by positioning characteristics like pressure, acceleration, and stiffness voltage to absorb the adverse changes of the ground. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kerr, Brendan; Hawkins, Trisha Lee-Ann; Herman, Robert; Barnes, Sue; Kaufmann, Stephanie; Fraser, Kristin; Ma, Irene W Y
2013-07-18
Although simulation-based training is increasingly used for medical education, its benefits in continuing medical education (CME) are less established. This study seeks to evaluate the feasibility of incorporating simulation-based training into a CME conference and compare its effectiveness with the traditional workshop in improving knowledge and self-reported confidence. Participants (N=27) were group randomized to either a simulation-based workshop or a traditional case-based workshop. Post-training, knowledge assessment score neither did increase significantly in the traditional group (d=0.13; p=0.76) nor did significantly decrease in the simulation group (d= - 0.44; p=0.19). Self-reported comfort in patient assessment parameters increased in both groups (p<0.05 in all). However, only the simulation group reported an increase in comfort in patient management (d=1.1, p=0.051 for the traditional group and d=1.3; p= 0.0003 for the simulation group). At 1 month, comfort measures in the traditional group increased consistently over time while these measures in the simulation group increased post-workshop but decreased by 1 month, suggesting that some of the effects of training with simulation may be short lived. The use of simulation-based training was not associated with benefits in knowledge acquisition, knowledge retention, or comfort in patient assessment. It was associated with superior outcomes in comfort in patient management, but this benefit may be short-lived. Further studies are required to better define the conditions under which simulation-based training is beneficial.
Kerr, Brendan; Hawkins, Trisha Lee-Ann; Herman, Robert; Barnes, Sue; Kaufmann, Stephanie; Fraser, Kristin; Ma, Irene W. Y.
2013-01-01
Introduction Although simulation-based training is increasingly used for medical education, its benefits in continuing medical education (CME) are less established. This study seeks to evaluate the feasibility of incorporating simulation-based training into a CME conference and compare its effectiveness with the traditional workshop in improving knowledge and self-reported confidence. Methods Participants (N=27) were group randomized to either a simulation-based workshop or a traditional case-based workshop. Results Post-training, knowledge assessment score neither did increase significantly in the traditional group (d=0.13; p=0.76) nor did significantly decrease in the simulation group (d= − 0.44; p=0.19). Self-reported comfort in patient assessment parameters increased in both groups (p<0.05 in all). However, only the simulation group reported an increase in comfort in patient management (d=1.1, p=0.051 for the traditional group and d=1.3; p= 0.0003 for the simulation group). At 1 month, comfort measures in the traditional group increased consistently over time while these measures in the simulation group increased post-workshop but decreased by 1 month, suggesting that some of the effects of training with simulation may be short lived. Discussion The use of simulation-based training was not associated with benefits in knowledge acquisition, knowledge retention, or comfort in patient assessment. It was associated with superior outcomes in comfort in patient management, but this benefit may be short-lived. Further studies are required to better define the conditions under which simulation-based training is beneficial. PMID:23870304
Kerr, Brendan; Lee-Ann Hawkins, Trisha; Herman, Robert; Barnes, Sue; Kaufmann, Stephanie; Fraser, Kristin; Ma, Irene W Y
2013-01-01
Introduction Although simulation-based training is increasingly used for medical education, its benefits in continuing medical education (CME) are less established. This study seeks to evaluate the feasibility of incorporating simulation-based training into a CME conference and compare its effectiveness with the traditional workshop in improving knowledge and self-reported confidence. Methods Participants (N=27) were group randomized to either a simulation-based workshop or a traditional case-based workshop. Results Post-training, knowledge assessment score neither did increase significantly in the traditional group (d=0.13; p=0.76) nor did significantly decrease in the simulation group (d= - 0.44; p=0.19). Self-reported comfort in patient assessment parameters increased in both groups (p<0.05 in all). However, only the simulation group reported an increase in comfort in patient management (d=1.1, p=0.051 for the traditional group and d=1.3; p= 0.0003 for the simulation group). At 1 month, comfort measures in the traditional group increased consistently over time while these measures in the simulation group increased post-workshop but decreased by 1 month, suggesting that some of the effects of training with simulation may be short lived. Discussion The use of simulation-based training was not associated with benefits in knowledge acquisition, knowledge retention, or comfort in patient assessment. It was associated with superior outcomes in comfort in patient management, but this benefit may be short-lived. Further studies are required to better define the conditions under which simulation-based training is beneficial.
Instructional Simulation Integrates Research, Education, and Practice.
Teasdale, Thomas A; Mapes, Sheryl A; Henley, Omolara; Lindsey, Jeanene; Dillard, Della
2016-01-01
Instructional simulation is widely used in clinical education. Examples include the use of inanimate models meant to imitate humans, standardized patients who are actors portraying patients with certain conditions, and role-play where learners experience the disease through props and circumstances. These modalities are briefly described, and then case examples are provided of simulation curricula in use that integrate research findings and clinical practice expertise to guide development and implementation steps. The cases illustrate how formative and summative feedback from two legs of the "three-legged stool" can be potent integrating forces in development of simulation curricula. In these examples, the educational outputs benefit from purposeful inclusion of research and practice inputs. Costs are outlined for instructor and learner time commitments, space considerations, and expendables. The authors' data and experience suggest that instructional simulation that is supported by a solid scientific base and clinical expertise is appreciated by teachers and learners.
Shoemaker, Michael J; Platko, Christina M; Cleghorn, Susan M; Booth, Andrew
2014-07-01
The purpose of this retrospective qualitative case report is to describe how a case-based, virtual patient interprofessional education (IPE) simulation activity was utilized to achieve physician assistant (PA), physical therapy (PT) and occupational therapy (OT) student IPE learning outcomes. Following completion of a virtual patient case, 30 PA, 46 PT and 24 OT students were required to develop a comprehensive, written treatment plan and respond to reflective questions. A qualitative analysis of the submitted written assignment was used to determine whether IPE learning objectives were met. Student responses revealed three themes that supported the learning objectives of the IPE experience: benefits of collaborative care, role clarification and relevance of the IPE experience for future practice. A case-based, IPE simulation activity for physician assistant and rehabilitation students using a computerized virtual patient software program effectively facilitated achievement of the IPE learning objectives, including development of greater student awareness of other professions and ways in which collaborative patient care can be provided.
Kondo, Kosuke; Harada, Naoyuki; Masuda, Hiroyuki; Sugo, Nobuo; Terazono, Sayaka; Okonogi, Shinichi; Sakaeyama, Yuki; Fuchinoue, Yutaka; Ando, Syunpei; Fukushima, Daisuke; Nomoto, Jun; Nemoto, Masaaki
2016-06-01
Deep regions are not visible in three-dimensional (3D) printed rapid prototyping (RP) models prepared from opaque materials, which is not the case with translucent images. The objectives of this study were to develop an RP model in which a skull base tumor was simulated using mesh, and to investigate its usefulness for surgical simulations by evaluating the visibility of its deep regions. A 3D printer that employs binder jetting and is mainly used to prepare plaster models was used. RP models containing a solid tumor, no tumor, and a mesh tumor were prepared based on computed tomography, magnetic resonance imaging, and angiographic data for four cases of petroclival tumor. Twelve neurosurgeons graded the three types of RP model into the following four categories: 'clearly visible,' 'visible,' 'difficult to see,' and 'invisible,' based on the visibility of the internal carotid artery, basilar artery, and brain stem through a craniotomy performed via the combined transpetrosal approach. In addition, the 3D positional relationships between these structures and the tumor were assessed. The internal carotid artery, basilar artery, and brain stem and the positional relationships of these structures with the tumor were significantly more visible in the RP models with mesh tumors than in the RP models with solid or no tumors. The deep regions of PR models containing mesh skull base tumors were easy to visualize. This 3D printing-based method might be applicable to various surgical simulations.
NASA Astrophysics Data System (ADS)
Cai, Congbo; Dong, Jiyang; Cai, Shuhui; Cheng, En; Chen, Zhong
2006-11-01
Intermolecular multiple quantum coherences (iMQCs) have many potential applications since they can provide interaction information between different molecules within the range of dipolar correlation distance, and can provide new contrast in magnetic resonance imaging (MRI). Because of the non-localized property of dipolar field, and the non-linear property of the Bloch equations incorporating the dipolar field term, the evolution behavior of iMQC is difficult to deduce strictly in many cases. In such cases, simulation studies are very important. Simulation results can not only give a guide to optimize experimental conditions, but also help analyze unexpected experimental results. Based on our product operator matrix and the K-space method for dipolar field calculation, the MRI simulation software was constructed, running on Windows operation system. The non-linear Bloch equations are calculated by a fifth-order Cash-Karp Runge-Kutta formulism. Computational time can be efficiently reduced by separating the effects of chemical shifts and strong gradient field. Using this software, simulation of different kinds of complex MRI sequences can be done conveniently and quickly on general personal computers. Some examples were given. The results were discussed.
NASA Astrophysics Data System (ADS)
Krebs, Isabel; Jardin, Stephen C.; Guenter, Sibylle; Lackner, Karl; Hoelzl, Matthias; Strumberger, Erika; Ferraro, Nate
2017-10-01
3D nonlinear MHD simulations of tokamak plasmas have been performed in toroidal geometry by means of the high-order finite element code M3D-C1. The simulations are set up such that the safety factor on axis (q0) is driven towards values below unity. As reported in and the resulting asymptotic states either exhibit sawtooth-like reconnection cycling or they are sawtooth-free. In the latter cases, a self-regulating magnetic flux pumping mechanism, mainly provided by a saturated quasi-interchange instability via a dynamo effect, redistributes the central current density so that the central safety factor profile is flat and q0 1 . Sawtoothing is prevented if β is sufficiently high to allow for the necessary amount of flux pumping to counterbalance the tendency of the current density profile to centrally peak. We present the results of 3D nonlinear simulations based on specific types of experimental discharges and analyze their asymptotic behavior. A set of cases is presented where aspects of the current ramp-up phase of Hybrid ASDEX Upgrade discharges are mimicked. Another set of simulations is based on low-qedge discharges in DIII-D.
NASA Astrophysics Data System (ADS)
Liu, Dongdong; She, Dongli
2018-06-01
Current physically based erosion models do not carefully consider the dynamic variations of soil properties during rainfall and are unable to simulate saline-sodic soil slope erosion processes. The aim of this work was to build upon a complete model framework, SSEM, to simulate runoff and erosion processes for saline-sodic soils by coupling dynamic saturated hydraulic conductivity Ks and soil erodibility Kτ. Sixty rainfall simulation rainfall experiments (2 soil textures × 5 sodicity levels × 2 slope gradients × 3 duplicates) provided data for model calibration and validation. SSEM worked very well for simulating the runoff and erosion processes of saline-sodic silty clay. The runoff and erosion processes of saline-sodic silt loam were more complex than those of non-saline soils or soils with higher clay contents; thus, SSEM did not perform very well for some validation events. We further examined the model performances of four concepts: Dynamic Ks and Kτ (Case 1, SSEM), Dynamic Ks and Constant Kτ (Case 2), Constant Ks and Dynamic Kτ (Case 3) and Constant Ks and Constant Kτ (Case 4). The results demonstrated that the model, which considers dynamic variations in soil saturated hydraulic conductivity and soil erodibility, can provide more reasonable runoff and erosion prediction results for saline-sodic soils.
BIPV: a real-time building performance study for a roof-integrated facility
NASA Astrophysics Data System (ADS)
Aaditya, Gayathri; Mani, Monto
2018-03-01
Building integrated photovoltaic system (BIPV) is a photovoltaic (PV) integration that generates energy and serves as a building envelope. A building element (e.g. roof and wall) is based on its functional performance, which could include structure, durability, maintenance, weathering, thermal insulation, acoustics, and so on. The present paper discusses the suitability of PV as a building element in terms of thermal performance based on a case study of a 5.25 kWp roof-integrated BIPV system in tropical regions. Performance of PV has been compared with conventional construction materials and various scenarios have been simulated to understand the impact on occupant comfort levels. In the current case study, PV as a roofing material has been shown to cause significant thermal discomfort to the occupants. The study has been based on real-time data monitoring supported by computer-based building simulation model.
Cohen, Elaine R; Feinglass, Joe; Barsuk, Jeffrey H; Barnard, Cynthia; O'Donnell, Anna; McGaghie, William C; Wayne, Diane B
2010-04-01
Interventions to reduce preventable complications such as catheter-related bloodstream infections (CRBSI) can also decrease hospital costs. However, little is known about the cost-effectiveness of simulation-based education. The aim of this study was to estimate hospital cost savings related to a reduction in CRBSI after simulation training for residents. This was an intervention evaluation study estimating cost savings related to a simulation-based intervention in central venous catheter (CVC) insertion in the Medical Intensive Care Unit (MICU) at an urban teaching hospital. After residents completed a simulation-based mastery learning program in CVC insertion, CRBSI rates declined sharply. Case-control and regression analysis methods were used to estimate savings by comparing CRBSI rates in the year before and after the intervention. Annual savings from reduced CRBSIs were compared with the annual cost of simulation training. Approximately 9.95 CRBSIs were prevented among MICU patients with CVCs in the year after the intervention. Incremental costs attributed to each CRBSI were approximately $82,000 in 2008 dollars and 14 additional hospital days (including 12 MICU days). The annual cost of the simulation-based education was approximately $112,000. Net annual savings were thus greater than $700,000, a 7 to 1 rate of return on the simulation training intervention. A simulation-based educational intervention in CVC insertion was highly cost-effective. These results suggest that investment in simulation training can produce significant medical care cost savings.
Gao, Yuan; Zhang, Chuanrong; He, Qingsong; Liu, Yaolin
2017-01-01
Ecological security is an important research topic, especially urban ecological security. As highly populated eco-systems, cities always have more fragile ecological environments. However, most of the research on urban ecological security in literature has focused on evaluating current or past status of the ecological environment. Very little literature has carried out simulation or prediction of future ecological security. In addition, there is even less literature exploring the urban ecological environment at a fine scale. To fill-in the literature gap, in this study we simulated and predicted urban ecological security at a fine scale (district level) using an improved Cellular Automata (CA) approach. First we used the pressure-state-response (PSR) method based on grid-scale data to evaluate urban ecological security. Then, based on the evaluation results, we imported the geographically weighted regression (GWR) concept into the CA model to simulate and predict urban ecological security. We applied the improved CA approach in a case study—simulating and predicting urban ecological security for the city of Wuhan in Central China. By comparing the simulated ecological security values from 2010 using the improved CA model to the actual ecological security values of 2010, we got a relatively high value of the kappa coefficient, which indicates that this CA model can simulate or predict well future development of ecological security in Wuhan. Based on the prediction results for 2020, we made some policy recommendations for each district in Wuhan. PMID:28617348
Tait, Lauren; Lee, Kenneth; Rasiah, Rohan; Cooper, Joyce M; Ling, Tristan; Geelan, Benjamin; Bindoff, Ivan
2018-05-03
Background . There are numerous approaches to simulating a patient encounter in pharmacy education. However, little direct comparison between these approaches has been undertaken. Our objective was to investigate student experiences, satisfaction, and feedback preferences between three scenario simulation modalities (paper-, actor-, and computer-based). Methods . We conducted a mixed methods study with randomized cross-over of simulation modalities on final-year Australian graduate-entry Master of Pharmacy students. Participants completed case-based scenarios within each of three simulation modalities, with feedback provided at the completion of each scenario in a format corresponding to each simulation modality. A post-simulation questionnaire collected qualitative and quantitative responses pertaining to participant satisfaction, experiences, and feedback preferences. Results . Participants reported similar levels satisfaction across all three modalities. However, each modality resulted in unique positive and negative experiences, such as student disengagement with paper-based scenarios. Conclusion . Importantly, the themes of guidance and opportunity for peer discussion underlie the best forms of feedback for students. The provision of feedback following simulation should be carefully considered and delivered, with all three simulation modalities producing both positive and negative experiences in regard to their feedback format.
dsmcFoam+: An OpenFOAM based direct simulation Monte Carlo solver
NASA Astrophysics Data System (ADS)
White, C.; Borg, M. K.; Scanlon, T. J.; Longshaw, S. M.; John, B.; Emerson, D. R.; Reese, J. M.
2018-03-01
dsmcFoam+ is a direct simulation Monte Carlo (DSMC) solver for rarefied gas dynamics, implemented within the OpenFOAM software framework, and parallelised with MPI. It is open-source and released under the GNU General Public License in a publicly available software repository that includes detailed documentation and tutorial DSMC gas flow cases. This release of the code includes many features not found in standard dsmcFoam, such as molecular vibrational and electronic energy modes, chemical reactions, and subsonic pressure boundary conditions. Since dsmcFoam+ is designed entirely within OpenFOAM's C++ object-oriented framework, it benefits from a number of key features: the code emphasises extensibility and flexibility so it is aimed first and foremost as a research tool for DSMC, allowing new models and test cases to be developed and tested rapidly. All DSMC cases are as straightforward as setting up any standard OpenFOAM case, as dsmcFoam+ relies upon the standard OpenFOAM dictionary based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of a DSMC simulation is not typical of most OpenFOAM applications. We show that dsmcFoam+ compares well to other well-known DSMC codes and to analytical solutions in terms of benchmark results.
dos Santos, Mateus Casanova; Leite, Maria Cecília Lorea; Heck, Rita Maria
2010-12-01
This is an investigative case study with descriptive and participative character, based on an educational experience with the Simulation in Nursing learning trigger. It was carried out during the second semester of the first cycle of Faculdade de Enfermagem (FEN), Universidade Federal de Pelotas (UFPel). The aim is to study the recontextualization of pedagogic practice of simulation-based theories developed by Basil Bernstein, an education sociologist, and to contribute with the improvement process of education planning, and especially the evaluation of learning trigger. The research shows that Bernstein's theory is a powerful tool semiotic pedagogical of practices which contributes to the planning and analysis of curricular educational device.
Base-Case 1% Yield Increase (BC1), All Energy Crops scenario of the 2016 Billion Ton Report
Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron (ORCID:0000000320373827)
2016-07-13
Scientific reason for data generation: to serve as the base-case scenario for the BT16 volume 1 agricultural scenarios to compare these projections of potential biomass supplies against a reference case (agricultural baseline 10.11578/1337885). The simulation runs from 2015 through 2040; a starting year of 2014 is used but not reported. How each parameter was produced (methods), format, and relationship to other data in the data set: This exogenous price simulations (also referred to as “specified-price” simulations) introduces a farmgate price, and POLYSYS solves for biomass supplies that may be brought to market in response to these prices. In specified-price scenarios, a specified farmgate price is offered constantly in all counties over all years of the simulation. This simulation begins in 2015 with an offered farmgate price for primary crop residues only between 2015 and 2018 and long-term contracts for dedicated crops beginning in 2019. Expected mature energy crop yield grows at a compounding rate of 1% beginning in 2016. The yield growth assumptions are fixed after crops are planted such that yield gains do not apply to crops already planted, but new plantings do take advantage of the gains in expected yield growth. Instruments used: Policy Analysis System –POLYSYS (version POLYS2015_V10_alt_JAN22B), an agricultural policy modeling system of U.S. agriculture (crops and livestock), supplied by the University of Tennessee Institute of Agriculture, Agricultural Policy Analysis Center.
NASA Astrophysics Data System (ADS)
Pahar, Gourabananda; Dhar, Anirban
2017-04-01
A coupled solenoidal Incompressible Smoothed Particle Hydrodynamics (ISPH) model is presented for simulation of sediment displacement in erodible bed. The coupled framework consists of two separate incompressible modules: (a) granular module, (b) fluid module. The granular module considers a friction based rheology model to calculate deviatoric stress components from pressure. The module is validated for Bagnold flow profile and two standardized test cases of sediment avalanching. The fluid module resolves fluid flow inside and outside porous domain. An interaction force pair containing fluid pressure, viscous term and drag force acts as a bridge between two different flow modules. The coupled model is validated against three dambreak flow cases with different initial conditions of movable bed. The simulated results are in good agreement with experimental data. A demonstrative case considering effect of granular column failure under full/partial submergence highlights the capability of the coupled model for application in generalized scenario.
A clustering method of Chinese medicine prescriptions based on modified firefly algorithm.
Yuan, Feng; Liu, Hong; Chen, Shou-Qiang; Xu, Liang
2016-12-01
This paper is aimed to study the clustering method for Chinese medicine (CM) medical cases. The traditional K-means clustering algorithm had shortcomings such as dependence of results on the selection of initial value, trapping in local optimum when processing prescriptions form CM medical cases. Therefore, a new clustering method based on the collaboration of firefly algorithm and simulated annealing algorithm was proposed. This algorithm dynamically determined the iteration of firefly algorithm and simulates sampling of annealing algorithm by fitness changes, and increased the diversity of swarm through expansion of the scope of the sudden jump, thereby effectively avoiding premature problem. The results from confirmatory experiments for CM medical cases suggested that, comparing with traditional K-means clustering algorithms, this method was greatly improved in the individual diversity and the obtained clustering results, the computing results from this method had a certain reference value for cluster analysis on CM prescriptions.
ERIC Educational Resources Information Center
Meyer, Kimberly E.
2010-01-01
The purpose of this dissertation was to evaluate learning transfer achieved by physician assistant students comparing two instructional methods, human patient simulation and electronic clinical case studies. This prospective, randomized, mixed-methods study utilized first and second-year physician assistant student volunteers taking a pretest and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamm, L.L.
1998-10-07
This report is one of a series of reports documenting accident scenario simulations for the Accelerator Production of Tritium (APT) blanket heat removal systems. The simulations were performed in support of the Preliminary Safety Analysis Report (PSAR) for the APT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamm, L.L.
1998-10-07
This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.
Numerical simulations in the development of propellant management devices
NASA Astrophysics Data System (ADS)
Gaulke, Diana; Winkelmann, Yvonne; Dreyer, Michael
Propellant management devices (PMDs) are used for positioning the propellant at the propel-lant port. It is important to provide propellant without gas bubbles. Gas bubbles can inflict cavitation and may lead to system failures in the worst case. Therefore, the reliable operation of such devices must be guaranteed. Testing these complex systems is a very intricate process. Furthermore, in most cases only tests with downscaled geometries are possible. Numerical sim-ulations are used here as an aid to optimize the tests and to predict certain results. Based on these simulations, parameters can be determined in advance and parts of the equipment can be adjusted in order to minimize the number of experiments. In return, the simulations are validated regarding the test results. Furthermore, if the accuracy of the numerical prediction is verified, then numerical simulations can be used for validating the scaling of the experiments. This presentation demonstrates some selected numerical simulations for the development of PMDs at ZARM.
Tennant, Marc; Kruger, Estie
2013-02-01
This study developed a Monte Carlo simulation approach to examining the prevalence and incidence of dental decay using Australian children as a test environment. Monte Carlo simulation has been used for a half a century in particle physics (and elsewhere); put simply, it is the probability for various population-level outcomes seeded randomly to drive the production of individual level data. A total of five runs of the simulation model for all 275,000 12-year-olds in Australia were completed based on 2005-2006 data. Measured on average decayed/missing/filled teeth (DMFT) and DMFT of highest 10% of sample (Sic10) the runs did not differ from each other by more than 2% and the outcome was within 5% of the reported sampled population data. The simulations rested on the population probabilities that are known to be strongly linked to dental decay, namely, socio-economic status and Indigenous heritage. Testing the simulated population found DMFT of all cases where DMFT<>0 was 2.3 (n = 128,609) and DMFT for Indigenous cases only was 1.9 (n = 13,749). In the simulation population the Sic25 was 3.3 (n = 68,750). Monte Carlo simulations were created in particle physics as a computational mathematical approach to unknown individual-level effects by resting a simulation on known population-level probabilities. In this study a Monte Carlo simulation approach to childhood dental decay was built, tested and validated. © 2013 FDI World Dental Federation.
Virtual Cerebral Aneurysm Clipping with Real-Time Haptic Force Feedback in Neurosurgical Education.
Gmeiner, Matthias; Dirnberger, Johannes; Fenz, Wolfgang; Gollwitzer, Maria; Wurm, Gabriele; Trenkler, Johannes; Gruber, Andreas
2018-04-01
Realistic, safe, and efficient modalities for simulation-based training are highly warranted to enhance the quality of surgical education, and they should be incorporated in resident training. The aim of this study was to develop a patient-specific virtual cerebral aneurysm-clipping simulator with haptic force feedback and real-time deformation of the aneurysm and vessels. A prototype simulator was developed from 2012 to 2016. Evaluation of virtual clipping by blood flow simulation was integrated in this software, and the prototype was evaluated by 18 neurosurgeons. In 4 patients with different medial cerebral artery aneurysms, virtual clipping was performed after real-life surgery, and surgical results were compared regarding clip application, surgical trajectory, and blood flow. After head positioning and craniotomy, bimanual virtual aneurysm clipping with an original forceps was performed. Blood flow simulation demonstrated residual aneurysm filling or branch stenosis. The simulator improved anatomic understanding for 89% of neurosurgeons. Simulation of head positioning and craniotomy was considered realistic by 89% and 94% of users, respectively. Most participants agreed that this simulator should be integrated into neurosurgical education (94%). Our illustrative cases demonstrated that virtual aneurysm surgery was possible using the same trajectory as in real-life cases. Both virtual clipping and blood flow simulation were realistic in broad-based but not calcified aneurysms. Virtual clipping of a calcified aneurysm could be performed using the same surgical trajectory, but not the same clip type. We have successfully developed a virtual aneurysm-clipping simulator. Next, we will prospectively evaluate this device for surgical procedure planning and education. Copyright © 2018 Elsevier Inc. All rights reserved.
2014-01-01
Background We aimed to observe the preparedness level of final year medical students in approaching emergencies by computer-based simulation training and evaluate the efficacy of the program. Methods A computer-based prototype simulation program (Lsim), designed by researchers from the medical education and computer science departments, was used to present virtual cases for medical learning. Fifty-four final year medical students from Ondokuz Mayis University School of Medicine attended an education program on June 20, 2012 and were trained with Lsim. Volunteer attendants completed a pre-test and post-test exam at the beginning and end of the course, respectively, on the same day. Results Twenty-nine of the 54 students who attended the course accepted to take the pre-test and post-test exams; 58.6% (n = 17) were female. In 10 emergency medical cases, an average of 3.9 correct medical approaches were performed in the pre-test and an average of 9.6 correct medical approaches were performed in the post-test (t = 17.18, P = 0.006). Conclusions This study’s results showed that the readiness level of students for an adequate medical approach to emergency cases was very low. Computer-based training could help in the adequate approach of students to various emergency cases. PMID:24386919
NASA Astrophysics Data System (ADS)
Samadi, Reza
Technical textiles are increasingly being engineered and used in challenging applications, in areas such as safety, biomedical devices, architecture and others, where they must meet stringent demands including excellent and predictable load bearing capabilities. They also form the bases for one of the most widespread group of composite materials, fibre reinforced polymer-matrix composites (PMCs), which comprise materials made of stiff and strong fibres generally available in textile form and selected for their structural potential, combined with a polymer matrix that gives parts their shape. Manufacturing processes for PMCs and technical textiles, as well as parts and advanced textile structures must be engineered, ideally through simulation, and therefore diverse properties of the textiles, textile reinforcements and PMC materials must be available for predictive simulation. Knowing the detailed geometry of technical textiles is essential to predicting accurately the processing and performance properties of textiles and PMC parts. In turn, the geometry taken by a textile or a reinforcement textile is linked in an intricate manner to its constitutive behaviour. This thesis proposes, investigates and validates a general numerical tool for the integrated and comprehensive analysis of textile geometry and constitutive behaviour as required toward engineering applications featuring technical textiles and textile reinforcements. The tool shall be general with regards to the textiles modelled and the loading cases applied. Specifically, the work aims at fulfilling the following objectives: 1) developing and implementing dedicated simulation software for modelling textiles subjected to various load cases; 2) providing, through simulation, geometric descriptions for different textiles subjected to different load cases namely compaction, relaxation and shear; 3) predicting the constitutive behaviour of the textiles undergoing said load cases; 4) identifying parameters affecting the textile geometry and constitutive behaviour under evolving loading; 5) validating simulation results with experimental trials; and 6) demonstrating the applicability of the simulation procedure to textile reinforcements featuring large numbers of small fibres as used in PMCs. As a starting point, the effects of reinforcement configuration on the in-plane permeability of textile reinforcements, through-thickness thermal conductivity of PMCs and in-plane stiffness of unidirectional and bidirectional PMCs were quantified systematically and correlated with specific geometric parameters. Variability was quantified for each property at a constant fibre volume fraction. It was observed that variability differed strongly between properties; as such, the simulated behaviour can be related to variability levels seen in experimental measurements. The effects of the geometry of textile reinforcements on the aforementioned processing and performance properties of the textiles and PMCs made from these textiles was demonstrated and validated, but only for simple cases as thorough and credible geometric models were not available at the onset of this work. Outcomes of this work were published in a peer-reviewed journal [101]. Through this thesis it was demonstrated that predicting changes in textile geometry prior and during loading is feasible using the proposed particle-based modelling method. The particle-based modelling method relies on discrete mechanics and offers an alternative to more traditional methods based on continuum mechanics. Specifically it alleviates issues caused by large strains and management of intricate, evolving contact present in finite element simulations. The particle-based modelling method enables credible, intricate modelling of the geometry of textiles at the mesoscopic scale as well as faithful mechanical modelling under load. Changes to textile geometry and configuration due to the normal compaction pressure, stress relaxation, in-plane shear and other types of loads were successfully predicted.
NASA Astrophysics Data System (ADS)
Fan, Y. Z.; Zuo, Z. G.; Liu, S. H.; Wu, Y. L.; Sha, Y. J.
2012-11-01
Primary formulation derivation indicates that the dimension of one existing centrifugal boiler circulation pump casing is too large. As great manufacture cost can be saved by dimension decrease, a numerical simulation research is developed in this paper on dimension decrease for annular casing of this pump with a specific speed equaling to 189, which aims at finding an appropriately smaller dimension of the casing while hydraulic performance and strength performance will hardly be changed according to the requirements of the cooperative company. The research object is one existing centrifugal pump with a diffuser and a semi-spherical annular casing, working as the boiler circulation pump for (ultra) supercritical units in power plants. Dimension decrease, the modification method, is achieved by decreasing the existing casing's internal radius (marked as "Ri0") while keeping the wall thickness. The research analysis is based on primary formulation derivation, CFD (Computational Fluid Dynamics) simulation and FEM (Finite Element Method) simulation. Primary formulation derivation estimates that a design casing's internal radius should be less than 0.75 Ri0. CFD analysis indicates that smaller casing with 0.75 Ri0 has a worse hydraulic performance when working at large flow rates and a better hydraulic performance when working at small flow rates. In consideration of hydraulic performance and dimension decrease, an appropriate casing's internal radius is determined, which equals to 0.875 Ri0. FEM analysis then confirms that modified pump casing has nearly the same strength performance as the existing pump casing. It is concluded that dimension decrease can be an economical method as well as a practical method for large pumps in engineering fields.
Simulation of tracer dispersion from elevated and surface releases in complex terrain
NASA Astrophysics Data System (ADS)
Hernández, J. F.; Cremades, L.; Baldasano, J. M.
A new version of an advanced mesoscale dispersion modeling system for simulating passive air pollutant dispersion in the real atmospheric planetary boundary layer (PBL), is presented. The system comprises a diagnostic mass-consistent meteorological model and a Lagrangian particle dispersion model (LADISMO). The former version of LADISMO, developed according to Zannetti (Air pollution modelling, 1990), was based on the Monte Carlo technique and included calculation of higher-order moments of vertical random forcing for convective conditions. Its ability to simulate complex flow dispersion has been stated in a previous paper (Hernández et al. 1995, Atmospheric Environment, 29A, 1331-1341). The new version follows Thomson's scheme (1984, Q. Jl Roy. Met. Soc.110, 1107-1120). It is also based on Langevin equation and follows the ideas given by Brusasca et al. (1992, Atmospheric Environment26A, 707-723) and Anfossi et al. (1992, Nuovo Cemento 15c, 139-158). The model is used to simulate the dispersion and predict the ground level concentration (g.l.c.) of a tracer (SF 6) released from both an elevated source ( case a) and a ground level source ( case b) in a highly complex mountainous terrain during neutral and synoptically dominated conditions ( case a) and light and apparently stable conditions ( case b). The last case is considered as being a specially difficult task to simulate. In fact, few works have reported situations with valley drainage flows in complex terrains and real stable atmospheric conditions with weak winds. The model assumes that nearly calm situations associated to strong stability and air stagnation, make the lowest layers of PBL poorly diffusive (Brusasca et al., 1992, Atmospheric Environment26A, 707-723). Model results are verified against experimental data from Guardo-90 tracer experiments, an intensive field campaign conducted in the Carrion river valley (Northern Spain) to study atmospheric diffusion within a steep walled valley in mountainous terrain (Ibarra, 1992, Energia, No. 1, 74-85).
Fusion of Scores in a Detection Context Based on Alpha Integration.
Soriano, Antonio; Vergara, Luis; Ahmed, Bouziane; Salazar, Addisson
2015-09-01
We present a new method for fusing scores corresponding to different detectors (two-hypotheses case). It is based on alpha integration, which we have adapted to the detection context. Three optimization methods are presented: least mean square error, maximization of the area under the ROC curve, and minimization of the probability of error. Gradient algorithms are proposed for the three methods. Different experiments with simulated and real data are included. Simulated data consider the two-detector case to illustrate the factors influencing alpha integration and demonstrate the improvements obtained by score fusion with respect to individual detector performance. Two real data cases have been considered. In the first, multimodal biometric data have been processed. This case is representative of scenarios in which the probability of detection is to be maximized for a given probability of false alarm. The second case is the automatic analysis of electroencephalogram and electrocardiogram records with the aim of reproducing the medical expert detections of arousal during sleeping. This case is representative of scenarios in which probability of error is to be minimized. The general superior performance of alpha integration verifies the interest of optimizing the fusing parameters.
A stochastic approach for automatic generation of urban drainage systems.
Möderl, M; Butler, D; Rauch, W
2009-01-01
Typically, performance evaluation of new developed methodologies is based on one or more case studies. The investigation of multiple real world case studies is tedious and time consuming. Moreover extrapolating conclusions from individual investigations to a general basis is arguable and sometimes even wrong. In this article a stochastic approach is presented to evaluate new developed methodologies on a broader basis. For the approach the Matlab-tool "Case Study Generator" is developed which generates a variety of different virtual urban drainage systems automatically using boundary conditions e.g. length of urban drainage system, slope of catchment surface, etc. as input. The layout of the sewer system is based on an adapted Galton-Watson branching process. The sub catchments are allocated considering a digital terrain model. Sewer system components are designed according to standard values. In total, 10,000 different virtual case studies of urban drainage system are generated and simulated. Consequently, simulation results are evaluated using a performance indicator for surface flooding. Comparison between results of the virtual and two real world case studies indicates the promise of the method. The novelty of the approach is that it is possible to get more general conclusions in contrast to traditional evaluations with few case studies.
Aydin, Abdullatif; Muir, Gordon H; Graziano, Manuela E; Khan, Muhammad Shamim; Dasgupta, Prokar; Ahmed, Kamran
2015-06-01
To assess face, content and construct validity, and feasibility and acceptability of the GreenLight™ Simulator as a training tool for photoselective vaporisation of the prostate (PVP), and to establish learning curves and develop an evidence-based training curriculum. This prospective, observational and comparative study, recruited novice (25 participants), intermediate (14) and expert-level urologists (seven) from the UK and Europe at the 28th European Association of Urological Surgeons Annual Meeting 2013. A group of novices (12 participants) performed 10 sessions of subtask training modules followed by a long operative case, whereas a second group (13) performed five sessions of a given case module. Intermediate and expert groups performed all training modules once, followed by one operative case. The outcome measures for learning curves and construct validity were time to task, coagulation time, vaporisation time, average sweep speed, average laser distance, blood loss, operative errors, and instrument cost. Face and content validity, feasibility and acceptability were addressed through a quantitative survey. Construct validity was demonstrated in two of five training modules (P = 0.038; P = 0.018) and in a considerable number of case metrics (P = 0.034). Learning curves were seen in all five training modules (P < 0.001) and significant reduction in case operative time (P < 0.001) and error (P = 0.017) were seen. An evidence-based training curriculum, to help trainees acquire transferable skills, was produced using the results. This study has shown the GreenLight Simulator to be a valid and useful training tool for PVP. It is hoped that by using the training curriculum for the GreenLight Simulator, novice trainees can acquire skills and knowledge to a predetermined level of proficiency. © 2014 The Authors. BJU International © 2014 BJU International.
Implementation of unsteady sampling procedures for the parallel direct simulation Monte Carlo method
NASA Astrophysics Data System (ADS)
Cave, H. M.; Tseng, K.-C.; Wu, J.-S.; Jermy, M. C.; Huang, J.-C.; Krumdieck, S. P.
2008-06-01
An unsteady sampling routine for a general parallel direct simulation Monte Carlo method called PDSC is introduced, allowing the simulation of time-dependent flow problems in the near continuum range. A post-processing procedure called DSMC rapid ensemble averaging method (DREAM) is developed to improve the statistical scatter in the results while minimising both memory and simulation time. This method builds an ensemble average of repeated runs over small number of sampling intervals prior to the sampling point of interest by restarting the flow using either a Maxwellian distribution based on macroscopic properties for near equilibrium flows (DREAM-I) or output instantaneous particle data obtained by the original unsteady sampling of PDSC for strongly non-equilibrium flows (DREAM-II). The method is validated by simulating shock tube flow and the development of simple Couette flow. Unsteady PDSC is found to accurately predict the flow field in both cases with significantly reduced run-times over single processor code and DREAM greatly reduces the statistical scatter in the results while maintaining accurate particle velocity distributions. Simulations are then conducted of two applications involving the interaction of shocks over wedges. The results of these simulations are compared to experimental data and simulations from the literature where there these are available. In general, it was found that 10 ensembled runs of DREAM processing could reduce the statistical uncertainty in the raw PDSC data by 2.5-3.3 times, based on the limited number of cases in the present study.
An Ellipsoidal Particle-Finite Element Method for Hypervelocity Impact Simulation. Chapter 1
NASA Technical Reports Server (NTRS)
Shivarama, Ravishankar; Fahrenthold, Eric P.
2004-01-01
A number of coupled particle-element and hybrid particle-element methods have been developed for the simulation of hypervelocity impact problems, to avoid certain disadvantages associated with the use of pure continuum based or pure particle based methods. To date these methods have employed spherical particles. In recent work a hybrid formulation has been extended to the ellipsoidal particle case. A model formulation approach based on Lagrange's equations, with particles entropies serving as generalized coordinates, avoids the angular momentum conservation problems which have been reported with ellipsoidal smooth particle hydrodynamics models.
Development of Hydro-Informatic Modelling System and its Application
NASA Astrophysics Data System (ADS)
Wang, Z.; Liu, C.; Zheng, H.; Zhang, L.; Wu, X.
2009-12-01
The understanding of hydrological cycle is the core of hydrology and the scientific base of water resources management. Meanwhile, simulation of hydrological cycle has long been regarded as an important tool for the assessment, utilization and protection of water resources. In this paper, a new tool named Hydro-Informatic Modelling System (HIMS) has been developed and introduced with case studies in the Yellow River Basin in China and 331 catchments in Australia. The case studies showed that HIMS can be employed as an integrated platform for hydrological simulation in different regions. HIMS is a modular based framework of hydrological model designed for different utilization such as flood forecasting, water resources planning and evaluating hydrological impacts of climate change and human activities. The unique of HIMS is its flexibility in providing alternative modules in the simulation of hydrological cycle, which successfully overcome the difficulties in the availability of input data, the uncertainty of parameters, and the difference of rainfall-runoff processes. The modular based structure of HIMS makes it possible for developing new hydrological models by the users.
Proceedings of the 2007 Antenna Applications Symposium, Volume 1
2007-12-01
34Gyrator- Based Biquad Filters and Negative Impedance Converters for Microwaves," International Journal of RF and Microwave Computer- Aided...upper curves in the figure is the bandwidth based on half power gain 0/ 2 0.443 / DλΔ = , and the other upper curve is for the “Ideal” case 0/ 2 0.5...simulator with TEM horn, 2Ports HP Power Network Analyzer (PNA). 153 [a] [b] Figure 7: Active input impedance of the measured and simulated
HuPSON: the human physiology simulation ontology.
Gündel, Michaela; Younesi, Erfan; Malhotra, Ashutosh; Wang, Jiali; Li, Hui; Zhang, Bijun; de Bono, Bernard; Mevissen, Heinz-Theodor; Hofmann-Apitius, Martin
2013-11-22
Large biomedical simulation initiatives, such as the Virtual Physiological Human (VPH), are substantially dependent on controlled vocabularies to facilitate the exchange of information, of data and of models. Hindering these initiatives is a lack of a comprehensive ontology that covers the essential concepts of the simulation domain. We propose a first version of a newly constructed ontology, HuPSON, as a basis for shared semantics and interoperability of simulations, of models, of algorithms and of other resources in this domain. The ontology is based on the Basic Formal Ontology, and adheres to the MIREOT principles; the constructed ontology has been evaluated via structural features, competency questions and use case scenarios.The ontology is freely available at: http://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads.html (owl files) and http://bishop.scai.fraunhofer.de/scaiview/ (browser). HuPSON provides a framework for a) annotating simulation experiments, b) retrieving relevant information that are required for modelling, c) enabling interoperability of algorithmic approaches used in biomedical simulation, d) comparing simulation results and e) linking knowledge-based approaches to simulation-based approaches. It is meant to foster a more rapid uptake of semantic technologies in the modelling and simulation domain, with particular focus on the VPH domain.
Iannaccone, Francesco; Degroote, Joris; Vierendeels, Jan; Segers, Patrick
2016-01-01
In recent years the role of FSI (fluid-structure interaction) simulations in the analysis of the fluid-mechanics of heart valves is becoming more and more important, being able to capture the interaction between the blood and both the surrounding biological tissues and the valve itself. When setting up an FSI simulation, several choices have to be made to select the most suitable approach for the case of interest: in particular, to simulate flexible leaflet cardiac valves, the type of discretization of the fluid domain is crucial, which can be described with an ALE (Arbitrary Lagrangian-Eulerian) or an Eulerian formulation. The majority of the reported 3D heart valve FSI simulations are performed with the Eulerian formulation, allowing for large deformations of the domains without compromising the quality of the fluid grid. Nevertheless, it is known that the ALE-FSI approach guarantees more accurate results at the interface between the solid and the fluid. The goal of this paper is to describe the same aortic valve model in the two cases, comparing the performances of an ALE-based FSI solution and an Eulerian-based FSI approach. After a first simplified 2D case, the aortic geometry was considered in a full 3D set-up. The model was kept as similar as possible in the two settings, to better compare the simulations’ outcomes. Although for the 2D case the differences were unsubstantial, in our experience the performance of a full 3D ALE-FSI simulation was significantly limited by the technical problems and requirements inherent to the ALE formulation, mainly related to the mesh motion and deformation of the fluid domain. As a secondary outcome of this work, it is important to point out that the choice of the solver also influenced the reliability of the final results. PMID:27128798
A verification library for multibody simulation software
NASA Technical Reports Server (NTRS)
Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.
1989-01-01
A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.
A case study on dune response to infragravity waves
NASA Astrophysics Data System (ADS)
Li, Wenshan; Wang, Hui; Li, Huan; Wu, Shuangquan; Li, Cheng
2017-08-01
A series of numerical simulations were conducted using the process-based model XBeach to investigate dune response under normal and getting rid of infragravity wave conditions with different slopes. Erosion volume upside the dune toe and dune top recession are set as indicators for dune vulnerability as well as defence capacity for its front-beach. Results show that both dune erosion volume and dune top recession decrease with gentler dune slopes. Of all the simulation cases, dune with a face slope of 1/1 lost most sand and supplied most sand for lower-bed. The presence of infragravity waves is validated to be crucial to dune vulnerability. The dune erosion volume is shown to decrease by 44.5%∼61.5% and the dune top recession decreased by 0%∼45.5% correspondingly, in the case that infragravity motion is not taken into account during simulation for different dune slopes.
A software-based sensor for combined sewer overflows.
Leonhardt, G; Fach, S; Engelhard, C; Kinzel, H; Rauch, W
2012-01-01
A new methodology for online estimation of excess flow from combined sewer overflow (CSO) structures based on simulation models is presented. If sufficient flow and water level data from the sewer system is available, no rainfall data are needed to run the model. An inverse rainfall-runoff model was developed to simulate net rainfall based on flow and water level data. Excess flow at all CSO structures in a catchment can then be simulated with a rainfall-runoff model. The method is applied to a case study and results show that the inverse rainfall-runoff model can be used instead of missing rain gauges. Online operation is ensured by software providing an interface to the SCADA-system of the operator and controlling the model. A water quality model could be included to simulate also pollutant concentrations in the excess flow.
NASA Technical Reports Server (NTRS)
Chapman, David K.
1989-01-01
The use of clinostats and centrifuges to explore the hypogravity range between zero and 1 g is described. Different types of clinostat configurations and clinostat-centrifuge combinations are compared. Some examples selected from the literature and current research in gravitational physiology are presented to show plant responses in the simulated hypogravity region of the g-parameter (0 is greater than g is greater than 1). The validation of clinostat simulation is discussed. Examples in which flight data can be compared to clinostat data are presented. The data from 3 different laboratories using 3 different plant species indicate that clinostat simulation in some cases were qualitatively similar to flight data, but that in all cases were quantitatively different. The need to conduct additional tests in weightlessness is emphasized.
A Multi-agent Simulation Tool for Micro-scale Contagion Spread Studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, Daniel B
2016-01-01
Within the disaster preparedness and emergency response community, there is interest in how contagions spread person-to-person at large gatherings and if mitigation strategies can be employed to reduce new infections. A contagion spread simulation module was developed for the Incident Management Preparedness and Coordination Toolkit that allows a user to see how a geographically accurate layout of the gathering space helps or hinders the spread of a contagion. The results can inform mitigation strategies based on changing the physical layout of an event space. A case study was conducted for a particular event to calibrate the underlying simulation model. Thismore » paper presents implementation details of the simulation code that incorporates agent movement and disease propagation. Elements of the case study are presented to show how the tool can be used.« less
Towards real-time photon Monte Carlo dose calculation in the cloud
NASA Astrophysics Data System (ADS)
Ziegenhein, Peter; Kozin, Igor N.; Kamerling, Cornelis Ph; Oelfke, Uwe
2017-06-01
Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.
Towards real-time photon Monte Carlo dose calculation in the cloud.
Ziegenhein, Peter; Kozin, Igor N; Kamerling, Cornelis Ph; Oelfke, Uwe
2017-06-07
Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.
Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations
Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting; ...
2018-03-28
Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less
Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting
Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less
Cost-Effectiveness of Mass Dog Vaccination Campaigns against Rabies in Flores Island, Indonesia.
Wera, E; Mourits, M C M; Siko, M M; Hogeveen, H
2017-12-01
A dynamic deterministic simulation model was developed to determine the cost-effectiveness of different mass dog vaccination strategies against rabies in a dog population representative of a typical village on Flores Island. Cost-effectiveness was measured as public cost per averted dog-rabies case. Simulations started with the introduction of one infectious dog into a susceptible dog population of 399 dogs and subsequently ran for a period of 10 years. The base scenario represented a situation without any control intervention. Evaluated vaccination strategies were as follows: annual vaccination campaigns with short-acting vaccine (immunity duration of 52 weeks) (AV_52), annual campaigns with long-acting vaccine (immunity duration of 156 weeks) (AV_156), biannual campaigns with short-acting vaccine (BV_52) and once-in-2-years campaigns with long-acting vaccine (O2V_156). The effectiveness of the vaccination strategies was simulated for vaccination coverages of 50% and 70%. Cumulative results were reported for the 10-year simulation period. The base scenario resulted in three epidemic waves, with a total of 1274 dog-rabies cases. The public cost of applying AV_52 at a coverage of 50% was US$5342 for a village. This strategy was unfavourable compared to other strategies, as it was costly and ineffective in controlling the epidemic. The costs of AV_52 at a coverage of 70% and AV_156 at a coverage of 70% were, respectively, US$3646 and US$3716, equivalent to US$3.00 and US$3.17 per averted dog-rabies case. Increasing the coverage of AV_156 from 50% to 70% reduced the number of cases by 7% and reduced the cost by US$1452, resulting in a cost-effectiveness ratio of US$1.81 per averted dog-rabies case. This simulation model provides an effective tool to explore the public cost-effectiveness of mass dog vaccination strategies in Flores Island. Insights obtained from the simulation results are useful for animal health authorities to support decision-making in rabies-endemic areas, such as Flores Island. © 2016 Blackwell Verlag GmbH.
ERIC Educational Resources Information Center
Owens, Norma J.; Padula, Cynthia A.; Hume, Anne L.
2002-01-01
Interdisciplinary clinical case studies in geriatrics were developed using active and problem-based learning approaches that simulate clinical environments. Feedback from their use in continuing education indicated that facilitators need interdisciplinary group skills, well-written discussion questions enhanced learning, and the presence of all…
High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair.
Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y K
2018-01-01
Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed.
High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair
Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y. K.
2018-01-01
Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed. PMID:29706894
Building the Case for SNAP: Creation of Multi-Band, Simulated Images With Shapelets
NASA Technical Reports Server (NTRS)
Ferry, Matthew A.
2005-01-01
Dark energy has simultaneously been the most elusive and most important phenomenon in the shaping of the universe. A case for a proposed space-telescope called SNAP (SuperNova Acceleration Probe) is being built, a crucial component of which is image simulations. One method for this is "Shapelets," developed at Caltech. Shapelets form an orthonormal basis and are uniquely able to represent realistic space images and create new images based on real ones. Previously, simulations were created using the Hubble Deep Field (HDF) as a basis Set in one band. In this project, image simulations are created.using the 4 bands of the Hubble Ultra Deep Field (UDF) as a basis set. This provides a better basis for simulations because (1) the survey is deeper, (2) they have a higher resolution, and (3) this is a step closer to simulating the 9 bands of SNAP. Image simulations are achieved by detecting sources in the UDF, decomposing them into shapelets, tweaking their parameters in realistic ways, and recomposing them into new images. Morphological tests were also run to verify the realism of the simulations. They have a wide variety of uses, including the ability to create weak gravitational lensing simulations.
Simulation of networks of spiking neurons: A review of tools and strategies
Brette, Romain; Rudolph, Michelle; Carnevale, Ted; Hines, Michael; Beeman, David; Bower, James M.; Diesmann, Markus; Morrison, Abigail; Goodman, Philip H.; Harris, Frederick C.; Zirpe, Milind; Natschläger, Thomas; Pecevski, Dejan; Ermentrout, Bard; Djurfeldt, Mikael; Lansner, Anders; Rochel, Olivier; Vieville, Thierry; Muller, Eilif; Davison, Andrew P.; El Boustani, Sami
2009-01-01
We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin–Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks. PMID:17629781
NASA Astrophysics Data System (ADS)
Sadeghi-Goughari, M.; Mojra, A.; Sadeghi, S.
2016-02-01
Intraoperative Thermal Imaging (ITI) is a new minimally invasive diagnosis technique that can potentially locate margins of brain tumor in order to achieve maximum tumor resection with least morbidity. This study introduces a new approach to ITI based on artificial tactile sensing (ATS) technology in conjunction with artificial neural networks (ANN) and feasibility and applicability of this method in diagnosis and localization of brain tumors is investigated. In order to analyze validity and reliability of the proposed method, two simulations were performed. (i) An in vitro experimental setup was designed and fabricated using a resistance heater embedded in agar tissue phantom in order to simulate heat generation by a tumor in the brain tissue; and (ii) A case report patient with parafalcine meningioma was presented to simulate ITI in the neurosurgical procedure. In the case report, both brain and tumor geometries were constructed from MRI data and tumor temperature and depth of location were estimated. For experimental tests, a novel assisted surgery robot was developed to palpate the tissue phantom surface to measure temperature variations and ANN was trained to estimate the simulated tumor’s power and depth. Results affirm that ITI based ATS is a non-invasive method which can be useful to detect, localize and characterize brain tumors.
Promoting Simulation Globally: Networking with Nursing Colleagues Across Five Continents.
Alfes, Celeste M; Madigan, Elizabeth A
Simulation education is gaining momentum internationally and may provide the opportunity to enhance clinical education while disseminating evidence-based practice standards for clinical simulation and learning. There is a need to develop a cohesive leadership group that fosters support, networking, and sharing of simulation resources globally. The Frances Payne Bolton School of Nursing at Case Western Reserve University has had the unique opportunity to establish academic exchange programs with schools of nursing across five continents. Although the joint and mutual simulation activities have been extensive, each international collaboration has also provided insight into the innovations developed by global partners.
Quantification of uncertainties for application in detonation simulation
NASA Astrophysics Data System (ADS)
Zheng, Miao; Ma, Zhibo
2016-06-01
Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.
Simulator for concurrent processing data flow architectures
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.; Stoughton, John W.; Mielke, Roland R.
1992-01-01
A software simulator capability of simulating execution of an algorithm graph on a given system under the Algorithm to Architecture Mapping Model (ATAMM) rules is presented. ATAMM is capable of modeling the execution of large-grained algorithms on distributed data flow architectures. Investigating the behavior and determining the performance of an ATAMM based system requires the aid of software tools. The ATAMM Simulator presented is capable of determining the performance of a system without having to build a hardware prototype. Case studies are performed on four algorithms to demonstrate the capabilities of the ATAMM Simulator. Simulated results are shown to be comparable to the experimental results of the Advanced Development Model System.
NASA Astrophysics Data System (ADS)
Guérin, Charles-Antoine; Grilli, Stéphan T.; Moran, Patrick; Grilli, Annette R.; Insua, Tania L.
2018-05-01
The authors recently proposed a new method for detecting tsunamis using high-frequency (HF) radar observations, referred to as "time-correlation algorithm" (TCA; Grilli et al. Pure Appl Geophys 173(12):3895-3934, 2016a, 174(1): 3003-3028, 2017). Unlike standard algorithms that detect surface current patterns, the TCA is based on analyzing space-time correlations of radar signal time series in pairs of radar cells, which does not require inverting radial surface currents. This was done by calculating a contrast function, which quantifies the change in pattern of the mean correlation between pairs of neighboring cells upon tsunami arrival, with respect to a reference correlation computed in the recent past. In earlier work, the TCA was successfully validated based on realistic numerical simulations of both the radar signal and tsunami wave trains. Here, this algorithm is adapted to apply to actual data from a HF radar installed in Tofino, BC, for three test cases: (1) a simulated far-field tsunami generated in the Semidi Subduction Zone in the Aleutian Arc; (2) a simulated near-field tsunami from a submarine mass failure on the continental slope off of Tofino; and (3) an event believed to be a meteotsunami, which occurred on October 14th, 2016, off of the Pacific West Coast and was measured by the radar. In the first two cases, the synthetic tsunami signal is superimposed onto the radar signal by way of a current memory term; in the third case, the tsunami signature is present within the radar data. In light of these test cases, we develop a detection methodology based on the TCA, using a correlation contrast function, and show that in all three cases the algorithm is able to trigger a timely early warning.
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Morin, Cory
2015-01-01
Dengue fever (DF) is caused by a virus transmitted between humans and Aedes genus mosquitoes through blood feeding. In recent decades incidence of the disease has drastically increased in the tropical Americas, culminating with the Pan American outbreak in 2010 which resulted in 1.7 million reported cases. In Puerto Rico dengue is endemic, however, there is significant inter-annual, intraannual, and spatial variability in case loads. Variability in climate and the environment, herd immunity and virus genetics, and demographic characteristics may all contribute to differing patterns of transmission both spatially and temporally. Knowledge of climate influences on dengue incidence could facilitate development of early warning systems allowing public health workers to implement appropriate transmission intervention strategies. In this study, we simulate dengue incidence in several municipalities in Puerto Rico using population and meteorological data derived from ground based stations and remote sensing instruments. This data was used to drive a process based model of vector population development and virus transmission. Model parameter values for container composition, vector characteristics, and incubation period were chosen by employing a Monte Carlo approach. Multiple simulations were performed for each municipality and the results were compared with reported dengue cases. The best performing simulations were retained and their parameter values and meteorological input were compared between years and municipalities. Parameter values varied by municipality and year illustrating the complexity and sensitivity of the disease system. Local characteristics including the natural and built environment impact transmission dynamics and produce varying responses to meteorological conditions.
Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations
NASA Astrophysics Data System (ADS)
Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.
2010-11-01
We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.
Rupture Dynamics Simulation for Non-Planar fault by a Curved Grid Finite Difference Method
NASA Astrophysics Data System (ADS)
Zhang, Z.; Zhu, G.; Chen, X.
2011-12-01
We first implement the non-staggered finite difference method to solve the dynamic rupture problem, with split-node, for non-planar fault. Split-node method for dynamic simulation has been used widely, because of that it's more precise to represent the fault plane than other methods, for example, thick fault, stress glut and so on. The finite difference method is also a popular numeric method to solve kinematic and dynamic problem in seismology. However, previous works focus most of theirs eyes on the staggered-grid method, because of its simplicity and computational efficiency. However this method has its own disadvantage comparing to non-staggered finite difference method at some fact for example describing the boundary condition, especially the irregular boundary, or non-planar fault. Zhang and Chen (2006) proposed the MacCormack high order non-staggered finite difference method based on curved grids to precisely solve irregular boundary problem. Based upon on this non-staggered grid method, we make success of simulating the spontaneous rupture problem. The fault plane is a kind of boundary condition, which could be irregular of course. So it's convinced that we could simulate rupture process in the case of any kind of bending fault plane. We will prove this method is valid in the case of Cartesian coordinate first. In the case of bending fault, the curvilinear grids will be used.
Steiner, Malte; Claes, Lutz; Ignatius, Anita; Niemeyer, Frank; Simon, Ulrich; Wehner, Tim
2013-09-06
Numerical models of secondary fracture healing are based on mechanoregulatory algorithms that use distortional strain alone or in combination with either dilatational strain or fluid velocity as determining stimuli for tissue differentiation and development. Comparison of these algorithms has previously suggested that healing processes under torsional rotational loading can only be properly simulated by considering fluid velocity and deviatoric strain as the regulatory stimuli. We hypothesize that sufficient calibration on uncertain input parameters will enhance our existing model, which uses distortional and dilatational strains as determining stimuli, to properly simulate fracture healing under various loading conditions including also torsional rotation. Therefore, we minimized the difference between numerically simulated and experimentally measured courses of interfragmentary movements of two axial compressive cases and two shear load cases (torsional and translational) by varying several input parameter values within their predefined bounds. The calibrated model was then qualitatively evaluated on the ability to predict physiological changes of spatial and temporal tissue distributions, based on respective in vivo data. Finally, we corroborated the model on five additional axial compressive and one asymmetrical bending load case. We conclude that our model, using distortional and dilatational strains as determining stimuli, is able to simulate fracture-healing processes not only under axial compression and torsional rotation but also under translational shear and asymmetrical bending loading conditions.
A platform for dynamic simulation and control of movement based on OpenSim and MATLAB.
Mansouri, Misagh; Reinbolt, Jeffrey A
2012-05-11
Numerical simulations play an important role in solving complex engineering problems and have the potential to revolutionize medical decision making and treatment strategies. In this paper, we combine the rapid model-based design, control systems and powerful numerical method strengths of MATLAB/Simulink with the simulation and human movement dynamics strengths of OpenSim by developing a new interface between the two software tools. OpenSim is integrated with Simulink using the MATLAB S-function mechanism, and the interface is demonstrated using both open-loop and closed-loop control systems. While the open-loop system uses MATLAB/Simulink to separately reproduce the OpenSim Forward Dynamics Tool, the closed-loop system adds the unique feature of feedback control to OpenSim, which is necessary for most human movement simulations. An arm model example was successfully used in both open-loop and closed-loop cases. For the open-loop case, the simulation reproduced results from the OpenSim Forward Dynamics Tool with root mean square (RMS) differences of 0.03° for the shoulder elevation angle and 0.06° for the elbow flexion angle. MATLAB's variable step-size integrator reduced the time required to generate the forward dynamic simulation from 7.1s (OpenSim) to 2.9s (MATLAB). For the closed-loop case, a proportional-integral-derivative controller was used to successfully balance a pole on model's hand despite random force disturbances on the pole. The new interface presented here not only integrates the OpenSim and MATLAB/Simulink software tools, but also will allow neuroscientists, physiologists, biomechanists, and physical therapists to adapt and generate new solutions as treatments for musculoskeletal conditions. Copyright © 2012 Elsevier Ltd. All rights reserved.
A platform for dynamic simulation and control of movement based on OpenSim and MATLAB
Mansouri, Misagh; Reinbolt, Jeffrey A.
2013-01-01
Numerical simulations play an important role in solving complex engineering problems and have the potential to revolutionize medical decision making and treatment strategies. In this paper, we combine the rapid model-based design, control systems and powerful numerical method strengths of MATLAB/Simulink with the simulation and human movement dynamics strengths of OpenSim by developing a new interface between the two software tools. OpenSim is integrated with Simulink using the MATLAB S-function mechanism, and the interface is demonstrated using both open-loop and closed-loop control systems. While the open-loop system uses MATLAB/Simulink to separately reproduce the OpenSim Forward Dynamics Tool, the closed-loop system adds the unique feature of feedback control to OpenSim, which is necessary for most human movement simulations. An arm model example was successfully used in both open-loop and closed-loop cases. For the open-loop case, the simulation reproduced results from the OpenSim Forward Dynamics Tool with root mean square (RMS) differences of 0.03° for the shoulder elevation angle and 0.06° for the elbow flexion angle. MATLAB’s variable step-size integrator reduced the time required to generate the forward dynamic simulation from 7.1 s (OpenSim) to 2.9 s (MATLAB). For the closed-loop case, a proportional–integral–derivative controller was used to successfully balance a pole on model’s hand despite random force disturbances on the pole. The new interface presented here not only integrates the OpenSim and MATLAB/Simulink software tools, but also will allow neuroscientists, physiologists, biomechanists, and physical therapists to adapt and generate new solutions as treatments for musculoskeletal conditions. PMID:22464351
NASA Astrophysics Data System (ADS)
Carcellar, B. G., III
2017-10-01
Museum exhibit management is one of the usual undertakings of museum facilitators. Art works must be strategically placed to achieve maximum viewing from the visitors. The positioning of the artworks also highly influences the quality of experience of the visitors. One solution in such problems is to utilize GIS and Agent-Based Modelling (ABM). In ABM, persistent interacting objects are modelled as agents. These agents are given attributes and behaviors that describe their properties as well as their motion. In this study, ABM approach that incorporates GIS is utilized to perform analyticcal assessment on the placement of the artworks in the Vargas Museum. GIS serves as the backbone for the spatial aspect of the simulation such as the placement of the artwork exhibits, as well as possible obstructions to perception such as the columns, walls, and panel boards. Visibility Analysis is also done to the model in GIS to assess the overall visibility of the artworks. The ABM is done using the initial GIS outputs and GAMA, an open source ABM software. Visitors are modelled as agents, moving inside the museum following a specific decision tree. The simulation is done in three use cases: the 10 %, 20 %, and 30 % chance of having a visitor in the next minute. For the case of the said museum, the 10 % chance is determined to be the closest simulation case to the actual and the recommended minimum time to achieve a maximum artwork perception is 1 hour and 40 minutes. Initial assessment of the results shows that even after 3 hours of simulation, small parts of the exhibit show lack of viewers, due to its distance from the entrance. A more detailed decision tree for the visitor agents can be incorporated to have a more realistic simulation.
ERIC Educational Resources Information Center
Luo, Wei; Pelletier, Jon; Duffin, Kirk; Ormand, Carol; Hung, Wei-chen; Shernoff, David J.; Zhai, Xiaoming; Iverson, Ellen; Whalley, Kyle; Gallaher, Courtney; Furness, Walter
2016-01-01
The long geological time needed for landform development and evolution poses a challenge for understanding and appreciating the processes involved. The Web-based Interactive Landform Simulation Model--Grand Canyon (WILSIM-GC, http://serc.carleton.edu/landform/) is an educational tool designed to help students better understand such processes,…
How To Create Complex Measurement Models: A Case Study of Principled Assessment Design.
ERIC Educational Resources Information Center
Bauer, Malcolm; Williamson, David M.; Steinberg, Linda S.; Mislevy, Robert J.; Behrens, John T.
In computer-based simulations, students must bring a wide range of relevant knowledge, skills, and abilities to bear jointly as they solve meaningful problems in a learning domain. To function effectively as an assessment, a simulation system must additionally be able to evoke and interpret observable evidence about targeted knowledge in a manner…
ERIC Educational Resources Information Center
Matrundola, Deborah La Torre; Chang, Sandy; Herman, Joan
2012-01-01
The purpose of these case studies was to examine the ways technology and professional development supported the use of the SimScientists assessment systems. Qualitative research methodology was used to provide narrative descriptions of six classes implementing simulation-based assessments for either the topic of Ecosystems or Atoms and Molecules.…
ERIC Educational Resources Information Center
Demissie, Tesfaye; Ochonogor, Chukunoye E.; Engida, Temechegn
2011-01-01
Many students have difficulty in learning abstract and complex lessons of chemistry. This study investigated how students develop their understandings of abstract and complex lessons in chemistry with the aid of visualizing tools: animation, simulation and video that allow them to build clear concepts. Animation, simulation and video enable…
ERIC Educational Resources Information Center
Wigton, Robert S.; And Others
1990-01-01
An educational intervention was effective in improving the judgment of experienced student-health physicians (N=11) in predicting positive culture in simulated patients with pharyngitis. The intervention was three parts: an initial one-hour lecture; three sessions with computer-based cognitive feedback; and monthly reports of the percentage of…
Preoperative simulation for the planning of microsurgical clipping of intracranial aneurysms.
Marinho, Paulo; Vermandel, Maximilien; Bourgeois, Philippe; Lejeune, Jean-Paul; Mordon, Serge; Thines, Laurent
2014-12-01
The safety and success of intracranial aneurysm (IA) surgery could be improved through the dedicated application of simulation covering the procedure from the 3-dimensional (3D) description of the surgical scene to the visual representation of the clip application. We aimed in this study to validate the technical feasibility and clinical relevance of such a protocol. All patients preoperatively underwent 3D magnetic resonance imaging and 3D computed tomography angiography to build 3D reconstructions of the brain, cerebral arteries, and surrounding cranial bone. These 3D models were segmented and merged using Osirix, a DICOM image processing application. This provided the surgical scene that was subsequently imported into Blender, a modeling platform for 3D animation. Digitized clips and appliers could then be manipulated in the virtual operative environment, allowing the visual simulation of clipping. This simulation protocol was assessed in a series of 10 IAs by 2 neurosurgeons. The protocol was feasible in all patients. The visual similarity between the surgical scene and the operative view was excellent in 100% of the cases, and the identification of the vascular structures was accurate in 90% of the cases. The neurosurgeons found the simulation helpful for planning the surgical approach (ie, the bone flap, cisternal opening, and arterial tree exposure) in 100% of the cases. The correct number of final clip(s) needed was predicted from the simulation in 90% of the cases. The preoperatively expected characteristics of the optimal clip(s) (ie, their number, shape, size, and orientation) were validated during surgery in 80% of the cases. This study confirmed that visual simulation of IA clipping based on the processing of high-resolution 3D imaging can be effective. This is a new and important step toward the development of a more sophisticated integrated simulation platform dedicated to cerebrovascular surgery.
Meng, Xiangyin; Li, Yan
2015-01-01
Natural heat convection of water-based alumina (Al2O3/water) nanofluids (with volume fraction 1% and 4%) in a horizontal cylinder is numerically investigated. The whole three-dimensional computational fluid dynamics (CFD) procedure is performed in a completely open-source way. Blender, enGrid, OpenFOAM and ParaView are employed for geometry creation, mesh generation, case simulation and post process, respectively. Original solver 'buoyantBoussinesqSimpleFoam' is selected for the present study, and a temperature-dependent solver 'buoyantBoussinesqSimpleTDFoam' is developed to ensure the simulation is more realistic. The two solvers are used for same cases and compared to corresponding experimental results. The flow regime in these cases is laminar (Reynolds number is 150) and the Rayleigh number range is 0.7 × 10(7) ~ 5 × 10(7). By comparison, the average natural Nusselt numbers of water and Al2O3/water nanofluids are found to increase with the Rayleigh number. At the same Rayleigh number, the Nusselt number is found to decrease with nanofluid volume fraction. The temperature-dependent solver is found better for water and 1% Al2O3/water nanofluid cases, while the original solver is better for 4% Al2O3/water nanofluid cases. Furthermore, due to strong three-dimensional flow features in the horizontal cylinder, three-dimensional CFD simulation is recommended instead of two-dimensional simplifications.
NASA Astrophysics Data System (ADS)
Setlur Nagesh, S. V.; Russ, M.; Ionita, C. N.; Bednarek, D.; Rudin, S.
2017-03-01
Modern 3D printing technology can fabricate vascular phantoms based on an actual human patient with a high degree of precision facilitating a realistic simulation environment for an intervention. We present two experimental setups using 3D printed patient-specific neurovasculature to simulate different disease anatomies. To simulate the human neurovasculature in the Circle of Willis, patient-based phantoms with aneurysms were 3D printed using a Objet Eden 260V printer. Anthropomorphic head phantoms and a human skull combined with acrylic plates simulated human head bone anatomy and x-ray attenuation. For dynamic studies the 3D printed phantom was connected to a pulsatile flow loop with the anthropomorphic phantom underneath. By combining different 3D printed phantoms and the anthropomorphic phantoms, different patient pathologies can be simulated. For static studies a 3D printed neurovascular phantom was embedded inside a human skull and used as a positional reference for treatment devices such as stents. To simulate tissue attenuation acrylic layers were added. Different combinations can simulate different patient treatment procedures. The Complementary-Metal-Oxide-Semiconductor (CMOS) based High Resolution Fluoroscope (HRF) with 75μm pixels offers an advantage over the state-of-the-art 200 μm pixel Flat Panel Detector (FPD) due to higher Nyquist frequency and better DQE performance. Whether this advantage is clinically useful during an actual clinical neurovascular intervention can be addressed by qualitatively evaluating images from a cohort of various cases performed using both detectors. The above-mentioned method can offer a realistic substitute for an actual clinical procedure. Also a large cohort of cases can be generated and used for a HRF clinical utility determination study.
Analytical study of effect of casing treatment on performance of a multistage compressor
NASA Technical Reports Server (NTRS)
Snyder, R. W.; Blade, R. J.
1972-01-01
The simulation was based on individual stage pressure and efficiency maps. These maps were modified to account for casing treatment effects on the individual stage characteristics. The individual stage maps effects on overall compressor performance were observed. The results show that to improve the performance of the compressor in its normal operating range, casing treatment of the rear stages is required.
TH-A-18C-04: Ultrafast Cone-Beam CT Scatter Correction with GPU-Based Monte Carlo Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Y; Southern Medical University, Guangzhou; Bai, T
2014-06-15
Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT). We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC) simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstructions within 30 seconds. Methods: The method consists of six steps: 1) FDK reconstruction using raw projection data; 2) Rigid Registration of planning CT to the FDK results; 3) MC scatter calculation at sparse view angles using the planning CT; 4) Interpolation of the calculated scatter signals to other angles; 5) Removal of scatter from the raw projections;more » 6) FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC scatter noise caused by low photon numbers. The method is validated on head-and-neck cases with simulated and clinical data. Results: We have studied impacts of photo histories, volume down sampling factors on the accuracy of scatter estimation. The Fourier analysis was conducted to show that scatter images calculated at 31 angles are sufficient to restore those at all angles with <0.1% error. For the simulated case with a resolution of 512×512×100, we simulated 10M photons per angle. The total computation time is 23.77 seconds on a Nvidia GTX Titan GPU. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU. Similar results were found for a real patient case. Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. The whole process of scatter correction and reconstruction is accomplished within 30 seconds. This study is supported in part by NIH (1R01CA154747-01), The Core Technology Research in Strategic Emerging Industry, Guangdong, China (2011A081402003)« less
Numerical Simulation of Earth Pressure on Head Chamber of Shield Machine with FEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Shouju; Kang Chengang; Sun, Wei
2010-05-21
Model parameters of conditioned soils in head chamber of shield machine are determined based on tree-axial compression tests in laboratory. The loads acting on tunneling face are estimated according to static earth pressure principle. Based on Duncan-Chang nonlinear elastic constitutive model, the earth pressures on head chamber of shield machine are simulated in different aperture ratio cases for rotating cutterhead of shield machine. Relationship between pressure transportation factor and aperture ratio of shield machine is proposed by using aggression analysis.
NASA Astrophysics Data System (ADS)
Nazzal, M. A.
2018-04-01
It is established that some superplastic materials undergo significant cavitation during deformation. In this work, stability analysis for the superplastic copper based alloy Coronze-638 at 550 °C based on Hart's definition of stable plastic deformation and finite element simulations for the balanced biaxial loading case are carried out to study the effects of hydrostatic pressure on cavitation evolution during superplastic forming. The finite element results show that imposing hydrostatic pressure yields to a reduction in cavitation growth.
Noury, N; Hadidi, T
2012-12-01
We propose a simulator of human activities collected with presence sensors in our experimental Health Smart Home "Habitat Intelligent pour la Sante (HIS)". We recorded 1492 days of data on several experimental HIS during the French national project "AILISA". On these real data, we built a mathematical model of the behavior of the data series, based on "Hidden Markov Models" (HMM). The model is then played on a computer to produce simulated data series with added flexibility to adjust the parameters in various scenarios. We also tested several methods to measure the similarity between our real and simulated data. Our simulator can produce large data base which can be further used to evaluate the algorithms to raise an alarm in case of loss in autonomy. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Heuristic for Critical Machine Based a Lot Streaming for Two-Stage Hybrid Production Environment
NASA Astrophysics Data System (ADS)
Vivek, P.; Saravanan, R.; Chandrasekaran, M.; Pugazhenthi, R.
2017-03-01
Lot streaming in Hybrid flowshop [HFS] is encountered in many real world problems. This paper deals with a heuristic approach for Lot streaming based on critical machine consideration for a two stage Hybrid Flowshop. The first stage has two identical parallel machines and the second stage has only one machine. In the second stage machine is considered as a critical by valid reasons these kind of problems is known as NP hard. A mathematical model developed for the selected problem. The simulation modelling and analysis were carried out in Extend V6 software. The heuristic developed for obtaining optimal lot streaming schedule. The eleven cases of lot streaming were considered. The proposed heuristic was verified and validated by real time simulation experiments. All possible lot streaming strategies and possible sequence under each lot streaming strategy were simulated and examined. The heuristic consistently yielded optimal schedule consistently in all eleven cases. The identification procedure for select best lot streaming strategy was suggested.
The Simplified Aircraft-Based Paired Approach With the ALAS Alerting Algorithm
NASA Technical Reports Server (NTRS)
Perry, Raleigh B.; Madden, Michael M.; Torres-Pomales, Wilfredo; Butler, Ricky W.
2013-01-01
This paper presents the results of an investigation of a proposed concept for closely spaced parallel runways called the Simplified Aircraft-based Paired Approach (SAPA). This procedure depends upon a new alerting algorithm called the Adjacent Landing Alerting System (ALAS). This study used both low fidelity and high fidelity simulations to validate the SAPA procedure and test the performance of the new alerting algorithm. The low fidelity simulation enabled a determination of minimum approach distance for the worst case over millions of scenarios. The high fidelity simulation enabled an accurate determination of timings and minimum approach distance in the presence of realistic trajectories, communication latencies, and total system error for 108 test cases. The SAPA procedure and the ALAS alerting algorithm were applied to the 750-ft parallel spacing (e.g., SFO 28L/28R) approach problem. With the SAPA procedure as defined in this paper, this study concludes that a 750-ft application does not appear to be feasible, but preliminary results for 1000-ft parallel runways look promising.
Autonomous Motion Learning for Intra-Vehicular Activity Space Robot
NASA Astrophysics Data System (ADS)
Watanabe, Yutaka; Yairi, Takehisa; Machida, Kazuo
Space robots will be needed in the future space missions. So far, many types of space robots have been developed, but in particular, Intra-Vehicular Activity (IVA) space robots that support human activities should be developed to reduce human-risks in space. In this paper, we study the motion learning method of an IVA space robot with the multi-link mechanism. The advantage point is that this space robot moves using reaction force of the multi-link mechanism and contact forces from the wall as space walking of an astronaut, not to use a propulsion. The control approach is determined based on a reinforcement learning with the actor-critic algorithm. We demonstrate to clear effectiveness of this approach using a 5-link space robot model by simulation. First, we simulate that a space robot learn the motion control including contact phase in two dimensional case. Next, we simulate that a space robot learn the motion control changing base attitude in three dimensional case.
Li, Zhengdong; Zou, Donghua; Liu, Ningguo; Zhong, Liangwei; Shao, Yu; Wan, Lei; Huang, Ping; Chen, Yijiu
2013-06-10
The elucidation and prediction of the biomechanics of lower limb fractures could serve as a useful tool in forensic practices. Finite element (FE) analysis could potentially help in the understanding of the fracture mechanisms of lower limb fractures frequently caused by car-pedestrian accidents. Our aim was (1) to develop and validate a FE model of the human lower limb, (2) to assess the biomechanics of specific injuries concerning run-over and impact loading conditions, and (3) to reconstruct one real car-pedestrian collision case using the model created in this study. We developed a novel lower limb FE model and simulated three different loading scenarios. The geometry of the model was reconstructed using Mimics 13.0 based on computed tomography (CT) scans from an actual traffic accident. The material properties were based upon a synthesis of data found in published literature. The FE model validation and injury reconstruction were conducted using the LS-DYNA code. The FE model was validated by a comparison of the simulation results of three-point bending, overall lateral impact tests and published postmortem human surrogate (PMHS) results. Simulated loading scenarios of running-over the thigh with a wheel, the impact on the upper leg, and impact on the lower thigh were conducted with velocities of 10 m/s, 20 m/s, and 40 m/s, respectively. We compared the injuries resulting from one actual case with the simulated results in order to explore the possible fracture bio-mechanism. The peak fracture forces, maximum bending moments, and energy lost ratio exhibited no significant differences between the FE simulations and the literature data. Under simulated run-over conditions, the segmental fracture pattern was formed and the femur fracture patterns and mechanisms were consistent with the actual injury features of the case. Our study demonstrated that this simulation method could potentially be effective in identifying forensic cases and exploring of the injury mechanisms of lower limb fractures encountered due to inflicted lesions. This model can also help to distinguish between possible and impossible scenarios. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Data-driven non-linear elasticity: constitutive manifold construction and problem discretization
NASA Astrophysics Data System (ADS)
Ibañez, Ruben; Borzacchiello, Domenico; Aguado, Jose Vicente; Abisset-Chavanne, Emmanuelle; Cueto, Elias; Ladeveze, Pierre; Chinesta, Francisco
2017-11-01
The use of constitutive equations calibrated from data has been implemented into standard numerical solvers for successfully addressing a variety problems encountered in simulation-based engineering sciences (SBES). However, the complexity remains constantly increasing due to the need of increasingly detailed models as well as the use of engineered materials. Data-Driven simulation constitutes a potential change of paradigm in SBES. Standard simulation in computational mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy,\\ldots ), whereas the second one consists of models that scientists have extracted from collected, either natural or synthetic, data. Data-driven (or data-intensive) simulation consists of directly linking experimental data to computers in order to perform numerical simulations. These simulations will employ laws, universally recognized as epistemic, while minimizing the need of explicit, often phenomenological, models. The main drawback of such an approach is the large amount of required data, some of them inaccessible from the nowadays testing facilities. Such difficulty can be circumvented in many cases, and in any case alleviated, by considering complex tests, collecting as many data as possible and then using a data-driven inverse approach in order to generate the whole constitutive manifold from few complex experimental tests, as discussed in the present work.
NASA Astrophysics Data System (ADS)
Bellos, Vasilis; Tsakiris, George
2016-09-01
The study presents a new hybrid method for the simulation of flood events in small catchments. It combines a physically-based two-dimensional hydrodynamic model and the hydrological unit hydrograph theory. Unit hydrographs are derived using the FLOW-R2D model which is based on the full form of two-dimensional Shallow Water Equations, solved by a modified McCormack numerical scheme. The method is tested at a small catchment in a suburb of Athens-Greece for a storm event which occurred in February 2013. The catchment is divided into three friction zones and unit hydrographs of 15 and 30 min are produced. The infiltration process is simulated by the empirical Kostiakov equation and the Green-Ampt model. The results from the implementation of the proposed hybrid method are compared with recorded data at the hydrometric station at the outlet of the catchment and the results derived from the fully hydrodynamic model FLOW-R2D. It is concluded that for the case studied, the proposed hybrid method produces results close to those of the fully hydrodynamic simulation at substantially shorter computational time. This finding, if further verified in a variety of case studies, can be useful in devising effective hybrid tools for the two-dimensional flood simulations, which are lead to accurate and considerably faster results than those achieved by the fully hydrodynamic simulations.
On the spreading and instability of gravity current fronts of arbitrary shape
NASA Astrophysics Data System (ADS)
Zgheib, N.; Bonometti, T.; Balachandar, S.
2012-11-01
Experiments, simulations and theoretical analysis were carried out to study the influence of geometry on the spreading of gravity currents. The horizontal spreading of three different intial planforms of initial release were investigated: an extended ellipse, a cross, and a circle. The experiments used a pulley system for a swift nearly instantaneous release. The case of the axisymmetric cylinder compared favorably with earlier simulations. We ran experiments for multiple aspect ratios for all three configurations. Perhaps the most intriguing of the three cases is the ``ellipse,'' which within a short period of release flipped the major and minor axes. This behavior cannot be captured by current theoretical methods (such as the Box Model). These cases have also been investigated using shallow water and direct numerical simulations. Also, in this study, we investigate the possibility of a Rayleigh-Taylor (RT) instability of the radially moving, but decelerating front. We present a simple theoretical framework based on the inviscid Shallow Water Equations. The theoretical results are supplemented and compared to highly resolved three-dimensional simulations with the Boussinesq approximation. Chateaubriand Fellowship - NSF PIRE grant OISE-0968313.
Optical properties of mineral dust aerosol in the thermal infrared
NASA Astrophysics Data System (ADS)
Köhler, Claas H.
2017-02-01
The optical properties of mineral dust and biomass burning aerosol in the thermal infrared (TIR) are examined by means of Fourier Transform Infrared Spectrometer (FTIR) measurements and radiative transfer (RT) simulations. The measurements were conducted within the scope of the Saharan Mineral Dust Experiment 2 (SAMUM-2) at Praia (Cape Verde) in January and February 2008. The aerosol radiative effect in the TIR atmospheric window region 800-1200 cm-1 (8-12 µm) is discussed in two case studies. The first case study employs a combination of IASI measurements and RT simulations to investigate a lofted optically thin biomass burning layer with emphasis on its potential influence on sea surface temperature (SST) retrieval. The second case study uses ground based measurements to establish the importance of particle shape and refractive index for benchmark RT simulations of dust optical properties in the TIR domain. Our research confirms earlier studies suggesting that spheroidal model particles lead to a significantly improved agreement between RT simulations and measurements compared to spheres. However, room for improvement remains, as the uncertainty originating from the refractive index data for many aerosol constituents prohibits more conclusive results.
Goh, Yang Miang; Askar Ali, Mohamed Jawad
2016-08-01
One of the key challenges in improving construction safety and health is the management of safety behavior. From a system point of view, workers work unsafely due to system level issues such as poor safety culture, excessive production pressure, inadequate allocation of resources and time and lack of training. These systemic issues should be eradicated or minimized during planning. However, there is a lack of detailed planning tools to help managers assess the impact of their upstream decisions on worker safety behavior. Even though simulation had been used in construction planning, the review conducted in this study showed that construction safety management research had not been exploiting the potential of simulation techniques. Thus, a hybrid simulation framework is proposed to facilitate integration of safety management considerations into construction activity simulation. The hybrid framework consists of discrete event simulation (DES) as the core, but heterogeneous, interactive and intelligent (able to make decisions) agents replace traditional entities and resources. In addition, some of the cognitive processes and physiological aspects of agents are captured using system dynamics (SD) approach. The combination of DES, agent-based simulation (ABS) and SD allows a more "natural" representation of the complex dynamics in construction activities. The proposed hybrid framework was demonstrated using a hypothetical case study. In addition, due to the lack of application of factorial experiment approach in safety management simulation, the case study demonstrated sensitivity analysis and factorial experiment to guide future research. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ören, Ünal; Hiller, Mauritius; Andersson, M
2017-04-28
A Monte Carlo-based stand-alone program, IDACstar (Internal Dose Assessment by Computer), was developed, dedicated to perform radiation dose calculations using complex voxel simulations. To test the program, two irradiation situations were simulated, one hypothetical contamination case with 600 MBq of 99mTc and one extravasation case involving 370 MBq of 18F-FDG. The effective dose was estimated to be 0.042 mSv for the contamination case and 4.5 mSv for the extravasation case. IDACstar has demonstrated that dosimetry results from contamination or extravasation cases can be acquired with great ease. An effective tool for radiation protection applications is provided with IDACstar allowing physicists at nuclear medicine departments to easily quantify the radiation risk of stochastic effects when a radiation accident has occurred. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
An interior-point method-based solver for simulation of aircraft parts riveting
NASA Astrophysics Data System (ADS)
Stefanova, Maria; Yakunin, Sergey; Petukhova, Margarita; Lupuleac, Sergey; Kokkolaras, Michael
2018-05-01
The particularities of the aircraft parts riveting process simulation necessitate the solution of a large amount of contact problems. A primal-dual interior-point method-based solver is proposed for solving such problems efficiently. The proposed method features a worst case polynomial complexity bound ? on the number of iterations, where n is the dimension of the problem and ε is a threshold related to desired accuracy. In practice, the convergence is often faster than this worst case bound, which makes the method applicable to large-scale problems. The computational challenge is solving the system of linear equations because the associated matrix is ill conditioned. To that end, the authors introduce a preconditioner and a strategy for determining effective initial guesses based on the physics of the problem. Numerical results are compared with ones obtained using the Goldfarb-Idnani algorithm. The results demonstrate the efficiency of the proposed method.
Kocic, Ivana; Homsek, Irena; Dacevic, Mirjana; Grbic, Sandra; Parojcic, Jelena; Vucicevic, Katarina; Prostran, Milica; Miljkovic, Branislava
2012-04-01
The aim of this case study was to develop a drug-specific absorption model for levothyroxine (LT4) using mechanistic gastrointestinal simulation technology (GIST) implemented in the GastroPlus™ software package. The required input parameters were determined experimentally, in silico predicted and/or taken from the literature. The simulated plasma profile was similar and in a good agreement with the data observed in the in vivo bioequivalence study, indicating that the GIST model gave an accurate prediction of LT4 oral absorption. Additionally, plasma concentration-time profiles were simulated based on a set of experimental and virtual in vitro dissolution data in order to estimate the influence of different in vitro drug dissolution kinetics on the simulated plasma profiles and to identify biorelevant dissolution specification for LT4 immediate-release (IR) tablets. A set of experimental and virtual in vitro data was also used for correlation purposes. In vitro-in vivo correlation model based on the convolution approach was applied in order to assess the relationship between the in vitro and in vivo data. The obtained results suggest that dissolution specification of more than 85% LT4 dissolved in 60 min might be considered as biorelevant dissolution specification criteria for LT4 IR tablets. Copyright © 2012 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grinstein, F. F.; Saenz, J. A.; Dolence, J. C.
Inmore » this paper, transition and turbulence decay with the Taylor–Green vortex have been effectively used to demonstrate emulation of high Reynolds-number ( R e ) physical dissipation through numerical convective effects of various non-oscillatory finite-volume algorithms for implicit large eddy simulation (ILES), e.g. using the Godunov-based Eulerian adaptive mesh refinement code xRAGE. The inverse-chevron shock tube experiment simulations have been also used to assess xRAGE based ILES for shock driven turbulent mixing, compared with available simulation and laboratory data. The previous assessments are extended to evaluate new directionally-unsplit high-order algorithms in xRAGE, including a correction to address the well-known issue of excessive numerical diffusion of shock-capturing (e.g., Godunov-type) schemes for low Mach numbers. The unsplit options for hydrodynamics in xRAGE are discussed in detail, followed by fundamental tests with representative shock problems. Basic issues of transition to turbulence and turbulent mixing are discussed, and results of simulations of high- R e turbulent flow and mixing in canonical test cases are reported. Finally, compared to the directional-split cases, and for each grid resolution considered, unsplit results exhibit transition to turbulence with much higher effective R e —and significantly more so with the low Mach number correction.« less
1D kinetic simulations of a short glow discharge in helium
NASA Astrophysics Data System (ADS)
Yuan, Chengxun; Bogdanov, E. A.; Eliseev, S. I.; Kudryavtsev, A. A.
2017-07-01
This paper presents a 1D model of a direct current glow discharge based on the solution of the kinetic Boltzmann equation in the two-term approximation. The model takes into account electron-electron coulomb collisions, the corresponding collision integral is written in both detailed and simplified forms. The Boltzmann equation for electrons is coupled with continuity equations for ions and metastable atoms and the Poisson equation for electric potential. Simulations are carried out self-consistently for the whole length of discharge in helium (from cathode to anode) for cases p = 1 Torr, L = 3.6 cm and p = 20 Torr, L = 1.8 mm, so that pL = 3.6 cm.Torr in both cases. It is shown that simulations based on the kinetic approach give lower values of electron temperature in plasma than fluid simulations. Peaks in spatial differential flux corresponding to the electrons originating from superelastic collisions and Penning ionization were observed in simulations. Different approaches of taking coulomb collisions into account give significantly different values of electron density and electron temperature in plasma. Analysis showed that using a simplified approach gives a non-zero contribution to the electron energy balance, which is comparable to energy losses on elastic and inelastic collisions and leads to significant errors and thus is not recommended.
Grinstein, F. F.; Saenz, J. A.; Dolence, J. C.; ...
2018-06-07
Inmore » this paper, transition and turbulence decay with the Taylor–Green vortex have been effectively used to demonstrate emulation of high Reynolds-number ( R e ) physical dissipation through numerical convective effects of various non-oscillatory finite-volume algorithms for implicit large eddy simulation (ILES), e.g. using the Godunov-based Eulerian adaptive mesh refinement code xRAGE. The inverse-chevron shock tube experiment simulations have been also used to assess xRAGE based ILES for shock driven turbulent mixing, compared with available simulation and laboratory data. The previous assessments are extended to evaluate new directionally-unsplit high-order algorithms in xRAGE, including a correction to address the well-known issue of excessive numerical diffusion of shock-capturing (e.g., Godunov-type) schemes for low Mach numbers. The unsplit options for hydrodynamics in xRAGE are discussed in detail, followed by fundamental tests with representative shock problems. Basic issues of transition to turbulence and turbulent mixing are discussed, and results of simulations of high- R e turbulent flow and mixing in canonical test cases are reported. Finally, compared to the directional-split cases, and for each grid resolution considered, unsplit results exhibit transition to turbulence with much higher effective R e —and significantly more so with the low Mach number correction.« less
Collaborative simulation method with spatiotemporal synchronization process control
NASA Astrophysics Data System (ADS)
Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian
2016-10-01
When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.
Three-Dimensional Imaging in Rhinoplasty: A Comparison of the Simulated versus Actual Result.
Persing, Sarah; Timberlake, Andrew; Madari, Sarika; Steinbacher, Derek
2018-05-22
Computer imaging has become increasingly popular for rhinoplasty. Three-dimensional (3D) analysis permits a more comprehensive view from multiple vantage points. However, the predictability and concordance between the simulated and actual result have not been morphometrically studied. The purpose of this study was to aesthetically and quantitatively compare the simulated to actual rhinoplasty result. A retrospective review of 3D images (VECTRA, Canfield) for rhinoplasty patients was performed. Images (preop, simulated, and actual) were randomized. A blinded panel of physicians rated the images (1 = poor, 5 = excellent). The image series considered "best" was also recorded. A quantitative assessment of nasolabial angle and tip projection was compared. Paired and two-sample t tests were performed for statistical analysis (P < 0.05 as significant). Forty patients were included. 67.5% of preoperative images were rated as poor (mean = 1.7). The simulation received a mean score of 2.9 (good in 60% of cases). 82.5% of actual cases were rated good to excellent (mean 3.4) (P < 0.001). Overall, the panel significantly preferred the actual postoperative result in 77.5% of cases compared to the simulation in 22.5% of cases (P < 0.001). The actual nasal tip was more projected compared to the simulations for both males and females. There was no significant difference in nasal tip rotation between simulated and postoperative groups. 3D simulation is a powerful communication and planning tool in rhinoplasty. In this study, the actual result was deemed more aesthetic than the simulated image. Surgeon experience is important to translate the plan and achieve favorable postoperative results. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .
NASA Astrophysics Data System (ADS)
Mooers, Christopher N. K.; Bang, Inkweon; Sandoval, Francisco J.
2005-06-01
The Princeton Ocean Model (POM), as implemented for the Japan (East) Sea (JES) with mesoscale-admitting resolution is driven by seasonal throughflow and synoptic atmospheric forcing for 1999 through 2001. Temperature and salinity profiles from shipborne and PALACE float CTDs, and horizontal velocities at 800 m from PALACE float trajectories, plus horizontal velocities at 15 m from WOCE surface drifters for 1988 through 2001, are used to assess the performance of the numerical simulations for a base case. General agreement exists in the circulation at 15 and 800 m and the horizontal and vertical structure of the upper ocean temperature and salinity fields. The mean observed flow at 15 m defines the two branches of the Tsushima Warm Current and hints at the existence of a large cyclonic gyre over the Japan Basin, which the simulations also produce. The mean observed flow at 800 m defines a large cyclonic recirculation gyre over the Japan Basin that validates the simulated flow pattern. Variances of the observed and simulated flows at 15 and 800 m have similar patterns. The main discrepancies are associated with the strength of the seasonal thermocline and halocline and the location of the Subpolar Front. When smoother topography and smaller lateral friction are used in other cases, the thermocline and halocline strengthen, agreeing better with the observed values, and when 80% of total outflow transport is forced to exit through Soya Strait, the Subpolar Front extends along the coast to the north of Tsugaru Strait, which is an observed feature absent in the base case.
NASA Astrophysics Data System (ADS)
Petsev, Nikolai D.; Leal, L. Gary; Shell, M. Scott
2017-12-01
Hybrid molecular-continuum simulation techniques afford a number of advantages for problems in the rapidly burgeoning area of nanoscale engineering and technology, though they are typically quite complex to implement and limited to single-component fluid systems. We describe an approach for modeling multicomponent hydrodynamic problems spanning multiple length scales when using particle-based descriptions for both the finely resolved (e.g., molecular dynamics) and coarse-grained (e.g., continuum) subregions within an overall simulation domain. This technique is based on the multiscale methodology previously developed for mesoscale binary fluids [N. D. Petsev, L. G. Leal, and M. S. Shell, J. Chem. Phys. 144, 084115 (2016)], simulated using a particle-based continuum method known as smoothed dissipative particle dynamics. An important application of this approach is the ability to perform coupled molecular dynamics (MD) and continuum modeling of molecularly miscible binary mixtures. In order to validate this technique, we investigate multicomponent hybrid MD-continuum simulations at equilibrium, as well as non-equilibrium cases featuring concentration gradients.
NASA Astrophysics Data System (ADS)
Schilling, Oleg
2016-11-01
Two-, three- and four-equation, single-velocity, multicomponent Reynolds-averaged Navier-Stokes (RANS) models, based on the turbulent kinetic energy dissipation rate or lengthscale, are used to simulate At = 0 . 5 Rayleigh-Taylor turbulent mixing with constant and complex accelerations. The constant acceleration case is inspired by the Cabot and Cook (2006) DNS, and the complex acceleration cases are inspired by the unstable/stable and unstable/neutral cases simulated using DNS (Livescu, Wei & Petersen 2011) and the unstable/stable/unstable case simulated using ILES (Ramaprabhu, Karkhanis & Lawrie 2013). The four-equation models couple equations for the mass flux a and negative density-specific volume correlation b to the K- ɛ or K- L equations, while the three-equation models use a two-fluid algebraic closure for b. The lengthscale-based models are also applied with no buoyancy production in the L equation to explore the consequences of neglecting this term. Predicted mixing widths, turbulence statistics, fields, and turbulent transport equation budgets are compared among these models to identify similarities and differences in the turbulence production, dissipation and diffusion physics represented by the closures used in these models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
The many-body Wigner Monte Carlo method for time-dependent ab-initio quantum simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sellier, J.M., E-mail: jeanmichel.sellier@parallel.bas.bg; Dimov, I.
2014-09-15
The aim of ab-initio approaches is the simulation of many-body quantum systems from the first principles of quantum mechanics. These methods are traditionally based on the many-body Schrödinger equation which represents an incredible mathematical challenge. In this paper, we introduce the many-body Wigner Monte Carlo method in the context of distinguishable particles and in the absence of spin-dependent effects. Despite these restrictions, the method has several advantages. First of all, the Wigner formalism is intuitive, as it is based on the concept of a quasi-distribution function. Secondly, the Monte Carlo numerical approach allows scalability on parallel machines that is practicallymore » unachievable by means of other techniques based on finite difference or finite element methods. Finally, this method allows time-dependent ab-initio simulations of strongly correlated quantum systems. In order to validate our many-body Wigner Monte Carlo method, as a case study we simulate a relatively simple system consisting of two particles in several different situations. We first start from two non-interacting free Gaussian wave packets. We, then, proceed with the inclusion of an external potential barrier, and we conclude by simulating two entangled (i.e. correlated) particles. The results show how, in the case of negligible spin-dependent effects, the many-body Wigner Monte Carlo method provides an efficient and reliable tool to study the time-dependent evolution of quantum systems composed of distinguishable particles.« less
NASA Astrophysics Data System (ADS)
Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi
2015-11-01
Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.
Schaumberg, A
2015-04-01
Simulation often relies on a case-based learning approach and is used as a teaching tool for a variety of audiences. The knowledge transfer goes beyond the mere exchange of soft skills and practical abilities and also includes practical knowledge and decision-making behavior; however, verification of knowledge or practical skills seldom unfolds during simulations. Simulation-based learning seems to affect many learning domains and can, therefore, be considered to be multifactorial in nature. At present, studies examining the effects of learning environments with varying levels of reality on the cognitive long-term retention of students are lacking. The present study focused on the question whether case scenarios with varying levels of reality produce differences in the cognitive long-term retention of students, in particular with regard to the learning dimensions knowledge, understanding and transfer. The study was conducted on 153 students in the first clinical semester at the Justus-Liebig University of Giessen. Students were randomly selected and subsequently assigned, also in a random fashion, to two practice groups, i.e. realistic and unrealistic. In both groups the students were presented with standardized case scenarios consisting of three case studies, which were accurately defined with a case report containing a detailed description of each scenario and all relevant values so as to ensure identical conditions for both groups. The unrealistic group sat in an unfurnished practice room as a learning environment. The realistic group sat in a furnished learning environment with various background pictures and ambient noise. Students received examination questions before, immediately following and 14 days after the practice. Examination questions were identical at each of the three time points, classified into three learning dimensions following Bloom's taxonomy and evaluated. Furthermore, examination questions were supplemented by a questionnaire concerning the individual perception of reality and own learning success, to be filled in by students immediately after the practice. Examination questions and questionnaires were anonymous but associated with each other. Even with less experienced participants, realistic simulation design led to a significant increase of knowledge immediately after the end of the simulation. This effect, however, did not impact the cognitive long-term retention of students. While the realistic group showed a higher initial knowledge after the simulation, this "knowledge delta" was forgotten within 14 days, putting them back on par with the unrealistic comparison group. It could be significantly demonstrated that 2 weeks after the practice, comprehension questions were answered better than those on pure knowledge. Therefore, it can be concluded that even vaguely realistic simulation scenarios affect the learning dimension of understanding. For simulation-based learning the outcome depends not only on knowledge, practical skills and motivational variables but also on the onset of negative emotions, perception of own ability and personality profile. Simulation training alone does not appear to guarantee learning success but it seems to be necessary to establish a simulation setting suitable for the education level, needs and personality characteristics of the students.
A Simulation for Teaching the Basic and Clinical Science of Fluid Therapy
ERIC Educational Resources Information Center
Rawson, Richard E.; Dispensa, Marilyn E.; Goldstein, Richard E.; Nicholson, Kimberley W.; Vidal, Noni Korf
2009-01-01
The course "Management of Fluid and Electrolyte Disorders" is an applied physiology course taught using lectures and paper-based cases. The course approaches fluid therapy from both basic science and clinical perspectives. While paper cases provide a basis for application of basic science concepts, they lack key components of genuine clinical…
NASA Astrophysics Data System (ADS)
Wang, XiaoLiang; Li, JiaChun
2017-12-01
A new solver based on the high-resolution scheme with novel treatments of source terms and interface capture for the Savage-Hutter model is developed to simulate granular avalanche flows. The capability to simulate flow spread and deposit processes is verified through indoor experiments of a two-dimensional granular avalanche. Parameter studies show that reduction in bed friction enhances runout efficiency, and that lower earth pressure restraints enlarge the deposit spread. The April 9, 2000, Yigong avalanche in Tibet, China, is simulated as a case study by this new solver. The predicted results, including evolution process, deposit spread, and hazard impacts, generally agree with site observations. It is concluded that the new solver for the Savage-Hutter equation provides a comprehensive software platform for granular avalanche simulation at both experimental and field scales. In particular, the solver can be a valuable tool for providing necessary information for hazard forecasts, disaster mitigation, and countermeasure decisions in mountainous areas.
He, Xueqin; Han, Lujia; Ge, Jinyi; Huang, Guangqun
2018-04-01
This study establishes an optimal mathematical modelling to rationally describe the dynamic changes and spatial distribution of temperature and oxygen concentration in the aerobic composting process using coupling mass-heat-momentum transfer based on the microbial mechanism. Two different conditional composting experiments, namely continuous aeration and intermittent aeration, were performed to verify the proposed model. The results show that the model accurately predicted the dynamic changes in temperature (case I: R 2 = 0.93, RMSE = 1.95 K; case II: R 2 = 0.86, RMSE = 4.69 K) and oxygen concentration (case I: R 2 = 0.90, RMSE = 1.26%; case II: R 2 = 0.75, RMSE = 2.93%) in the central point of compost substrates. It also systematically simulated fluctuations in oxygen concentration caused by boundary conditions and the spatial distribution of the actual temperature and oxygen concentration. The proposed model exhibits good applicability in simulating the actual working conditions of aerobic composting process. Copyright © 2018 Elsevier Ltd. All rights reserved.
A parabolic model of drag coefficient for storm surge simulation in the South China Sea
Peng, Shiqiu; Li, Yineng
2015-01-01
Drag coefficient (Cd) is an essential metric in the calculation of momentum exchange over the air-sea interface and thus has large impacts on the simulation or forecast of the upper ocean state associated with sea surface winds such as storm surges. Generally, Cd is a function of wind speed. However, the exact relationship between Cd and wind speed is still in dispute, and the widely-used formula that is a linear function of wind speed in an ocean model could lead to large bias at high wind speed. Here we establish a parabolic model of Cd based on storm surge observations and simulation in the South China Sea (SCS) through a number of tropical cyclone cases. Simulation of storm surges for independent Tropical cyclones (TCs) cases indicates that the new parabolic model of Cd outperforms traditional linear models. PMID:26499262
A parabolic model of drag coefficient for storm surge simulation in the South China Sea.
Peng, Shiqiu; Li, Yineng
2015-10-26
Drag coefficient (Cd) is an essential metric in the calculation of momentum exchange over the air-sea interface and thus has large impacts on the simulation or forecast of the upper ocean state associated with sea surface winds such as storm surges. Generally, Cd is a function of wind speed. However, the exact relationship between Cd and wind speed is still in dispute, and the widely-used formula that is a linear function of wind speed in an ocean model could lead to large bias at high wind speed. Here we establish a parabolic model of Cd based on storm surge observations and simulation in the South China Sea (SCS) through a number of tropical cyclone cases. Simulation of storm surges for independent Tropical cyclones (TCs) cases indicates that the new parabolic model of Cd outperforms traditional linear models.
A parabolic model of drag coefficient for storm surge simulation in the South China Sea
NASA Astrophysics Data System (ADS)
Peng, Shiqiu; Li, Yineng
2015-10-01
Drag coefficient (Cd) is an essential metric in the calculation of momentum exchange over the air-sea interface and thus has large impacts on the simulation or forecast of the upper ocean state associated with sea surface winds such as storm surges. Generally, Cd is a function of wind speed. However, the exact relationship between Cd and wind speed is still in dispute, and the widely-used formula that is a linear function of wind speed in an ocean model could lead to large bias at high wind speed. Here we establish a parabolic model of Cd based on storm surge observations and simulation in the South China Sea (SCS) through a number of tropical cyclone cases. Simulation of storm surges for independent Tropical cyclones (TCs) cases indicates that the new parabolic model of Cd outperforms traditional linear models.
Blended learning in surgery using the Inmedea Simulator.
Funke, Katrin; Bonrath, Esther; Mardin, Wolf Arif; Becker, Jan Carl; Haier, Joerg; Senninger, Norbert; Vowinkel, Thorsten; Hoelzen, Jens Peter; Mees, Soeren Torge
2013-02-01
Recently, medical education in surgery has experienced several modifications. We have implemented a blended learning module in our teaching curriculum to evaluate its effectiveness, applicability, and acceptance in surgical education. In this prospective study, the traditional face-to-face learning of our teaching curriculum for fourth-year medical students (n = 116) was augmented by the Inmedea Simulator, a web-based E-learning system, with six virtual patient cases. Student results were documented by the system and learning success was determined by comparing patient cases with comparable diseases (second and sixth case). The acceptance among the students was evaluated with a questionnaire. After using the Inmedea Simulator, correct diagnoses were found significantly (P < 0.05) more often, while an incomplete diagnostic was seen significantly (P < 0.05) less often. Significant overall improvement (P < 0.05) was seen in sixth case (62.3 ± 5.6 %) vs. second case (53.9 ± 5.6 %). The questionnaire revealed that our students enjoyed the surgical seminar (score 2.1 ± 1.5) and preferred blended learning (score 2.5 ± 1.2) to conventional teaching. The blended learning approach using the Inmedea Simulator was highly appreciated by our medical students and resulted in a significant learning success. Blended learning appears to be a suitable tool to complement traditional teaching in surgery.
Model-based verification and validation of the SMAP uplink processes
NASA Astrophysics Data System (ADS)
Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.
Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.
NASA Astrophysics Data System (ADS)
Sanchez, K.; Roberts, G.; Calmer, R.; Nicoll, K.; Hashimshoni, E.; Rosenfeld, D.; Ovadnevaite, J.; Preissler, J.; Ceburnis, D.; O'Dowd, C. D. D.; Russell, L. M.
2017-12-01
Top-down and bottom-up aerosol-cloud shortwave radiative flux closures were conducted at the Mace Head atmospheric research station in Galway, Ireland in August 2015. Instrument platforms include ground-based, unmanned aerial vehicles (UAV), and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN) concentration were used to initiate a 1D microphysical aerosol-cloud parcel model (ACPM). UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction, or a 5-hole probe for 3D wind vectors. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in-situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF) by between 25 W m-2 and 60 W m-2. After accounting for entrainment, satellite-derived cloud droplet number concentrations (CDNC) were within 30% of simulated CDNC. In cases with a well-mixed boundary layer, δRF is no greater than 20 W m-2 after accounting for cloud-top entrainment, and up to 50 W m-2 when entrainment is not taken into account. In cases with a decoupled boundary layer, cloud microphysical properties are inconsistent with ground-based aerosol measurements, as expected, and δRF is as high as 88 W m-2, even high (> 30 W m-2) after accounting for cloud-top entrainment. This work demonstrates the need to take in-situ measurements of aerosol properties for cases where the boundary layer is decoupled as well as consider cloud-top entrainment to accurately model stratocumulus cloud radiative flux.
NASA Astrophysics Data System (ADS)
Sanchez, K.; Roberts, G.; Calmer, R.; Nicoll, K.; Hashimshoni, E.; Rosenfeld, D.; Ovadnevaite, J.; Preissler, J.; Ceburnis, D.; O'Dowd, C. D. D.; Russell, L. M.
2016-12-01
Top-down and bottom-up aerosol-cloud shortwave radiative flux closures were conducted at the Mace Head atmospheric research station in Galway, Ireland in August 2015. Instrument platforms include ground-based, unmanned aerial vehicles (UAV), and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN) concentration were used to initiate a 1D microphysical aerosol-cloud parcel model (ACPM). UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction, or a 5-hole probe for 3D wind vectors. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in-situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF) by between 25 W m-2 and 60 W m-2. After accounting for entrainment, satellite-derived cloud droplet number concentrations (CDNC) were within 30% of simulated CDNC. In cases with a well-mixed boundary layer, δRF is no greater than 20 W m-2 after accounting for cloud-top entrainment, and up to 50 W m-2 when entrainment is not taken into account. In cases with a decoupled boundary layer, cloud microphysical properties are inconsistent with ground-based aerosol measurements, as expected, and δRF is as high as 88 W m-2, even high (> 30 W m-2) after accounting for cloud-top entrainment. This work demonstrates the need to take in-situ measurements of aerosol properties for cases where the boundary layer is decoupled as well as consider cloud-top entrainment to accurately model stratocumulus cloud radiative flux.
Generalized Sheet Transition Condition FDTD Simulation of Metasurface
NASA Astrophysics Data System (ADS)
Vahabzadeh, Yousef; Chamanara, Nima; Caloz, Christophe
2018-01-01
We propose an FDTD scheme based on Generalized Sheet Transition Conditions (GSTCs) for the simulation of polychromatic, nonlinear and space-time varying metasurfaces. This scheme consists in placing the metasurface at virtual nodal plane introduced between regular nodes of the staggered Yee grid and inserting fields determined by GSTCs in this plane in the standard FDTD algorithm. The resulting update equations are an elegant generalization of the standard FDTD equations. Indeed, in the limiting case of a null surface susceptibility ($\\chi_\\text{surf}=0$), they reduce to the latter, while in the next limiting case of a time-invariant metasurface $[\\chi_\\text{surf}\
Ahmed, Aqeel; Rippmann, Friedrich; Barnickel, Gerhard; Gohlke, Holger
2011-07-25
A three-step approach for multiscale modeling of protein conformational changes is presented that incorporates information about preferred directions of protein motions into a geometric simulation algorithm. The first two steps are based on a rigid cluster normal-mode analysis (RCNMA). Low-frequency normal modes are used in the third step (NMSim) to extend the recently introduced idea of constrained geometric simulations of diffusive motions in proteins by biasing backbone motions of the protein, whereas side-chain motions are biased toward favorable rotamer states. The generated structures are iteratively corrected regarding steric clashes and stereochemical constraint violations. The approach allows performing three simulation types: unbiased exploration of conformational space; pathway generation by a targeted simulation; and radius of gyration-guided simulation. When applied to a data set of proteins with experimentally observed conformational changes, conformational variabilities are reproduced very well for 4 out of 5 proteins that show domain motions, with correlation coefficients r > 0.70 and as high as r = 0.92 in the case of adenylate kinase. In 7 out of 8 cases, NMSim simulations starting from unbound structures are able to sample conformations that are similar (root-mean-square deviation = 1.0-3.1 Å) to ligand bound conformations. An NMSim generated pathway of conformational change of adenylate kinase correctly describes the sequence of domain closing. The NMSim approach is a computationally efficient alternative to molecular dynamics simulations for conformational sampling of proteins. The generated conformations and pathways of conformational transitions can serve as input to docking approaches or as starting points for more sophisticated sampling techniques.
Effects of a System Thinking-Based Simulation Program for Congestive Heart Failure.
Kim, Hyeon-Young; Yun, Eun Kyoung
2018-03-01
This study evaluated a system thinking-based simulation program for the care of patients with congestive heart failure. Participants were 67 undergraduate nursing students from a nursing college in Seoul, South Korea. The experimental group was given a 4-hour system-thinking program and a 2-hour simulation program, whereas the control group had a 4-hour case study and a 2-hour simulation program. There were significant improvements in critical thinking in both groups, but no significant group differences between educational methods (F = 3.26, P = .076). Problem-solving ability in the experimental group was significantly higher than in the control group (F = 5.04, P = .028). Clinical competency skills in the experimental group were higher than in the control group (t = 2.12, P = .038). A system thinking-based simulation program is a more effective learning method in terms of problem-solving ability and clinical competency skills compared to the existing simulation program. Further research using a longitudinal study is needed to test the long-term effect of the intervention and apply it to the nursing curriculum.
Cow-specific treatment of clinical mastitis: an economic approach.
Steeneveld, W; van Werven, T; Barkema, H W; Hogeveen, H
2011-01-01
Under Dutch circumstances, most clinical mastitis (CM) cases of cows on dairy farms are treated with a standard intramammary antimicrobial treatment. Several antimicrobial treatments are available for CM, differing in antimicrobial compound, route of application, duration, and cost. Because cow factors (e.g., parity, stage of lactation, and somatic cell count history) and the causal pathogen influence the probability of cure, cow-specific treatment of CM is often recommended. The objective of this study was to determine if cow-specific treatment of CM is economically beneficial. Using a stochastic Monte Carlo simulation model, 20,000 CM cases were simulated. These CM cases were caused by Streptococcus uberis and Streptococcus dysgalactiae (40%), Staphylococcus aureus (30%), or Escherichia coli (30%). For each simulated CM case, the consequences of using different antimicrobial treatment regimens (standard 3-d intramammary, extended 5-d intramammary, combination 3-d intramammary+systemic, combination 3-d intramammary+systemic+1-d nonsteroidal antiinflammatory drugs, and combination extended 5-d intramammary+systemic) were simulated simultaneously. Finally, total costs of the 5 antimicrobial treatment regimens were compared. Some inputs for the model were based on literature information and assumptions made by the authors were used if no information was available. Bacteriological cure for each individual cow depended on the antimicrobial treatment regimen, the causal pathogen, and the cow factors parity, stage of lactation, somatic cell count history, CM history, and whether the cow was systemically ill. Total costs for each case depended on treatment costs for the initial CM case (including costs for antibiotics, milk withdrawal, and labor), treatment costs for follow-up CM cases, costs for milk production losses, and costs for culling. Average total costs for CM using the 5 treatments were (US) $224, $247, $253, $260, and $275, respectively. Average probabilities of bacteriological cure for the 5 treatments were 0.53, 0.65, 0.65, 0.68, and 0.75, respectively. For all different simulated CM cases, the standard 3-d intramammary antimicrobial treatment had the lowest total costs. The benefits of lower costs for milk production losses and culling for cases treated with the intensive treatments did not outweigh the higher treatment costs. The stochastic model was developed using information from the literature and assumptions made by the authors. Using these information sources resulted in a difference in effectiveness of different antimicrobial treatments for CM. Based on our assumptions, cow-specific treatment of CM was not economically beneficial. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Stueber, Thomas J.
2013-01-01
A dual flow-path inlet system is being tested to evaluate methodologies for a Turbine Based Combined Cycle (TBCC) propulsion system to perform a controlled inlet mode transition. Prior to experimental testing, simulation models are used to test, debug, and validate potential control algorithms. One simulation package being used for testing is the High Mach Transient Engine Cycle Code simulation, known as HiTECC. This paper discusses the closed loop control system, which utilizes a shock location sensor to improve inlet performance and operability. Even though the shock location feedback has a coarse resolution, the feedback allows for a reduction in steady state error and, in some cases, better performance than with previous proposed pressure ratio based methods. This paper demonstrates the design and benefit with the implementation of a proportional-integral controller, an H-Infinity based controller, and a disturbance observer based controller.
Simulation of biochemical reactions with time-dependent rates by the rejection-based algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thanh, Vo Hong, E-mail: vo@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; Department of Mathematics, University of Trento, Trento
We address the problem of simulating biochemical reaction networks with time-dependent rates and propose a new algorithm based on our rejection-based stochastic simulation algorithm (RSSA) [Thanh et al., J. Chem. Phys. 141(13), 134116 (2014)]. The computation for selecting next reaction firings by our time-dependent RSSA (tRSSA) is computationally efficient. Furthermore, the generated trajectory is exact by exploiting the rejection-based mechanism. We benchmark tRSSA on different biological systems with varying forms of reaction rates to demonstrate its applicability and efficiency. We reveal that for nontrivial cases, the selection of reaction firings in existing algorithms introduces approximations because the integration of reactionmore » rates is very computationally demanding and simplifying assumptions are introduced. The selection of the next reaction firing by our approach is easier while preserving the exactness.« less
A multimedia patient simulation for teaching and assessing endodontic diagnosis.
Littlefield, John H; Demps, Elaine L; Keiser, Karl; Chatterjee, Lipika; Yuan, Cheng H; Hargreaves, Kenneth M
2003-06-01
Teaching and assessing diagnostic skills are difficult due to relatively small numbers of total clinical experiences and a shortage of clinical faculty. Patient simulations could help teach and assess diagnosis by displaying a well-defined diagnostic task, then providing informative feedback and opportunities for repetition and correction of errors. This report describes the development and initial evaluation of SimEndo I, a multimedia patient simulation program that could be used for teaching or assessing endodontic diagnosis. Students interact with a graphical interface that has four pull-down menus and related submenus. In response to student requests, the program presents patient information. Scoring is based on diagnosis of each case by endodontists. Pilot testing with seventy-four junior dental students identified numerous needed improvements to the user interface program. A multi-school field test of the interface program using three patient cases addressed three research questions: 1) How did the field test students evaluate SimEndo I? Overall mean evaluation was 8.1 on a 0 to 10 scale; 2) How many cases are needed to generate a reproducible diagnostic proficiency score for an individual student using the Rimoldi scoring procedure? Mean diagnostic proficiency scores by case ranged from .27 to .40 on a 0 to 1 scale; five cases would produce a score with a 0.80 reliability coefficient; and 3) Did students accurately diagnose each case? Mean correct diagnosis scores by case ranged from .54 to .78 on a 0 to 1 scale. We conclude that multimedia patient simulations offer a promising alternative for teaching and assessing student diagnostic skills.
2015-01-01
Computational simulations are currently used to identify epidemic dynamics, to test potential prevention and intervention strategies, and to study the effects of social behaviors on HIV transmission. The author describes an agent-based epidemic simulation model of a network of individuals who participate in high-risk sexual practices, using number of partners, condom usage, and relationship length to distinguish between high- and low-risk populations. Two new concepts—free links and fixed links—are used to indicate tendencies among individuals who either have large numbers of short-term partners or stay in long-term monogamous relationships. An attempt was made to reproduce epidemic curves of reported HIV cases among male homosexuals in Taiwan prior to using the agent-based model to determine the effects of various policies on epidemic dynamics. Results suggest that when suitable adjustments are made based on available social survey statistics, the model accurately simulates real-world behaviors on a large scale. PMID:25815047
Ikebe, Jinzen; Umezawa, Koji; Higo, Junichi
2016-03-01
Molecular dynamics (MD) simulations using all-atom and explicit solvent models provide valuable information on the detailed behavior of protein-partner substrate binding at the atomic level. As the power of computational resources increase, MD simulations are being used more widely and easily. However, it is still difficult to investigate the thermodynamic properties of protein-partner substrate binding and protein folding with conventional MD simulations. Enhanced sampling methods have been developed to sample conformations that reflect equilibrium conditions in a more efficient manner than conventional MD simulations, thereby allowing the construction of accurate free-energy landscapes. In this review, we discuss these enhanced sampling methods using a series of case-by-case examples. In particular, we review enhanced sampling methods conforming to trivial trajectory parallelization, virtual-system coupled multicanonical MD, and adaptive lambda square dynamics. These methods have been recently developed based on the existing method of multicanonical MD simulation. Their applications are reviewed with an emphasis on describing their practical implementation. In our concluding remarks we explore extensions of the enhanced sampling methods that may allow for even more efficient sampling.
Large-eddy simulation of turbulent flow with a surface-mounted two-dimensional obstacle
NASA Technical Reports Server (NTRS)
Yang, Kyung-Soo; Ferziger, Joel H.
1993-01-01
In this paper, we perform a large eddy simulation (LES) of turbulent flow in a channel containing a two-dimensional obstacle on one wall using a dynamic subgrid-scale model (DSGSM) at Re = 3210, based on bulk velocity above the obstacle and obstacle height; the wall layers are fully resolved. The low Re enables us to perform a DNS (Case 1) against which to validate the LES results. The LES with the DSGSM is designated Case 2. In addition, an LES with the conventional fixed model constant (Case 3) is conducted to allow identification of improvements due to the DSGSM. We also include LES at Re = 82,000 (Case 4) using conventional Smagorinsky subgrid-scale model and a wall-layer model. The results will be compared with the experiment of Dimaczek et al.
Scenario management and automated scenario generation
NASA Astrophysics Data System (ADS)
McKeever, William; Gilmour, Duane; Lehman, Lynn; Stirtzinger, Anthony; Krause, Lee
2006-05-01
The military planning process utilizes simulation to determine the appropriate course of action (COA) that will achieve a campaign end state. However, due to the difficulty in developing and generating simulation level COAs, only a few COAs are simulated. This may have been appropriate for traditional conflicts but the evolution of warfare from attrition based to effects based strategies, as well as the complexities of 4 th generation warfare and asymmetric adversaries have placed additional demands on military planners and simulation. To keep pace with this dynamic, changing environment, planners must be able to perform continuous, multiple, "what-if" COA analysis. Scenario management and generation are critical elements to achieving this goal. An effects based scenario generation research project demonstrated the feasibility of automated scenario generation techniques which support multiple stove-pipe and emerging broad scope simulations. This paper will discuss a case study in which the scenario generation capability was employed to support COA simulations to identify plan effectiveness. The study demonstrated the effectiveness of using multiple simulation runs to evaluate the effectiveness of alternate COAs in achieving the overall campaign (metrics-based) objectives. The paper will discuss how scenario generation technology can be employed to allow military commanders and mission planning staff to understand the impact of command decisions on the battlespace of tomorrow.
Case-mix reimbursement for nursing home services: Simulation approach
Adams, E. Kathleen; Schlenker, Robert E.
1986-01-01
Nursing home reimbursement based on case mix is a matter of growing interest. Several States either use or are considering this reimbursement method. In this article, we present a method for evaluating key outcomes of such a change for Connecticut nursing homes. A simulation model is used to replicate payments under the case-mix systems used in Maryland, Ohio, and West Virginia. The findings indicate that, compared with the system presently used in Connecticut, these systems would better relate dollar payments to measure patient need, and for-profit homes would benefit relative to nonprofit homes. The Ohio methodology would impose the most additional costs, the West Virginia system would actually be somewhat less expensive in terms of direct patient care payments. PMID:10311776
Case-mix reimbursement for nursing home services: simulation approach.
Adams, E K; Schlenker, R E
1986-01-01
Nursing home reimbursement based on case mix is a matter of growing interest. Several States either use or are considering this reimbursement method. In this article, we present a method for evaluating key outcomes of such a change for Connecticut nursing homes. A simulation model is used to replicate payments under the case-mix systems used in Maryland, Ohio, and West Virginia. The findings indicate that, compared with the system presently used in Connecticut, these systems would better relate dollar payments to measure patient need, and for-profit homes would benefit relative to nonprofit homes. The Ohio methodology would impose the most additional costs, the West Virginia system would actually be somewhat less expensive in terms of direct patient care payments.
Light extraction efficiency of GaN-based LED with pyramid texture by using ray path analysis.
Pan, Jui-Wen; Wang, Chia-Shen
2012-09-10
We study three different gallium-nitride (GaN) based light emitting diode (LED) cases based on the different locations of the pyramid textures. In case 1, the pyramid texture is located on the sapphire top surface, in case 2, the pyramid texture is locate on the P-GaN top surface, while in case 3, the pyramid texture is located on both the sapphire and P-GaN top surfaces. We study the relationship between the light extraction efficiency (LEE) and angle of slant of the pyramid texture. The optimization of total LEE was highest for case 3 among the three cases. Moreover, the seven escape paths along which most of the escaped photon flux propagated were selected in a simulation of the LEDs. The seven escape paths were used to estimate the slant angle for the optimization of LEE and to precisely analyze the photon escape path.
Multi-Scale Characterization of Orthotropic Microstructures
2008-04-01
D. Valiveti, S. J. Harris, J. Boileau, A domain partitioning based pre-processor for multi-scale modelling of cast aluminium alloys , Modelling and...SUPPLEMENTARY NOTES Journal article submitted to Modeling and Simulation in Materials Science and Engineering. PAO Case Number: WPAFB 08-3362...element for charac- terization or simulation to avoid misleading predictions of macroscopic defor- mation, fracture, or transport behavior. Likewise
NASA Astrophysics Data System (ADS)
Anzai, Yosuke; Fukagata, Koji; Meliga, Philippe; Boujo, Edouard; Gallaire, François
2017-04-01
Flow around a square cylinder controlled using plasma actuators (PAs) is numerically investigated by direct numerical simulation in order to clarify the most effective location of actuator installation and to elucidate the mechanism of control effect. The Reynolds number based on the cylinder diameter and the free-stream velocity is set to be 100 to study the fundamental effect of PAs on two-dimensional vortex shedding, and three different locations of PAs are considered. The mean drag and the root-mean-square of lift fluctuations are found to be reduced by 51% and 99% in the case where two opposing PAs are aligned vertically on the rear surface. In that case, a jet flow similar to a base jet is generated by the collision of the streaming flows induced by the two opposing PAs, and the vortex shedding is completely suppressed. The simulation results are ultimately revisited in the frame of linear sensitivity analysis, whose computational cost is much lower than that of performing the full simulation. A good agreement is reported for low control amplitudes, which allows further discussion of the linear optimal arrangement for any number of PAs.
Richings, Gareth W; Habershon, Scott
2017-09-12
We describe a method for performing nuclear quantum dynamics calculations using standard, grid-based algorithms, including the multiconfiguration time-dependent Hartree (MCTDH) method, where the potential energy surface (PES) is calculated "on-the-fly". The method of Gaussian process regression (GPR) is used to construct a global representation of the PES using values of the energy at points distributed in molecular configuration space during the course of the wavepacket propagation. We demonstrate this direct dynamics approach for both an analytical PES function describing 3-dimensional proton transfer dynamics in malonaldehyde and for 2- and 6-dimensional quantum dynamics simulations of proton transfer in salicylaldimine. In the case of salicylaldimine we also perform calculations in which the PES is constructed using Hartree-Fock calculations through an interface to an ab initio electronic structure code. In all cases, the results of the quantum dynamics simulations are in excellent agreement with previous simulations of both systems yet do not require prior fitting of a PES at any stage. Our approach (implemented in a development version of the Quantics package) opens a route to performing accurate quantum dynamics simulations via wave function propagation of many-dimensional molecular systems in a direct and efficient manner.
Lin, Lawrence; Pan, Yi; Hedayat, A S; Barnhart, Huiman X; Haber, Michael
2016-01-01
Total deviation index (TDI) captures a prespecified quantile of the absolute deviation of paired observations from raters, observers, methods, assays, instruments, etc. We compare the performance of TDI using nonparametric quantile regression to the TDI assuming normality (Lin, 2000). This simulation study considers three distributions: normal, Poisson, and uniform at quantile levels of 0.8 and 0.9 for cases with and without contamination. Study endpoints include the bias of TDI estimates (compared with their respective theoretical values), standard error of TDI estimates (compared with their true simulated standard errors), and test size (compared with 0.05), and power. Nonparametric TDI using quantile regression, although it slightly underestimates and delivers slightly less power for data without contamination, works satisfactorily under all simulated cases even for moderate (say, ≥40) sample sizes. The performance of the TDI based on a quantile of 0.8 is in general superior to that of 0.9. The performances of nonparametric and parametric TDI methods are compared with a real data example. Nonparametric TDI can be very useful when the underlying distribution on the difference is not normal, especially when it has a heavy tail.
Elliott, Lydia; DeCristofaro, Claire; Carpenter, Alesia
2012-09-01
This article describes the development and implementation of integrated use of personal handheld devices (personal digital assistants, PDAs) and high-fidelity simulation in an advanced health assessment course in a graduate family nurse practitioner (NP) program. A teaching tool was developed that can be utilized as a template for clinical case scenarios blending these separate technologies. Review of the evidence-based literature, including peer-reviewed articles and reviews. Blending the technologies of high-fidelity simulation and handheld devices (PDAs) provided a positive learning experience for graduate NP students in a teaching laboratory setting. Combining both technologies in clinical case scenarios offered a more real-world learning experience, with a focus on point-of-care service and integration of interview and physical assessment skills with existing standards of care and external clinical resources. Faculty modeling and advance training with PDA technology was crucial to success. Faculty developed a general template tool and systems-based clinical scenarios integrating PDA and high-fidelity simulation. Faculty observations, the general template tool, and one scenario example are included in this article. ©2012 The Author(s) Journal compilation ©2012 American Academy of Nurse Practitioners.
HuPSON: the human physiology simulation ontology
2013-01-01
Background Large biomedical simulation initiatives, such as the Virtual Physiological Human (VPH), are substantially dependent on controlled vocabularies to facilitate the exchange of information, of data and of models. Hindering these initiatives is a lack of a comprehensive ontology that covers the essential concepts of the simulation domain. Results We propose a first version of a newly constructed ontology, HuPSON, as a basis for shared semantics and interoperability of simulations, of models, of algorithms and of other resources in this domain. The ontology is based on the Basic Formal Ontology, and adheres to the MIREOT principles; the constructed ontology has been evaluated via structural features, competency questions and use case scenarios. The ontology is freely available at: http://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads.html (owl files) and http://bishop.scai.fraunhofer.de/scaiview/ (browser). Conclusions HuPSON provides a framework for a) annotating simulation experiments, b) retrieving relevant information that are required for modelling, c) enabling interoperability of algorithmic approaches used in biomedical simulation, d) comparing simulation results and e) linking knowledge-based approaches to simulation-based approaches. It is meant to foster a more rapid uptake of semantic technologies in the modelling and simulation domain, with particular focus on the VPH domain. PMID:24267822
Canestrari, Niccolo; Chubar, Oleg; Reininger, Ruben
2014-09-01
X-ray beamlines in modern synchrotron radiation sources make extensive use of grazing-incidence reflective optics, in particular Kirkpatrick-Baez elliptical mirror systems. These systems can focus the incoming X-rays down to nanometer-scale spot sizes while maintaining relatively large acceptance apertures and high flux in the focused radiation spots. In low-emittance storage rings and in free-electron lasers such systems are used with partially or even nearly fully coherent X-ray beams and often target diffraction-limited resolution. Therefore, their accurate simulation and modeling has to be performed within the framework of wave optics. Here the implementation and benchmarking of a wave-optics method for the simulation of grazing-incidence mirrors based on the local stationary-phase approximation or, in other words, the local propagation of the radiation electric field along geometrical rays, is described. The proposed method is CPU-efficient and fully compatible with the numerical methods of Fourier optics. It has been implemented in the Synchrotron Radiation Workshop (SRW) computer code and extensively tested against the geometrical ray-tracing code SHADOW. The test simulations have been performed for cases without and with diffraction at mirror apertures, including cases where the grazing-incidence mirrors can be hardly approximated by ideal lenses. Good agreement between the SRW and SHADOW simulation results is observed in the cases without diffraction. The differences between the simulation results obtained by the two codes in diffraction-dominated cases for illumination with fully or partially coherent radiation are analyzed and interpreted. The application of the new method for the simulation of wavefront propagation through a high-resolution X-ray microspectroscopy beamline at the National Synchrotron Light Source II (Brookhaven National Laboratory, USA) is demonstrated.
NASA Astrophysics Data System (ADS)
Nguyen, S. T.; Vu, M.-H.; Vu, M. N.; Tang, A. M.
2017-05-01
The present work aims to modeling the thermal conductivity of fractured materials using homogenization-based analytical and pattern-based numerical methods. These materials are considered as a network of cracks distributed inside a solid matrix. Heat flow through such media is perturbed by the crack system. The problem of heat flow across a single crack is firstly investigated. The classical Eshelby's solution, extended to the thermal conduction problem of an ellipsoidal inclusion embedding in an infinite homogeneous matrix, gives an analytical solution of temperature discontinuity across a non-conducting penny-shaped crack. This solution is then validated by the numerical simulation based on the finite elements method. The numerical simulation allows analyzing the effect of crack conductivity. The problem of a single crack is then extended to a medium containing multiple cracks. Analytical estimations for effective thermal conductivity, that take into account the interaction between cracks and their spatial distribution, are developed for the case of non-conducting cracks. Pattern-based numerical method is then employed for both cases non-conducting and conducting cracks. In the case of non-conducting cracks, numerical and analytical methods, both account for the spatial distribution of the cracks, fit perfectly. In the case of conducting cracks, the numerical analyzing of crack conductivity effect shows that highly conducting cracks weakly affect heat flow and the effective thermal conductivity of fractured media.
Pre-operative simulation of pediatric mastoid surgery with 3D-printed temporal bone models.
Rose, Austin S; Webster, Caroline E; Harrysson, Ola L A; Formeister, Eric J; Rawal, Rounak B; Iseli, Claire E
2015-05-01
As the process of additive manufacturing, or three-dimensional (3D) printing, has become more practical and affordable, a number of applications for the technology in the field of pediatric otolaryngology have been considered. One area of promise is temporal bone surgical simulation. Having previously developed a model for temporal bone surgical training using 3D printing, we sought to produce a patient-specific model for pre-operative simulation in pediatric otologic surgery. Our hypothesis was that the creation and pre-operative dissection of such a model was possible, and would demonstrate potential benefits in cases of abnormal temporal bone anatomy. In the case presented, an 11-year-old boy underwent a planned canal-wall-down (CWD) tympano-mastoidectomy for recurrent cholesteatoma preceded by a pre-operative surgical simulation using 3D-printed models of the temporal bone. The models were based on the child's pre-operative clinical CT scan and printed using multiple materials to simulate both bone and soft tissue structures. To help confirm the models as accurate representations of the child's anatomy, distances between various anatomic landmarks were measured and compared to the temporal bone CT scan and the 3D model. The simulation allowed the surgical team to appreciate the child's unusual temporal bone anatomy as well as any challenges that might arise in the safety of the temporal bone laboratory, prior to actual surgery in the operating room (OR). There was minimal variability, in terms of absolute distance (mm) and relative distance (%), in measurements between anatomic landmarks obtained from the patient intra-operatively, the pre-operative CT scan and the 3D-printed models. Accurate 3D temporal bone models can be rapidly produced based on clinical CT scans for pre-operative simulation of specific challenging otologic cases in children, potentially reducing medical errors and improving patient safety. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bellos, V.; Mahmoodian, M.; Leopold, U.; Torres-Matallana, J. A.; Schutz, G.; Clemens, F.
2017-12-01
Surrogate models help to decrease the run-time of computationally expensive, detailed models. Recent studies show that Gaussian Process Emulators (GPE) are promising techniques in the field of urban drainage modelling. However, this study focusses on developing a GPE-based surrogate model for later application in Real Time Control (RTC) using input and output time series of a complex simulator. The case study is an urban drainage catchment in Luxembourg. A detailed simulator, implemented in InfoWorks ICM, is used to generate 120 input-output ensembles, from which, 100 are used for training the emulator and 20 for validation of the results. An ensemble of historical rainfall events with 2 hours duration and 10 minutes time steps are considered as the input data. Two example outputs, are selected as wastewater volume and total COD concentration in a storage tank in the network. The results of the emulator are tested with unseen random rainfall events from the ensemble dataset. The emulator is approximately 1000 times faster than the original simulator for this small case study. Whereas the overall patterns of the simulator are matched by the emulator, in some cases the emulator deviates from the simulator. To quantify the accuracy of the emulator in comparison with the original simulator, Nash-Sutcliffe efficiency (NSE) between the emulator and simulator is calculated for unseen rainfall scenarios. The range of NSE for the case of tank volume is from 0.88 to 0.99 with a mean value of 0.95, whereas for COD is from 0.71 to 0.99 with a mean value of 0.92. The emulator is able to predict the tank volume with higher accuracy as the relationship between rainfall intensity and tank volume is linear. For COD, which has a non-linear behaviour, the predictions are less accurate and more uncertain, in particular when rainfall intensity increases. This predictions were improved by including a larger amount of training data for the higher rainfall intensities. It was observed that, the accuracy of the emulator predictions depends on the ensemble training dataset design and the amount of data fed. Finally, more investigation is required to test the possibility of applying this type of fast emulators for model-based RTC applications in which limited number of inputs and outputs are considered in a short prediction horizon.
Inversion of oceanic constituents in case I and II waters with genetic programming algorithms.
Chami, Malik; Robilliard, Denis
2002-10-20
A stochastic inverse technique based on agenetic programming (GP) algorithm was developed toinvert oceanic constituents from simulated data for case I and case II water applications. The simulations were carried out with the Ordre Successifs Ocean Atmosphere (OSOA) radiative transfer model. They include the effects of oceanic substances such as algal-related chlorophyll, nonchlorophyllous suspended matter, and dissolved organic matter. The synthetic data set also takes into account the directional effects of particles through a variation of their phase function that makes the simulated data realistic. It is shown that GP can be successfully applied to the inverse problem with acceptable stability in the presence of realistic noise in the data. GP is compared with neural network methodology for case I waters; GP exhibits similar retrieval accuracy, which is greater than for traditional techniques such as band ratio algorithms. The application of GP to real satellite data [a Sea-viewing Wide Field-of-view Sensor (SeaWiFS)] was also carried out for case I waters as a validation. Good agreement was obtained when GP results were compared with the SeaWiFS empirical algorithm. For case II waters the accuracy of GP is less than 33%, which remains satisfactory, at the present time, for remote-sensing purposes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Srikanta; Jin, Larry; He, Jincong
2015-06-30
Reduced-order models provide a means for greatly accelerating the detailed simulations that will be required to manage CO 2 storage operations. In this work, we investigate the use of one such method, POD-TPWL, which has previously been shown to be effective in oil reservoir simulation problems. This method combines trajectory piecewise linearization (TPWL), in which the solution to a new (test) problem is represented through a linearization around the solution to a previously-simulated (training) problem, with proper orthogonal decomposition (POD), which enables solution states to be expressed in terms of a relatively small number of parameters. We describe the applicationmore » of POD-TPWL for CO 2-water systems simulated using a compositional procedure. Stanford’s Automatic Differentiation-based General Purpose Research Simulator (AD-GPRS) performs the full-order training simulations and provides the output (derivative matrices and system states) required by the POD-TPWL method. A new POD-TPWL capability introduced in this work is the use of horizontal injection wells that operate under rate (rather than bottom-hole pressure) control. Simulation results are presented for CO 2 injection into a synthetic aquifer and into a simplified model of the Mount Simon formation. Test cases involve the use of time-varying well controls that differ from those used in training runs. Results of reasonable accuracy are consistently achieved for relevant well quantities. Runtime speedups of around a factor of 370 relative to full- order AD-GPRS simulations are achieved, though the preprocessing needed for POD-TPWL model construction corresponds to the computational requirements for about 2.3 full-order simulation runs. A preliminary treatment for POD-TPWL modeling in which test cases differ from training runs in terms of geological parameters (rather than well controls) is also presented. Results in this case involve only small differences between training and test runs, though they do demonstrate that the approach is able to capture basic solution trends. The impact of some of the detailed numerical treatments within the POD-TPWL formulation is considered in an Appendix.« less
NASA Astrophysics Data System (ADS)
van der Plas, Peter; Guerriero, Suzanne; Cristiano, Leorato; Rugina, Ana
2012-08-01
Modelling and simulation can support a number of use cases across the spacecraft development life-cycle. Given the increasing complexity of space missions, the observed general trend is for a more extensive usage of simulation already in the early phases. A major perceived advantage is that modelling and simulation can enable the validation of critical aspects of the spacecraft design before the actual development is started, as such reducing the risk in later phases.Failure Detection, Isolation, and Recovery (FDIR) is one of the areas with a high potential to benefit from early modelling and simulation. With the increasing level of required spacecraft autonomy, FDIR specifications can grow in such a way that the traditional document-based review process soon becomes inadequate.This paper shows that FDIR modelling and simulation in a system context can provide a powerful tool to support the FDIR verification process. It is highlighted that FDIR modelling at this early stage requires heterogeneous modelling tools and languages, in order to provide an adequate functional description of the different components (i.e. FDIR functions, environment, equipment, etc.) to be modelled.For this reason, an FDIR simulation framework is proposed in this paper. This framework is based on a number of tools already available in the Avionics Systems Laboratory at ESTEC, which are the Avionics Test Bench Functional Engineering Simulator (ATB FES), Matlab/Simulink, TASTE, and Real Time Developer Studio (RTDS).The paper then discusses the application of the proposed simulation framework to a real case-study, i.e. the FDIR modelling of a satellite in support of actual ESA mission. Challenges and benefits of the approach are described. Finally, lessons learned and the generality of the proposed approach are discussed.
Optical simulations of laser focusing for optimization of laser betatron
NASA Astrophysics Data System (ADS)
Stanke, L.; Thakur, A.; Šmíd, M.; Gu, Y. J.; Falk, K.
2017-05-01
This work presents optical simulations that are used to design a betatron driven by a short-pulse laser based on the Laser Wakefield Acceleration (LWFA) concept. These simulations explore how the optical setup and its components influence the performance of the betatron. The impact of phase irregularities induced by optical elements is investigated. In order to obtain a good estimate of the future performance of this design a combination of two distinct techniques are used - Field Tracing for optical simulations employing a combination of the Zemax and VirtualLab computational platforms for the laser beam propagation and focusing with the given optical system and particle-in-cell simulation (PIC) for simulating the short-pulse laser interaction with a gas target. The result of the optical simulations serves as an input for the PIC simulations. Application of Field Tracing in combination with the PIC for the purposes of high power laser facility introduces the new application for VirtualLab Fusion. Based on the result of these simulations an alternative design with a hole in the final folding mirror coupled with a spherical focusing mirror is considered in favour of more commonly used off-axis parabola focusing setup. Results are demonstrating, that the decrease of the irradiance due to the presence of the central hole in the folding mirror is negligible (9.69× 1019 W/cm2 for the case without the hole vs. 9.73× 1019 W/cm2 for the case with hole). However, decrease caused by the surface irregularities (surface RMS λ/4 , λ/20 and λ/40 ) is more significant and leads to the poor performance of particle production.
NASA Astrophysics Data System (ADS)
Koo, J.; Wood, S.; Cenacchi, N.; Fisher, M.; Cox, C.
2012-12-01
HarvestChoice (harvestchoice.org) generates knowledge products to guide strategic investments to improve the productivity and profitability of smallholder farming systems in sub-Saharan Africa (SSA). A keynote component of the HarvestChoice analytical framework is a grid-based overlay of SSA - a cropping simulation platform powered by process-based, crop models. Calibrated around the best available representation of cropping production systems in SSA, the simulation platform engages the DSSAT Crop Systems Model with the CENTURY Soil Organic Matter model (DSSAT-CENTURY) and provides a virtual experimentation module with which to explore the impact of a range of technological, managerial and environmental metrics on future crop productivity and profitability, as well as input use. For each of 5 (or 30) arc-minute grid cells in SSA, a stack of model input underlies it: datasets that cover soil properties and fertility, historic and future climate scenarios and farmers' management practices; all compiled from analyses of existing global and regional databases and consultations with other CGIAR centers. Running a simulation model is not always straightforward, especially when certain cropping systems or management practices are not even practiced by resource-poor farmers yet (e.g., precision agriculture) or they were never included in the existing simulation framework (e.g., water harvesting). In such cases, we used DSSAT-CENTURY as a function to iteratively estimate relative responses of cropping systems to technology-driven changes in water and nutrient balances compared to zero-adoption by farmers, while adjusting model input parameters to best mimic farmers' implementation of technologies in the field. We then fed the results of the simulation into to the economic and food trade model framework, IMPACT, to assess the potential implications on future food security. The outputs of the overall simulation analyses are packaged as a web-accessible database and published on the web with an interface that allows users to explore the simulation results in each country with user-defined baseline and what-if scenarios. The results are dynamically presented on maps, charts, and tables. This paper discusses the development of the simulation platform and its underlying data layers, a case study that assessed the role of potential crop management technology development, and the development of a web-based application that visualizes the simulation results.
Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J
2016-08-01
Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.
Building test data from real outbreaks for evaluating detection algorithms.
Texier, Gaetan; Jackson, Michael L; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve
2017-01-01
Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method-ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals.
Building test data from real outbreaks for evaluating detection algorithms
Texier, Gaetan; Jackson, Michael L.; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve
2017-01-01
Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method—ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals. PMID:28863159
NASA Astrophysics Data System (ADS)
Vandermeulen, J.; Nasseri, S. A.; Van de Wiele, B.; Durin, G.; Van Waeyenberge, B.; Dupré, L.
2018-03-01
Lagrangian-based collective coordinate models for magnetic domain wall (DW) motion rely on an ansatz for the DW profile and a Lagrangian approach to describe the DW motion in terms of a set of time-dependent collective coordinates: the DW position, the DW magnetization angle, the DW width and the DW tilting angle. Another approach was recently used to derive similar equations of motion by averaging the Landau-Lifshitz-Gilbert equation without any ansatz, and identifying the relevant collective coordinates afterwards. In this paper, we use an updated version of the semi-analytical equations to compare the Lagrangian-based collective coordinate models with micromagnetic simulations for field- and STT-driven (spin-transfer torque-driven) DW motion in Pt/CoFe/MgO and Pt/Co/AlOx nanostrips. Through this comparison, we assess the accuracy of the different models, and provide insight into the deviations of the models from simulations. It is found that the lack of terms related to DW asymmetry in the Lagrangian-based collective coordinate models significantly contributes to the discrepancy between the predictions of the most accurate Lagrangian-based model and the micromagnetic simulations in the field-driven case. This is in contrast to the STT-driven case where the DW remains symmetric.
Lumped-parameters equivalent circuit for condenser microphones modeling.
Esteves, Josué; Rufer, Libor; Ekeom, Didace; Basrour, Skandar
2017-10-01
This work presents a lumped parameters equivalent model of condenser microphone based on analogies between acoustic, mechanical, fluidic, and electrical domains. Parameters of the model were determined mainly through analytical relations and/or finite element method (FEM) simulations. Special attention was paid to the air gap modeling and to the use of proper boundary condition. Corresponding lumped-parameters were obtained as results of FEM simulations. Because of its simplicity, the model allows a fast simulation and is readily usable for microphone design. This work shows the validation of the equivalent circuit on three real cases of capacitive microphones, including both traditional and Micro-Electro-Mechanical Systems structures. In all cases, it has been demonstrated that the sensitivity and other related data obtained from the equivalent circuit are in very good agreement with available measurement data.
NASA Astrophysics Data System (ADS)
Grilli, Stéphan T.; Guérin, Charles-Antoine; Shelby, Michael; Grilli, Annette R.; Moran, Patrick; Grosdidier, Samuel; Insua, Tania L.
2017-08-01
In past work, tsunami detection algorithms (TDAs) have been proposed, and successfully applied to offline tsunami detection, based on analyzing tsunami currents inverted from high-frequency (HF) radar Doppler spectra. With this method, however, the detection of small and short-lived tsunami currents in the most distant radar ranges is challenging due to conflicting requirements on the Doppler spectra integration time and resolution. To circumvent this issue, in Part I of this work, we proposed an alternative TDA, referred to as time correlation (TC) TDA, that does not require inverting currents, but instead detects changes in patterns of correlations of radar signal time series measured in pairs of cells located along the main directions of tsunami propagation (predicted by geometric optics theory); such correlations can be maximized when one signal is time-shifted by the pre-computed long wave propagation time. We initially validated the TC-TDA based on numerical simulations of idealized tsunamis in a simplified geometry. Here, we further develop, extend, and apply the TC algorithm to more realistic tsunami case studies. These are performed in the area West of Vancouver Island, BC, where Ocean Networks Canada recently deployed a HF radar (in Tofino, BC), to detect tsunamis from far- and near-field sources, up to a 110 km range. Two case studies are considered, both simulated using long wave models (1) a far-field seismic, and (2) a near-field landslide, tsunami. Pending the availability of radar data, a radar signal simulator is parameterized for the Tofino HF radar characteristics, in particular its signal-to-noise ratio with range, and combined with the simulated tsunami currents to produce realistic time series of backscattered radar signal from a dense grid of cells. Numerical experiments show that the arrival of a tsunami causes a clear change in radar signal correlation patterns, even at the most distant ranges beyond the continental shelf, thus making an early tsunami detection possible with the TC-TDA. Based on these results, we discuss how the new algorithm could be combined with standard methods proposed earlier, based on a Doppler analysis, to develop a new tsunami detection system based on HF radar data, that could increase warning time. This will be the object of future work, which will be based on actual, rather than simulated, radar data.
Problem-Based Learning in Accounting
ERIC Educational Resources Information Center
Dockter, DuWayne L.
2012-01-01
Seasoned educators use an assortment of student-centered methods and tools to enhance their student's learning environment. In respects to methodologies used in accounting, educators have utilized and created new forms of problem-based learning exercises, including case studies, simulations, and other projects, to help students become more active…
ERIC Educational Resources Information Center
Hamaker, Ellen L.; Dolan, Conor V.; Molenaar, Peter C. M.
2003-01-01
Demonstrated, through simulation, that stationary autoregressive moving average (ARMA) models may be fitted readily when T>N, using normal theory raw maximum likelihood structural equation modeling. Also provides some illustrations based on real data. (SLD)
NASA Astrophysics Data System (ADS)
Connell, Rasheen M.
At the Howard University Atmospheric Observatory in Beltsville, MD, a Raman Lidar System was developed to provide both daytime and nighttime measurements of water vapor, aerosols, and cirrus clouds with 60 s temporal and 7.5 m spatial resolution in the lower and upper troposphere. This system analyzes signals at three wavelengths associated with Rayleigh/Mie scattering for aerosols and cirrus clouds at 354.7 nm, Raman scattering for nitrogen at 386.7 nm, and water vapor at 407.5 nm. The transmitter is a triple harmonic Nd: YAG solid state laser. The receiver is a 40 cm Cassegrain telescope. The detector system consists of a multi-channel wavelength separator unit and data acquisition system. This thesis develops a numerical model to provide a realistic representation of the system behavior. The variants of the lidar equation in the model use system parameters to solve and determine the return signals for the lidar system. This dissertation describes four case studies being investigated: clear sky, polluted, wet, and cirrus cloud atmospheric conditions. The first simulations are based on a standard atmosphere, which assumes an unpolluted (aerosol-free) dry-air atmosphere. The second and third sets of simulations are based on polluted and cirrus cloud atmospheric conditions, where aerosols and cirrus clouds are added to Case Study I. The last set of simulations is based on a wet atmosphere, where the troposphere is comprised of the same mixture of gases in Case Study II, with the addition of atmospheric water vapor. Lidar signals are simulated over the altitude range covered by our measurements (up to 14 km). Results of our simulations show that the measured and modeled signals agree within 10% over an extended period of time when the system (i.e., such as alignment, filter tuning, etc.) has not changed.
Dredging for dilution: A simulation based case study in a Tidal River.
Bilgili, Ata; Proehl, Jeffrey A; Swift, M Robinson
2016-02-01
A 2-D hydrodynamic finite element model with a Lagrangian particle module is used to investigate the effects of dredging on the hydrodynamics and the horizontal dilution of pollutant particles originating from a wastewater treatment facility (WWTF) in tidal Oyster River in New Hampshire, USA. The model is driven by the semi-diurnal (M2) tidal component and includes the effect of flooding and drying of riverine mud flats. The particle tracking method consists of tidal advection plus a horizontal random walk model of sub-grid scale turbulent processes. Our approach is to perform continuous pollutant particle releases from the outfall, simulating three different scenarios: a base-case representing the present conditions and two different dredged channel/outfall location configurations. Hydrodynamics are investigated in an Eulerian framework and Lagrangian particle dilution improvement ratios are calculated for all cases. Results show that the simulated hydrodynamics are consistent with observed conditions. Eulerian and Lagrangian residuals predict an outward path suggesting flushing of pollutants on longer (>M2) time scales. Simulated dilution maps show that, in addition to dredging, the relocation of the WWTF outfall into the dredged main channel is required for increased dilution performance. The methodology presented here can be applied to similar managerial problems in all similar systems worldwide with relatively little effort, with the combination of Lagrangian and Eulerian methods working together towards a better solution. The statistical significance brought into methodology, by using a large number of particles (16000 in this case), is to be emphasized, especially with the growing number of networked parallel computer clusters worldwide. This paper improves on the study presented in Bilgili et al., 2006b, by adding an Eulerian analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.
A simulation of orientation dependent, global changes in camera sensitivity in ECT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bieszk, J.A.; Hawman, E.G.; Malmin, R.E.
1984-01-01
ECT promises the abilities to: 1) observe radioisotope distributions in a patient without the summation of overlying activity to reduce contrast, and 2) measure quantitatively these distributions to further and more accurately assess organ function. Ideally, camera-based ECT systems should have a performance that is independent of camera orientation or gantry angle. This study is concerned with ECT quantitation errors that can arise from angle-dependent variations of camera sensitivity. Using simulated phantoms representative of heart and liver sections, the effects of sensitivity changes on reconstructed images were assessed both visually and quantitatively based on ROI sums. The sinogram for eachmore » test image was simulated with 128 linear digitization and 180 angular views. The global orientation-dependent sensitivity was modelled by applying an angular sensitivity dependence to the sinograms of the test images. Four sensitivity variations were studied. Amplitudes of 0% (as a reference), 5%, 10%, and 25% with a costheta dependence were studied as well as a cos2theta dependence with a 5% amplitude. Simulations were done with and without Poisson noise to: 1) determine trends in the quantitative effects as a function of the magnitude of the variation, and 2) to see how these effects are manifested in studies having statistics comparable to clinical cases. For the most realistic sensitivity variation (costheta, 5% ampl.), the ROIs chosen in the present work indicated changes of <0.5% in the noiseless case and <5% for the case with Poisson noise. The effects of statistics appear to dominate any effects due to global, sinusoidal, orientation-dependent sensitivity changes in the cases studied.« less
Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; ...
2015-06-19
Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60-hour case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in-situ measurements from the RACORO field campaign and remote-sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functionsmore » for concise representation in models. Values of the aerosol hygroscopicity parameter, κ, are derived from observations to be ~0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing datasets are derived from the ARM variational analysis, ECMWF forecasts, and a multi-scale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in 'trial' large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.« less
NASA Technical Reports Server (NTRS)
Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; Endo, Satoshi; Lin, Wuyin; Wang, Jian; Feng, Sha; Zhang, Yunyan; Turner, David D.; Liu, Yangang;
2015-01-01
Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60 h case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in situ measurements from the Routine AAF (Atmospheric Radiation Measurement (ARM) Aerial Facility) CLOWD (Clouds with Low Optical Water Depth) Optical Radiative Observations (RACORO) field campaign and remote sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functions for concise representation in models. Values of the aerosol hygroscopicity parameter, kappa, are derived from observations to be approximately 0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing data sets are derived from the ARM variational analysis, European Centre for Medium-Range Weather Forecasts, and a multiscale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in "trial" large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.
NASA Astrophysics Data System (ADS)
Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; Endo, Satoshi; Lin, Wuyin; Wang, Jian; Feng, Sha; Zhang, Yunyan; Turner, David D.; Liu, Yangang; Li, Zhijin; Xie, Shaocheng; Ackerman, Andrew S.; Zhang, Minghua; Khairoutdinov, Marat
2015-06-01
Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60 h case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in situ measurements from the Routine AAF (Atmospheric Radiation Measurement (ARM) Aerial Facility) CLOWD (Clouds with Low Optical Water Depth) Optical Radiative Observations (RACORO) field campaign and remote sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functions for concise representation in models. Values of the aerosol hygroscopicity parameter, κ, are derived from observations to be 0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing data sets are derived from the ARM variational analysis, European Centre for Medium-Range Weather Forecasts, and a multiscale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in "trial" large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.
Stochastic model search with binary outcomes for genome-wide association studies.
Russu, Alberto; Malovini, Alberto; Puca, Annibale A; Bellazzi, Riccardo
2012-06-01
The spread of case-control genome-wide association studies (GWASs) has stimulated the development of new variable selection methods and predictive models. We introduce a novel Bayesian model search algorithm, Binary Outcome Stochastic Search (BOSS), which addresses the model selection problem when the number of predictors far exceeds the number of binary responses. Our method is based on a latent variable model that links the observed outcomes to the underlying genetic variables. A Markov Chain Monte Carlo approach is used for model search and to evaluate the posterior probability of each predictor. BOSS is compared with three established methods (stepwise regression, logistic lasso, and elastic net) in a simulated benchmark. Two real case studies are also investigated: a GWAS on the genetic bases of longevity, and the type 2 diabetes study from the Wellcome Trust Case Control Consortium. Simulations show that BOSS achieves higher precisions than the reference methods while preserving good recall rates. In both experimental studies, BOSS successfully detects genetic polymorphisms previously reported to be associated with the analyzed phenotypes. BOSS outperforms the other methods in terms of F-measure on simulated data. In the two real studies, BOSS successfully detects biologically relevant features, some of which are missed by univariate analysis and the three reference techniques. The proposed algorithm is an advance in the methodology for model selection with a large number of features. Our simulated and experimental results showed that BOSS proves effective in detecting relevant markers while providing a parsimonious model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Xiao; Gao, Wenzhong; Wang, Jianhui
The frequency regulation capability of a wind power plant plays an important role in enhancing frequency reliability especially in an isolated power system with high wind power penetration levels. A comparison of two types of inertial control methods, namely frequency-based inertial control (FBIC) and stepwise inertial control (SIC), is presented in this paper. Comprehensive case studies are carried out to reveal features of the different inertial control methods, simulated in a modified Western System Coordination Council (WSCC) nine-bus power grid using real-time digital simulator (RTDS) platform. The simulation results provide an insight into the inertial control methods under various scenarios.
Kuselman, Ilya; Pennecchi, Francesca; Epstein, Malka; Fajgelj, Ales; Ellison, Stephen L R
2014-12-01
Monte Carlo simulation of expert judgments on human errors in a chemical analysis was used for determination of distributions of the error quantification scores (scores of likelihood and severity, and scores of effectiveness of a laboratory quality system in prevention of the errors). The simulation was based on modeling of an expert behavior: confident, reasonably doubting and irresolute expert judgments were taken into account by means of different probability mass functions (pmfs). As a case study, 36 scenarios of human errors which may occur in elemental analysis of geological samples by ICP-MS were examined. Characteristics of the score distributions for three pmfs of an expert behavior were compared. Variability of the scores, as standard deviation of the simulated score values from the distribution mean, was used for assessment of the score robustness. A range of the score values, calculated directly from elicited data and simulated by a Monte Carlo method for different pmfs, was also discussed from the robustness point of view. It was shown that robustness of the scores, obtained in the case study, can be assessed as satisfactory for the quality risk management and improvement of a laboratory quality system against human errors. Copyright © 2014 Elsevier B.V. All rights reserved.
Environmental Education Technologies in a Social Void: The Case of "Greendrive"
ERIC Educational Resources Information Center
Delicado, Ana
2012-01-01
This article is based on a case study that follows the trajectory of a technological device aimed at environmental education from the engineering laboratory in which it was designed into the contexts in which it is used. "Greendrive" is a driving simulator that accurately reproduces the performance of a vehicle in terms of fuel…
Fan broadband interaction noise modeling using a low-order method
NASA Astrophysics Data System (ADS)
Grace, S. M.
2015-06-01
A low-order method for simulating broadband interaction noise downstream of the fan stage in a turbofan engine is explored in this paper. The particular noise source of interest is due to the interaction of the fan rotor wake with the fan exit guide vanes (FEGVs). The vanes are modeled as flat plates and the method utilizes strip theory relying on unsteady aerodynamic cascade theory at each strip. This paper shows predictions for 6 of the 9 cases from NASA's Source Diagnostic Test (SDT) and all 4 cases from the 2014 Fan Broadband Workshop Fundamental Case 2 (FC2). The turbulence in the rotor wake is taken from hot-wire data for the low speed SDT cases and the FC2 cases. Additionally, four different computational simulations of the rotor wake flow for all of the SDT rotor speeds have been used to determine the rotor wake turbulence parameters. Comparisons between predictions based on the different inputs highlight the possibility of a potential effect present in the hot-wire data for the SDT as well as the importance of accurately describing the turbulence length scale when using this model. The method produces accurate predictions of the spectral shape for all of the cases. It also predicts reasonably well all of the trends that can be considered based on the included cases such as vane geometry, vane count, turbulence level, and rotor speed.
Simulation environment and graphical visualization environment: a COPD use-case
2014-01-01
Background Today, many different tools are developed to execute and visualize physiological models that represent the human physiology. Most of these tools run models written in very specific programming languages which in turn simplify the communication among models. Nevertheless, not all of these tools are able to run models written in different programming languages. In addition, interoperability between such models remains an unresolved issue. Results In this paper we present a simulation environment that allows, first, the execution of models developed in different programming languages and second the communication of parameters to interconnect these models. This simulation environment, developed within the Synergy-COPD project, aims at helping and supporting bio-researchers and medical students understand the internal mechanisms of the human body through the use of physiological models. This tool is composed of a graphical visualization environment, which is a web interface through which the user can interact with the models, and a simulation workflow management system composed of a control module and a data warehouse manager. The control module monitors the correct functioning of the whole system. The data warehouse manager is responsible for managing the stored information and supporting its flow among the different modules. This simulation environment has been validated with the integration of three models: two deterministic, i.e. based on linear and differential equations, and one probabilistic, i.e., based on probability theory. These models have been selected based on the disease under study in this project, i.e., chronic obstructive pulmonary disease. Conclusion It has been proved that the simulation environment presented here allows the user to research and study the internal mechanisms of the human physiology by the use of models via a graphical visualization environment. A new tool for bio-researchers is ready for deployment in various use cases scenarios. PMID:25471327
Graceful Failure and Societal Resilience Analysis Via Agent-Based Modeling and Simulation
NASA Astrophysics Data System (ADS)
Schopf, P. S.; Cioffi-Revilla, C.; Rogers, J. D.; Bassett, J.; Hailegiorgis, A. B.
2014-12-01
Agent-based social modeling is opening up new methodologies for the study of societal response to weather and climate hazards, and providing measures of resiliency that can be studied in many contexts, particularly in coupled human and natural-technological systems (CHANTS). Since CHANTS are complex adaptive systems, societal resiliency may or may not occur, depending on dynamics that lack closed form solutions. Agent-based modeling has been shown to provide a viable theoretical and methodological approach for analyzing and understanding disasters and societal resiliency in CHANTS. Our approach advances the science of societal resilience through computational modeling and simulation methods that complement earlier statistical and mathematical approaches. We present three case studies of social dynamics modeling that demonstrate the use of these agent based models. In Central Asia, we exmaine mutltiple ensemble simulations with varying climate statistics to see how droughts and zuds affect populations, transmission of wealth across generations, and the overall structure of the social system. In Eastern Africa, we explore how successive episodes of drought events affect the adaptive capacity of rural households. Human displacement, mainly, rural to urban migration, and livelihood transition particularly from pastoral to farming are observed as rural households interacting dynamically with the biophysical environment and continually adjust their behavior to accommodate changes in climate. In the far north case we demonstrate one of the first successful attempts to model the complete climate-permafrost-infrastructure-societal interaction network as a complex adaptive system/CHANTS implemented as a ``federated'' agent-based model using evolutionary computation. Analysis of population changes resulting from extreme weather across these and other cases provides evidence for the emergence of new steady states and shifting patterns of resilience.
Log file-based patient dose calculations of double-arc VMAT for head-and-neck radiotherapy.
Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Majima, Kazuhiro; Matsushita, Haruo; Takeda, Ken; Jingu, Keiichi
2018-04-01
The log file-based method cannot display dosimetric changes due to linac component miscalibration because of the insensitivity of log files to linac component miscalibration. The purpose of this study was to supply dosimetric changes in log file-based patient dose calculations for double-arc volumetric-modulated arc therapy (VMAT) in head-and-neck cases. Fifteen head-and-neck cases participated in this study. For each case, treatment planning system (TPS) doses were produced by double-arc and single-arc VMAT. Miscalibration-simulated log files were generated by inducing a leaf miscalibration of ±0.5 mm into the log files that were acquired during VMAT irradiation. Subsequently, patient doses were estimated using the miscalibration-simulated log files. For double-arc VMAT, regarding planning target volume (PTV), the change from TPS dose to miscalibration-simulated log file dose in D mean was 0.9 Gy and that for tumor control probability was 1.4%. As for organ-at-risks (OARs), the change in D mean was <0.7 Gy and normal tissue complication probability was <1.8%. A comparison between double-arc and single-arc VMAT for PTV showed statistically significant differences in the changes evaluated by D mean and radiobiological metrics (P < 0.01), even though the magnitude of these differences was small. Similarly, for OARs, the magnitude of these changes was found to be small. Using the log file-based method for PTV and OARs, the log file-based method estimate of patient dose using the double-arc VMAT has accuracy comparable to that obtained using the single-arc VMAT. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
A Comparison of Educational Interventions to Enhance Cultural Competency in Pharmacy Students
Jonkman, Lauren; Connor, Sharon; Hall, Deanne
2013-01-01
Objective. To determine the degree to which 3 different educational interventions enhance cultural competency in pharmacy students. Methods. Students were equally divided among a simulated-patient activity group, a written case-scenarios group, and a formal lecture group. Mean scores for pre- and post-intervention cultural self-assessment surveys were compared. Results. In the simulation group, there were significant positive changes in the cultural skills and cultural desire components; in the case-scenario group, there was a significant positive change in the cultural awareness component; and in the lecture group, there were significant positive changes in the cultural skills and cultural empathy components. With respect to the cultural skills component, there was greater post-intervention improvement in the simulation and lecture groups than in the case-scenario group. Conclusions. There were significant positive changes within each group, indicating that ideologies and behaviors may be altered based on the educational intervention received. However, a 1-hour practicum may not be sufficient to enhance cultural competency. PMID:23716744
An analytical study of reduced-gravity propellant settling
NASA Technical Reports Server (NTRS)
Bradshaw, R. D.; Kramer, J. L.; Masica, W. J.
1974-01-01
Full-scale propellant reorientation flow dynamics for the Centaur D-1T fuel tank were analyzed. A computer code using the simplified marker and cell technique was modified to include the capability for a variable-grid mesh configuration. Use of smaller cells near the boundary, near baffles, and in corners provides improved flow resolution. Two drop tower model cases were simulated to verify program validity: one case without baffles, the other with baffles and geometry identical to Centaur D-1T. Flow phenomena using the new code successfully modeled drop tower data. Baffles are a positive factor in the settling flow. Two full-scale Centaur D-1T cases were simulated using parameters based on the Titan/Centaur proof flight. These flow simulations indicated the time to clear the vent area and an indication of time to orient and collect the propellant. The results further indicated the complexity of the reorientation flow and the long time period required for settling.
One-shot estimate of MRMC variance: AUC.
Gallas, Brandon D
2006-03-01
One popular study design for estimating the area under the receiver operating characteristic curve (AUC) is the one in which a set of readers reads a set of cases: a fully crossed design in which every reader reads every case. The variability of the subsequent reader-averaged AUC has two sources: the multiple readers and the multiple cases (MRMC). In this article, we present a nonparametric estimate for the variance of the reader-averaged AUC that is unbiased and does not use resampling tools. The one-shot estimate is based on the MRMC variance derived by the mechanistic approach of Barrett et al. (2005), as well as the nonparametric variance of a single-reader AUC derived in the literature on U statistics. We investigate the bias and variance properties of the one-shot estimate through a set of Monte Carlo simulations with simulated model observers and images. The different simulation configurations vary numbers of readers and cases, amounts of image noise and internal noise, as well as how the readers are constructed. We compare the one-shot estimate to a method that uses the jackknife resampling technique with an analysis of variance model at its foundation (Dorfman et al. 1992). The name one-shot highlights that resampling is not used. The one-shot and jackknife estimators behave similarly, with the one-shot being marginally more efficient when the number of cases is small. We have derived a one-shot estimate of the MRMC variance of AUC that is based on a probabilistic foundation with limited assumptions, is unbiased, and compares favorably to an established estimate.
Hamel, J F; Sebille, V; Le Neel, T; Kubis, G; Boyer, F C; Hardouin, J B
2017-12-01
Subjective health measurements using Patient Reported Outcomes (PRO) are increasingly used in randomized trials, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: Classical Test Theory (CTT) and Item Response Theory models (IRT). These two strategies display very similar characteristics when data are complete, but in the common case when data are missing, whether IRT or CTT would be the most appropriate remains unknown and was investigated using simulations. We simulated PRO data such as quality of life data. Missing responses to items were simulated as being completely random, depending on an observable covariate or on an unobserved latent trait. The considered CTT-based methods allowed comparing scores using complete-case analysis, personal mean imputations or multiple-imputations based on a two-way procedure. The IRT-based method was the Wald test on a Rasch model including a group covariate. The IRT-based method and the multiple-imputations-based method for CTT displayed the highest observed power and were the only unbiased method whatever the kind of missing data. Online software and Stata® modules compatibles with the innate mi impute suite are provided for performing such analyses. Traditional procedures (listwise deletion and personal mean imputations) should be avoided, due to inevitable problems of biases and lack of power.
A simulation for teaching the basic and clinical science of fluid therapy.
Rawson, Richard E; Dispensa, Marilyn E; Goldstein, Richard E; Nicholson, Kimberley W; Vidal, Noni Korf
2009-09-01
The course "Management of Fluid and Electrolyte Disorders" is an applied physiology course taught using lectures and paper-based cases. The course approaches fluid therapy from both basic science and clinical perspectives. While paper cases provide a basis for application of basic science concepts, they lack key components of genuine clinical cases that, by nature, are diverse, change over time, and respond in unique ways to therapeutic interventions. We developed a dynamic model using STELLA software that simulates normal and abnormal fluid and electrolyte balance in the dog. Students interact, not with the underlying model, but with a user interface that provides sufficient data (skin turgor, chemistry panel, etc.) for the clinical assessment of patients and an opportunity for treatment. Students administer fluids and supplements, and the model responds in "real time," requiring regular reassessment and, potentially, adaptation of the treatment strategy. The level of success is determined by clinical outcome, including improvement, deterioration, or death. We expected that the simulated cases could be used to teach both the clinical and basic science of fluid therapy. The simulation provides exposure to a realistic clinical environment, and students tend to focus on this aspect of the simulation while, for the most part, ignoring an exploration of the underlying physiological basis for patient responses. We discuss how the instructor's expertise can provide sufficient support, feedback, and scaffolding so that students can extract maximum understanding of the basic science in the context of assessing and treating at the clinical level.
Docherty, Paul D; Schranz, Christoph; Chase, J Geoffrey; Chiew, Yeong Shiong; Möller, Knut
2014-05-01
Accurate model parameter identification relies on accurate forward model simulations to guide convergence. However, some forward simulation methodologies lack the precision required to properly define the local objective surface and can cause failed parameter identification. The role of objective surface smoothness in identification of a pulmonary mechanics model was assessed using forward simulation from a novel error-stepping method and a proprietary Runge-Kutta method. The objective surfaces were compared via the identified parameter discrepancy generated in a Monte Carlo simulation and the local smoothness of the objective surfaces they generate. The error-stepping method generated significantly smoother error surfaces in each of the cases tested (p<0.0001) and more accurate model parameter estimates than the Runge-Kutta method in three of the four cases tested (p<0.0001) despite a 75% reduction in computational cost. Of note, parameter discrepancy in most cases was limited to a particular oblique plane, indicating a non-intuitive multi-parameter trade-off was occurring. The error-stepping method consistently improved or equalled the outcomes of the Runge-Kutta time-integration method for forward simulations of the pulmonary mechanics model. This study indicates that accurate parameter identification relies on accurate definition of the local objective function, and that parameter trade-off can occur on oblique planes resulting prematurely halted parameter convergence. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Khanduja, P Kristina; Bould, M Dylan; Naik, Viren N; Hladkowicz, Emily; Boet, Sylvain
2015-01-01
We systematically reviewed the effectiveness of simulation-based education, targeting independently practicing qualified physicians in acute care specialties. We also describe how simulation is used for performance assessment in this population. Data source included: DataMEDLINE, Embase, Cochrane Database of Systematic Reviews, Cochrane CENTRAL Database of Controlled Trials, and National Health Service Economic Evaluation Database. The last date of search was January 31, 2013. All original research describing simulation-based education for independently practicing physicians in anesthesiology, critical care, and emergency medicine was reviewed. Data analysis was performed in duplicate with further review by a third author in cases of disagreement until consensus was reached. Data extraction was focused on effectiveness according to Kirkpatrick's model. For simulation-based performance assessment, tool characteristics and sources of validity evidence were also collated. Of 39 studies identified, 30 studies focused on the effectiveness of simulation-based education and nine studies evaluated the validity of simulation-based assessment. Thirteen studies (30%) targeted the lower levels of Kirkpatrick's hierarchy with reliance on self-reporting. Simulation was unanimously described as a positive learning experience with perceived impact on clinical practice. Of the 17 remaining studies, 10 used a single group or "no intervention comparison group" design. The majority (n = 17; 44%) were able to demonstrate both immediate and sustained improvements in educational outcomes. Nine studies reported the psychometric properties of simulation-based performance assessment as their sole objective. These predominantly recruited independent practitioners as a convenience sample to establish whether the tool could discriminate between experienced and inexperienced operators and concentrated on a single aspect of validity evidence. Simulation is perceived as a positive learning experience with limited evidence to support improved learning. Future research should focus on the optimal modality and frequency of exposure, quality of assessment tools and on the impact of simulation-based education beyond the individuals toward improved patient care.
Embedding Research in a Field-Based Module through Peer Review and Assessment for Learning
ERIC Educational Resources Information Center
Nicholson, Dawn T.
2011-01-01
A case study is presented of embedding research in a final year undergraduate, field-based, physical geography module. The approach is holistic, whereby research-based learning activities simulate the full life cycle of research from inception through to peer review and publication. The learning, teaching and assessment strategy emphasizes the…
Virtual worlds and team training.
Dev, Parvati; Youngblood, Patricia; Heinrichs, W Leroy; Kusumoto, Laura
2007-06-01
An important component of all emergency medicine residency programs is managing trauma effectively as a member of an emergency medicine team, but practice on live patients is often impractical and mannequin-based simulators are expensive and require all trainees to be physically present at the same location. This article describes a project to develop and evaluate a computer-based simulator (the Virtual Emergency Department) for distance training in teamwork and leadership in trauma management. The virtual environment provides repeated practice opportunities with life-threatening trauma cases in a safe and reproducible setting.
Automated Simulation Updates based on Flight Data
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Ward, David G.
2007-01-01
A statistically-based method for using flight data to update aerodynamic data tables used in flight simulators is explained and demonstrated. A simplified wind-tunnel aerodynamic database for the F/A-18 aircraft is used as a starting point. Flight data from the NASA F-18 High Alpha Research Vehicle (HARV) is then used to update the data tables so that the resulting aerodynamic model characterizes the aerodynamics of the F-18 HARV. Prediction cases are used to show the effectiveness of the automated method, which requires no ad hoc adjustments by the analyst.
Mesoscale acid deposition modeling studies
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Proctor, F. H.; Zack, John W.; Karyampudi, V. Mohan; Price, P. E.; Bousquet, M. D.; Coats, G. D.
1989-01-01
The work performed in support of the EPA/DOE MADS (Mesoscale Acid Deposition) Project included the development of meteorological data bases for the initialization of chemistry models, the testing and implementation of new planetary boundary layer parameterization schemes in the MASS model, the simulation of transport and precipitation for MADS case studies employing the MASS model, and the use of the TASS model in the simulation of cloud statistics and the complex transport of conservative tracers within simulated cumuloform clouds. The work performed in support of the NASA/FAA Wind Shear Program included the use of the TASS model in the simulation of the dynamical processes within convective cloud systems, the analyses of the sensitivity of microburst intensity and general characteristics as a function of the atmospheric environment within which they are formed, comparisons of TASS model microburst simulation results to observed data sets, and the generation of simulated wind shear data bases for use by the aviation meteorological community in the evaluation of flight hazards caused by microbursts.
A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON
King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix
2008-01-01
As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597
Wind Turbine Control Design to Reduce Capital Costs: 7 January 2009 - 31 August 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darrow, P. J.
2010-01-01
This report first discusses and identifies which wind turbine components can benefit from advanced control algorithms and also presents results from a preliminary loads case analysis using a baseline controller. Next, it describes the design, implementation, and simulation-based testing of an advanced controller to reduce loads on those components. The case-by-case loads analysis and advanced controller design will help guide future control research.
ERIC Educational Resources Information Center
Rector-Aranda, Amy; Raider-Roth, Miriam; Glaser, Noah; Behrman, Matthew
2017-01-01
This study explores the relationship between character selection and student engagement in the Jewish Court of All Time (JCAT), an online and classroom-based role-playing simulation of a current events court case with Jewish historical roots. Analyzing students' responses to three questions posed in an out-of-character JCAT discussion forum, we…
ERIC Educational Resources Information Center
Eisenhardt, Alyson; Ninassi, Susanne Bruno
2016-01-01
Many pedagogy experts suggest the use of real world scenarios and simulations as a means of teaching students to apply decision analysis concepts to their field of study. These methods allow students an opportunity to synthesize knowledge, skills, and abilities by presenting a field-based dilemma. The use of real world scenarios and simulations…
ERIC Educational Resources Information Center
Ledford, Jennifer R.; Ayres, Kevin M.; Lane, Justin D.; Lam, Man Fung
2015-01-01
Momentary time sampling (MTS), whole interval recording (WIR), and partial interval recording (PIR) are commonly used in applied research. We discuss potential difficulties with analyzing data when these systems are used and present results from a pilot simulation study designed to determine the extent to which these issues are likely to be…
Weller, Jennifer; Henderson, Robert; Webster, Craig S; Shulruf, Boaz; Torrie, Jane; Davies, Elaine; Henderson, Kaylene; Frampton, Chris; Merry, Alan F
2014-01-01
Effective teamwork is important for patient safety, and verbal communication underpins many dimensions of teamwork. The validity of the simulated environment would be supported if it elicited similar verbal communications to the real setting. The authors hypothesized that anesthesiologists would exhibit similar verbal communication patterns in routine operating room (OR) cases and routine simulated cases. The authors further hypothesized that anesthesiologists would exhibit different communication patterns in routine cases (real or simulated) and simulated cases involving a crisis. Key communications relevant to teamwork were coded from video recordings of anesthesiologists in the OR, routine simulation and crisis simulation and percentages were compared. The authors recorded comparable videos of 20 anesthesiologists in the two simulations, and 17 of these anesthesiologists in the OR, generating 400 coded events in the OR, 683 in the routine simulation, and 1,419 in the crisis simulation. The authors found no significant differences in communication patterns in the OR and the routine simulations. The authors did find significant differences in communication patterns between the crisis simulation and both the OR and the routine simulations. Participants rated team communication as realistic and considered their communications occurred with a similar frequency in the simulations as in comparable cases in the OR. The similarity of teamwork-related communications elicited from anesthesiologists in simulated cases and the real setting lends support for the ecological validity of the simulation environment and its value in teamwork training. Different communication patterns and frequencies under the challenge of a crisis support the use of simulation to assess crisis management skills.
Vane Pump Casing Machining of Dumpling Machine Based on CAD/CAM
NASA Astrophysics Data System (ADS)
Huang, Yusen; Li, Shilong; Li, Chengcheng; Yang, Zhen
Automatic dumpling forming machine is also called dumpling machine, which makes dumplings through mechanical motions. This paper adopts the stuffing delivery mechanism featuring the improved and specially-designed vane pump casing, which can contribute to the formation of dumplings. Its 3D modeling in Pro/E software, machining process planning, milling path optimization, simulation based on UG and compiling post program were introduced and verified. The results indicated that adoption of CAD/CAM offers firms the potential to pursue new innovative strategies.
2014-01-01
Seven different types of gasification-based coal conversion processes for producing mainly electricity and in some cases hydrogen (H2), with and without carbon dioxide (CO2) capture, were compared on a consistent basis through simulation studies. The flowsheet for each process was developed in a chemical process simulation tool “Aspen Plus”. The pressure swing adsorption (PSA), physical absorption (Selexol), and chemical looping combustion (CLC) technologies were separately analyzed for processes with CO2 capture. The performances of the above three capture technologies were compared with respect to energetic and exergetic efficiencies, and the level of CO2 emission. The effect of air separation unit (ASU) and gas turbine (GT) integration on the power output of all the CO2 capture cases is assessed. Sensitivity analysis was carried out for the CLC process (electricity-only case) to examine the effect of temperature and water-cooling of the air reactor on the overall efficiency of the process. The results show that, when only electricity production in considered, the case using CLC technology has an electrical efficiency 1.3% and 2.3% higher than the PSA and Selexol based cases, respectively. The CLC based process achieves an overall CO2 capture efficiency of 99.9% in contrast to 89.9% for PSA and 93.5% for Selexol based processes. The overall efficiency of the CLC case for combined electricity and H2 production is marginally higher (by 0.3%) than Selexol and lower (by 0.6%) than PSA cases. The integration between the ASU and GT units benefits all three technologies in terms of electrical efficiency. Furthermore, our results suggest that it is favorable to operate the air reactor of the CLC process at higher temperatures with excess air supply in order to achieve higher power efficiency. PMID:24578590
Mukherjee, Sanjay; Kumar, Prashant; Hosseini, Ali; Yang, Aidong; Fennell, Paul
2014-02-20
Seven different types of gasification-based coal conversion processes for producing mainly electricity and in some cases hydrogen (H 2 ), with and without carbon dioxide (CO 2 ) capture, were compared on a consistent basis through simulation studies. The flowsheet for each process was developed in a chemical process simulation tool "Aspen Plus". The pressure swing adsorption (PSA), physical absorption (Selexol), and chemical looping combustion (CLC) technologies were separately analyzed for processes with CO 2 capture. The performances of the above three capture technologies were compared with respect to energetic and exergetic efficiencies, and the level of CO 2 emission. The effect of air separation unit (ASU) and gas turbine (GT) integration on the power output of all the CO 2 capture cases is assessed. Sensitivity analysis was carried out for the CLC process (electricity-only case) to examine the effect of temperature and water-cooling of the air reactor on the overall efficiency of the process. The results show that, when only electricity production in considered, the case using CLC technology has an electrical efficiency 1.3% and 2.3% higher than the PSA and Selexol based cases, respectively. The CLC based process achieves an overall CO 2 capture efficiency of 99.9% in contrast to 89.9% for PSA and 93.5% for Selexol based processes. The overall efficiency of the CLC case for combined electricity and H 2 production is marginally higher (by 0.3%) than Selexol and lower (by 0.6%) than PSA cases. The integration between the ASU and GT units benefits all three technologies in terms of electrical efficiency. Furthermore, our results suggest that it is favorable to operate the air reactor of the CLC process at higher temperatures with excess air supply in order to achieve higher power efficiency.
Empirical validation of an agent-based model of wood markets in Switzerland
Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver
2018-01-01
We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300
NASA Astrophysics Data System (ADS)
Klein, Andreas; Gerlach, Gerald
1998-09-01
This paper deals with the simulation of the fluid-structure interaction phenomena in micropumps. The proposed solution approach is based on external coupling of two different solvers, which are considered here as `black boxes'. Therefore, no specific intervention is necessary into the program code, and solvers can be exchanged arbitrarily. For the realization of the external iteration loop, two algorithms are considered: the relaxation-based Gauss-Seidel method and the computationally more extensive Newton method. It is demonstrated in terms of a simplified test case, that for rather weak coupling, the Gauss-Seidel method is sufficient. However, by simply changing the considered fluid from air to water, the two physical domains become strongly coupled, and the Gauss-Seidel method fails to converge in this case. The Newton iteration scheme must be used instead.
Vrijheid, Martine; Deltour, Isabelle; Krewski, Daniel; Sanchez, Marie; Cardis, Elisabeth
2006-07-01
This paper examines the effects of systematic and random errors in recall and of selection bias in case-control studies of mobile phone use and cancer. These sensitivity analyses are based on Monte-Carlo computer simulations and were carried out within the INTERPHONE Study, an international collaborative case-control study in 13 countries. Recall error scenarios simulated plausible values of random and systematic, non-differential and differential recall errors in amount of mobile phone use reported by study subjects. Plausible values for the recall error were obtained from validation studies. Selection bias scenarios assumed varying selection probabilities for cases and controls, mobile phone users, and non-users. Where possible these selection probabilities were based on existing information from non-respondents in INTERPHONE. Simulations used exposure distributions based on existing INTERPHONE data and assumed varying levels of the true risk of brain cancer related to mobile phone use. Results suggest that random recall errors of plausible levels can lead to a large underestimation in the risk of brain cancer associated with mobile phone use. Random errors were found to have larger impact than plausible systematic errors. Differential errors in recall had very little additional impact in the presence of large random errors. Selection bias resulting from underselection of unexposed controls led to J-shaped exposure-response patterns, with risk apparently decreasing at low to moderate exposure levels. The present results, in conjunction with those of the validation studies conducted within the INTERPHONE study, will play an important role in the interpretation of existing and future case-control studies of mobile phone use and cancer risk, including the INTERPHONE study.
Unsteady Computational Tests of a Non-Equilibrium
NASA Astrophysics Data System (ADS)
Jirasek, Adam; Hamlington, Peter; Lofthouse, Andrew; Usafa Collaboration; Cu Boulder Collaboration
2017-11-01
A non-equilibrium turbulence model is assessed on simulations of three practically-relevant unsteady test cases; oscillating channel flow, transonic flow around an oscillating airfoil, and transonic flow around the Benchmark Super-Critical Wing. The first case is related to piston-driven flows while the remaining cases are relevant to unsteady aerodynamics at high angles of attack and transonic speeds. Non-equilibrium turbulence effects arise in each of these cases in the form of a lag between the mean strain rate and Reynolds stresses, resulting in reduced kinetic energy production compared to classical equilibrium turbulence models that are based on the gradient transport (or Boussinesq) hypothesis. As a result of the improved representation of unsteady flow effects, the non-equilibrium model provides substantially better agreement with available experimental data than do classical equilibrium turbulence models. This suggests that the non-equilibrium model may be ideally suited for simulations of modern high-speed, high angle of attack aerodynamics problems.
Description of waste pretreatment and interfacing systems dynamic simulation model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garbrick, D.J.; Zimmerman, B.D.
1995-05-01
The Waste Pretreatment and Interfacing Systems Dynamic Simulation Model was created to investigate the required pretreatment facility processing rates for both high level and low level waste so that the vitrification of tank waste can be completed according to the milestones defined in the Tri-Party Agreement (TPA). In order to achieve this objective, the processes upstream and downstream of the pretreatment facilities must also be included. The simulation model starts with retrieval of tank waste and ends with vitrification for both low level and high level wastes. This report describes the results of three simulation cases: one based on suggestedmore » average facility processing rates, one with facility rates determined so that approximately 6 new DSTs are required, and one with facility rates determined so that approximately no new DSTs are required. It appears, based on the simulation results, that reasonable facility processing rates can be selected so that no new DSTs are required by the TWRS program. However, this conclusion must be viewed with respect to the modeling assumptions, described in detail in the report. Also included in the report, in an appendix, are results of two sensitivity cases: one with glass plant water recycle steams recycled versus not recycled, and one employing the TPA SST retrieval schedule versus a more uniform SST retrieval schedule. Both recycling and retrieval schedule appear to have a significant impact on overall tank usage.« less
A new algorithm for modeling friction in dynamic mechanical systems
NASA Technical Reports Server (NTRS)
Hill, R. E.
1988-01-01
A method of modeling friction forces that impede the motion of parts of dynamic mechanical systems is described. Conventional methods in which the friction effect is assumed a constant force, or torque, in a direction opposite to the relative motion, are applicable only to those cases where applied forces are large in comparison to the friction, and where there is little interest in system behavior close to the times of transitions through zero velocity. An algorithm is described that provides accurate determination of friction forces over a wide range of applied force and velocity conditions. The method avoids the simulation errors resulting from a finite integration interval used in connection with a conventional friction model, as is the case in many digital computer-based simulations. The algorithm incorporates a predictive calculation based on initial conditions of motion, externally applied forces, inertia, and integration step size. The predictive calculation in connection with an external integration process provides an accurate determination of both static and Coulomb friction forces and resulting motions in dynamic simulations. Accuracy of the results is improved over that obtained with conventional methods and a relatively large integration step size is permitted. A function block for incorporation in a specific simulation program is described. The general form of the algorithm facilitates implementation with various programming languages such as FORTRAN or C, as well as with other simulation programs.
Process simulation and dynamic control for marine oily wastewater treatment using UV irradiation.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Li, Pu
2015-09-15
UV irradiation and advanced oxidation processes have been recently regarded as promising solutions in removing polycyclic aromatic hydrocarbons (PAHs) from marine oily wastewater. However, such treatment methods are generally not sufficiently understood in terms of reaction mechanisms, process simulation and process control. These deficiencies can drastically hinder their application in shipping and offshore petroleum industries which produce bilge/ballast water and produced water as the main streams of marine oily wastewater. In this study, the factorial design of experiment was carried out to investigate the degradation mechanism of a typical PAH, namely naphthalene, under UV irradiation in seawater. Based on the experimental results, a three-layer feed-forward artificial neural network simulation model was developed to simulate the treatment process and to forecast the removal performance. A simulation-based dynamic mixed integer nonlinear programming (SDMINP) approach was then proposed to intelligently control the treatment process by integrating the developed simulation model, genetic algorithm and multi-stage programming. The applicability and effectiveness of the developed approach were further tested though a case study. The experimental results showed that the influences of fluence rate and temperature on the removal of naphthalene were greater than those of salinity and initial concentration. The developed simulation model could well predict the UV-induced removal process under varying conditions. The case study suggested that the SDMINP approach, with the aid of the multi-stage control strategy, was able to significantly reduce treatment cost when comparing to the traditional single-stage process optimization. The developed approach and its concept/framework have high potential of applicability in other environmental fields where a treatment process is involved and experimentation and modeling are used for process simulation and control. Copyright © 2015 Elsevier Ltd. All rights reserved.
Chi, Yujie; Tian, Zhen; Jia, Xun
2016-08-07
Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU's shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75-2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0.69-1.23 times for photon only transport.
Desktop Application Program to Simulate Cargo-Air-Drop Tests
NASA Technical Reports Server (NTRS)
Cuthbert, Peter
2009-01-01
The DSS Application is a computer program comprising a Windows version of the UNIX-based Decelerator System Simulation (DSS) coupled with an Excel front end. The DSS is an executable code that simulates the dynamics of airdropped cargo from first motion in an aircraft through landing. The bare DSS is difficult to use; the front end makes it easy to use. All inputs to the DSS, control of execution of the DSS, and postprocessing and plotting of outputs are handled in the front end. The front end is graphics-intensive. The Excel software provides the graphical elements without need for additional programming. Categories of input parameters are divided into separate tabbed windows. Pop-up comments describe each parameter. An error-checking software component evaluates combinations of parameters and alerts the user if an error results. Case files can be created from inputs, making it possible to build cases from previous ones. Simulation output is plotted in 16 charts displayed on a separate worksheet, enabling plotting of multiple DSS cases with flight-test data. Variables assigned to each plot can be changed. Selected input parameters can be edited from the plot sheet for quick sensitivity studies.
Application of particle splitting method for both hydrostatic and hydrodynamic cases in SPH
NASA Astrophysics Data System (ADS)
Liu, W. T.; Sun, P. N.; Ming, F. R.; Zhang, A. M.
2018-01-01
Smoothed particle hydrodynamics (SPH) method with numerical diffusive terms shows satisfactory stability and accuracy in some violent fluid-solid interaction problems. However, in most simulations, uniform particle distributions are used and the multi-resolution, which can obviously improve the local accuracy and the overall computational efficiency, has seldom been applied. In this paper, a dynamic particle splitting method is applied and it allows for the simulation of both hydrostatic and hydrodynamic problems. The splitting algorithm is that, when a coarse (mother) particle enters the splitting region, it will be split into four daughter particles, which inherit the physical parameters of the mother particle. In the particle splitting process, conservations of mass, momentum and energy are ensured. Based on the error analysis, the splitting technique is designed to allow the optimal accuracy at the interface between the coarse and refined particles and this is particularly important in the simulation of hydrostatic cases. Finally, the scheme is validated by five basic cases, which demonstrate that the present SPH model with a particle splitting technique is of high accuracy and efficiency and is capable for the simulation of a wide range of hydrodynamic problems.
Intense sea-effect snowfall case on the western coast of Finland
NASA Astrophysics Data System (ADS)
Olsson, Taru; Perttula, Tuuli; Jylhä, Kirsti; Luomaranta, Anna
2017-07-01
A new national daily snowfall record was measured in Finland on 8 January 2016 when it snowed 73 cm (31 mm as liquid water) in less than a day in Merikarvia on the western coast of Finland. The area of the most intense snowfall was very small, which is common in convective precipitation. In this work we used hourly weather radar images to identify the sea-effect snowfall case and to qualitatively estimate the performance of HARMONIE, a non-hydrostatic convection-permitting weather prediction model, in simulating the spatial and temporal evolution of the snowbands. The model simulation, including data assimilation, was run at 2.5 km horizontal resolution and 65 levels in vertical. HARMONIE was found to capture the overall sea-effect snowfall situation quite well, as both the timing and the location of the most intense snowstorm were properly simulated. Based on our preliminary analysis, the snowband case was triggered by atmospheric instability above the mostly ice-free sea and a low-level convergence zone almost perpendicular to the coastline. The simulated convective available potential energy (CAPE) reached a value of 87 J kg-1 near the site of the observed snowfall record.
Processing biobased polymers using plasticizers: Numerical simulations versus experiments
NASA Astrophysics Data System (ADS)
Desplentere, Frederik; Cardon, Ludwig; Six, Wim; Erkoç, Mustafa
2016-03-01
In polymer processing, the use of biobased products shows lots of possibilities. Considering biobased materials, biodegradability is in most cases the most important issue. Next to this, bio based materials aimed at durable applications, are gaining interest. Within this research, the influence of plasticizers on the processing of the bio based material is investigated. This work is done for an extrusion grade of PLA, Natureworks PLA 2003D. Extrusion through a slit die equipped with pressure sensors is used to compare the experimental pressure values to numerical simulation results. Additional experimental data (temperature and pressure data along the extrusion screw and die are recorded) is generated on a dr. Collin Lab extruder producing a 25mm diameter tube. All these experimental data is used to indicate the appropriate functioning of the numerical simulation tool Virtual Extrusion Laboratory 6.7 for the simulation of both the industrial available extrusion grade PLA and the compound in which 15% of plasticizer is added. Adding the applied plasticizer, resulted in a 40% lower pressure drop over the extrusion die. The combination of different experiments allowed to fit the numerical simulation results closely to the experimental values. Based on this experience, it is shown that numerical simulations also can be used for modified bio based materials if appropriate material and process data are taken into account.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vencels, Juris; Delzanno, Gian Luca; Johnson, Alec
2015-06-01
A spectral method for kinetic plasma simulations based on the expansion of the velocity distribution function in a variable number of Hermite polynomials is presented. The method is based on a set of non-linear equations that is solved to determine the coefficients of the Hermite expansion satisfying the Vlasov and Poisson equations. In this paper, we first show that this technique combines the fluid and kinetic approaches into one framework. Second, we present an adaptive strategy to increase and decrease the number of Hermite functions dynamically during the simulation. The technique is applied to the Landau damping and two-stream instabilitymore » test problems. Performance results show 21% and 47% saving of total simulation time in the Landau and two-stream instability test cases, respectively.« less
NASA Astrophysics Data System (ADS)
Gao, Xiatian; Wang, Xiaogang; Jiang, Binhao
2017-10-01
UPSF (Universal Plasma Simulation Framework) is a new plasma simulation code designed for maximum flexibility by using edge-cutting techniques supported by C++17 standard. Through use of metaprogramming technique, UPSF provides arbitrary dimensional data structures and methods to support various kinds of plasma simulation models, like, Vlasov, particle in cell (PIC), fluid, Fokker-Planck, and their variants and hybrid methods. Through C++ metaprogramming technique, a single code can be used to arbitrary dimensional systems with no loss of performance. UPSF can also automatically parallelize the distributed data structure and accelerate matrix and tensor operations by BLAS. A three-dimensional particle in cell code is developed based on UPSF. Two test cases, Landau damping and Weibel instability for electrostatic and electromagnetic situation respectively, are presented to show the validation and performance of the UPSF code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hodge, Bri-Mathias
2016-08-11
This paper discusses the development of, approaches for, experiences with, and some results from a large-scale, high-performance-computer-based (HPC-based) co-simulation of electric power transmission and distribution systems using the Integrated Grid Modeling System (IGMS). IGMS was developed at the National Renewable Energy Laboratory (NREL) as a novel Independent System Operator (ISO)-to-appliance scale electric power system modeling platform that combines off-the-shelf tools to simultaneously model 100s to 1000s of distribution systems in co-simulation with detailed ISO markets, transmission power flows, and AGC-level reserve deployment. Lessons learned from the co-simulation architecture development are shared, along with a case study that explores the reactivemore » power impacts of PV inverter voltage support on the bulk power system.« less
Berndt, Jodi; Dinndorf-Hogenson, Georgia; Herheim, Rena; Hoover, Carrie; Lanc, Nicole; Neuwirth, Janet; Tollefson, Bethany
2015-01-01
Collaborative Classroom Simulation (CCS) is a pedagogy designed to provide a simulation learning experience for a classroom of students simultaneously through the use of unfolding case scenarios. The purpose of this descriptive study was to explore the effectiveness of CCS based on student perceptions. Baccalaureate nursing students (n = 98) participated in the study by completing a survey after participation in the CCS experience. Opportunities for collaboration, clinical judgment, and participation as both observer and active participant were seen as strengths of the experience. Developed as a method to overcome barriers to simulation, CCS was shown to be an effective active learning technique that may prove to be sustainable.
Thread scheduling for GPU-based OPC simulation on multi-thread
NASA Astrophysics Data System (ADS)
Lee, Heejun; Kim, Sangwook; Hong, Jisuk; Lee, Sooryong; Han, Hwansoo
2018-03-01
As semiconductor product development based on shrinkage continues, the accuracy and difficulty required for the model based optical proximity correction (MBOPC) is increasing. OPC simulation time, which is the most timeconsuming part of MBOPC, is rapidly increasing due to high pattern density in a layout and complex OPC model. To reduce OPC simulation time, we attempt to apply graphic processing unit (GPU) to MBOPC because OPC process is good to be programmed in parallel. We address some issues that may typically happen during GPU-based OPC simulation in multi thread system, such as "out of memory" and "GPU idle time". To overcome these problems, we propose a thread scheduling method, which manages OPC jobs in multiple threads in such a way that simulations jobs from multiple threads are alternatively executed on GPU while correction jobs are executed at the same time in each CPU cores. It was observed that the amount of GPU peak memory usage decreases by up to 35%, and MBOPC runtime also decreases by 4%. In cases where out of memory issues occur in a multi-threaded environment, the thread scheduler was used to improve MBOPC runtime up to 23%.
Evaluating the benefits of collaboration in simulation games: the case of health care.
Leung, Ricky
2014-01-28
Organizations have used simulation games for health promotion and communication. To evaluate how simulation games can foster collaboration among stakeholders, this paper develops two social network measures. The paper aims to initiate two specific measures that facilitate organizations and researchers to evaluate the effectiveness of Web-based simulation games in fostering collaboration. The two measures are: (1) network density and (2) network diversity. They measure the level of connectedness and communication evenness within social networks. To illustrate how these measures may be used, a hypothetical game about health policy is outlined. Web-based games can serve as an effective platform to engage stakeholders because interaction among them is quite convenient. Yet, systematic evaluation and planning are necessary to realize the benefits of these games. The paper suggests directions for testing how the social network dimension of Web-based games can augment individual-level benefits that stakeholders can obtain from playing simulation games. While this paper focuses on measuring the structural properties of social networks in Web-based games, further research should focus more attention on the appropriateness of game contents. In addition, empirical research should cover different geographical areas, such as East Asian countries where video games are very popular.
Detached-Eddy Simulation Based on the V2-F Model
NASA Technical Reports Server (NTRS)
Jee, Sol Keun; Shariff, Karim R.
2012-01-01
Detached-eddy simulation (DES) based on the v(sup 2)-f Reynolds-averaged Navier-Stokes (RANS) model is developed and tested. The v(sup 2)-f model incorporates the anisotropy of near-wall turbulence which is absent in other RANS models commonly used in the DES community. The v(sup 2)-f RANS model is modified in order the proposed v(sup 2)-f-based DES formulation reduces to a transport equation for the subgrid-scale kinetic energy isotropic turbulence. First, three coefficients in the elliptic relaxation equation are modified, which is tested in channel flows with friction Reynolds number up to 2000. Then, the proposed v(sup 2)-f DES model formulation is derived. The constant, C(sub DES), required in the DES formulation was calibrated by simulating both decaying and statistically-steady isotropic turbulence. After C(sub DES) was calibrated, the v(sub 2)-f DES formulation is tested for flow around a circular cylinder at a Reynolds number of 3900, in which case turbulence develops after separation. Simulations indicate that this model represents the turbulent wake nearly as accurately as the dynamic Smagorinsky model. Spalart-Allmaras-based DES is also included in the cylinder flow simulation for comparison.
Numerical simulation and fracture identification of dual laterolog in organic shale
NASA Astrophysics Data System (ADS)
Maojin, Tan; Peng, Wang; Qiong, Liu
2012-09-01
Fracture is one of important spaces in shale oil and shale gas reservoirs, and fractures identification and evaluation are an important part in organic shale interpretation. According to the fractured shale gas reservoir, a physical model is set up to study the dual laterolog logging responses. First, based on the principle of dual laterolog, three-dimensional finite element method (FEM) is used to simulate the dual laterolog responses in various formation models with different fractures widths, different fracture numbers, different fractures inclination angle. All the results are extremely important for the fracture identification and evaluation in shale reservoirs. Appointing to different base rock resistivity models, the fracture models are constructed respectively through a number of numerical simulation, and the fracture porosity can be calculated by solving the corresponding formulas. A case study about organic shale formation is analyst and discussed, and the fracture porosity is calculated from dual laterolog. The fracture evaluation results are also be validated right by Full borehole Micro-resistivity Imaging (FMI). So, in case of the absence of borehole resistivity imaging log, the dual laterolog resistivity can be used to estimate the fracture development.
A physical-based gas-surface interaction model for rarefied gas flow simulation
NASA Astrophysics Data System (ADS)
Liang, Tengfei; Li, Qi; Ye, Wenjing
2018-01-01
Empirical gas-surface interaction models, such as the Maxwell model and the Cercignani-Lampis model, are widely used as the boundary condition in rarefied gas flow simulations. The accuracy of these models in the prediction of macroscopic behavior of rarefied gas flows is less satisfactory in some cases especially the highly non-equilibrium ones. Molecular dynamics simulation can accurately resolve the gas-surface interaction process at atomic scale, and hence can predict accurate macroscopic behavior. They are however too computationally expensive to be applied in real problems. In this work, a statistical physical-based gas-surface interaction model, which complies with the basic relations of boundary condition, is developed based on the framework of the washboard model. In virtue of its physical basis, this new model is capable of capturing some important relations/trends for which the classic empirical models fail to model correctly. As such, the new model is much more accurate than the classic models, and in the meantime is more efficient than MD simulations. Therefore, it can serve as a more accurate and efficient boundary condition for rarefied gas flow simulations.
Reduced order model based on principal component analysis for process simulation and optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lang, Y.; Malacina, A.; Biegler, L.
2009-01-01
It is well-known that distributed parameter computational fluid dynamics (CFD) models provide more accurate results than conventional, lumped-parameter unit operation models used in process simulation. Consequently, the use of CFD models in process/equipment co-simulation offers the potential to optimize overall plant performance with respect to complex thermal and fluid flow phenomena. Because solving CFD models is time-consuming compared to the overall process simulation, we consider the development of fast reduced order models (ROMs) based on CFD results to closely approximate the high-fidelity equipment models in the co-simulation. By considering process equipment items with complicated geometries and detailed thermodynamic property models,more » this study proposes a strategy to develop ROMs based on principal component analysis (PCA). Taking advantage of commercial process simulation and CFD software (for example, Aspen Plus and FLUENT), we are able to develop systematic CFD-based ROMs for equipment models in an efficient manner. In particular, we show that the validity of the ROM is more robust within well-sampled input domain and the CPU time is significantly reduced. Typically, it takes at most several CPU seconds to evaluate the ROM compared to several CPU hours or more to solve the CFD model. Two case studies, involving two power plant equipment examples, are described and demonstrate the benefits of using our proposed ROM methodology for process simulation and optimization.« less
Stochastic simulation by image quilting of process-based geological models
NASA Astrophysics Data System (ADS)
Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef
2017-09-01
Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.
Impact of a Paper vs Virtual Simulated Patient Case on Student-Perceived Confidence and Engagement
Gallimore, Casey E.; Pitterle, Michael; Morrill, Josh
2016-01-01
Objective. To evaluate online case simulation vs a paper case on student confidence and engagement. Design. Students enrolled in a pharmacotherapy laboratory course completed a patient case scenario as a component of an osteoarthritis laboratory module. Two laboratory sections used a paper case (n=53); three sections used an online virtual case simulation (n=81). Student module performance was assessed through a submitted subjective objective assessment plan (SOAP) note. Students completed pre/post surveys to measure self-perceived confidence in providing medication management. The simulation group completed postmodule questions related to realism and engagement of the online virtual case simulation. Group assessments were performed using chi-square and Mann Whitney tests. Assessment. A significant increase in all 13 confidence items was seen in both student groups following completion of the laboratory module. The simulation group had an increased change of confidence compared to the paper group in assessing medication efficacy and documenting a thorough assessment. Comparing the online virtual simulation to a paper case, students agreed the learning experience increased interest, enjoyment, relevance, and realism. The simulation group performed better on the subjective SOAP note domain though no differences in total SOAP note scores was found between the two groups. Conclusion. Virtual case simulations result in increased student engagement and may lead to improved documentation performance in the subjective domain of SOAP notes. However, virtual patient cases may offer limited benefit over paper cases in improving overall student self-confidence to provide medication management. PMID:26941442
Impact of a Paper vs Virtual Simulated Patient Case on Student-Perceived Confidence and Engagement.
Barnett, Susanne G; Gallimore, Casey E; Pitterle, Michael; Morrill, Josh
2016-02-25
To evaluate online case simulation vs a paper case on student confidence and engagement. Students enrolled in a pharmacotherapy laboratory course completed a patient case scenario as a component of an osteoarthritis laboratory module. Two laboratory sections used a paper case (n=53); three sections used an online virtual case simulation (n=81). Student module performance was assessed through a submitted subjective objective assessment plan (SOAP) note. Students completed pre/post surveys to measure self-perceived confidence in providing medication management. The simulation group completed postmodule questions related to realism and engagement of the online virtual case simulation. Group assessments were performed using chi-square and Mann Whitney tests. A significant increase in all 13 confidence items was seen in both student groups following completion of the laboratory module. The simulation group had an increased change of confidence compared to the paper group in assessing medication efficacy and documenting a thorough assessment. Comparing the online virtual simulation to a paper case, students agreed the learning experience increased interest, enjoyment, relevance, and realism. The simulation group performed better on the subjective SOAP note domain though no differences in total SOAP note scores was found between the two groups. Virtual case simulations result in increased student engagement and may lead to improved documentation performance in the subjective domain of SOAP notes. However, virtual patient cases may offer limited benefit over paper cases in improving overall student self-confidence to provide medication management.
NASA Astrophysics Data System (ADS)
Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi
This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.
NASA Astrophysics Data System (ADS)
Lin, Hui; Liu, Tianyu; Su, Lin; Bednarz, Bryan; Caracappa, Peter; Xu, X. George
2017-09-01
Monte Carlo (MC) simulation is well recognized as the most accurate method for radiation dose calculations. For radiotherapy applications, accurate modelling of the source term, i.e. the clinical linear accelerator is critical to the simulation. The purpose of this paper is to perform source modelling and examine the accuracy and performance of the models on Intel Many Integrated Core coprocessors (aka Xeon Phi) and Nvidia GPU using ARCHER and explore the potential optimization methods. Phase Space-based source modelling for has been implemented. Good agreements were found in a tomotherapy prostate patient case and a TrueBeam breast case. From the aspect of performance, the whole simulation for prostate plan and breast plan cost about 173s and 73s with 1% statistical error.
Miles, Anna; Friary, Philippa; Jackson, Bianca; Sekula, Julia; Braakhuis, Andrea
2016-06-01
This study evaluated hospital readiness and interprofessional clinical reasoning in speech-language pathology and dietetics students following a simulation-based teaching package. Thirty-one students participated in two half-day simulation workshops. The training included orientation to the hospital setting, part-task skill learning and immersive simulated cases. Students completed workshop evaluation forms. They filled in a 10-question survey regarding confidence, knowledge and preparedness for working in a hospital environment before and immediately after the workshops. Students completed written 15-min clinical vignettes at 1 month prior to training, immediately prior to training and immediately after training. A marking rubric was devised to evaluate the responses to the clinical vignettes within a framework of interprofessional education. The simulation workshops were well received by all students. There was a significant increase in students' self-ratings of confidence, preparedness and knowledge following the study day (p < .001). There was a significant increase in student overall scores in clinical vignettes after training with the greatest increase in clinical reasoning (p < .001). Interprofessional simulation-based training has benefits in developing hospital readiness and clinical reasoning in allied health students.
Bassi, Gabriele; Blednykh, Alexei; Smalyuk, Victor
2016-02-24
A novel algorithm for self-consistent simulations of long-range wakefield effects has been developed and applied to the study of both longitudinal and transverse coupled-bunch instabilities at NSLS-II. The algorithm is implemented in the new parallel tracking code space (self-consistent parallel algorithm for collective effects) discussed in the paper. The code is applicable for accurate beam dynamics simulations in cases where both bunch-to-bunch and intrabunch motions need to be taken into account, such as chromatic head-tail effects on the coupled-bunch instability of a beam with a nonuniform filling pattern, or multibunch and single-bunch effects of a passive higher-harmonic cavity. The numericalmore » simulations have been compared with analytical studies. For a beam with an arbitrary filling pattern, intensity-dependent complex frequency shifts have been derived starting from a system of coupled Vlasov equations. The analytical formulas and numerical simulations confirm that the analysis is reduced to the formulation of an eigenvalue problem based on the known formulas of the complex frequency shifts for the uniform filling pattern case.« less
Application of game-like simulations in the Spanish Transplant National Organization.
Borro-Escribano, B; Martínez-Alpuente, I; Blanco, A Del; Torrente, J; Fernández-Manjón, B; Matesanz, R
2013-01-01
Twenty years ago, the Spanish National Transplant Organization (NTO) started a management and organizational system, known as the Spanish Model, that has allowed the NTO to occupy a privileged world position regarding deceased donation rates, which have been 33-35 donors per million population in recent years. One of the key elements of this model is its instructional approach. Two years ago, the NTO started the project "educ@nt" in close collaboration with the e-UCM research group of the University Complutense of Madrid to support and maximize its successful professional training system. As a result, 3 game-like simulations have been developed representing the different procedural steps of the suprahospital level of the transplantation process. These simulations represent the donor and organ evaluation, the allocation of organs applying the corresponding geographic and clinical criteria, and the logistics of transportation. Simulations are based on 10 representative teaching cases that help students become familiar with the most common cases arriving in the NTO. For the 2nd consecutive year, these simulations have been used in different courses around Spain. Copyright © 2013 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Po-Lun; Gattiker, J. R.; Liu, Xiaohong
2013-06-27
A Gaussian process (GP) emulator is applied to quantify the contribution of local and remote emissions of black carbon (BC) on the BC concentrations in different regions using a Latin Hypercube sampling strategy for emission perturbations in the offline version of the Community Atmosphere Model Version 5.1 (CAM5) simulations. The source-receptor relationships are computed based on simulations constrained by a standard free-running CAM5 simulation and the ERA-Interim reanalysis product. The analysis demonstrates that the emulator is capable of retrieving the source-receptor relationships based on a small number of CAM5 simulations. Most regions are found susceptible to their local emissions. Themore » emulator also finds that the source-receptor relationships retrieved from the model-driven and the reanalysis-driven simulations are very similar, suggesting that the simulated circulation in CAM5 resembles the assimilated meteorology in ERA-Interim. The robustness of the results provides confidence for applying the emulator to detect dose-response signals in the climate system.« less
ERIC Educational Resources Information Center
Hakerem, Gita; And Others
This study reports the efforts of the Water and Molecular Networks Project (WAMNet), a program in which high school chemistry students use computer simulations developed at Boston University (Massachusetts) to model the three-dimensional structure of molecules and the hydrogen bond network that holds water molecules together. This case study…
Lin, Yuting; Lin, Wei-Ching; Fwu, Peter T; Shih, Tzu-Ching; Yeh, Lee-Ren; Su, Min-Ying; Chen, Jeon-Hor
2015-10-01
This study applied a simulation method to map the temperature distribution based on magnetic resonance imaging (MRI) of individual patients, and investigated the influence of different pelvic tissue types as well as the choice of thermal property parameters on the efficiency of endorectal cooling balloon (ECB). MR images of four subjects with different prostate sizes and pelvic tissue compositions, including fatty tissue and venous plexus, were analyzed. The MR images acquired using endorectal coil provided a realistic geometry of deformed prostate that resembled the anatomy in the presence of ECB. A single slice with the largest two-dimensional (2D) cross-sectional area of the prostate gland was selected for analysis. The rectal wall, prostate gland, peri-rectal fatty tissue, peri-prostatic fatty tissue, peri-prostatic venous plexus, and urinary bladder were manually segmented. Pennes' bioheat thermal model was used to simulate the temperature distribution dynamics, by using an in-house finite element mesh based solver written in MATLAB. The results showed that prostate size and periprostatic venous plexus were two major factors affecting ECB cooling efficiency. For cases with negligible amount of venous plexus and small prostate, the average temperature in the prostate and neurovascular bundles could be cooled down to 25 °C within 30 min. For cases with abundant venous plexus and large prostate, the temperature could not reach 25 °C at the end of 3 h cooling. Large prostate made the cooling difficult to propagate through. The impact of fatty tissue on cooling effect was small. The filling of bladder with warm urine during the ECB cooling procedure did not affect the temperature in the prostate or NVB. In addition to the 2D simulation, in one case a 3D pelvic model was constructed for volumetric simulation. It was found that the 2D slice with the largest cross-sectional area of prostate had the most abundant venous plexus, and was the most difficult slice to cool, thus it may provide a conservative prediction of the cooling effect. This feasibility study demonstrated that the simulation tool could potentially be used for adjusting the setting of ECB for individual patients during hypothermic radical prostatectomy. Further studies using MR thermometry are required to validate the in silico results obtained using simulation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Lin, Yuting; Lin, Wei-Ching; Fwu, Peter T.; Shih, Tzu-Ching; Yeh, Lee-Ren; Su, Min-Ying; Chen, Jeon-Hor
2015-01-01
This study applied a simulation method to map the temperature distribution based on magnetic resonance imaging (MRI) of individual patients, and investigated the influence of different pelvic tissue types as well as the choice of thermal property parameters on the efficiency of endorectal cooling balloon (ECB). MR images of four subjects with different prostate sizes and pelvic tissue compositions, including fatty tissue and venous plexus, were analyzed. The MR images acquired using endorectal coil provided a realistic geometry of deformed prostate that resembled the anatomy in the presence of ECB. A single slice with the largest two-dimensional (2D) cross-sectional area of the prostate gland was selected for analysis. The rectal wall, prostate gland, peri-rectal fatty tissue, peri-prostatic fatty tissue, peri-prostatic venous plexus, and urinary bladder were manually segmented. Pennes’ bioheat thermal model was used to simulate the temperature distribution dynamics, by using an in-house finite element mesh based solver written in Matlab. The results showed that prostate size and periprostatic venous plexus were two major factors affecting ECB cooling efficiency. For cases with negligible amount of venous plexus and small prostate, the averaged temperature in the prostate and neurovascular bundles could be cooled down to 25°C within 30 minutes. For cases with abundant venous plexus and large prostate, the temperature could not reach 25°C at the end of 3 hours cooling. Large prostate made the cooling difficult to propagate through. The impact of fatty tissue on cooling effect was small. The filling of bladder with warm urine during the ECB cooling procedure did not affect the temperature in the prostate or NVB. In addition to the 2D simulation, in one case a 3D pelvic model was constructed for volumetric simulation. It was found that the 2D slice with the largest cross-sectional area of prostate had the most abundant venous plexus, and was the most difficult slice to cool, thus it may provide a conservative prediction of the cooling effect. This feasibility study demonstrated that the simulation tool could potentially be used for adjusting the setting of ECB for individual patients during hypothermic radical prostatectomy. Further studies using MR thermometry are required to validate the in silico results obtained using simulation. PMID:26198131
Numerical simulation of adverse-pressure-gradient boundary layer with or without roughness
NASA Astrophysics Data System (ADS)
Mottaghian, Pouya; Yuan, Junlin; Piomelli, Ugo
2014-11-01
Large-eddy and direct numerical simulations are carried out on flat-plate boundary layer over smooth and rough surfaces, with adverse pressure gradient.The deceleration is achieved by imposing a wall-normal freestream velocity profile, and is strong enough to cause separation at the wall. The Reynolds number based on momentum thickness and freestream velocity at inlet is 600. Numerical sandgrain roughness is applied based on an immersed boundary method, yielding a flow that is transitionally rough. The turbulence intensity increases before separation, and reaches a higher value for the rough case, indicating stronger mixing. Roughness also causes higher momentum deficit near the wall, leading to earlier separation. This is consistent with previous observation made on rough-wall flow separation over a ramp. In both cases, the turbulent kinetic energy peaks inside the shear layer above the detachment region, with higher values in the rough case; it then decreases approaching the reattachment region. Near the wall inside the separation bubble, the near-zero turbulent intensity indicates that the turbulent structures are lifted up in the separation region. Compared with the smooth case, the shear layer is farther from the wall and the reattachment length is longer on the rough wall.
Horstmann, M; Renninger, M; Hennenlotter, J; Horstmann, C C; Stenzl, A
2009-08-01
E-learning is a teaching tool used successfully in many medical subspecialties. Experience with its use in urology, however, is scarce. We present our teaching experience with the INMEDEA simulator to teach urological care to medical students. The INMEDEA simulator is an interactive e-learning system built around a virtual hospital which includes a department of urology. It allows students to solve virtual patient cases online. In this study, students were asked to prepare two urological cases prior to discussion of the cases in small groups. This blended teaching approach was evaluated by students through anonymous questionnaires. Of 70 4th year medical students 76% judged this teaching method as good or very good. Eighty-seven percent felt that it offered a good way to understand urological diseases better and 72% felt that learning with this method was fun. Nevertheless, 30 out of 70 free text statements revealed that further improvements of the program, including an easier and more comfortable navigation and a faster supply of information are necessary. Virtual patient cases offer a practicable solution for teaching based on problem solving in urology with a high acceptance rate by students.
Kennedy, Joshua L; Jones, Stacie M; Porter, Nicholas; White, Marjorie L; Gephardt, Grace; Hill, Travis; Cantrell, Mary; Nick, Todd G; Melguizo, Maria; Smith, Chris; Boateng, Beatrice A; Perry, Tamara T; Scurlock, Amy M; Thompson, Tonya M
2013-01-01
Simulation models that used high-fidelity mannequins have shown promise in medical education, particularly for cases in which the event is uncommon. Allergy physicians encounter emergencies in their offices, and these can be the source of much trepidation. To determine if case-based simulations with high-fidelity mannequins are effective in teaching and retention of emergency management team skills. Allergy clinics were invited to Arkansas Children's Hospital Pediatric Understanding and Learning through Simulation Education center for a 1-day workshop to evaluate skills concerning the management of allergic emergencies. A Clinical Emergency Preparedness Team Performance Evaluation was developed to evaluate the competence of teams in several areas: leadership and/or role clarity, closed-loop communication, team support, situational awareness, and scenario-specific skills. Four cases, which focus on common allergic emergencies, were simulated by using high-fidelity mannequins and standardized patients. Teams were evaluated by multiple reviewers by using video recording and standardized scoring. Ten to 12 months after initial training, an unannounced in situ case was performed to determine retention of the skills training. Clinics showed significant improvements for role clarity, teamwork, situational awareness, and scenario-specific skills during the 1-day workshop (all P < .003). Follow-up in situ scenarios 10-12 months later demonstrated retention of skills training at both clinics (all P ≤ .004). Clinical Emergency Preparedness Team Performance Evaluation scores demonstrated improved team management skills with simulation training in office emergencies. Significant recall of team emergency management skills was demonstrated months after the initial training. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care
Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis
2017-01-01
This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration. PMID:28133988
A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care.
Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis
2017-01-01
This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamm, L.L.
1998-10-07
This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report. This report documents the results of simulations of a Loss-of-Flow Accident (LOFA) where power is lost to all of the pumps that circulate water in the blanket region, the accelerator beam is shut off and neither the residual heat removal nor cavity flood systems operate.
Qin, Nan; Shen, Chenyang; Tsai, Min-Yu; Pinto, Marco; Tian, Zhen; Dedes, Georgios; Pompos, Arnold; Jiang, Steve B; Parodi, Katia; Jia, Xun
2018-01-01
One of the major benefits of carbon ion therapy is enhanced biological effectiveness at the Bragg peak region. For intensity modulated carbon ion therapy (IMCT), it is desirable to use Monte Carlo (MC) methods to compute the properties of each pencil beam spot for treatment planning, because of their accuracy in modeling physics processes and estimating biological effects. We previously developed goCMC, a graphics processing unit (GPU)-oriented MC engine for carbon ion therapy. The purpose of the present study was to build a biological treatment plan optimization system using goCMC. The repair-misrepair-fixation model was implemented to compute the spatial distribution of linear-quadratic model parameters for each spot. A treatment plan optimization module was developed to minimize the difference between the prescribed and actual biological effect. We used a gradient-based algorithm to solve the optimization problem. The system was embedded in the Varian Eclipse treatment planning system under a client-server architecture to achieve a user-friendly planning environment. We tested the system with a 1-dimensional homogeneous water case and 3 3-dimensional patient cases. Our system generated treatment plans with biological spread-out Bragg peaks covering the targeted regions and sparing critical structures. Using 4 NVidia GTX 1080 GPUs, the total computation time, including spot simulation, optimization, and final dose calculation, was 0.6 hour for the prostate case (8282 spots), 0.2 hour for the pancreas case (3795 spots), and 0.3 hour for the brain case (6724 spots). The computation time was dominated by MC spot simulation. We built a biological treatment plan optimization system for IMCT that performs simulations using a fast MC engine, goCMC. To the best of our knowledge, this is the first time that full MC-based IMCT inverse planning has been achieved in a clinically viable time frame. Copyright © 2017 Elsevier Inc. All rights reserved.
Numerical tool for SMA material simulation: application to composite structure design
NASA Astrophysics Data System (ADS)
Chemisky, Yves; Duval, Arnaud; Piotrowski, Boris; Ben Zineb, Tarak; Tahiri, Vanessa; Patoor, Etienne
2009-10-01
Composite materials based on shape memory alloys (SMA) have received growing attention over these last few years. In this paper, two particular morphologies of composites are studied. The first one is an SMA/elastomer composite in which a snake-like wire NiTi SMA is embedded into an elastomer ribbon. The second one is a commercial Ni47Ti44Nb9 which presents elastic-plastic inclusions in an NiTi SMA matrix. In both cases, the design of such composites required the development of an SMA design tool, based on a macroscopic 3D constitutive law for NiTi alloys. Two different strategies are then applied to compute these composite behaviors. For the SMA/elastomer composite, the macroscopic behavior law is implemented in commercial FEM software, and for the Ni47Ti44Nb9 a scale transition approach based on the Mori-Tanaka scheme is developed. In both cases, simulations are compared to experimental data.
Using a Web-Based e-Visit Simulation to Educate Nurse Practitioner Students.
Merritt, Lisa Schaeg; Brauch, Allison N; Bender, Annah K; Kochuk, Daria
2018-05-01
The purpose of this pilot study was to develop and implement a Web-based, e-Visit simulation experience for nurse practitioner students and evaluate student satisfaction and perceived learning. The convenience sample consisted of 26 senior-level Master of Science in Nursing students in the Pediatric Nurse Practitioner and Adult-Gerontology Nurse Practitioner programs. A Likert survey was used for evaluation that measured items from 1 (strongly disagree) to 5 (strongly agree). Students reported that the simulation cases closely resembled real-world patients (97%; M = 4.42, SD = 0.69), providing them with a better understanding of complaints commonly addressed via telehealth services (96%; M = 4.46, SD = 0.57). Accuracy of diagnosis and treatment on first attempt was 95%. A Web-based, e-Visit simulation can be a useful learning experience for nurse practitioner students with knowledge gained that is transferable to real clinical situations. [J Nurs Educ. 2018;57(5):304-307.]. Copyright 2018, SLACK Incorporated.
Equation-based languages – A new paradigm for building energy modeling, simulation and optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.
Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less
Virtual tryout planning in automotive industry based on simulation metamodels
NASA Astrophysics Data System (ADS)
Harsch, D.; Heingärtner, J.; Hortig, D.; Hora, P.
2016-11-01
Deep drawn sheet metal parts are increasingly designed to the feasibility limit, thus achieving a robust manufacturing is often challenging. The fluctuation of process and material properties often lead to robustness problems. Therefore, numerical simulations are used to detect the critical regions. To enhance the agreement with the real process conditions, the material data are acquired through a variety of experiments. Furthermore, the force distribution is taken into account. The simulation metamodel contains the virtual knowledge of a particular forming process, which is determined based on a series of finite element simulations with variable input parameters. Based on the metamodels, virtual process windows can be displayed for different configurations. This helps to improve the operating point as well as to adjust process settings in case the process becomes unstable. Furthermore, the time of tool tryout can be shortened due to transfer of the virtual knowledge contained in the metamodels on the optimisation of the drawbeads. This allows the tool manufacturer to focus on the essential, to save time and to recognize complex relationships.
Equation-based languages – A new paradigm for building energy modeling, simulation and optimization
Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.
2016-04-01
Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less
A cognitive task analysis for dental hygiene.
Cameron, C A; Beemsterboer, P L; Johnson, L A; Mislevy, R J; Steinberg, L S; Breyer, F J
2000-05-01
To be an effective assessment tool, a simulation-based examination must be able to evoke and interpret observable evidence about targeted knowledge, strategies, and skills in a manner that is logical and defensible. Dental Interactive Simulations Corporation's first assessment effort is the development of a scoring algorithm for a simulation-based dental hygiene initial licensure examination. The first phase in developing a scoring system is the completion of a cognitive task analysis (CTA) of the dental hygiene domain. In the first step of the CTA, a specifications map was generated to provide a framework of the tasks and knowledge that are important to the practice of dental hygiene. Using this framework, broad classes of behaviors that would tend to distinguish along the dental hygiene expert-novice continuum were identified. Nine paper-based cases were then designed with the expectation that the solutions of expert, competent, and novice dental hygienists would differ. Interviews were conducted with thirty-one dental hygiene students/practitioners to capture solutions to the paper-based cases. Transcripts of the interviews were analyzed to identify performance features that distinguish among the interviewees on the basis of their expertise. These features were more detailed and empirically grounded than the originating broad classes and better serve to ground the design of a scoring system. The resulting performance features were collapsed into nine major categories: 1) gathering and using information, 2) formulating problems and investigating hypotheses, 3) communication and language, 4) scripting behavior, 5) ethics, 6) patient assessment, 7) treatment planning, 8) treatment, and 9) evaluation. The results of the CTA provide critical information for defining the necessary elements of a simulation-based dental hygiene examination.
A robustness test of the braided device foreshortening algorithm
NASA Astrophysics Data System (ADS)
Moyano, Raquel Kale; Fernandez, Hector; Macho, Juan M.; Blasco, Jordi; San Roman, Luis; Narata, Ana Paula; Larrabide, Ignacio
2017-11-01
Different computational methods have been recently proposed to simulate the virtual deployment of a braided stent inside a patient vasculature. Those methods are primarily based on the segmentation of the region of interest to obtain the local vessel morphology descriptors. The goal of this work is to evaluate the influence of the segmentation quality on the method named "Braided Device Foreshortening" (BDF). METHODS: We used the 3DRA images of 10 aneurysmatic patients (cases). The cases were segmented by applying a marching cubes algorithm with a broad range of thresholds in order to generate 10 surface models each. We selected a braided device to apply the BDF algorithm to each surface model. The range of the computed flow diverter lengths for each case was obtained to calculate the variability of the method against the threshold segmentation values. RESULTS: An evaluation study over 10 clinical cases indicates that the final length of the deployed flow diverter in each vessel model is stable, shielding maximum difference of 11.19% in vessel diameter and maximum of 9.14% in the simulated stent length for the threshold values. The average coefficient of variation was found to be 4.08 %. CONCLUSION: A study evaluating how the threshold segmentation affects the simulated length of the deployed FD, was presented. The segmentation algorithm used to segment intracranial aneurysm 3D angiography images presents small variation in the resulting stent simulation.
NASA Astrophysics Data System (ADS)
Rasmussen, Karsten B.; Juhl, Peter
2004-05-01
Boundary element method (BEM) calculations are used for the purpose of predicting the acoustic influence of the human head in two cases. In the first case the sound source is the mouth and in the second case the sound is plane waves arriving from different directions in the horizontal plane. In both cases the sound field is studied in relation to two positions above the right ear being representative of hearing aid microphone positions. Both cases are relevant for hearing aid development. The calculations are based upon a direct BEM implementation in Matlab. The meshing is based on the original geometrical data files describing the B&K Head and Torso Simulator 4128 combined with a 3D scan of the pinna.
Modelling approaches: the case of schizophrenia.
Heeg, Bart M S; Damen, Joep; Buskens, Erik; Caleo, Sue; de Charro, Frank; van Hout, Ben A
2008-01-01
Schizophrenia is a chronic disease characterized by periods of relative stability interrupted by acute episodes (or relapses). The course of the disease may vary considerably between patients. Patient histories show considerable inter- and even intra-individual variability. We provide a critical assessment of the advantages and disadvantages of three modelling techniques that have been used in schizophrenia: decision trees, (cohort and micro-simulation) Markov models and discrete event simulation models. These modelling techniques are compared in terms of building time, data requirements, medico-scientific experience, simulation time, clinical representation, and their ability to deal with patient heterogeneity, the timing of events, prior events, patient interaction, interaction between co-variates and variability (first-order uncertainty). We note that, depending on the research question, the optimal modelling approach should be selected based on the expected differences between the comparators, the number of co-variates, the number of patient subgroups, the interactions between co-variates, and simulation time. Finally, it is argued that in case micro-simulation is required for the cost-effectiveness analysis of schizophrenia treatments, a discrete event simulation model is best suited to accurately capture all of the relevant interdependencies in this chronic, highly heterogeneous disease with limited long-term follow-up data.
Time simulation of flutter with large stiffness changes
NASA Technical Reports Server (NTRS)
Karpel, M.; Wieseman, C. D.
1992-01-01
Time simulation of flutter, involving large local structural changes, is formulated with a state-space model that is based on a relatively small number of generalized coordinates. Free-free vibration modes are first calculated for a nominal finite-element model with relatively large fictitious masses located at the area of structural changes. A low-frequency subset of these modes is then transformed into a set of structural modal coordinates with which the entire simulation is performed. These generalized coordinates and the associated oscillatory aerodynamic force coefficient matrices are used to construct an efficient time-domain, state-space model for basic aeroelastic case. The time simulation can then be performed by simply changing the mass, stiffness and damping coupling terms when structural changes occur. It is shown that the size of the aeroelastic model required for time simulation with large structural changes at a few a priori known locations is similar to that required for direct analysis of a single structural case. The method is applied to the simulation of an aeroelastic wind-tunnel model. The diverging oscillations are followed by the activation of a tip-ballast decoupling mechanism that stabilizes the system but may cause significant transient overshoots.
Time simulation of flutter with large stiffness changes
NASA Technical Reports Server (NTRS)
Karpel, Mordechay; Wieseman, Carol D.
1992-01-01
Time simulation of flutter, involving large local structural changes, is formulated with a state-space model that is based on a relatively small number of generalized coordinates. Free-free vibration modes are first calculated for a nominal finite-element model with relatively large fictitious masses located at the area of structural changes. A low-frequency subset of these modes is then transformed into a set of structural modal coordinates with which the entire simulation is performed. These generalized coordinates and the associated oscillatory aerodynamic force coefficient matrices are used to construct an efficient time-domain, state-space model for a basic aeroelastic case. The time simulation can then be performed by simply changing the mass, stiffness, and damping coupling terms when structural changes occur. It is shown that the size of the aeroelastic model required for time simulation with large structural changes at a few apriori known locations is similar to that required for direct analysis of a single structural case. The method is applied to the simulation of an aeroelastic wind-tunnel model. The diverging oscillations are followed by the activation of a tip-ballast decoupling mechanism that stabilizes the system but may cause significant transient overshoots.
Ohira, Yoshiyuki; Uehara, Takanori; Noda, Kazutaka; Suzuki, Shingo; Shikino, Kiyoshi; Kajiwara, Hideki; Kondo, Takeshi; Hirota, Yusuke; Ikusaka, Masatomi
2017-01-01
Objectives We examined whether problem-based learning tutorials using patient-simulated videos showing daily life are more practical for clinical learning, compared with traditional paper-based problem-based learning, for the consideration rate of psychosocial issues and the recall rate for experienced learning. Methods Twenty-two groups with 120 fifth-year students were each assigned paper-based problem-based learning and video-based problem-based learning using patient-simulated videos. We compared target achievement rates in questionnaires using the Wilcoxon signed-rank test and discussion contents diversity using the Mann-Whitney U test. A follow-up survey used a chi-square test to measure students’ recall of cases in three categories: video, paper, and non-experienced. Results Video-based problem-based learning displayed significantly higher achievement rates for imagining authentic patients (p=0.001), incorporating a comprehensive approach including psychosocial aspects (p<0.001), and satisfaction with sessions (p=0.001). No significant differences existed in the discussion contents diversity regarding the International Classification of Primary Care Second Edition codes and chapter types or in the rate of psychological codes. In a follow-up survey comparing video and paper groups to non-experienced groups, the rates were higher for video (χ2=24.319, p<0.001) and paper (χ2=11.134, p=0.001). Although the video rate tended to be higher than the paper rate, no significant difference was found between the two. Conclusions Patient-simulated videos showing daily life facilitate imagining true patients and support a comprehensive approach that fosters better memory. The clinical patient-simulated video method is more practical and clinical problem-based tutorials can be implemented if we create patient-simulated videos for each symptom as teaching materials. PMID:28245193
Ikegami, Akiko; Ohira, Yoshiyuki; Uehara, Takanori; Noda, Kazutaka; Suzuki, Shingo; Shikino, Kiyoshi; Kajiwara, Hideki; Kondo, Takeshi; Hirota, Yusuke; Ikusaka, Masatomi
2017-02-27
We examined whether problem-based learning tutorials using patient-simulated videos showing daily life are more practical for clinical learning, compared with traditional paper-based problem-based learning, for the consideration rate of psychosocial issues and the recall rate for experienced learning. Twenty-two groups with 120 fifth-year students were each assigned paper-based problem-based learning and video-based problem-based learning using patient-simulated videos. We compared target achievement rates in questionnaires using the Wilcoxon signed-rank test and discussion contents diversity using the Mann-Whitney U test. A follow-up survey used a chi-square test to measure students' recall of cases in three categories: video, paper, and non-experienced. Video-based problem-based learning displayed significantly higher achievement rates for imagining authentic patients (p=0.001), incorporating a comprehensive approach including psychosocial aspects (p<0.001), and satisfaction with sessions (p=0.001). No significant differences existed in the discussion contents diversity regarding the International Classification of Primary Care Second Edition codes and chapter types or in the rate of psychological codes. In a follow-up survey comparing video and paper groups to non-experienced groups, the rates were higher for video (χ 2 =24.319, p<0.001) and paper (χ 2 =11.134, p=0.001). Although the video rate tended to be higher than the paper rate, no significant difference was found between the two. Patient-simulated videos showing daily life facilitate imagining true patients and support a comprehensive approach that fosters better memory. The clinical patient-simulated video method is more practical and clinical problem-based tutorials can be implemented if we create patient-simulated videos for each symptom as teaching materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bassi, Gabriele; Blednykh, Alexei; Smalyuk, Victor
A novel algorithm for self-consistent simulations of long-range wakefield effects has been developed and applied to the study of both longitudinal and transverse coupled-bunch instabilities at NSLS-II. The algorithm is implemented in the new parallel tracking code space (self-consistent parallel algorithm for collective effects) discussed in the paper. The code is applicable for accurate beam dynamics simulations in cases where both bunch-to-bunch and intrabunch motions need to be taken into account, such as chromatic head-tail effects on the coupled-bunch instability of a beam with a nonuniform filling pattern, or multibunch and single-bunch effects of a passive higher-harmonic cavity. The numericalmore » simulations have been compared with analytical studies. For a beam with an arbitrary filling pattern, intensity-dependent complex frequency shifts have been derived starting from a system of coupled Vlasov equations. The analytical formulas and numerical simulations confirm that the analysis is reduced to the formulation of an eigenvalue problem based on the known formulas of the complex frequency shifts for the uniform filling pattern case.« less
Noise levels from a model turbofan engine with simulated noise control measures applied
NASA Technical Reports Server (NTRS)
Hall, David G.; Woodward, Richard P.
1993-01-01
A study of estimated full-scale noise levels based on measured levels from the Advanced Ducted Propeller (ADP) sub-scale model is presented. Testing of this model was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. Effective Perceived Noise Level (EPNL) estimates for the baseline configuration are documented, and also used as the control case in a study of the potential benefits of two categories of noise control. The effect of active noise control is evaluated by artificially removing various rotor-stator interaction tones. Passive noise control is simulated by applying a notch filter to the wind tunnel data. Cases with both techniques are included to evaluate hybrid active-passive noise control. The results for EPNL values are approximate because the original source data was limited in bandwidth and in sideline angular coverage. The main emphasis is on comparisons between the baseline and configurations with simulated noise control measures.
Yang, Qing-Sheng; Qiao, Ji-Gang; Ai, Bin
2013-09-01
Taking the Dongguan City with rapid urbanization as a case, and selecting landscape ecological security level as evaluation criterion, the urbanization cellular number of 1 km x 1 km ecological security cells was obtained, and imbedded into the transition rules of cellular automata (CA) as the restraint term to control urban development, establish ecological security urban CA, and simulate ecological security urban development pattern. The results showed the integrated landscape ecological security index of the City decreased from 0.497 in 1998 to 0.395 in 2005, indicating that the ecological security at landscape scale was decreased. The CA-simulated integrated ecological security index of the City in 2005 was increased from the measured 0.395 to 0.479, showing that the simulated urban landscape ecological pressure from human became lesser, ecological security became better, and integrated landscape ecological security became higher. CA could be used as an effective tool in researching urban ecological security.
NASA Astrophysics Data System (ADS)
Striepe, Scott Allen
The objectives of this research were to develop a reconstruction capability using the Program to Optimize Simulated Trajectories II (POST2), apply this capability to reconstruct the Huygens Titan probe entry, descent, and landing (EDL) trajectory, evaluate the newly developed POST2 reconstruction module, analyze the reconstructed trajectory, and assess the pre-flight simulation models used for Huygens EDL simulation. An extended Kalman filter (EKF) module was developed and integrated into POST2 to enable trajectory reconstruction (especially when using POST2-based mission specific simulations). Several validation cases, ranging from a single, constant parameter estimate to multivariable estimation cases similar to an actual mission flight, were executed to test the POST2 reconstruction module. Trajectory reconstruction of the Huygens entry probe at Titan was accomplished using accelerometer measurements taken during flight to adjust an estimated state (e.g., position, velocity, parachute drag, wind velocity, etc.) in a POST2-based simulation developed to support EDL analyses and design prior to entry. Although the main emphasis of the trajectory reconstruction was to evaluate models used in the NASA pre-entry trajectory simulation, the resulting reconstructed trajectory was also assessed to provide an independent evaluation of the ESA result. Major findings from this analysis include: Altitude profiles from this analysis agree well with other NASA and ESA results but not with Radar data, whereas a scale factor of about 0.93 would bring the radar measurements into compliance with these results; entry capsule aerodynamics predictions (axial component only) were well within 3-sigma bounds established pre-flight for most of the entry when compared to reconstructed values; Main parachute drag of 9% to 19% above ESA model was determined from the reconstructed trajectory; based on the tilt sensor and accelerometer data, the conclusion from this assessment was that the probe was tilted about 10 degrees during the Drogue parachute phase.
Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations
NASA Technical Reports Server (NTRS)
Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.
2015-01-01
The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.
ERIC Educational Resources Information Center
Ke, Fengfeng; Lee, Sungwoong
2016-01-01
This exploratory case study examined the process and potential impact of collaborative architectural design and construction in an OpenSimulator-based virtual reality (VR) on the social skills development of children with high-functioning autism (HFA). Two children with a formal medical diagnosis of HFA and one typically developing peer, aged…
Creating A Data Base For Design Of An Impeller
NASA Technical Reports Server (NTRS)
Prueger, George H.; Chen, Wei-Chung
1993-01-01
Report describes use of Taguchi method of parametric design to create data base facilitating optimization of design of impeller in centrifugal pump. Data base enables systematic design analysis covering all significant design parameters. Reduces time and cost of parametric optimization of design: for particular impeller considered, one can cover 4,374 designs by computational simulations of performance for only 18 cases.
ERIC Educational Resources Information Center
Chen, Lih-Shyang; Cheng, Yuh-Ming; Weng, Sheng-Feng; Chen, Yong-Guo; Lin, Chyi-Her
2009-01-01
The prevalence of Internet applications nowadays has led many medical schools and centers to incorporate computerized Problem-Based Learning (PBL) methods into their training curricula. However, many of these PBL systems do not truly reflect the situations which practitioners may actually encounter in a real medical environment, and hence their…
Zeng, Guang-Ming; Zhang, Shuo-Fu; Qin, Xiao-Sheng; Huang, Guo-He; Li, Jian-Bing
2003-05-01
The paper establishes the relationship between the settling efficiency and the sizes of the sedimentation tank through the process of numerical simulation, which is taken as one of the constraints to set up a simple optimum designing model of sedimentation tank. The feasibility and advantages of this model based on numerical calculation are verified through the application of practical case.
Thompson, R C; Scammon, D L
1994-01-01
A client-responsive strategy was developed based upon input from nutrition clinic personnel, administrators, and clients. Systems simulation identified the strategy most likely to lead to client satisfaction while also meeting the needs of clinic personnel and administration. The strategy was subsequently introduced into the clinic and patient satisfaction and operating revenues were monitored following implementation. Both measures of impact demonstrated significant improvement.
Loading relativistic Maxwell distributions in particle simulations
NASA Astrophysics Data System (ADS)
Zenitani, Seiji
2015-04-01
Numerical algorithms to load relativistic Maxwell distributions in particle-in-cell (PIC) and Monte-Carlo simulations are presented. For stationary relativistic Maxwellian, the inverse transform method and the Sobol algorithm are reviewed. To boost particles to obtain relativistic shifted-Maxwellian, two rejection methods are proposed in a physically transparent manner. Their acceptance efficiencies are ≈50 % for generic cases and 100% for symmetric distributions. They can be combined with arbitrary base algorithms.
Calder, Stefan; O'Grady, Greg; Cheng, Leo K; Du, Peng
2018-04-27
Electrogastrography (EGG) is a non-invasive method for measuring gastric electrical activity. Recent simulation studies have attempted to extend the current clinical utility of the EGG, in particular by providing a theoretical framework for distinguishing specific gastric slow wave dysrhythmias. In this paper we implement an experimental setup called a 'torso-tank' with the aim of expanding and experimentally validating these previous simulations. The torso-tank was developed using an adult male torso phantom with 190 electrodes embedded throughout the torso. The gastric slow waves were reproduced using an artificial current source capable of producing 3D electrical fields. Multiple gastric dysrhythmias were reproduced based on high-resolution mapping data from cases of human gastric dysfunction (gastric re-entry, conduction blocks and ectopic pacemakers) in addition to normal test data. Each case was recorded and compared to the previously-presented simulated results. Qualitative and quantitative analyses were performed to define the accuracy showing [Formula: see text] 1.8% difference, [Formula: see text] 0.99 correlation, and [Formula: see text] 0.04 normalised RMS error between experimental and simulated findings. These results reaffirm previous findings and these methods in unison therefore present a promising morphological-based methodology for advancing the understanding and clinical applications of EGG.
Maccarini, Alessandro; Wetter, Michael; Afshari, Alireza; ...
2016-10-31
This paper analyzes the performance of a novel two-pipe system that operates one water loop to simultaneously provide space heating and cooling with a water supply temperature of around 22 °C. To analyze the energy performance of the system, a simulation-based research was conducted. The two-pipe system was modelled using the equation-based Modelica modeling language in Dymola. A typical office building model was considered as the case study. Simulations were run for two construction sets of the building envelope and two conditions related to inter-zone air flows. To calculate energy savings, a conventional four-pipe system was modelled and used formore » comparison. The conventional system presented two separated water loops for heating and cooling with supply temperatures of 45 °C and 14 °C, respectively. Simulation results showed that the two-pipe system was able to use less energy than the four-pipe system thanks to three effects: useful heat transfer from warm to cold zones, higher free cooling potential and higher efficiency of the heat pump. In particular, the two-pipe system used approximately between 12% and 18% less total annual primary energy than the four-pipe system, depending on the simulation case considered.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maccarini, Alessandro; Wetter, Michael; Afshari, Alireza
This paper analyzes the performance of a novel two-pipe system that operates one water loop to simultaneously provide space heating and cooling with a water supply temperature of around 22 °C. To analyze the energy performance of the system, a simulation-based research was conducted. The two-pipe system was modelled using the equation-based Modelica modeling language in Dymola. A typical office building model was considered as the case study. Simulations were run for two construction sets of the building envelope and two conditions related to inter-zone air flows. To calculate energy savings, a conventional four-pipe system was modelled and used formore » comparison. The conventional system presented two separated water loops for heating and cooling with supply temperatures of 45 °C and 14 °C, respectively. Simulation results showed that the two-pipe system was able to use less energy than the four-pipe system thanks to three effects: useful heat transfer from warm to cold zones, higher free cooling potential and higher efficiency of the heat pump. In particular, the two-pipe system used approximately between 12% and 18% less total annual primary energy than the four-pipe system, depending on the simulation case considered.« less
Imamura, Toshihiro; Kokai, Satoshi; Ono, Takashi
2018-01-01
For patients with bimaxillary protrusion, significant retraction and intrusion of the anterior teeth are sometimes essential to improve the facial profile. However, severe root resorption of the maxillary incisors occasionally occurs after treatment because of various factors. For instance, it has been reported that approximation or invasion of the incisive canal by the anterior tooth roots during retraction may cause apical root damage. Thus, determination of the position of the maxillary incisors is key for orthodontic diagnosis and treatment planning in such cases. Cone-beam computed tomography (CBCT) may be useful for simulating the post-treatment position of the maxillary incisors and surrounding structures in order to ensure safe teeth movement. Here, we present a case of Class II malocclusion with bimaxillary protrusion, wherein apical root damage due to treatment was minimized by pretreatment evaluation of the anatomical structures and simulation of the maxillary central incisor movement using CBCT. Considerable retraction and intrusion of the maxillary incisors, which resulted in a significant improvement in the facial profile and smile, were achieved without severe root resorption. Our findings suggest that CBCT-based diagnosis and treatment simulation may facilitate safe and dynamic orthodontic tooth movement, particularly in patients requiring maximum anterior tooth retraction. PMID:29732305
A simulation-based approach for solving assembly line balancing problem
NASA Astrophysics Data System (ADS)
Wu, Xiaoyu
2017-09-01
Assembly line balancing problem is directly related to the production efficiency, since the last century, the problem of assembly line balancing was discussed and still a lot of people are studying on this topic. In this paper, the problem of assembly line is studied by establishing the mathematical model and simulation. Firstly, the model of determing the smallest production beat under certain work station number is anysized. Based on this model, the exponential smoothing approach is applied to improve the the algorithm efficiency. After the above basic work, the gas stirling engine assembly line balancing problem is discussed as a case study. Both two algorithms are implemented using the Lingo programming environment and the simulation results demonstrate the validity of the new methods.
Reality Imagined: The Choice to Use a Real-World Case in a Simulation
ERIC Educational Resources Information Center
Langfield, Danielle
2016-01-01
The use of a real-world case in a classroom simulation--in contrast to invented or disguised cases--is not widely recognized as a "combination" of two common active-learning strategies in political science: teaching with a case study and conducting a simulation. I argue that using such a simulation therefore can provide the benefits of…
NASA Astrophysics Data System (ADS)
Tian, Yi; Chen, Mahao; Kong, Jun
2009-02-01
With the online z-axis tube current modulation (OZTCM) technique proposed by this work, full automatic exposure control (AEC) for CT systems could be realized with online feedback not only for angular tube current modulation (TCM) but also for z-axis TCM either. Then the localizer radiograph was not required for TCM any more. OZTCM could be implemented with 2 schemes as attenuation based μ-OZTCM and image noise level based μ-OZTCM. Respectively the maximum attenuation of projection readings and standard deviation of reconstructed images can be used to modulate the tube current level in z-axis adaptively for each half (180 degree) or full (360 degree) rotation. Simulation results showed that OZTCM achieved better noise level than constant tube current scan case by using same total dose in mAs. The OZTCM can provide optimized base tube current level for angular TCM to realize an effective auto exposure control when localizer radiograph is not available or need to be skipped for simplified scan protocol in case of emergency procedure or children scan, etc.
Efficient and Effective Use of Peer Teaching for Medical Student Simulation.
House, Joseph B; Choe, Carol H; Wourman, Heather L; Berg, Kristin M; Fischer, Jonathan P; Santen, Sally A
2017-01-01
Simulation is increasingly used in medical education, promoting active learning and retention; however, increasing use also requires considerable instructor resources. Simulation may provide a safe environment for students to teach each other, which many will need to do when they enter residency. Along with reinforcing learning and increasing retention, peer teaching could decrease instructor demands. Our objective was to determine the effectiveness of peer-taught simulation compared to physician-led simulation. We hypothesized that peer-taught simulation would lead to equivalent knowledge acquisition when compared to physician-taught sessions and would be viewed positively by participants. This was a quasi-experimental study in an emergency medicine clerkship. The control group was faculty taught. In the peer-taught intervention group, students were assigned to teach one of the three simulation-based medical emergency cases. Each student was instructed to master their topic and teach it to their peers using the provided objectives and resource materials. The students were assigned to groups of three, with all three cases represented; students took turns leading their case. Three groups ran simultaneously. During the intervention sessions, one physician was present to monitor the accuracy of learning and to answer questions, while three physicians were required for the control groups. Outcomes compared pre-test and post-test knowledge and student reaction between control and intervention groups. Both methods led to equally improved knowledge; mean score for the post-test was 75% for both groups (p=0.6) and were viewed positively. Students in the intervention group agreed that peer-directed learning was an effective way to learn. However, students in the control group scored their simulation experience more favorably. In general, students' response to peer teaching was positive, students learned equally well, and found peer-taught sessions to be interactive and beneficial.
Economic Evaluation of Pediatric Telemedicine Consultations to Rural Emergency Departments.
Yang, Nikki H; Dharmar, Madan; Yoo, Byung-Kwang; Leigh, J Paul; Kuppermann, Nathan; Romano, Patrick S; Nesbitt, Thomas S; Marcin, James P
2015-08-01
Comprehensive economic evaluations have not been conducted on telemedicine consultations to children in rural emergency departments (EDs). We conducted an economic evaluation to estimate the cost, effectiveness, and return on investment (ROI) of telemedicine consultations provided to health care providers of acutely ill and injured children in rural EDs compared with telephone consultations from a health care payer prospective. We built a decision model with parameters from primary programmatic data, national data, and the literature. We performed a base-case cost-effectiveness analysis (CEA), a probabilistic CEA with Monte Carlo simulation, and ROI estimation when CEA suggested cost-saving. The CEA was based on program effectiveness, derived from transfer decisions following telemedicine and telephone consultations. The average cost for a telemedicine consultation was $3641 per child/ED/year in 2013 US dollars. Telemedicine consultations resulted in 31% fewer patient transfers compared with telephone consultations and a cost reduction of $4662 per child/ED/year. Our probabilistic CEA demonstrated telemedicine consultations were less costly than telephone consultations in 57% of simulation iterations. The ROI was calculated to be 1.28 ($4662/$3641) from the base-case analysis and estimated to be 1.96 from the probabilistic analysis, suggesting a $1.96 return for each dollar invested in telemedicine. Treating 10 acutely ill and injured children at each rural ED with telemedicine resulted in an annual cost-savings of $46,620 per ED. Telephone and telemedicine consultations were not randomly assigned, potentially resulting in biased results. From a health care payer perspective, telemedicine consultations to health care providers of acutely ill and injured children presenting to rural EDs are cost-saving (base-case and more than half of Monte Carlo simulation iterations) or cost-effective compared with telephone consultations. © The Author(s) 2015.
RICH: OPEN-SOURCE HYDRODYNAMIC SIMULATION ON A MOVING VORONOI MESH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yalinewich, Almog; Steinberg, Elad; Sari, Re’em
2015-02-01
We present here RICH, a state-of-the-art two-dimensional hydrodynamic code based on Godunov’s method, on an unstructured moving mesh (the acronym stands for Racah Institute Computational Hydrodynamics). This code is largely based on the code AREPO. It differs from AREPO in the interpolation and time-advancement schemeS as well as a novel parallelization scheme based on Voronoi tessellation. Using our code, we study the pros and cons of a moving mesh (in comparison to a static mesh). We also compare its accuracy to other codes. Specifically, we show that our implementation of external sources and time-advancement scheme is more accurate and robustmore » than is AREPO when the mesh is allowed to move. We performed a parameter study of the cell rounding mechanism (Lloyd iterations) and its effects. We find that in most cases a moving mesh gives better results than a static mesh, but it is not universally true. In the case where matter moves in one way and a sound wave is traveling in the other way (such that relative to the grid the wave is not moving) a static mesh gives better results than a moving mesh. We perform an analytic analysis for finite difference schemes that reveals that a Lagrangian simulation is better than a Eulerian simulation in the case of a highly supersonic flow. Moreover, we show that Voronoi-based moving mesh schemes suffer from an error, which is resolution independent, due to inconsistencies between the flux calculation and the change in the area of a cell. Our code is publicly available as open source and designed in an object-oriented, user-friendly way that facilitates incorporation of new algorithms and physical processes.« less
Oakes, Jessica M; Marsden, Alison L; Grandmont, Celine; Shadden, Shawn C; Darquenne, Chantal; Vignon-Clementel, Irene E
2014-04-01
Image-based in silico modeling tools provide detailed velocity and particle deposition data. However, care must be taken when prescribing boundary conditions to model lung physiology in health or disease, such as in emphysema. In this study, the respiratory resistance and compliance were obtained by solving an inverse problem; a 0D global model based on healthy and emphysematous rat experimental data. Multi-scale CFD simulations were performed by solving the 3D Navier-Stokes equations in an MRI-derived rat geometry coupled to a 0D model. Particles with 0.95 μm diameter were tracked and their distribution in the lung was assessed. Seven 3D-0D simulations were performed: healthy, homogeneous, and five heterogeneous emphysema cases. Compliance (C) was significantly higher (p = 0.04) in the emphysematous rats (C = 0.37 ± 0.14 cm(3)/cmH2O) compared to the healthy rats (C = 0.25 ± 0.04 cm(3)/cmH2O), while the resistance remained unchanged (p = 0.83). There were increases in airflow, particle deposition in the 3D model, and particle delivery to the diseased regions for the heterogeneous cases compared to the homogeneous cases. The results highlight the importance of multi-scale numerical simulations to study airflow and particle distribution in healthy and diseased lungs. The effect of particle size and gravity were studied. Once available, these in silico predictions may be compared to experimental deposition data.
Aydin, Denis; Feychting, Maria; Schüz, Joachim; Andersen, Tina Veje; Poulsen, Aslak Harbo; Prochazka, Michaela; Klaeboe, Lars; Kuehni, Claudia E; Tynes, Tore; Röösli, Martin
2011-07-01
Whether the use of mobile phones is a risk factor for brain tumors in adolescents is currently being studied. Case--control studies investigating this possible relationship are prone to recall error and selection bias. We assessed the potential impact of random and systematic recall error and selection bias on odds ratios (ORs) by performing simulations based on real data from an ongoing case--control study of mobile phones and brain tumor risk in children and adolescents (CEFALO study). Simulations were conducted for two mobile phone exposure categories: regular and heavy use. Our choice of levels of recall error was guided by a validation study that compared objective network operator data with the self-reported amount of mobile phone use in CEFALO. In our validation study, cases overestimated their number of calls by 9% on average and controls by 34%. Cases also overestimated their duration of calls by 52% on average and controls by 163%. The participation rates in CEFALO were 83% for cases and 71% for controls. In a variety of scenarios, the combined impact of recall error and selection bias on the estimated ORs was complex. These simulations are useful for the interpretation of previous case-control studies on brain tumor and mobile phone use in adults as well as for the interpretation of future studies on adolescents. Copyright © 2011 Wiley-Liss, Inc.
A Three-Dimensional Target Depth-Resolution Method with a Single-Vector Sensor
Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin
2018-01-01
This paper mainly studies and verifies the target number category-resolution method in multi-target cases and the target depth-resolution method of aerial targets. Firstly, target depth resolution is performed by using the sign distribution of the reactive component of the vertical complex acoustic intensity; the target category and the number resolution in multi-target cases is realized with a combination of the bearing-time recording information; and the corresponding simulation verification is carried out. The algorithm proposed in this paper can distinguish between the single-target multi-line spectrum case and the multi-target multi-line spectrum case. This paper presents an improved azimuth-estimation method for multi-target cases, which makes the estimation results more accurate. Using the Monte Carlo simulation, the feasibility of the proposed target number and category-resolution algorithm in multi-target cases is verified. In addition, by studying the field characteristics of the aerial and surface targets, the simulation results verify that there is only amplitude difference between the aerial target field and the surface target field under the same environmental parameters, and an aerial target can be treated as a special case of a surface target; the aerial target category resolution can then be realized based on the sign distribution of the reactive component of the vertical acoustic intensity so as to realize three-dimensional target depth resolution. By processing data from a sea experiment, the feasibility of the proposed aerial target three-dimensional depth-resolution algorithm is verified. PMID:29649173
A Three-Dimensional Target Depth-Resolution Method with a Single-Vector Sensor.
Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin
2018-04-12
This paper mainly studies and verifies the target number category-resolution method in multi-target cases and the target depth-resolution method of aerial targets. Firstly, target depth resolution is performed by using the sign distribution of the reactive component of the vertical complex acoustic intensity; the target category and the number resolution in multi-target cases is realized with a combination of the bearing-time recording information; and the corresponding simulation verification is carried out. The algorithm proposed in this paper can distinguish between the single-target multi-line spectrum case and the multi-target multi-line spectrum case. This paper presents an improved azimuth-estimation method for multi-target cases, which makes the estimation results more accurate. Using the Monte Carlo simulation, the feasibility of the proposed target number and category-resolution algorithm in multi-target cases is verified. In addition, by studying the field characteristics of the aerial and surface targets, the simulation results verify that there is only amplitude difference between the aerial target field and the surface target field under the same environmental parameters, and an aerial target can be treated as a special case of a surface target; the aerial target category resolution can then be realized based on the sign distribution of the reactive component of the vertical acoustic intensity so as to realize three-dimensional target depth resolution. By processing data from a sea experiment, the feasibility of the proposed aerial target three-dimensional depth-resolution algorithm is verified.
Software for Brain Network Simulations: A Comparative Study
Tikidji-Hamburyan, Ruben A.; Narayana, Vikram; Bozkus, Zeki; El-Ghazawi, Tarek A.
2017-01-01
Numerical simulations of brain networks are a critical part of our efforts in understanding brain functions under pathological and normal conditions. For several decades, the community has developed many software packages and simulators to accelerate research in computational neuroscience. In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. However, all simulators tend to expand the supported range of models by providing a universal environment for the computational study of individual neurons and brain networks. Next, our investigations on the characteristics of computational architecture and efficiency indicate that all simulators compile the most computationally intensive procedures into binary code, with the aim of maximizing their computational performance. However, not all simulators provide the simplest method for module development and/or guarantee efficient binary code. Third, a study of their amenability for high-performance computing reveals that NEST can almost transparently map an existing model on a cluster or multicore computer, while NEURON requires code modification if the model developed for a single computer has to be mapped on a computational cluster. Interestingly, parallelization is the weakest characteristic of BRIAN, which provides no support for cluster computations and limited support for multicore computers. Fourth, we identify the level of user support and frequency of usage for all simulators. Finally, we carry out an evaluation using two case studies: a large network with simplified neural and synaptic models and a small network with detailed models. These two case studies allow us to avoid any bias toward a particular software package. The results indicate that BRIAN provides the most concise language for both cases considered. Furthermore, as expected, NEST mostly favors large network models, while NEURON is better suited for detailed models. Overall, the case studies reinforce our general observation that simulators have a bias in the computational performance toward specific types of the brain network models. PMID:28775687
NASA Astrophysics Data System (ADS)
Matha, Denis; Sandner, Frank; Schlipf, David
2014-12-01
Design verification of wind turbines is performed by simulation of design load cases (DLC) defined in the IEC 61400-1 and -3 standards or equivalent guidelines. Due to the resulting large number of necessary load simulations, here a method is presented to reduce the computational effort for DLC simulations significantly by introducing a reduced nonlinear model and simplified hydro- and aerodynamics. The advantage of the formulation is that the nonlinear ODE system only contains basic mathematic operations and no iterations or internal loops which makes it very computationally efficient. Global turbine extreme and fatigue loads such as rotor thrust, tower base bending moment and mooring line tension, as well as platform motions are outputs of the model. They can be used to identify critical and less critical load situations to be then analysed with a higher fidelity tool and so speed up the design process. Results from these reduced model DLC simulations are presented and compared to higher fidelity models. Results in frequency and time domain as well as extreme and fatigue load predictions demonstrate that good agreement between the reduced and advanced model is achieved, allowing to efficiently exclude less critical DLC simulations, and to identify the most critical subset of cases for a given design. Additionally, the model is applicable for brute force optimization of floater control system parameters.
NASA Astrophysics Data System (ADS)
Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.
2011-12-01
In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.
NASA Astrophysics Data System (ADS)
Li, Y.; Robertson, C.
2018-06-01
The influence of irradiation defect dispersions on plastic strain spreading is investigated by means of three-dimensional dislocation dynamics (DD) simulations, accounting for thermally activated slip and cross-slip mechanisms in Fe-2.5%Cr grains. The defect-induced evolutions of the effective screw dislocation mobility are evaluated by means of statistical comparisons, for various defect number density and defect size cases. Each comparison is systematically associated with a quantitative Defect-Induced Apparent Straining Temperature shift (or «ΔDIAT»), calculated without any adjustable parameters. In the investigated cases, the ΔDIAT level associated with a given defect dispersion closely replicates the measured ductile to brittle transition temperature shift (ΔDBTT) due to the same, actual defect dispersion. The results are further analyzed in terms of dislocation-based plasticity mechanisms and their possible relations with the dose-dependent changes of the ductile to brittle transition temperature.
Osawa, Takeshi; Okawa, Shigenori; Kurokawa, Shunji; Ando, Shinichiro
2016-12-01
In this study, we propose a method for estimating the risk of agricultural damage caused by an invasive species when species-specific information is lacking. We defined the "risk" as the product of the invasion probability and the area of potentially damaged crop for production. As a case study, we estimated the risk imposed by an invasive weed, Sicyos angulatus, based on simple cellular simulations and governmental data on the area of crop that could potentially be damaged in Miyagi Prefecture, Japan. Simulation results revealed that the current distribution range was sufficiently accurate for practical purposes. Using these results and records of crop areas, we present risk maps for S. angulatus in agricultural fields. Managers will be able to use these maps to rapidly establish a management plan with minimal cost. Our approach will be valuable for establishing a management plan before or during the early stages of invasion.
Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando
2017-01-01
Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.
Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.
Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo
2016-12-13
The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.
Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes
Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J.; Wang, Liliang; Lin, Jianguo
2016-01-01
The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions. PMID:28060298
NASA Astrophysics Data System (ADS)
Bui, Huy Anh
The multi-particle simulation program, ITSIM version 4.0, takes advantage of the enhanced performance of the Windows 95 and NT operating systems in areas such as memory management, user friendliness, flexibility of graphics and speed, to investigate the motion of ions in the quadrupole ion trap. The objective of this program is to use computer simulations based on mathematical models to improve the performance of the ion trap mass spectrometer. The simulation program can provide assistance in understanding fundamental aspects of ion trap mass spectrometry, precede and help to direct the course of experiments, as well as having didactic value in elucidating and allowing visualization of ion behavior under different experimental conditions. The program uses the improved Euler method to calculate ion trajectories as numerical solutions to the Mathieu differential equation. This Windows version can simultaneously simulate the trajectories of ions with a virtually unlimited number of different mass-to-charge ratios and hence allows realistic mass spectra, ion kinetic energy distributions and other experimentally measurable properties to be simulated. The large number of simulated ions allows examination of (i) the offsetting effects of mutual ion repulsion and collisional cooling in an ion trap and (ii) the effects of higher order fields. Field inhomogeneities arising from exit holes, electrode misalignment, imperfect electrode surfaces or new trap geometries can be simulated with the program. The simulated data are used to obtain mass spectra from mass-selective instability scans as well as by Fourier transformation of image currents induced by coherently moving ion clouds. Complete instruments, from an ion source through the ion trap mass analyzer to a detector, can now be simulated. Applications of the simulation program are presented and discussed. Comparisons are made between the simulations and experimental data. Fourier transformed experiments and a novel six-electrode ion trap mass spectrometer illustrate cases in which simulations precede new experiments. Broadband non-destructive ion detection based on induced image current measurements are described in the case of a quadrupole ion trap having cylindrical geometry.
GPU accelerated Monte-Carlo simulation of SEM images for metrology
NASA Astrophysics Data System (ADS)
Verduin, T.; Lokhorst, S. R.; Hagen, C. W.
2016-03-01
In this work we address the computation times of numerical studies in dimensional metrology. In particular, full Monte-Carlo simulation programs for scanning electron microscopy (SEM) image acquisition are known to be notoriously slow. Our quest in reducing the computation time of SEM image simulation has led us to investigate the use of graphics processing units (GPUs) for metrology. We have succeeded in creating a full Monte-Carlo simulation program for SEM images, which runs entirely on a GPU. The physical scattering models of this GPU simulator are identical to a previous CPU-based simulator, which includes the dielectric function model for inelastic scattering and also refinements for low-voltage SEM applications. As a case study for the performance, we considered the simulated exposure of a complex feature: an isolated silicon line with rough sidewalls located on a at silicon substrate. The surface of the rough feature is decomposed into 408 012 triangles. We have used an exposure dose of 6 mC/cm2, which corresponds to 6 553 600 primary electrons on average (Poisson distributed). We repeat the simulation for various primary electron energies, 300 eV, 500 eV, 800 eV, 1 keV, 3 keV and 5 keV. At first we run the simulation on a GeForce GTX480 from NVIDIA. The very same simulation is duplicated on our CPU-based program, for which we have used an Intel Xeon X5650. Apart from statistics in the simulation, no difference is found between the CPU and GPU simulated results. The GTX480 generates the images (depending on the primary electron energy) 350 to 425 times faster than a single threaded Intel X5650 CPU. Although this is a tremendous speedup, we actually have not reached the maximum throughput because of the limited amount of available memory on the GTX480. Nevertheless, the speedup enables the fast acquisition of simulated SEM images for metrology. We now have the potential to investigate case studies in CD-SEM metrology, which otherwise would take unreasonable amounts of computation time.
Harik, Polina; Cuddy, Monica M; O'Donovan, Seosaimhin; Murray, Constance T; Swanson, David B; Clauser, Brian E
2009-10-01
The 2000 Institute of Medicine report on patient safety brought renewed attention to the issue of preventable medical errors, and subsequently specialty boards and the National Board of Medical Examiners were encouraged to play a role in setting expectations around safety education. This paper examines potentially dangerous actions taken by examinees during the portion of the United States Medical Licensing Examination Step 3 that is particularly well suited to evaluating lapses in physician decision making, the Computer-based Case Simulation (CCS). Descriptive statistics and a general linear modeling approach were used to analyze dangerous actions ordered by 25,283 examinees that completed CCS for the first time between November 2006 and January 2008. More than 20% of examinees ordered at least one dangerous action with the potential to cause significant patient harm. The propensity to order dangerous actions may vary across clinical cases. The CCS format may provide a means of collecting important information about patient-care situations in which examinees may be more likely to commit dangerous actions and the propensity of examinees to order dangerous tests and treatments.
NUMERICAL FLOW AND TRANSPORT SIMULATIONS SUPPORTING THE SALTSTONE FACILITY PERFORMANCE ASSESSMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.
2009-02-28
The Saltstone Disposal Facility Performance Assessment (PA) is being revised to incorporate requirements of Section 3116 of the Ronald W. Reagan National Defense Authorization Act for Fiscal Year 2005 (NDAA), and updated data and understanding of vault performance since the 1992 PA (Cook and Fowler 1992) and related Special Analyses. A hybrid approach was chosen for modeling contaminant transport from vaults and future disposal cells to exposure points. A higher resolution, largely deterministic, analysis is performed on a best-estimate Base Case scenario using the PORFLOW numerical analysis code. a few additional sensitivity cases are simulated to examine alternative scenarios andmore » parameter settings. Stochastic analysis is performed on a simpler representation of the SDF system using the GoldSim code to estimate uncertainty and sensitivity about the Base Case. This report describes development of PORFLOW models supporting the SDF PA, and presents sample results to illustrate model behaviors and define impacts relative to key facility performance objectives. The SDF PA document, when issued, should be consulted for a comprehensive presentation of results.« less
An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.
Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph
2010-06-01
We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.
Virtual reality: emerging role of simulation training in vascular access.
Davidson, Ingemar J A; Lok, Charmaine; Dolmatch, Bart; Gallieni, Maurizio; Nolen, Billy; Pittiruti, Mauro; Ross, John; Slakey, Douglas
2012-11-01
Evolving new technologies in vascular access mandate increased attention to patient safety; an often overlooked yet valuable training tool is simulation. For the end-stage renal disease patient, simulation tools are effective for all aspects of creating access for peritoneal dialysis and hemodialysis. Based on aviation principles, known as crew resource management, we place equal emphasis on team training as individual training to improve interactions between team members and systems, cumulating in improved safety. Simulation allows for environmental control and standardized procedures, letting the trainee practice and correct mistakes without harm to patients, compared with traditional patient-based training. Vascular access simulators range from suture devices, to pressurized tunneled conduits for needle cannulation, to computer-based interventional simulators. Simulation training includes simulated case learning, root cause analysis of adverse outcomes, and continual update and refinement of concepts. Implementation of effective human to complex systems interaction in end-stage renal disease patients involves a change in institutional culture. Three concepts discussed in this article are as follows: (1) the need for user-friendly systems and technology to enhance performance, (2) the necessity for members to both train and work together as a team, and (3) the team assigned to use the system must test and practice it to a proficient level before safely using the system on patients. Copyright © 2012 Elsevier Inc. All rights reserved.
[Relevance of a driving simulator in the assessment of handicapped individuals].
Carroz, A; Comte, P-A; Nicolo, D; Dériaz, O; Vuadens, P
2008-06-01
To evaluate the value of our driving simulator in deciding whether or not to allow patients with physical and/or cognitive deficits to resuming driving and to analyze whether or not the medical expert's final decision is based more on the results of the driving simulator than those of the neuropsychological examination. One hundred and twenty-three patients were evaluated with the driving simulator. Thirty-five of those with cognitive deficits also underwent a neuropsychological examination prior to the medical expert's decision on driving aptitude. In cases of uncertainty or disagreement, a driving assessment in real conditions was performed by a driving instructor. In cases of physical handicap, the medical expert's decision concurred with that of the occupational therapist. For brain-injured patients, there was a significant correlation between the neuropsychologist's opinion and that of the occupational therapist (kappa=0.33; P=0.01). However, the sensibility and specificity were only 55 and 80%, respectively. The correlation between an occupational therapy decision based on the driving simulator and that of the medical expert was very significant (kappa=0.81; P<0.0001) and the sensibility and specificity were 84 and 100%, respectively. In contrast, these values were lower (63 and 71%, respectively) for the correlation between the neuropsychologist's opinion and that of the medical expert. Our driving simulator enables the danger-free evaluation of driving aptitude. The results mirror an in situ assessment and are more sensitive than neuropsychological examination. In fact, the neuropsychologist's opinion often is more negative or uncertain with respect to the patient's real driving aptitude. When taking a decision on a patient's driving aptitude, the medical expert is more inclined to trust the results of the driving simulator.
Assessing Procedural Competence: Validity Considerations.
Pugh, Debra M; Wood, Timothy J; Boulet, John R
2015-10-01
Simulation-based medical education (SBME) offers opportunities for trainees to learn how to perform procedures and to be assessed in a safe environment. However, SBME research studies often lack robust evidence to support the validity of the interpretation of the results obtained from tools used to assess trainees' skills. The purpose of this paper is to describe how a validity framework can be applied when reporting and interpreting the results of a simulation-based assessment of skills related to performing procedures. The authors discuss various sources of validity evidence because they relate to SBME. A case study is presented.
Nelson, Matthew A.; Brown, Michael J.; Halverson, Scot A.; ...
2016-07-28
Here, the Quick Urban & Industrial Complex (QUIC) atmospheric transport, and dispersion modelling, system was evaluated against the Joint Urban 2003 tracer-gas measurements. This was done using the wind and turbulence fields computed by the Weather Research and Forecasting (WRF) model. We compare the simulated and observed plume transport when using WRF-model-simulated wind fields, and local on-site wind measurements. Degradation of the WRF-model-based plume simulations was cased by errors in the simulated wind direction, and limitations in reproducing the small-scale wind-field variability. We explore two methods for importing turbulence from the WRF model simulations into the QUIC system. The firstmore » method uses parametrized turbulence profiles computed from WRF-model-computed boundary-layer similarity parameters; and the second method directly imports turbulent kinetic energy from the WRF model. Using the WRF model’s Mellor-Yamada-Janjic boundary-layer scheme, the parametrized turbulence profiles and the direct import of turbulent kinetic energy were found to overpredict and underpredict the observed turbulence quantities, respectively. Near-source building effects were found to propagate several km downwind. These building effects and the temporal/spatial variations in the observed wind field were often found to have a stronger influence over the lateral and vertical plume spread than the intensity of turbulence. Correcting the WRF model wind directions using a single observational location improved the performance of the WRF-model-based simulations, but using the spatially-varying flow fields generated from multiple observation profiles generally provided the best performance.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Matthew A.; Brown, Michael J.; Halverson, Scot A.
Here, the Quick Urban & Industrial Complex (QUIC) atmospheric transport, and dispersion modelling, system was evaluated against the Joint Urban 2003 tracer-gas measurements. This was done using the wind and turbulence fields computed by the Weather Research and Forecasting (WRF) model. We compare the simulated and observed plume transport when using WRF-model-simulated wind fields, and local on-site wind measurements. Degradation of the WRF-model-based plume simulations was cased by errors in the simulated wind direction, and limitations in reproducing the small-scale wind-field variability. We explore two methods for importing turbulence from the WRF model simulations into the QUIC system. The firstmore » method uses parametrized turbulence profiles computed from WRF-model-computed boundary-layer similarity parameters; and the second method directly imports turbulent kinetic energy from the WRF model. Using the WRF model’s Mellor-Yamada-Janjic boundary-layer scheme, the parametrized turbulence profiles and the direct import of turbulent kinetic energy were found to overpredict and underpredict the observed turbulence quantities, respectively. Near-source building effects were found to propagate several km downwind. These building effects and the temporal/spatial variations in the observed wind field were often found to have a stronger influence over the lateral and vertical plume spread than the intensity of turbulence. Correcting the WRF model wind directions using a single observational location improved the performance of the WRF-model-based simulations, but using the spatially-varying flow fields generated from multiple observation profiles generally provided the best performance.« less
Reconstruction of Orion Engineering Development Unit (EDU) Parachute Inflation Loads
NASA Technical Reports Server (NTRS)
Ray, Eric S.
2013-01-01
The process of reconstructing inflation loads of Capsule Parachute Assembly System (CPAS) has been updated as the program transitioned to testing Engineering Development Unit (EDU) hardware. The equations used to reduce the test data have been re-derived based on the same physical assumptions made by simulations. Due to instrumentation challenges, individual parachute loads are determined from complementary accelerometer and load cell measurements. Cluster inflations are now simulated by modeling each parachute individually to better represent different inflation times and non-synchronous disreefing. The reconstruction procedure is tailored to either infinite mass or finite mass events based on measurable characteristics from the test data. Inflation parameters are determined from an automated optimization routine to reduce subjectivity. Infinite mass inflation parameters have been re-defined to avoid unrealistic interactions in Monte Carlo simulations. Sample cases demonstrate how best-fit inflation parameters are used to generate simulated drag areas and loads which favorably agree with test data.
Surrogate Modeling of High-Fidelity Fracture Simulations for Real-Time Residual Strength Predictions
NASA Technical Reports Server (NTRS)
Spear, Ashley D.; Priest, Amanda R.; Veilleux, Michael G.; Ingraffea, Anthony R.; Hochhalter, Jacob D.
2011-01-01
A surrogate model methodology is described for predicting in real time the residual strength of flight structures with discrete-source damage. Starting with design of experiment, an artificial neural network is developed that takes as input discrete-source damage parameters and outputs a prediction of the structural residual strength. Target residual strength values used to train the artificial neural network are derived from 3D finite element-based fracture simulations. A residual strength test of a metallic, integrally-stiffened panel is simulated to show that crack growth and residual strength are determined more accurately in discrete-source damage cases by using an elastic-plastic fracture framework rather than a linear-elastic fracture mechanics-based method. Improving accuracy of the residual strength training data would, in turn, improve accuracy of the surrogate model. When combined, the surrogate model methodology and high-fidelity fracture simulation framework provide useful tools for adaptive flight technology.
Numerical simulation of flood barriers
NASA Astrophysics Data System (ADS)
Srb, Pavel; Petrů, Michal; Kulhavý, Petr
This paper deals with testing and numerical simulating of flood barriers. The Czech Republic has been hit by several very devastating floods in past years. These floods caused several dozens of causalities and property damage reached billions of Euros. The development of flood measures is very important, especially for the reduction the number of casualties and the amount of property damage. The aim of flood control measures is the detention of water outside populated areas and drainage of water from populated areas as soon as possible. For new flood barrier design it is very important to know its behaviour in case of a real flood. During the development of the barrier several standardized tests have to be carried out. Based on the results from these tests numerical simulation was compiled using Abaqus software and some analyses were carried out. Based on these numerical simulations it will be possible to predict the behaviour of barriers and thus improve their design.
Surrogate Modeling of High-Fidelity Fracture Simulations for Real-Time Residual Strength Predictions
NASA Technical Reports Server (NTRS)
Spear, Ashley D.; Priest, Amanda R.; Veilleux, Michael G.; Ingraffea, Anthony R.; Hochhalter, Jacob D.
2011-01-01
A surrogate model methodology is described for predicting, during flight, the residual strength of aircraft structures that sustain discrete-source damage. Starting with design of experiment, an artificial neural network is developed that takes as input discrete-source damage parameters and outputs a prediction of the structural residual strength. Target residual strength values used to train the artificial neural network are derived from 3D finite element-based fracture simulations. Two ductile fracture simulations are presented to show that crack growth and residual strength are determined more accurately in discrete-source damage cases by using an elastic-plastic fracture framework rather than a linear-elastic fracture mechanics-based method. Improving accuracy of the residual strength training data does, in turn, improve accuracy of the surrogate model. When combined, the surrogate model methodology and high fidelity fracture simulation framework provide useful tools for adaptive flight technology.
NASA Astrophysics Data System (ADS)
Addawe, Rizavel C.; Addawe, Joel M.; Magadia, Joselito C.
2016-10-01
Accurate forecasting of dengue cases would significantly improve epidemic prevention and control capabilities. This paper attempts to provide useful models in forecasting dengue epidemic specific to the young and adult population of Baguio City. To capture the seasonal variations in dengue incidence, this paper develops a robust modeling approach to identify and estimate seasonal autoregressive integrated moving average (SARIMA) models in the presence of additive outliers. Since the least squares estimators are not robust in the presence of outliers, we suggest a robust estimation based on winsorized and reweighted least squares estimators. A hybrid algorithm, Differential Evolution - Simulated Annealing (DESA), is used to identify and estimate the parameters of the optimal SARIMA model. The method is applied to the monthly reported dengue cases in Baguio City, Philippines.
Automatic physical inference with information maximizing neural networks
NASA Astrophysics Data System (ADS)
Charnock, Tom; Lavaux, Guilhem; Wandelt, Benjamin D.
2018-04-01
Compressing large data sets to a manageable number of summaries that are informative about the underlying parameters vastly simplifies both frequentist and Bayesian inference. When only simulations are available, these summaries are typically chosen heuristically, so they may inadvertently miss important information. We introduce a simulation-based machine learning technique that trains artificial neural networks to find nonlinear functionals of data that maximize Fisher information: information maximizing neural networks (IMNNs). In test cases where the posterior can be derived exactly, likelihood-free inference based on automatically derived IMNN summaries produces nearly exact posteriors, showing that these summaries are good approximations to sufficient statistics. In a series of numerical examples of increasing complexity and astrophysical relevance we show that IMNNs are robustly capable of automatically finding optimal, nonlinear summaries of the data even in cases where linear compression fails: inferring the variance of Gaussian signal in the presence of noise, inferring cosmological parameters from mock simulations of the Lyman-α forest in quasar spectra, and inferring frequency-domain parameters from LISA-like detections of gravitational waveforms. In this final case, the IMNN summary outperforms linear data compression by avoiding the introduction of spurious likelihood maxima. We anticipate that the automatic physical inference method described in this paper will be essential to obtain both accurate and precise cosmological parameter estimates from complex and large astronomical data sets, including those from LSST and Euclid.
Youngblood, Patricia; Harter, Phillip M; Srivastava, Sakti; Moffett, Shannon; Heinrichs, Wm LeRoy; Dev, Parvati
2008-01-01
Training interdisciplinary trauma teams to work effectively together using simulation technology has led to a reduction in medical errors in emergency department, operating room, and delivery room contexts. High-fidelity patient simulators (PSs)-the predominant method for training healthcare teams-are expensive to develop and implement and require that trainees be present in the same place at the same time. In contrast, online computer-based simulators are more cost effective and allow simultaneous participation by students in different locations and time zones. In this pilot study, the researchers created an online virtual emergency department (Virtual ED) for team training in crisis management, and compared the effectiveness of the Virtual ED with the PS. We hypothesized that there would be no difference in learning outcomes for graduating medical students trained with each method. In this pilot study, we used a pretest-posttest control group, experimental design in which 30 subjects were randomly assigned to either the Virtual ED or the PS system. In the Virtual ED each subject logged into the online environment and took the role of a team member. Four-person teams worked together in the Virtual ED, communicating in real time with live voice over Internet protocol, to manage computer-controlled patients who exhibited signs and symptoms of physical trauma. Each subject had the opportunity to be the team leader. The subjects' leadership behavior as demonstrated in both a pretest case and a posttest case was assessed by 3 raters, using a behaviorally anchored scale. In the PS environment, 4-person teams followed the same research protocol, using the same clinical scenarios in a Simulation Center. Guided by the Emergency Medicine Crisis Resource Management curriculum, both the Virtual ED and the PS groups applied the basic principles of team leadership and trauma management (Advanced Trauma Life Support) to manage 6 trauma cases-a pretest case, 4 training cases, and a posttest case. The subjects in each group were assessed individually with the same simulation method that they used for the training cases. Subjects who used either the Virtual ED or the PS showed significant improvement in performance between pretest and posttest cases (P < 0.05). In addition, there was no significant difference in subjects' performance between the 2 types of simulation, suggesting that the online Virtual ED may be as effective for learning team skills as the PS, the method widely used in Simulation Centers. Data on usability and attitudes toward both simulation methods as learning tools were equally positive. This study shows the potential value of using virtual learning environments for developing medical students' and resident physicians' team leadership and crisis management skills.
2013-12-01
experimental and simulated cases at Rytov number 0.044...21 Figure 9. Branch point occurrence for experimental and simulated cases at Rytov number 0.044...22 Figure 10. Unwrapped phase data for experimental and simulated cases at Rytov number 0.044
DDS: The Dental Diagnostic Simulation System.
ERIC Educational Resources Information Center
Tira, Daniel E.
The Dental Diagnostic Simulation (DDS) System provides an alternative to simulation systems which represent diagnostic case studies of relatively limited scope. It may be used to generate simulated case studies in all of the dental specialty areas with case materials progressing through the gamut of the diagnostic process. The generation of a…
A meta-analysis of outcomes from the use of computer-simulated experiments in science education
NASA Astrophysics Data System (ADS)
Lejeune, John Van
The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.
Itô and Stratonovich integrals on compound renewal processes: the normal/Poisson case
NASA Astrophysics Data System (ADS)
Germano, Guido; Politi, Mauro; Scalas, Enrico; Schilling, René L.
2010-06-01
Continuous-time random walks, or compound renewal processes, are pure-jump stochastic processes with several applications in insurance, finance, economics and physics. Based on heuristic considerations, a definition is given for stochastic integrals driven by continuous-time random walks, which includes the Itô and Stratonovich cases. It is then shown how the definition can be used to compute these two stochastic integrals by means of Monte Carlo simulations. Our example is based on the normal compound Poisson process, which in the diffusive limit converges to the Wiener process.
NASA Astrophysics Data System (ADS)
Léchappé, V.; Moulay, E.; Plestan, F.
2018-06-01
The stability of a prediction-based controller for linear time-invariant (LTI) systems is studied in the presence of time-varying input and output delays. The uncertain delay case is treated as well as the partial state knowledge case. The reduction method is used in order to prove the convergence of the closed-loop system including the state observer, the predictor and the plant. Explicit conditions that guarantee the closed-loop stability are given, thanks to a Lyapunov-Razumikhin analysis. Simulations illustrate the theoretical results.
NASA Astrophysics Data System (ADS)
Pillai, D.; Gerbig, C.; Kretschmer, R.; Beck, V.; Karstens, U.; Neininger, B.; Heimann, M.
2012-10-01
We present simulations of atmospheric CO2 concentrations provided by two modeling systems, run at high spatial resolution: the Eulerian-based Weather Research Forecasting (WRF) model and the Lagrangian-based Stochastic Time-Inverted Lagrangian Transport (STILT) model, both of which are coupled to a diagnostic biospheric model, the Vegetation Photosynthesis and Respiration Model (VPRM). The consistency of the simulations is assessed with special attention paid to the details of horizontal as well as vertical transport and mixing of CO2 concentrations in the atmosphere. The dependence of model mismatch (Eulerian vs. Lagrangian) on models' spatial resolution is further investigated. A case study using airborne measurements during which two models showed large deviations from each other is analyzed in detail as an extreme case. Using aircraft observations and pulse release simulations, we identified differences in the representation of details in the interaction between turbulent mixing and advection through wind shear as the main cause of discrepancies between WRF and STILT transport at a spatial resolution such as 2 and 6 km. Based on observations and inter-model comparisons of atmospheric CO2 concentrations, we show that a refinement of the parameterization of turbulent velocity variance and Lagrangian time-scale in STILT is needed to achieve a better match between the Eulerian and the Lagrangian transport at such a high spatial resolution (e.g. 2 and 6 km). Nevertheless, the inter-model differences in simulated CO2 time series for a tall tower observatory at Ochsenkopf in Germany are about a factor of two smaller than the model-data mismatch and about a factor of three smaller than the mismatch between the current global model simulations and the data.
A SEIR model for transmission of tuberculosis
NASA Astrophysics Data System (ADS)
Side, Syafruddin; Mulbar, Usman; Sidjara, Sahlan; Sanusi, Wahidah
2017-04-01
In this paper will be described Tuberculosis (TB) transmission using Susceptible-Exposed-Infected-Recovered (SEIR) model. SEIR model for transmission of TB were analyzed and performed simulations using data on the number of TB cases in South Sulawesi. The results showed that the levels of the basic reproduction ratio R0 using the model of SEIR is R0 ≤ 1, it means that the status of TB disease in South Sulawesi is at a stage that is not alarming, but based on simulation results using MatLab, predicted that the number of infection cases will continue to increase therefore government needs to take preventive measures to control and reduce the number of TB infections in South Sulawesi.
Simulation of Benchmark Cases with the Terminal Area Simulation System (TASS)
NASA Technical Reports Server (NTRS)
Ahmad, Nash'at; Proctor, Fred
2011-01-01
The hydrodynamic core of the Terminal Area Simulation System (TASS) is evaluated against different benchmark cases. In the absence of closed form solutions for the equations governing atmospheric flows, the models are usually evaluated against idealized test cases. Over the years, various authors have suggested a suite of these idealized cases which have become standards for testing and evaluating the dynamics and thermodynamics of atmospheric flow models. In this paper, simulations of three such cases are described. In addition, the TASS model is evaluated against a test case that uses an exact solution of the Navier-Stokes equations. The TASS results are compared against previously reported simulations of these banchmark cases in the literature. It is demonstrated that the TASS model is highly accurate, stable and robust.
Simulation of Benchmark Cases with the Terminal Area Simulation System (TASS)
NASA Technical Reports Server (NTRS)
Ahmad, Nashat N.; Proctor, Fred H.
2011-01-01
The hydrodynamic core of the Terminal Area Simulation System (TASS) is evaluated against different benchmark cases. In the absence of closed form solutions for the equations governing atmospheric flows, the models are usually evaluated against idealized test cases. Over the years, various authors have suggested a suite of these idealized cases which have become standards for testing and evaluating the dynamics and thermodynamics of atmospheric flow models. In this paper, simulations of three such cases are described. In addition, the TASS model is evaluated against a test case that uses an exact solution of the Navier-Stokes equations. The TASS results are compared against previously reported simulations of these benchmark cases in the literature. It is demonstrated that the TASS model is highly accurate, stable and robust.
Long-term hydrological simulation based on the Soil Conservation Service curve number
NASA Astrophysics Data System (ADS)
Mishra, Surendra Kumar; Singh, Vijay P.
2004-05-01
Presenting a critical review of daily flow simulation models based on the Soil Conservation Service curve number (SCS-CN), this paper introduces a more versatile model based on the modified SCS-CN method, which specializes into seven cases. The proposed model was applied to the Hemavati watershed (area = 600 km2) in India and was found to yield satisfactory results in both calibration and validation. The model conserved monthly and annual runoff volumes satisfactorily. A sensitivity analysis of the model parameters was performed, including the effect of variation in storm duration. Finally, to investigate the model components, all seven variants of the modified version were tested for their suitability.
A Multi-Institutional Simulation Boot Camp for Pediatric Cardiac Critical Care Nurse Practitioners.
Brown, Kristen M; Mudd, Shawna S; Hunt, Elizabeth A; Perretta, Julianne S; Shilkofski, Nicole A; Diddle, J Wesley; Yurasek, Gregory; Bembea, Melania; Duval-Arnould, Jordan; Nelson McMillan, Kristen
2018-06-01
Assess the effect of a simulation "boot camp" on the ability of pediatric nurse practitioners to identify and treat a low cardiac output state in postoperative patients with congenital heart disease. Additionally, assess the pediatric nurse practitioners' confidence and satisfaction with simulation training. Prospective pre/post interventional pilot study. University simulation center. Thirty acute care pediatric nurse practitioners from 13 academic medical centers in North America. We conducted an expert opinion survey to guide curriculum development. The curriculum included didactic sessions, case studies, and high-fidelity simulation, based on high-complexity cases, congenital heart disease benchmark procedures, and a mix of lesion-specific postoperative complications. To cover multiple, high-complexity cases, we implemented Rapid Cycle Deliberate Practice method of teaching for selected simulation scenarios using an expert driven checklist. Knowledge was assessed with a pre-/posttest format (maximum score, 100%). A paired-sample t test showed a statistically significant increase in the posttest scores (mean [SD], pre test, 36.8% [14.3%] vs post test, 56.0% [15.8%]; p < 0.001). Time to recognize and treat an acute deterioration was evaluated through the use of selected high-fidelity simulation. Median time improved overall "time to task" across these scenarios. There was a significant increase in the proportion of clinically time-sensitive tasks completed within 5 minutes (pre, 60% [30/50] vs post, 86% [43/50]; p = 0.003] Confidence and satisfaction were evaluated with a validated tool ("Student Satisfaction and Self-Confidence in Learning"). Using a five-point Likert scale, the participants reported a high level of satisfaction (4.7 ± 0.30) and performance confidence (4.8 ± 0.31) with the simulation experience. Although simulation boot camps have been used effectively for training physicians and educating critical care providers, this was a novel approach to educating pediatric nurse practitioners from multiple academic centers. The course improved overall knowledge, and the pediatric nurse practitioners reported satisfaction and confidence in the simulation experience.
NASA Astrophysics Data System (ADS)
Buzyurkin, A. E.; Gladky, I. L.; Kraus, E. I.
2015-03-01
Stress-strain curves of dynamic loading of VT6, OT4, and OT4-0 titanium-based alloys are constructed on the basis of experimental data, and the Johnson-Cook model parameters are determined. Results of LS-DYNA simulations of the processes of deformation and fracture of the fan casing after its high-velocity impact with a fan blade simulator are presented.
Continuation through Singularity of Continuum Multiphase Algorithms
2013-03-01
capturing simulation of two-phase flow ; a singularity- free mesoscopic simulation that bridges atomic and continuum scales; and a physics-based closure...for free surface flow . The full two-way coupling was found to be irrelevant to the overall objective of developing a closure model to allow...The method can be used for the study of single species free - surface flow , for instance, in the case of pinch-off of a liquid thread during the
BacNet and Analog/Digital Interfaces of the Building Controls Virtual Testbed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nouidui, Thierry Stephane; Wetter, Michael; Li, Zhengwei
2011-11-01
This paper gives an overview of recent developments in the Building Controls Virtual Test Bed (BCVTB), a framework for co-simulation and hardware-in-the-loop. First, a general overview of the BCVTB is presented. Second, we describe the BACnet interface, a link which has been implemented to couple BACnet devices to the BCVTB. We present a case study where the interface was used to couple a whole building simulation program to a building control system to assess in real-time the performance of a real building. Third, we present the ADInterfaceMCC, an analog/digital interface that allows a USB-based analog/digital converter to be linked tomore » the BCVTB. In a case study, we show how the link was used to couple the analog/digital converter to a building simulation model for local loop control.« less
NASA Astrophysics Data System (ADS)
Luo, D.; Guan, Z.; Wang, C.; Yue, L.; Peng, L.
2017-06-01
Distribution of different parts to the assembly lines is significant for companies to improve production. Current research investigates the problem of distribution method optimization of a logistics system in a third party logistic company that provide professional services to an automobile manufacturing case company in China. Current research investigates the logistics leveling the material distribution and unloading platform of the automobile logistics enterprise and proposed logistics distribution strategy, material classification method, as well as logistics scheduling. Moreover, the simulation technology Simio is employed on assembly line logistics system which helps to find and validate an optimization distribution scheme through simulation experiments. Experimental results indicate that the proposed scheme can solve the logistic balance and levels the material problem and congestion of the unloading pattern in an efficient way as compared to the original method employed by the case company.
Tracking the global maximum power point of PV arrays under partial shading conditions
NASA Astrophysics Data System (ADS)
Fennich, Meryem
This thesis presents the theoretical and simulation studies of the global maximum power point tracking (MPPT) for photovoltaic systems under partial shading. The main goal is to track the maximum power point of the photovoltaic module so that the maximum possible power can be extracted from the photovoltaic panels. When several panels are connected in series with some of them shaded partially either due to clouds or shadows from neighboring buildings, several local maxima appear in the power vs. voltage curve. A power increment based MPPT algorithm is effective in identifying the global maximum from the several local maxima. Several existing MPPT algorithms are explored and the state-of-the-art power increment method is simulated and tested for various partial shading conditions. The current-voltage and power-voltage characteristics of the PV model are studied under different partial shading conditions, along with five different cases demonstrating how the MPPT algorithm performs when shading switches from one state to another. Each case is supplemented with simulation results. The method of tracking the Global MPP is based on controlling the DC-DC converter connected to the output of the PV array. A complete system simulation including the PV array, the direct current to direct current (DC-DC) converter and the MPPT is presented and tested using MATLAB software. The simulation results show that the MPPT algorithm works very well with the buck converter, while the boost converter needs further changes and implementation.
Exploring Discretization Error in Simulation-Based Aerodynamic Databases
NASA Technical Reports Server (NTRS)
Aftosmis, Michael J.; Nemec, Marian
2010-01-01
This work examines the level of discretization error in simulation-based aerodynamic databases and introduces strategies for error control. Simulations are performed using a parallel, multi-level Euler solver on embedded-boundary Cartesian meshes. Discretization errors in user-selected outputs are estimated using the method of adjoint-weighted residuals and we use adaptive mesh refinement to reduce these errors to specified tolerances. Using this framework, we examine the behavior of discretization error throughout a token database computed for a NACA 0012 airfoil consisting of 120 cases. We compare the cost and accuracy of two approaches for aerodynamic database generation. In the first approach, mesh adaptation is used to compute all cases in the database to a prescribed level of accuracy. The second approach conducts all simulations using the same computational mesh without adaptation. We quantitatively assess the error landscape and computational costs in both databases. This investigation highlights sensitivities of the database under a variety of conditions. The presence of transonic shocks or the stiffness in the governing equations near the incompressible limit are shown to dramatically increase discretization error requiring additional mesh resolution to control. Results show that such pathologies lead to error levels that vary by over factor of 40 when using a fixed mesh throughout the database. Alternatively, controlling this sensitivity through mesh adaptation leads to mesh sizes which span two orders of magnitude. We propose strategies to minimize simulation cost in sensitive regions and discuss the role of error-estimation in database quality.
Pilot study on effectiveness of simulation for surgical robot design using manipulability.
Kawamura, Kazuya; Seno, Hiroto; Kobayashi, Yo; Fujie, Masakatsu G
2011-01-01
Medical technology has advanced with the introduction of robot technology, which facilitates some traditional medical treatments that previously were very difficult. However, at present, surgical robots are used in limited medical domains because these robots are designed using only data obtained from adult patients and are not suitable for targets having different properties, such as children. Therefore, surgical robots are required to perform specific functions for each clinical case. In addition, the robots must exhibit sufficiently high movability and operability for each case. In the present study, we focused on evaluation of the mechanism and configuration of a surgical robot by a simulation based on movability and operability during an operation. We previously proposed the development of a simulator system that reproduces the conditions of a robot and a target in a virtual patient body to evaluate the operability of the surgeon during an operation. In the present paper, we describe a simple experiment to verify the condition of the surgical assisting robot during an operation. In this experiment, the operation imitating suturing motion was carried out in a virtual workspace, and the surgical robot was evaluated based on manipulability as an indicator of movability. As the result, it was confirmed that the robot was controlled with low manipulability of the left side manipulator during the suturing. This simulation system can verify the less movable condition of a robot before developing an actual robot. Our results show the effectiveness of this proposed simulation system.
Spatial heterogeneity of leaf area index across scales from simulation and remote sensing
NASA Astrophysics Data System (ADS)
Reichenau, Tim G.; Korres, Wolfgang; Montzka, Carsten; Schneider, Karl
2016-04-01
Leaf area index (LAI, single sided leaf area per ground area) influences mass and energy exchange of vegetated surfaces. Therefore LAI is an input variable for many land surface schemes of coupled large scale models, which do not simulate LAI. Since these models typically run on rather coarse resolution grids, LAI is often inferred from coarse resolution remote sensing. However, especially in agriculturally used areas, a grid cell of these products often covers more than a single land-use. In that case, the given LAI does not apply to any single land-use. Therefore, the overall spatial heterogeneity in these datasets differs from that on resolutions high enough to distinguish areas with differing land-use. Detailed process-based plant growth models simulate LAI for separate plant functional types or specific species. However, limited availability of observations causes reduced spatial heterogeneity of model input data (soil, weather, land-use). Since LAI is strongly heterogeneous in space and time and since processes depend on LAI in a nonlinear way, a correct representation of LAI spatial heterogeneity is also desirable on coarse resolutions. The current study assesses this issue by comparing the spatial heterogeneity of LAI from remote sensing (RapidEye) and process-based simulations (DANUBIA simulation system) across scales. Spatial heterogeneity is assessed by analyzing LAI frequency distributions (spatial variability) and semivariograms (spatial structure). Test case is the arable land in the fertile loess plain of the Rur catchment near the Germany-Netherlands border.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yunyan; Klein, Stephen A.; Fan, Jiwen
Based on long-term observations by the Atmospheric Radiation Measurement program at its Southern Great Plains site, a new composite case of continental shallow cumulus (ShCu) convection is constructed for large-eddy simulations (LES) and single-column models. The case represents a typical daytime nonprecipitating ShCu whose formation and dissipation are driven by the local atmospheric conditions and land surface forcing and are not influenced by synoptic weather events. The case includes early morning initial profiles of temperature and moisture with a residual layer; diurnally varying sensible and latent heat fluxes, which represent a domain average over different land surface types; simplified large-scalemore » horizontal advective tendencies and subsidence; and horizontal winds with prevailing direction and average speed. Observed composite cloud statistics are provided for model evaluation. The observed diurnal cycle is well reproduced by LES; however, the cloud amount, liquid water path, and shortwave radiative effect are generally underestimated. LES are compared between simulations with an all-or-nothing bulk microphysics and a spectral bin microphysics. The latter shows improved agreement with observations in the total cloud cover and the amount of clouds with depths greater than 300 m. When compared with radar retrievals of in-cloud air motion, LES produce comparable downdraft vertical velocities, but a larger updraft area, velocity, and updraft mass flux. Both observations and LES show a significantly larger in-cloud downdraft fraction and downdraft mass flux than marine ShCu.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yunyan; Klein, Stephen A.; Fan, Jiwen
Based on long-term observations by the Atmospheric Radiation Measurement program at its Southern Great Plains site, a new composite case of continental shallow cumulus (ShCu) convection is constructed for large-eddy simulations (LES) and single-column models. The case represents a typical daytime non-precipitating ShCu whose formation and dissipation are driven by the local atmospheric conditions and land-surface forcing, and are not influenced by synoptic weather events. The case includes: early-morning initial profiles of temperature and moisture with a residual layer; diurnally-varying sensible and latent heat fluxes which represent a domain average over different land-surface types; simplified large-scale horizontal advective tendencies andmore » subsidence; and horizontal winds with prevailing direction and average speed. Observed composite cloud statistics are provided for model evaluation. The observed diurnal cycle is well-reproduced by LES, however the cloud amount, liquid water path, and shortwave radiative effect are generally underestimated. LES are compared between simulations with an all-or-nothing bulk microphysics and a spectral bin microphysics. The latter shows improved agreement with observations in the total cloud cover and the amount of clouds with depths greater than 300 meters. When compared with radar retrievals of in-cloud air motion, LES produce comparable downdraft vertical velocities, but a larger updraft area, velocity and updraft mass flux. Finally, both observation and LES show a significantly larger in-cloud downdraft fraction and downdraft mass flux than marine ShCu.« less
Zhang, Yunyan; Klein, Stephen A.; Fan, Jiwen; ...
2017-09-19
Based on long-term observations by the Atmospheric Radiation Measurement program at its Southern Great Plains site, a new composite case of continental shallow cumulus (ShCu) convection is constructed for large-eddy simulations (LES) and single-column models. The case represents a typical daytime non-precipitating ShCu whose formation and dissipation are driven by the local atmospheric conditions and land-surface forcing, and are not influenced by synoptic weather events. The case includes: early-morning initial profiles of temperature and moisture with a residual layer; diurnally-varying sensible and latent heat fluxes which represent a domain average over different land-surface types; simplified large-scale horizontal advective tendencies andmore » subsidence; and horizontal winds with prevailing direction and average speed. Observed composite cloud statistics are provided for model evaluation. The observed diurnal cycle is well-reproduced by LES, however the cloud amount, liquid water path, and shortwave radiative effect are generally underestimated. LES are compared between simulations with an all-or-nothing bulk microphysics and a spectral bin microphysics. The latter shows improved agreement with observations in the total cloud cover and the amount of clouds with depths greater than 300 meters. When compared with radar retrievals of in-cloud air motion, LES produce comparable downdraft vertical velocities, but a larger updraft area, velocity and updraft mass flux. Finally, both observation and LES show a significantly larger in-cloud downdraft fraction and downdraft mass flux than marine ShCu.« less