Sample records for baseline experiment simulator

  1. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    NASA Technical Reports Server (NTRS)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  2. Impact of an Advanced Cardiac Life Support Simulation Laboratory Experience on Pharmacy Student Confidence and Knowledge.

    PubMed

    Maxwell, Whitney D; Mohorn, Phillip L; Haney, Jason S; Phillips, Cynthia M; Lu, Z Kevin; Clark, Kimberly; Corboy, Alex; Ragucci, Kelly R

    2016-10-25

    Objective. To assess the impact of an advanced cardiac life support (ACLS) simulation on pharmacy student confidence and knowledge. Design. Third-year pharmacy students participated in a simulation experience that consisted of team roles training, high-fidelity ACLS simulations, and debriefing. Students completed a pre/postsimulation confidence and knowledge assessment. Assessment. Overall, student knowledge assessment scores and student confidence scores improved significantly. Student confidence and knowledge changes from baseline were not significantly correlated. Conversely, a significant, weak positive correlation between presimulation studying and both presimulation confidence and presimulation knowledge was discovered. Conclusions. Overall, student confidence and knowledge assessment scores in ACLS significantly improved from baseline; however, student confidence and knowledge were not significantly correlated.

  3. Design of experiment for earth rotation and baseline parameter determination from very long baseline interferometry

    NASA Technical Reports Server (NTRS)

    Dermanis, A.

    1977-01-01

    The possibility of recovering earth rotation and network geometry (baseline) parameters are emphasized. The numerical simulated experiments performed are set up in an environment where station coordinates vary with respect to inertial space according to a simulated earth rotation model similar to the actual but unknown rotation of the earth. The basic technique of VLBI and its mathematical model are presented. The parametrization of earth rotation chosen is described and the resulting model is linearized. A simple analysis of the geometry of the observations leads to some useful hints on achieving maximum sensitivity of the observations with respect to the parameters considered. The basic philosophy for the simulation of data and their analysis through standard least squares adjustment techniques is presented. A number of characteristic network designs based on present and candidate station locations are chosen. The results of the simulations for each design are presented together with a summary of the conclusions.

  4. Crip for a day: The unintended negative consequences of disability simulations.

    PubMed

    Nario-Redmond, Michelle R; Gospodinov, Dobromir; Cobb, Angela

    2017-08-01

    To investigate the impact of disability simulations on mood, self-ascribed disability stereotypes, attitudes about interacting with disabled individuals, and behavioral intentions for improving campus accessibility. Experiment 1 evaluated disability-awareness simulations by randomly assigning undergraduates (N = 60) with and without disabilities to stations simulating either dyslexia, hearing or mobility impairments. Experiment 2 extended the field study into the lab where undergraduates (N = 50) with and without disabilities each completed low vision, hearing impairment, and dyslexia simulations. Both studies incorporated pretest-posttest measures of mood, self-ascribed disability stereotypes, and attitudinal measures. In both experiments, disability simulations made participants feel more confused, embarrassed, helpless, and more vulnerable to becoming disabled themselves compared to baseline. Following the simulations, empathetic concern (warmth) toward disabled people increased in both studies, but attitudes about interacting did not improve. In Experiment 1, postsimulation anxiety, embarrassment, and helplessness were highest for those who used wheelchairs or simulated dyslexia. In Experiment 2, participants judged themselves less competent, expressed more pity, expressed more interaction discomfort, and were not more willing to interview disabled students for an accessibility project following the simulations compared to baseline. In addition, Experiment 2 found frustration, guilt, anxiety, and depression were most pronounced among those who interacted with disabled people less than once per month. Simulating disabilities promotes distress and fails to improve attitudes toward disabled people, undermining efforts to improve integration even while participants report more empathetic concern and "understanding of what the disability experience is like." (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Numerical Modeling of Flow Control in a Boundary-Layer-Ingesting Offset Inlet Diffuser at Transonic Mach Numbers

    NASA Technical Reports Server (NTRS)

    Allan, Brian G.; Owens, Lewis R.

    2006-01-01

    This paper will investigate the validation of the NASA developed, Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, for a boundary-layer-ingesting (BLI) offset (S-shaped) inlet in transonic flow with passive and active flow control devices as well as a baseline case. Numerical simulations are compared to wind tunnel results of a BLI inlet experiment conducted at the NASA Langley 0.3-Meter Transonic Cryogenic Tunnel. Comparisons of inlet flow distortion, pressure recovery, and inlet wall pressures are performed. The numerical simulations are compared to the BLI inlet data at a free-stream Mach number of 0.85 and a Reynolds number of approximately 2 million based on the fanface diameter. The numerical simulations with and without tunnel walls are performed, quantifying tunnel wall effects on the BLI inlet flow. A comparison is made between the numerical simulations and the BLI inlet experiment for the baseline and VG vane cases at various inlet mass flow rates. A comparison is also made to a BLI inlet jet configuration for varying actuator mass flow rates at a fixed inlet mass flow rate. Overall, the numerical simulations were able to predict the baseline circumferential flow distortion, DPCP avg, very well within the designed operating range of the BLI inlet. A comparison of the average total pressure recovery showed that the simulations were able to predict the trends but had a negative 0.01 offset when compared to the experimental levels. Numerical simulations of the baseline inlet flow also showed good agreement with the experimental inlet centerline surface pressures. The vane case showed that the CFD predicted the correct trends in the circumferential distortion levels for varying inlet mass flow but had a distortion level that was nearly twice as large as the experiment. Comparison to circumferential distortion measurements for a 15 deg clocked 40 probe rake indicated that the circumferential distortion levels are very sensitive to the symmetry of the flow and that a misalignment of the vanes in the experiment could have resulted in this difference. The numerical simulations of the BLI inlet with jets showed good agreement with the circumferential inlet distortion levels for a range of jet actuator mass flow ratios at a fixed inlet mass flow rate. The CFD simulations for the jet case also predicted an average total pressure recovery offset that was 0.01 lower than the experiment as was seen in the baseline. Comparisons of the flow features for the jet cases revealed that the CFD predicted a much larger vortex at the engine fan-face when compare to the experiment.

  6. The Impact of Baseline Trend Control on Visual Analysis of Single-Case Data

    ERIC Educational Resources Information Center

    Mercer, Sterett H.; Sterling, Heather E.

    2012-01-01

    The impact of baseline trend control on visual analyses of AB intervention graphs was examined with simulated data at various values of baseline trend, autocorrelation, and effect size. Participants included 202 undergraduate students with minimal training in visual analysis and 10 graduate students and faculty with more training and experience in…

  7. Effect of computer game playing on baseline laparoscopic simulator skills.

    PubMed

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  8. Skylab Medical Experiments Altitude Test (SMEAT)

    NASA Technical Reports Server (NTRS)

    Johnston, R. S. (Compiler)

    1973-01-01

    The Skylab 56-day environment simulation test provided baseline biomedical data on medical experiments to be included in the Skylab program. Also identified are problems in operating life support systems and medical equipment.

  9. The benefits of being a video gamer in laparoscopic surgery.

    PubMed

    Sammut, Matthew; Sammut, Mark; Andrejevic, Predrag

    2017-09-01

    Video games are mainly considered to be of entertainment value in our society. Laparoscopic surgery and video games are activities similarly requiring eye-hand and visual-spatial skills. Previous studies have not conclusively shown a positive correlation between video game experience and improved ability to accomplish visual-spatial tasks in laparoscopic surgery. This study was an attempt to investigate this relationship. The aim of the study was to investigate whether previous video gaming experience affects the baseline performance on a laparoscopic simulator trainer. Newly qualified medical officers with minimal experience in laparoscopic surgery were invited to participate in the study and assigned to the following groups: gamers (n = 20) and non-gamers (n = 20). Analysis included participants' demographic data and baseline video gaming experience. Laparoscopic skills were assessed using a laparoscopic simulator trainer. There were no significant demographic differences between the two groups. Each participant performed three laparoscopic tasks and mean scores between the two groups were compared. The gamer group had statistically significant better results in maintaining the laparoscopic camera horizon ± 15° (p value = 0.009), in the complex ball manipulation accuracy rates (p value = 0.024) and completed the complex laparoscopic simulator task in a significantly shorter time period (p value = 0.001). Although prior video gaming experience correlated with better results, there were no significant differences for camera accuracy rates (p value = 0.074) and in a two-handed laparoscopic exercise task accuracy rates (p value = 0.092). The results show that previous video-gaming experience improved the baseline performance in laparoscopic simulator skills. Copyright © 2017 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.

  10. Computational Fluid Dynamics (CFD) investigation onto passenger car disk brake design

    NASA Astrophysics Data System (ADS)

    Munisamy, Kannan M.; Kanasan Moorthy, Shangkari K.

    2013-06-01

    The aim of this study is to investigate the flow and heat transfer in ventilated disc brakes using Computational Fluid Dynamics (CFD). NACA Series blade is designed for ventilated disc brake and the cooling characteristic is compared to the baseline design. The ventilated disc brakes are simulated using commercial CFD software FLUENTTM using simulation configuration that was obtained from experiment data. The NACA Series blade design shows improvements in Nusselt number compared to baseline design.

  11. Effect of suspension kinematic on 14 DOF vehicle model

    NASA Astrophysics Data System (ADS)

    Wongpattananukul, T.; Chantharasenawong, C.

    2017-12-01

    Computer simulations play a major role in shaping modern science and engineering. They reduce time and resource consumption in new studies and designs. Vehicle simulations have been studied extensively to achieve a vehicle model used in minimum lap time solution. Simulation result accuracy depends on the abilities of these models to represent real phenomenon. Vehicles models with 7 degrees of freedom (DOF), 10 DOF and 14 DOF are normally used in optimal control to solve for minimum lap time. However, suspension kinematics are always neglected on these models. Suspension kinematics are defined as wheel movements with respect to the vehicle body. Tire forces are expressed as a function of wheel slip and wheel position. Therefore, the suspension kinematic relation is appended to the 14 DOF vehicle model to investigate its effects on the accuracy of simulate trajectory. Classical 14 DOF vehicle model is chosen as baseline model. Experiment data is collected from formula student style car test runs as baseline data for simulation and comparison between baseline model and model with suspension kinematic. Results show that in a single long turn there is an accumulated trajectory error in baseline model compared to model with suspension kinematic. While in short alternate turns, the trajectory error is much smaller. These results show that suspension kinematic had an effect on the trajectory simulation of vehicle. Which optimal control that use baseline model will result in inaccuracy control scheme.

  12. Numerical Modeling of Flow Control in a Boundary-Layer-Ingesting Offset Inlet Diffuser at Transonic Mach Numbers

    NASA Technical Reports Server (NTRS)

    Allan Brian G.; Owens, Lewis, R.

    2006-01-01

    This paper will investigate the validation of a NASA developed, Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, for a boundary-layer-ingesting (BLI) offset (S-shaped) inlet in transonic flow with passive and active flow control devices as well as the baseline case. Numerical simulations are compared to wind tunnel results of a BLI inlet conducted at the NASA Langley 0.3-Meter Transonic Cryogenic Tunnel. Comparisons of inlet flow distortion, pressure recovery, and inlet wall pressures are performed. The numerical simulations are compared to the BLI inlet data at a freestream Mach number of 0.85 and a Reynolds number of approximately 2 million based on the length of the fan-face diameter. The numerical simulations with and without wind tunnel walls are performed, quantifying effects of the tunnel walls on the BLI inlet flow measurements. The wind tunnel test evaluated several different combinations of jet locations and mass flow rates as well as a vortex generator (VG) vane case. The numerical simulations will be performed on a single jet configuration for varying actuator mass flow rates at a fix inlet mass flow condition. Validation of the numerical simulations for the VG vane case will also be performed for varying inlet mass flow rates. Overall, the numerical simulations were able to predict the baseline circumferential flow distortion, DPCPavg, very well for comparisons made within the designed operating range of the BLI inlet. However the CFD simulations did predict a total pressure recovery that was 0.01 lower than the experiment. Numerical simulations of the baseline inlet flow also showed good agreement with the experimental inlet centerline surface pressures. The vane case showed that the CFD predicted the correct trends in the circumferential distortion for varying inlet mass flow but had a distortion level that was nearly twice as large as the experiment. Comparison to circumferential distortion measurements for a 15 deg clocked 40 probe rake indicated that the circumferential distortion levels are very sensitive to the symmetry of the flow and that a miss alignment of the vanes in the experiment could have resulted in this difference. The numerical simulations of the BLI inlet with jets showed good agreement with the circumferential inlet distortion levels for a range of jet actuator mass flow ratios at a fixed inlet mass flow rate. The CFD simulations for the jet case also predicted an average total pressure recovery that was 0.01 lower than the experiment as was seen in the baseline. Comparison of the flow features the jet case revealed that the CFD predicted a much larger vortex at the engine fan-face when compare to the experiment.

  13. Gamma ray observatory dynamics simulator in Ada (GRODY)

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This experiment involved the parallel development of dynamics simulators for the Gamma Ray Observatory in both FORTRAN and Ada for the purpose of evaluating the applicability of Ada to the NASA/Goddard Space Flight Center's flight dynamics environment. The experiment successfully demonstrated that Ada is a viable, valuable technology for use in this environment. In addition to building a simulator, the Ada team evaluated training approaches, developed an Ada methodology appropriate to the flight dynamics environment, and established a baseline for evaluating future Ada projects.

  14. GLoBES: General Long Baseline Experiment Simulator

    NASA Astrophysics Data System (ADS)

    Huber, Patrick; Kopp, Joachim; Lindner, Manfred; Rolinec, Mark; Winter, Walter

    2007-09-01

    GLoBES (General Long Baseline Experiment Simulator) is a flexible software package to simulate neutrino oscillation long baseline and reactor experiments. On the one hand, it contains a comprehensive abstract experiment definition language (AEDL), which allows to describe most classes of long baseline experiments at an abstract level. On the other hand, it provides a C-library to process the experiment information in order to obtain oscillation probabilities, rate vectors, and Δχ-values. Currently, GLoBES is available for GNU/Linux. Since the source code is included, the port to other operating systems is in principle possible. GLoBES is an open source code that has previously been described in Computer Physics Communications 167 (2005) 195 and in Ref. [7]). The source code and a comprehensive User Manual for GLoBES v3.0.8 is now available from the CPC Program Library as described in the Program Summary below. The home of GLobES is http://www.mpi-hd.mpg.de/~globes/. Program summaryProgram title: GLoBES version 3.0.8 Catalogue identifier: ADZI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 145 295 No. of bytes in distributed program, including test data, etc.: 1 811 892 Distribution format: tar.gz Programming language: C Computer: GLoBES builds and installs on 32bit and 64bit Linux systems Operating system: 32bit or 64bit Linux RAM: Typically a few MBs Classification: 11.1, 11.7, 11.10 External routines: GSL—The GNU Scientific Library, www.gnu.org/software/gsl/ Nature of problem: Neutrino oscillations are now established as the leading flavor transition mechanism for neutrinos. In a long history of many experiments, see, e.g., [1], two oscillation frequencies have been identified: The fast atmospheric and the slow solar oscillations, which are driven by the respective mass squared differences. In addition, there could be interference effects between these two oscillations, provided that the coupling given by the small mixing angle θ is large enough. Such interference effects include, for example, leptonic CP violation. In order to test the unknown oscillation parameters, i.e. the mixing angle θ, the leptonic CP phase, and the neutrino mass hierarchy, new long-baseline and reactor experiments are proposed. These experiments send an artificial neutrino beam to a detector, or detect the neutrinos produced by a nuclear fission reactor. However, the presence of multiple solutions which are intrinsic to neutrino oscillation probabilities [2-5] affect these measurements. Thus optimization strategies are required which maximally exploit complementarity between experiments. Therefore, a modern, complete experiment simulation and analysis tool does not only need to have a highly accurate beam and detector simulation, but also powerful means to analyze correlations and degeneracies, especially for the combination of several experiments. The GLoBES software package is such a tool [6,7]. Solution method: GLoBES is a flexible software tool to simulate and analyze neutrino oscillation long-baseline and reactor experiments using a complete three-flavor description. On the one hand, it contains a comprehensive abstract experiment definition language (AEDL), which makes it possible to describe most classes of long baseline and reactor experiments at an abstract level. On the other hand, it provides a C-library to process the experiment information in order to obtain oscillation probabilities, rate vectors, and Δχ-values. In addition, it provides a binary program to test experiment definitions very quickly, before they are used by the application software. Restrictions: Currently restricted to discrete sets of sources and detectors. For example, the simulation of an atmospheric neutrino flux is not supported. Unusual features: Clear separation between experiment description and the simulation software. Additional comments: To find information on the latest version of the software and user manual, please check the author's web site, http://www.mpi-hd.mpg.de/~globes Running time: The examples included in the distribution take only a few minutes to complete. More sophisticated problems can take up to several days. References [1] V. Barger, D. Marfatia, K. Whisnant, Int. J. Mod. Phys. E 12 (2003) 569, hep-ph/0308123, and references therein. [2] G.L. Fogli, E. Lisi, Phys. Rev. D 54 (1996) 3667, hep-ph/9604415. [3] J. Burguet-Castell, M.B. Gavela, J.J. Gomez-Cadenas, P. Hernandez, O. Mena, Nucl. Phys. B 608 (2001) 301, hep-ph/0103258. [4] H. Minakata, H. Nunokawa, JHEP 0110 (2001) 001, hep-ph/0108085. [5] V. Barger, D. Marfatia, K. Whisnant, Phys. Rev. D 65 (2002) 073023, hep-ph/0112119. [6] P. Huber, M. Lindner, W. Winter, Comput. Phys. Commun. 167 (2005) 195. [7] P. Huber, J. Kopp, M. Lindner, M. Rolinec, W. Winter, Comput. Phys. Commun. 177 (2007) 432.

  15. Improvements in the simulation code of the SOX experiment

    NASA Astrophysics Data System (ADS)

    Caminata, A.; Agostini, M.; Altenmüeller, K.; Appel, S.; Atroshchenko, V.; Bellini, G.; Benziger, J.; Bick, D.; Bonfini, G.; Bravo, D.; Caccianiga, B.; Calaprice, F.; Carlini, M.; Cavalcante, P.; Chepurnov, A.; Choi, K.; Cribier, M.; D'Angelo, D.; Davini, S.; Derbin, A.; Di Noto, L.; Drachnev, I.; Durero, M.; Etenko, A.; Farinon, S.; Fischer, V.; Fomenko, K.; Franco, D.; Gabriele, F.; Gaffiot, J.; Galbiati, C.; Gschwender, M.; Ghiano, C.; Giammarchi, M.; Goeger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Houdy, Th.; Hungerford, E.; Ianni, Aldo; Ianni, Andrea; Jonquères, N.; Jany, A.; Jedrzejczak, K.; Jeschke, D.; Kobychev, V.; Korablev, D.; Korga, G.; Kornoukhov, V.; Kryn, D.; Lachenmaier, T.; Lasserre, T.; Laubenstein, M.; Lehnert, B.; Link, J.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Manecki, S.; Maneschg, W.; Manuzio, G.; Marcocci, S.; Maricic, J.; Mention, G.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Montuschi, M.; Mosteiro, P.; Muratova, V.; Musenich, R.; Neumair, B.; Oberauer, L.; Obolensky, M.; Ortica, F.; Pallavicini, M.; Papp, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Romani, A.; Roncin, R.; Rossi, N.; Schönert, S.; Scola, L.; Semenov, D.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Sukhotin, S.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Thurn, J.; Toropova, M.; Unzhakov, E.; Veyssiére, C.; Vishneva, A.; Vivier, M.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Weinz, S.; Winter, J.; Wojcik, M.; Wurm, M.; Yokley, Z.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.

    2017-09-01

    The aim of the SOX experiment is to test the hypothesis of existence of light sterile neutrinos trough a short baseline experiment. Electron antineutrinos will be produced by an high activity source and detected in the Borexino experiment. Both an oscillometry approach and a conventional disappearance analysis will be performed and, if combined, SOX will be able to investigate most of the anomaly region at 95% c.l. This paper focuses on the improvements performed on the simulation code and on the techniques (calibrations) used to validate the results.

  16. Teaching Palatoplasty Using a High-Fidelity Cleft Palate Simulator.

    PubMed

    Cheng, Homan; Podolsky, Dale J; Fisher, David M; Wong, Karen W; Lorenz, H Peter; Khosla, Rohit K; Drake, James M; Forrest, Christopher R

    2018-01-01

    Cleft palate repair is a challenging procedure for cleft surgeons to teach. A novel high-fidelity cleft palate simulator has been described for surgeon training. This study evaluates the simulator's effect on surgeon procedural confidence and palatoplasty knowledge among learners. Plastic surgery trainees attended a palatoplasty workshop consisting of a didactic session on cleft palate anatomy and repair followed by a simulation session. Participants completed a procedural confidence questionnaire and palatoplasty knowledge test immediately before and after the workshop. All participants reported significantly higher procedural confidence following the workshop (p < 0.05). Those with cleft palate surgery experience had higher procedural confidence before (p < 0.001) and after (p < 0.001) the session. Palatoplasty knowledge test scores increased in 90 percent of participants. The mean baseline test score was 28 ± 10.89 percent and 43 ± 18.86 percent following the workshop. Those with prior cleft palate experience did not have higher mean baseline test scores than those with no experience (30 percent versus 28 percent; p > 0.05), but did have significantly higher scores after the workshop (61 percent versus 35 percent; p < 0.05). All trainees strongly agreed or agreed that the simulator should be integrated into training and they would use it again. This study demonstrates the effective use of a novel cleft palate simulator as a training tool to teach palatoplasty. Improved procedural confidence and knowledge were observed after a single session, with benefits seen among trainees both with and without previous cleft experience.

  17. Design of a low parasitic inductance SiC power module with double-sided cooling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Fei; Liang, Zhenxian; Wang, Fei

    In this paper, a low-parasitic inductance SiC power module with double-sided cooling is designed and compared with a baseline double-sided cooled module. With the unique 3D layout utilizing vertical interconnection, the power loop inductance is effectively reduced without sacrificing the thermal performance. Both simulations and experiments are carried out to validate the design. Q3D simulation results show a power loop inductance of 1.63 nH, verified by the experiment, indicating more than 60% reduction of power loop inductance compared with the baseline module. With 0Ω external gate resistance turn-off at 600V, the voltage overshoot is less than 9% of the busmore » voltage at a load of 44.6A.« less

  18. The accuracy of caries risk assessment in children attending South Australian School Dental Service: a longitudinal study

    PubMed Central

    Ha, Diep H; Spencer, A John; Slade, Gary D; Chartier, Andrew D

    2014-01-01

    Objectives To determine the accuracy of the caries risk assessment system and performance of clinicians in their attempts to predict caries for children during routine practice. Design Longitudinal study. Setting and participants Data on caries risk assessment conducted by clinicians during routine practice while providing care for children in the South Australian School Dental Service (SA SDS) were collected from electronic patient records. Baseline data on caries experience, clinicians’ ratings of caries risk status and child demographics were obtained for all SA SDS patients aged 5–15 years examined during 2002–2005. Outcome measure Children’s caries incidence rate, calculated using examination data after a follow-up period of 6–48 months from baseline, was used as the gold standard to compute the sensitivity (Se) and specificity (Sp) of clinicians’ baseline ratings of caries risk. Multivariate binomial regression models were used to evaluate effects of children's baseline characteristics on Se and Sp. Results A total of 133 clinicians rated caries risk status of 71 430 children during 2002–2005. The observed Se and Sp were 0.48 and 0.86, respectively (Se+Sp=1.34). Caries experience at baseline was the strongest factor influencing accuracy in multivariable regression model. Among children with no caries experience at baseline, overall accuracy (Se+Sp) was only 1.05, whereas it was 1.28 among children with at least one tooth surfaces with caries experience at baseline. Conclusions Clinicians’ accuracy in predicting caries risk during routine practice was similar to levels reported in research settings that simulated patient care. Accuracy was acceptable in children who had prior caries experience at the baseline examination, while it was poor among children with no caries experience. PMID:24477318

  19. Simulated Performance of the Orbiting Wide-angle Light Collectors (OWL) Experiment

    NASA Technical Reports Server (NTRS)

    Krizmanic, J. F.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The Orbiting Wide-angle Light collectors (OWL) experiment is in NASA's mid-term strategic plan and will stereoscopically image, from equatorial orbit, the air fluorescence signal generated by airshowers induced by the ultrahigh energy (E greater than few x 10(exp 19) eV) component of the cosmic radiation. The use of a space-based platform enables an extremely large event acceptance aperture and thus will allow a high statistics measurement of these rare events. Detailed Monte Carlo simulations are required to quantify the physics potential of the mission as well as optimize the instrumental parameters. This paper reports on the results of the GSFC Monte Carlo simulation for two different, OWL instrument baseline designs. These results indicate that, assuming a continuation of the cosmic ray spectrum (theta approximately E(exp -2.75), OWL could have an event rate of 4000 events/year with E greater than or equal to 10(exp 20) eV. Preliminary results, based upon these Monte Carlo simulations, indicate that events can be accurately reconstructed in the detector focal plane arrays for the OWL instrument baseline designs under consideration.

  20. Premission and postmission simulation studies of the foot-controlled maneuvering unit for Skylab experiment T-020. [astronaut maneuvering equipment - space environment simulation

    NASA Technical Reports Server (NTRS)

    Hewes, D. E.; Glover, K. E.

    1975-01-01

    A Skylab experiment was conducted to study the maneuvering capabilities of astronauts using a relatively simple self-locomotive device, referred to as the foot-controlled maneuvering unit, and to evaluate the effectiveness of ground-based facilities simulating the operation of this device in weightless conditions of space. Some of the special considerations given in the definition and development of the experiment as related to the two ground-based simulators are reviewed. These simulators were used to train the test subjects and to obtain baseline data which could be used for comparison with the in-flight tests that were performed inside the Skylab orbital workshop. The results of both premission and postmission tests are discussed, and subjective comparisons of the in-flight and ground-based test conditions are presented.

  1. Mitigating Task Saturation in Critical Care Air Transport Team Red Flag Checklist

    DTIC Science & Technology

    2015-04-14

    Cincinnati. Team and individual performances were scored using a validated assessment tool for NOTECHS. Salivary cortisol levels were measured at baseline...deployment experience (pɘ.04) continued to be significant. Salivary cortisol levels increased by 0.124μg/dL over baseline as the result of the...Figure Page 1 Salivary cortisol levels for all 48 participants in the simulations ........................................3

  2. On the ability of human listeners to distinguish between front and back.

    PubMed

    Zhang, Peter Xinya; Hartmann, William M

    2010-02-01

    In order to determine whether a sound source is in front or in back, listeners can use location-dependent spectral cues caused by diffraction from their anatomy. This capability was studied using a precise virtual reality technique (VRX) based on a transaural technology. Presented with a virtual baseline simulation accurate up to 16 kHz, listeners could not distinguish between the simulation and a real source. Experiments requiring listeners to discriminate between front and back locations were performed using controlled modifications of the baseline simulation to test hypotheses about the important spectral cues. The experiments concluded: (1) Front/back cues were not confined to any particular 1/3rd or 2/3rd octave frequency region. Often adequate cues were available in any of several disjoint frequency regions. (2) Spectral dips were more important than spectral peaks. (3) Neither monaural cues nor interaural spectral level difference cues were adequate. (4) Replacing baseline spectra by sharpened spectra had minimal effect on discrimination performance. (5) When presented with an interaural time difference less than 200 micros, which pulled the image to the side, listeners still successfully discriminated between front and back, suggesting that front/back discrimination is independent of azimuthal localization within certain limits. Copyright 2009 Elsevier B.V. All rights reserved.

  3. On the ability of human listeners to distinguish between front and back

    PubMed Central

    Zhang, Peter Xinya; Hartmann, William M.

    2009-01-01

    In order to determine whether a sound source is in front or in back, listeners can use location-dependent spectral cues caused by diffraction from their anatomy. This capability was studied using a precise virtual-reality technique (VRX) based on a transaural technology. Presented with a virtual baseline simulation accurate up to 16 kHz, listeners could not distinguish between the simulation and a real source. Experiments requiring listeners to discriminate between front and back locations were performed using controlled modifications of the baseline simulation to test hypotheses about the important spectral cues. The experiments concluded: (1) Front/back cues were not confined to any particular 1/3rd or 2/3rd octave frequency region. Often adequate cues were available in any of several disjoint frequency regions. (2) Spectral dips were more important than spectral peaks. (3) Neither monaural cues nor interaural spectral level difference cues were adequate. (4) Replacing baseline spectra by sharpened spectra had minimal effect on discrimination performance. (5) When presented with an interaural time difference less than 200 μs, which pulled the image to the side, listeners still successfully discriminated between front and back, suggesting that front/back discrimination is independent of azimuthal localization within certain limits. PMID:19900525

  4. Hybrid Reynolds-Averaged/Large-Eddy Simulations of a Co-Axial Supersonic Free-Jet Experiment

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Edwards, J. R.

    2009-01-01

    Reynolds-averaged and hybrid Reynolds-averaged/large-eddy simulations have been applied to a supersonic coaxial jet flow experiment. The experiment utilized either helium or argon as the inner jet nozzle fluid, and the outer jet nozzle fluid consisted of laboratory air. The inner and outer nozzles were designed and operated to produce nearly pressure-matched Mach 1.8 flow conditions at the jet exit. The purpose of the computational effort was to assess the state-of-the-art for each modeling approach, and to use the hybrid Reynolds-averaged/large-eddy simulations to gather insight into the deficiencies of the Reynolds-averaged closure models. The Reynolds-averaged simulations displayed a strong sensitivity to choice of turbulent Schmidt number. The baseline value chosen for this parameter resulted in an over-prediction of the mixing layer spreading rate for the helium case, but the opposite trend was noted when argon was used as the injectant. A larger turbulent Schmidt number greatly improved the comparison of the results with measurements for the helium simulations, but variations in the Schmidt number did not improve the argon comparisons. The hybrid simulation results showed the same trends as the baseline Reynolds-averaged predictions. The primary reason conjectured for the discrepancy between the hybrid simulation results and the measurements centered around issues related to the transition from a Reynolds-averaged state to one with resolved turbulent content. Improvements to the inflow conditions are suggested as a remedy to this dilemma. Comparisons between resolved second-order turbulence statistics and their modeled Reynolds-averaged counterparts were also performed.

  5. Reactive flow modeling of small scale detonation failure experiments for a baseline non-ideal explosive

    NASA Astrophysics Data System (ADS)

    Kittell, David E.; Cummock, Nick R.; Son, Steven F.

    2016-08-01

    Small scale characterization experiments using only 1-5 g of a baseline ammonium nitrate plus fuel oil (ANFO) explosive are discussed and simulated using an ignition and growth reactive flow model. There exists a strong need for the small scale characterization of non-ideal explosives in order to adequately survey the wide parameter space in sample composition, density, and microstructure of these materials. However, it is largely unknown in the scientific community whether any useful or meaningful result may be obtained from detonation failure, and whether a minimum sample size or level of confinement exists for the experiments. In this work, it is shown that the parameters of an ignition and growth rate law may be calibrated using the small scale data, which is obtained from a 35 GHz microwave interferometer. Calibration is feasible when the samples are heavily confined and overdriven; this conclusion is supported with detailed simulation output, including pressure and reaction contours inside the ANFO samples. The resulting shock wave velocity is most likely a combined chemical-mechanical response, and simulations of these experiments require an accurate unreacted equation of state (EOS) in addition to the calibrated reaction rate. Other experiments are proposed to gain further insight into the detonation failure data, as well as to help discriminate between the role of the EOS and reaction rate in predicting the measured outcome.

  6. Reactive flow modeling of small scale detonation failure experiments for a baseline non-ideal explosive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kittell, David E.; Cummock, Nick R.; Son, Steven F.

    2016-08-14

    Small scale characterization experiments using only 1–5 g of a baseline ammonium nitrate plus fuel oil (ANFO) explosive are discussed and simulated using an ignition and growth reactive flow model. There exists a strong need for the small scale characterization of non-ideal explosives in order to adequately survey the wide parameter space in sample composition, density, and microstructure of these materials. However, it is largely unknown in the scientific community whether any useful or meaningful result may be obtained from detonation failure, and whether a minimum sample size or level of confinement exists for the experiments. In this work, itmore » is shown that the parameters of an ignition and growth rate law may be calibrated using the small scale data, which is obtained from a 35 GHz microwave interferometer. Calibration is feasible when the samples are heavily confined and overdriven; this conclusion is supported with detailed simulation output, including pressure and reaction contours inside the ANFO samples. The resulting shock wave velocity is most likely a combined chemical-mechanical response, and simulations of these experiments require an accurate unreacted equation of state (EOS) in addition to the calibrated reaction rate. Other experiments are proposed to gain further insight into the detonation failure data, as well as to help discriminate between the role of the EOS and reaction rate in predicting the measured outcome.« less

  7. The use of a virtual reality surgical simulator for cataract surgical skill assessment with 6 months of intervening operating room experience.

    PubMed

    Sikder, Shameema; Luo, Jia; Banerjee, P Pat; Luciano, Cristian; Kania, Patrick; Song, Jonathan C; Kahtani, Eman S; Edward, Deepak P; Towerki, Abdul-Elah Al

    2015-01-01

    To evaluate a haptic-based simulator, MicroVisTouch™, as an assessment tool for capsulorhexis performance in cataract surgery. The study is a prospective, unmasked, nonrandomized dual academic institution study conducted at the Wilmer Eye Institute at Johns Hopkins Medical Center (Baltimore, MD, USA) and King Khaled Eye Specialist Hospital (Riyadh, Saudi Arabia). This prospective study evaluated capsulorhexis simulator performance in 78 ophthalmology residents in the US and Saudi Arabia in the first round of testing and 40 residents in a second round for follow-up. Four variables (circularity, accuracy, fluency, and overall) were tested by the simulator and graded on a 0-100 scale. Circularity (42%), accuracy (55%), and fluency (3%) were compiled to give an overall score. Capsulorhexis performance was retested in the original cohort 6 months after baseline assessment. Average scores in all measured metrics demonstrated statistically significant improvement (except for circularity, which trended toward improvement) after baseline assessment. A reduction in standard deviation and improvement in process capability indices over the 6-month period was also observed. An interval objective improvement in capsulorhexis skill on a haptic-enabled cataract surgery simulator was associated with intervening operating room experience. Further work investigating the role of formalized simulator training programs requiring independent simulator use must be studied to determine its usefulness as an evaluation tool.

  8. How Do Simulated Error Experiences Impact Attitudes Related to Error Prevention?

    PubMed

    Breitkreuz, Karen R; Dougal, Renae L; Wright, Melanie C

    2016-10-01

    The objective of this project was to determine whether simulated exposure to error situations changes attitudes in a way that may have a positive impact on error prevention behaviors. Using a stratified quasi-randomized experiment design, we compared risk perception attitudes of a control group of nursing students who received standard error education (reviewed medication error content and watched movies about error experiences) to an experimental group of students who reviewed medication error content and participated in simulated error experiences. Dependent measures included perceived memorability of the educational experience, perceived frequency of errors, and perceived caution with respect to preventing errors. Experienced nursing students perceived the simulated error experiences to be more memorable than movies. Less experienced students perceived both simulated error experiences and movies to be highly memorable. After the intervention, compared with movie participants, simulation participants believed errors occurred more frequently. Both types of education increased the participants' intentions to be more cautious and reported caution remained higher than baseline for medication errors 6 months after the intervention. This study provides limited evidence of an advantage of simulation over watching movies describing actual errors with respect to manipulating attitudes related to error prevention. Both interventions resulted in long-term impacts on perceived caution in medication administration. Simulated error experiences made participants more aware of how easily errors can occur, and the movie education made participants more aware of the devastating consequences of errors.

  9. Motion simulator study of longitudinal stability requirements for large delta wing transport airplanes during approach and landing with stability augmentation systems failed

    NASA Technical Reports Server (NTRS)

    Snyder, C. T.; Fry, E. B.; Drinkwater, F. J., III; Forrest, R. D.; Scott, B. C.; Benefield, T. D.

    1972-01-01

    A ground-based simulator investigation was conducted in preparation for and correlation with an-flight simulator program. The objective of these studies was to define minimum acceptable levels of static longitudinal stability for landing approach following stability augmentation systems failures. The airworthiness authorities are presently attempting to establish the requirements for civil transports with only the backup flight control system operating. Using a baseline configuration representative of a large delta wing transport, 20 different configurations, many representing negative static margins, were assessed by three research test pilots in 33 hours of piloted operation. Verification of the baseline model to be used in the TIFS experiment was provided by computed and piloted comparisons with a well-validated reference airplane simulation. Pilot comments and ratings are included, as well as preliminary tracking performance and workload data.

  10. Statistical validation of predictive TRANSP simulations of baseline discharges in preparation for extrapolation to JET D-T

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Tae; Romanelli, M.; Yuan, X.; Kaye, S.; Sips, A. C. C.; Frassinetti, L.; Buchanan, J.; Contributors, JET

    2017-06-01

    This paper presents for the first time a statistical validation of predictive TRANSP simulations of plasma temperature using two transport models, GLF23 and TGLF, over a database of 80 baseline H-mode discharges in JET-ILW. While the accuracy of the predicted T e with TRANSP-GLF23 is affected by plasma collisionality, the dependency of predictions on collisionality is less significant when using TRANSP-TGLF, indicating that the latter model has a broader applicability across plasma regimes. TRANSP-TGLF also shows a good matching of predicted T i with experimental measurements allowing for a more accurate prediction of the neutron yields. The impact of input data and assumptions prescribed in the simulations are also investigated in this paper. The statistical validation and the assessment of uncertainty level in predictive TRANSP simulations for JET-ILW-DD will constitute the basis for the extrapolation to JET-ILW-DT experiments.

  11. Indentation experiments and simulation of ovine bone using a viscoelastic-plastic damage model

    PubMed Central

    Zhao, Yang; Wu, Ziheng; Turner, Simon; MacLeay, Jennifer; Niebur, Glen L.; Ovaert, Timothy C.

    2015-01-01

    Indentation methods have been widely used to study bone at the micro- and nanoscales. It has been shown that bone exhibits viscoelastic behavior with permanent deformation during indentation. At the same time, damage due to microcracks is induced due to the stresses beneath the indenter tip. In this work, a simplified viscoelastic-plastic damage model was developed to more closely simulate indentation creep data, and the effect of the model parameters on the indentation curve was investigated. Experimentally, baseline and 2-year postovariectomized (OVX-2) ovine (sheep) bone samples were prepared and indented. The damage model was then applied via finite element analysis to simulate the bone indentation data. The mechanical properties of yielding, viscosity, and damage parameter were obtained from the simulations. The results suggest that damage develops more quickly for OVX-2 samples under the same indentation load conditions as the baseline data. PMID:26136623

  12. Very Long Baseline Interferometry Applied to Polar Motion, Relativity and Geodesy. Ph.D. Thesis - Maryland Univ.

    NASA Technical Reports Server (NTRS)

    Ma, C.

    1978-01-01

    The causes and effects of diurnal polar motion are described. An algorithm is developed for modeling the effects on very long baseline interferometry observables. Five years of radio-frequency very long baseline interferometry data from stations in Massachusetts, California, and Sweden are analyzed for diurnal polar motion. It is found that the effect is larger than predicted by McClure. Corrections to the standard nutation series caused by the deformability of the earth have a significant effect on the estimated diurnal polar motion scaling factor and the post-fit residual scatter. Simulations of high precision very long baseline interferometry experiments taking into account both measurement uncertainty and modeled errors are described.

  13. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    PubMed

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  14. Shoulder arthroscopy simulator training improves shoulder arthroscopy performance in a cadaveric model.

    PubMed

    Henn, R Frank; Shah, Neel; Warner, Jon J P; Gomoll, Andreas H

    2013-06-01

    The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaveric model of shoulder arthroscopy. Seventeen first-year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and 9 of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The 2 groups were compared by use of Student t tests, and change over time within groups was analyzed with paired t tests. There were no observed differences between the 2 groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (P < .05). Time to completion was significantly faster in the simulator group compared with controls at the final evaluation (P < .05). No difference was observed between the groups on the subjective scores at the final evaluation (P = .98). Shoulder arthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaveric model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. There may be a role for simulator training in shoulder arthroscopy education. Copyright © 2013 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  15. Shoulder Arthroscopy Simulator Training Improves Shoulder Arthroscopy Performance in a Cadaver Model

    PubMed Central

    Henn, R. Frank; Shah, Neel; Warner, Jon J.P.; Gomoll, Andreas H.

    2013-01-01

    Purpose The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaver model of shoulder arthroscopy. Methods Seventeen first year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and nine of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The two groups were compared with students t-tests, and change over time within groups was analyzed with paired t-tests. Results There were no observed differences between the two groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (p<0.05). Time to completion was significantly faster in the simulator group compared to controls at final evaluation (p<0.05). No difference was observed between the groups on the subjective scores at final evaluation (p=0.98). Conclusions Shoulder arthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaver model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. Clinical Relevance There may be a role for simulator training in shoulder arthroscopy education. PMID:23591380

  16. Development of realtime connected element interferometry at the Goldstone Deep Space Communications Complex

    NASA Technical Reports Server (NTRS)

    Edwards, C. D.

    1990-01-01

    Connected-element interferometry (CEI) has the potential to provide high-accuracy angular spacecraft tracking on short baselines by making use of the very precise phase delay observable. Within the Goldstone Deep Space Communications Complex (DSCC), one of three tracking complexes in the NASA Deep Space Network, baselines of up to 21 km in length are available. Analysis of data from a series of short-baseline phase-delay interferometry experiments are presented to demonstrate the potential tracking accuracy on these baselines. Repeated differential observations of pairs of angularly close extragalactic radio sources were made to simulate differential spacecraft-quasar measurements. Fiber-optic data links and a correlation processor are currently being developed and installed at Goldstone for a demonstration of real-time CEI in 1990.

  17. Intercomparison of cloud model simulations of Arctic mixed-phase boundary layer clouds observed during SHEBA/FIRE-ACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrison, H.; Zuidema, Paquita; Ackerman, Andrew

    2011-06-16

    An intercomparison of six cloud-resolving and large-eddy simulation models is presented. This case study is based on observations of a persistent mixed-phase boundary layer cloud gathered on 7 May, 1998 from the Surface Heat Budget of Arctic Ocean (SHEBA) and First ISCCP Regional Experiment - Arctic Cloud Experiment (FIRE-ACE). Ice nucleation is constrained in the simulations in a way that holds the ice crystal concentration approximately fixed, with two sets of sensitivity runs in addition to the baseline simulations utilizing different specified ice nucleus (IN) concentrations. All of the baseline and sensitivity simulations group into two distinct quasi-steady states associatedmore » with either persistent mixed-phase clouds or all-ice clouds after the first few hours of integration, implying the existence of multiple equilibria. These two states are associated with distinctly different microphysical, thermodynamic, and radiative characteristics. Most but not all of the models produce a persistent mixed-phase cloud qualitatively similar to observations using the baseline IN/crystal concentration, while small increases in the IN/crystal concentration generally lead to rapid glaciation and conversion to the all-ice state. Budget analysis indicates that larger ice deposition rates associated with increased IN/crystal concentrations have a limited direct impact on dissipation of liquid in these simulations. However, the impact of increased ice deposition is greatly enhanced by several interaction pathways that lead to an increased surface precipitation flux, weaker cloud top radiative cooling and cloud dynamics, and reduced vertical mixing, promoting rapid glaciation of the mixed-phase cloud for deposition rates in the cloud layer greater than about 1-2x10-5 g kg-1 s-1. These results indicate the critical importance of precipitation-radiative-dynamical interactions in simulating cloud phase, which have been neglected in previous fixed-dynamical parcel studies of the cloud phase parameter space. Large sensitivity to the IN/crystal concentration also suggests the need for improved understanding of ice nucleation and its parameterization in models.« less

  18. Assessing Orchestrated Simulation Through Modeling to Quantify the Benefits of Unmanned-Teaming in a Tactical ASW Scenario

    DTIC Science & Technology

    2018-03-01

    Results are compared to a previous study using a similar design of experiments but different simulation software. The baseline scenario for exploring the...behaviors are mimicked in this research, enabling Solem’s MANA results to be compared to our LITMUS’ results. By design , the principal difference...missions when using the second order NOLH, and compares favorably with the over six million in the full factorial design . 3. Advantages of Cluster

  19. A pilot study examining experiential learning vs didactic education of abdominal compartment syndrome.

    PubMed

    Saraswat, Anju; Bach, John; Watson, William D; Elliott, John O; Dominguez, Edward P

    2017-08-01

    Current surgical education relies on simulated educational experiences or didactic sessions to teach low-frequency clinical events such as abdominal compartment syndrome (ACS). The purpose of this pilot study was to evaluate if simulation would improve performance and knowledge retention of ACS better than a didactic lecture. Nineteen general surgery residents were block randomized by postgraduate year level to a didactic or a simulation session. After 3 months, all residents completed a knowledge assessment before participating in an additional simulation. Two independent reviewers assessed resident performance via audio-video recordings. No baseline differences in ACS experience were noted between groups. The observational evaluation demonstrated a significant difference in performance between the didactic and simulation groups: 9.9 vs 12.5, P = .037 (effect size = 1.15). Knowledge retention was equivalent between groups. This pilot study suggests that simulation-based education may be more effective for teaching the basic concepts of ACS. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. The Small Aircraft Transportation System Higher Volume Operations (SATS HVO) Flight Experiment

    NASA Technical Reports Server (NTRS)

    Williams, Daniel M.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides a summary of conclusions from the Small Aircraft Transportation System (SATS) Higher Volume Operations (HVO) Flight Experiment which NASA conducted to determine pilot acceptability of the HVO concept for normal conditions. The SATS HVO concept improves efficiency at non-towered, non-radar airports in Instrument Meteorological Conditions (IMC) while achieving a level of safety equal to today s system. Reported are results from flight experiment data that indicate that the SATS HVO concept is viable. The success of the SATS HVO concept is based on acceptable pilot workload, performance, and subjective criteria when compared to the procedural control operations in use today at non-towered, non-radar controlled airfields in IMC. The HVO Flight Experiment, flown on NASA's Cirrus SR22, used a subset of the HVO Simulation Experiment scenarios and evaluation pilots in order to validate the simulation experiment results. HVO and Baseline (today s system) scenarios flown included: single aircraft arriving for a GPS non-precision approach; aircraft arriving for the approach with multiple traffic aircraft; and aircraft arriving for the approach with multiple traffic aircraft and then conducting a missed approach. Results reveal that all twelve low-time instrument-rated pilots preferred SATS HVO when compared to current procedural separation operations. These pilots also flew the HVO procedures safely and proficiently without additional workload in comparison to today s system (Baseline). Detailed results of pilot flight technical error, and their subjective assessments of workload and situation awareness are presented in this paper.

  1. A test of maternal programming of offspring stress response to predation risk in threespine sticklebacks.

    PubMed

    Mommer, Brett C; Bell, Alison M

    2013-10-02

    Non-genetic maternal effects are widespread across taxa and challenge our traditional understanding of inheritance. Maternal experience with predators, for example, can have lifelong consequences for offspring traits, including fitness. Previous work in threespine sticklebacks showed that females exposed to simulated predation risk produced eggs with higher cortisol content and offspring with altered anti-predator behavior. However, it is unknown whether this maternal effect is mediated via the offspring glucocorticoid stress response and if it is retained over the entire lifetime of offspring. Therefore, we tested the hypothesis that maternal exposure to simulated predation risk has long-lasting effects on the cortisol response to simulated predation risk in stickleback offspring. We measured circulating concentrations of cortisol before (baseline), 15 min after, and 60 min after exposure to a simulated predation risk. We compared adult offspring of predator-exposed mothers and control mothers in two different social environments (alone or in a group). Relative to baseline, offspring plasma cortisol was highest 15 min after exposure to simulated predation risk and decreased after 60 min. Offspring of predator-exposed mothers differed in the cortisol response to simulated predation risk compared to offspring of control mothers. In general, females had higher cortisol than males, and fish in a group had lower cortisol than fish that were by themselves. The buffering effect of the social environment did not differ between maternal treatments or between males and females. Altogether the results show that while a mother's experience with simulated predation risk might affect the physiological response of her adult offspring to a predator, sex and social isolation have much larger effects on the stress response to predation risk in sticklebacks. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Effects of low-dose alcohol exposure on simulated merchant ship piloting by maritime cadets.

    PubMed

    Howland, J; Rohsenow, D J; Cote, J; Gomez, B; Mangione, T W; Laramie, A K

    2001-03-01

    The US Department of Transportation (DOT) regulates on-the-job alcohol use by operators of certain categories of commercial transport. For aircraft, trains, and commercial vessels, operators are subject to sanctions for having > or = 0.04 g% blood alcohol concentration (BAC). This study examines the effects of alcohol (between 0.04 and 0.05 g% BAC) on simulated merchant ship handling. A two-group randomized factorial design was used to compare beverage alcohol to placebo while controlling for baseline performance on a previous day. The study was conducted in the Maritime Simulation Center at Maine Maritime Academy, Castine, ME. Participants were 38 volunteer deck officer cadets in their junior or senior year, at least 21 years of age, with previous experience on a bridge simulator. Following a baseline trial on Day 1, on Day 2 participants were randomized to receive alcohol (0.6 g/kg for males and 0.5 g/kg for females) or placebo. After allowing time for absorption, participants completed a bridge simulator task. For baseline and performance trials, participants were randomized to one of four bridge simulator scenarios, each representing passage of a fully loaded container vessel through a channel with commercial traffic. The aggregate scenario score given by blinded maritime educators measured performance. A main effect for alcohol was found indicating that performance was significantly impaired by this low dose of alcohol relative to performance in the placebo condition. These findings are consistent with current federal regulations that limit low-dose alcohol exposure for the operators of commercial transport vehicles. Further research is required to determine effects at lower BACs.

  3. Stress and Temperature Distributions of Individual Particles in a Shock Wave Propagating through Dry and Wet Sand Mixtures

    NASA Astrophysics Data System (ADS)

    Schumaker, Merit; Stewart, Sarah T.; Borg, John P.

    2015-06-01

    Determining stress and temperature distributions of dynamically compacted particles is of interest to the geophysical and astrological research communities. However, these particle interactions during a shock event are not easily observed in planar shock experiments; it is with the utilization of mesoscale simulations that these granular particle interactions can be unraveled. Unlike homogenous materials, the overall averaged hugoniot state for heterogeneous granular materials differs from the individual stress and temperature states of particles during a shock event. From planar shock experiments on dry and wet sand mixtures, simulations were constructed using CTH. A baseline dry sand simulation was also setup to be compared to sand grains that possessed water particles between grains. It is from these simulations that the distributions of stress and temperatures for individual sand and water particles are presented and compared in this document.

  4. Boundary Layer Protuberance Simulations in Channel Nozzle Arc-Jet

    NASA Technical Reports Server (NTRS)

    Marichalar, J. J.; Larin, M. E.; Campbell, C. H.; Pulsonetti, M. V.

    2010-01-01

    Two protuberance designs were modeled in the channel nozzle of the NASA Johnson Space Center Atmospheric Reentry Materials and Structures Facility with the Data-Parallel Line Relaxation computational fluid dynamics code. The heating on the protuberance was compared to nominal baseline heating at a single fixed arc-jet condition in order to obtain heating augmentation factors for flight traceability in the Boundary Layer Transition Flight Experiment on Space Shuttle Orbiter flights STS-119 and STS-128. The arc-jet simulations were performed in conjunction with the actual ground tests performed on the protuberances. The arc-jet simulations included non-uniform inflow conditions based on the current best practices methodology and used variable enthalpy and constant mass flow rate across the throat. Channel walls were modeled as fully catalytic isothermal surfaces, while the test section (consisting of Reaction Cured Glass tiles) was modeled as a partially catalytic radiative equilibrium wall. The results of the protuberance and baseline simulations were compared to the applicable ground test results, and the effects of the protuberance shock on the opposite channel wall were investigated.

  5. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Overview and summary

    NASA Technical Reports Server (NTRS)

    1989-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned Marshall Space Flight Center (MSFC) Payload Training Complex (PTC) required to meet this need will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs. This study was performed August 1988 to October 1989. Thus, the results are based on the SSFP August 1989 baseline, i.e., pre-Langley configuration/budget review (C/BR) baseline. Some terms, e.g., combined trainer, are being redefined. An overview of the study activities and a summary of study results are given here.

  6. Determination of the θ23 octant in long baseline neutrino experiments within and beyond the standard model

    NASA Astrophysics Data System (ADS)

    Das, C. R.; Pulido, João; Maalampi, Jukka; Vihonen, Sampsa

    2018-02-01

    The recent data indicate that the neutrino mixing angle θ23 deviates from the maximal-mixing value of 45°, showing two nearly degenerate solutions, one in the lower octant (LO) (θ23<4 5 ° ) and one in the higher octant (HO) (θ23>4 5 ° ). We investigate, using numerical simulations, the prospects for determining the octant of θ23 in the future long baseline oscillation experiments. We present our results as contour plots on the (θ23-4 5 ° , δ )-plane, where δ is the C P phase, showing the true values of θ23 for which the octant can be experimentally determined at 3 σ , 2 σ and 1 σ confidence level. In particular, we study the impact of the possible nonunitarity of neutrino mixing on the experimental determination of θ23 in those experiments.

  7. Layover and shadow detection based on distributed spaceborne single-baseline InSAR

    NASA Astrophysics Data System (ADS)

    Huanxin, Zou; Bin, Cai; Changzhou, Fan; Yun, Ren

    2014-03-01

    Distributed spaceborne single-baseline InSAR is an effective technique to get high quality Digital Elevation Model. Layover and Shadow are ubiquitous phenomenon in SAR images because of geometric relation of SAR imaging. In the signal processing of single-baseline InSAR, the phase singularity of Layover and Shadow leads to the phase difficult to filtering and unwrapping. This paper analyzed the geometric and signal model of the Layover and Shadow fields. Based on the interferometric signal autocorrelation matrix, the paper proposed the signal number estimation method based on information theoretic criteria, to distinguish Layover and Shadow from normal InSAR fields. The effectiveness and practicability of the method proposed in the paper are validated in the simulation experiments and theoretical analysis.

  8. Video requirements for remote medical diagnosis

    NASA Technical Reports Server (NTRS)

    Davis, J. G.

    1974-01-01

    Minimal television system requirements for medical telediagnosis were studied. The experiment was conducted with the aid of a simulated telemedicine system. The first step involved making high quality videotape recordings of actual medical examinations conducted by a skilled nurse under the direction of a physician watching on closed circuit television. These recordings formed the baseline for the study. Next, these videotape recordings were electronically degraded to simulate television systems of less than broadcast quality. Finally, the baseline and degraded video recordings were shown (via a statistically randomized procedure) to a large number of physicians who attempted to reach a correct medical diagnosis and to visually recognize key physical signs for each patient. By careful scoring and analysis of the results of these viewings, the pictorial and diagnostic limitations as a function of technical video characteristics were to be defined.

  9. Structural assembly demonstration experiment, phase 1

    NASA Astrophysics Data System (ADS)

    Akin, David L.; Bowden, Mary L.; Miller, Rene H.

    1983-03-01

    The goal of this phase of the structural assembly and demonstration experiment (SADE) program was to begin to define a shuttle flight experiment that would yield data to compare on-orbit assembly operations of large space structures with neutral buoyancy simulations. In addition, the experiment would be an early demonstration of structural hardware and human capabilities in extravehicular activity (EVA). The objectives of the MIT study, as listed in the statement of work, were: to provide support in establishing a baseline neutral buoyancy testing data base, to develop a correlation technique between neutral buoyancy test results and on-orbit operations, and to prepare the SADE experiment plan (MSFC-PLAN-913).

  10. Crack-Detection Experiments on Simulated Turbine Engine Disks in NASA Glenn Research Center's Rotordynamics Laboratory

    NASA Technical Reports Server (NTRS)

    Woike, Mark R.; Abdul-Aziz, Ali

    2010-01-01

    The development of new health-monitoring techniques requires the use of theoretical and experimental tools to allow new concepts to be demonstrated and validated prior to use on more complicated and expensive engine hardware. In order to meet this need, significant upgrades were made to NASA Glenn Research Center s Rotordynamics Laboratory and a series of tests were conducted on simulated turbine engine disks as a means of demonstrating potential crack-detection techniques. The Rotordynamics Laboratory consists of a high-precision spin rig that can rotate subscale engine disks at speeds up to 12,000 rpm. The crack-detection experiment involved introducing a notch on a subscale engine disk and measuring its vibration response using externally mounted blade-tip-clearance sensors as the disk was operated at speeds up to 12 000 rpm. Testing was accomplished on both a clean baseline disk and a disk with an artificial crack: a 50.8-mm- (2-in.-) long introduced notch. The disk s vibration responses were compared and evaluated against theoretical models to investigate how successful the technique was in detecting cracks. This paper presents the capabilities of the Rotordynamics Laboratory, the baseline theory and experimental setup for the crack-detection experiments, and the associated results from the latest test campaign.

  11. Assessing the learning curve for the acquisition of laparoscopic skills on a virtual reality simulator.

    PubMed

    Sherman, V; Feldman, L S; Stanbridge, D; Kazmi, R; Fried, G M

    2005-05-01

    The aim of this study was to develop summary metrics and assess the construct validity for a virtual reality laparoscopic simulator (LapSim) by comparing the learning curves of three groups with different levels of laparoscopic expertise. Three groups of subjects ('expert', 'junior', and 'naïve') underwent repeated trials on three LapSim tasks. Formulas were developed to calculate scores for efficiency ('time-error') and economy of 'motion' ('motion') using metrics generated by the software after each drill. Data (mean +/- SD) were evaluated by analysis of variance (ANOVA). Significance was set at p < 0.05. All three groups improved significantly from baseline to final for both 'time-error' and 'motion' scores. There were significant differences between groups in time error performances at baseline and final, due to higher scores in the 'expert' group. A significant difference in 'motion' scores was seen only at baseline. We have developed summary metrics for the LapSim that differentiate among levels of laparoscopic experience. This study also provides evidence of construct validity for the LapSim.

  12. Simulation of acute haemodynamic outcomes of the surgical strategies for the right ventricular failure treatment in pediatric LVAD.

    PubMed

    Di Molfetta, Arianna; Ferrari, Gianfranco; Iacobelli, Roberta; Filippelli, Sergio; Fresiello, Libera; Gagliardi, Maria G; Toscano, Alessandro; Trivella, Maria G; Amodeo, Antonio

    2015-12-01

    Right ventricular failure (RVF) is one of the major complications during LVAD. Apart from drug therapy, the most reliable option is the implantation of RVAD. However, BIVAD have a poor prognosis and increased complications. Experiments have been conducted on alternative approaches, such as the creation of an atrial septal defect (ASD), a cavo-aortic shunt (CAS) including the LVAD and a cavo-pulmonary connection (CPC). This work aims at realizing a lumped parameter model (LPM) to compare the acute hemodynamic effects of ASD, CPC, CAS, RVAD in LVAD pediatric patients with RVF. Data of 5 pediatric patients undergoing LVAD were retrospectively collected to reproduce patients baseline hemodynamics with the LPM. The effects of continuous flow LVAD implantation complicated by RVF was simulated and then the effects of ASD, CPC, CAS and RVAD treatments were simulated for each patient. The model successfully reproduced patients' baseline and the hemodynamic effects of the surgical strategies. Simulating the different surgical strategies, an unloading of the right ventricle and an increment of left ventricular preload were observed with an improvement of the hemodynamics (total cardiac output: ASD +15%, CPC +10%, CAS +70% RVAD +20%; right ventricular external work: ASD -19%, CPC -46%, CAS -76%, RVAD -32%; left ventricular external work: ASD +12%, CPC +28%, RVAD +64%). The use of numerical model could offer an additional support for clinical decision-making, also potentially reducing animal experiments, to compare the outcome of different surgical strategies to treat RVF in LVAD.

  13. Combustion-Powered Actuation for Dynamic Stall Suppression - Simulations and Low-Mach Experiments

    NASA Technical Reports Server (NTRS)

    Matalanis, Claude G.; Min, Byung-Young; Bowles, Patrick O.; Jee, Solkeun; Wake, Brian E.; Crittenden, Tom; Woo, George; Glezer, Ari

    2014-01-01

    An investigation on dynamic-stall suppression capabilities of combustion-powered actuation (COMPACT) applied to a tabbed VR-12 airfoil is presented. In the first section, results from computational fluid dynamics (CFD) simulations carried out at Mach numbers from 0.3 to 0.5 are presented. Several geometric parameters are varied including the slot chordwise location and angle. Actuation pulse amplitude, frequency, and timing are also varied. The simulations suggest that cycle-averaged lift increases of approximately 4% and 8% with respect to the baseline airfoil are possible at Mach numbers of 0.4 and 0.3 for deep and near-deep dynamic-stall conditions. In the second section, static-stall results from low-speed wind-tunnel experiments are presented. Low-speed experiments and high-speed CFD suggest that slots oriented tangential to the airfoil surface produce stronger benefits than slots oriented normal to the chordline. Low-speed experiments confirm that chordwise slot locations suitable for Mach 0.3-0.4 stall suppression (based on CFD) will also be effective at lower Mach numbers.

  14. The impact of simulation sequencing on perceived clinical decision making.

    PubMed

    Woda, Aimee; Hansen, Jamie; Paquette, Mary; Topp, Robert

    2017-09-01

    An emerging nursing education trend is to utilize simulated learning experiences as a means to optimize competency and decision making skills. The purpose of this study was to examine differences in students' perception of clinical decision making and clinical decision making-related self-confidence and anxiety based on the sequence (order) in which they participated in a block of simulated versus hospital-based learning experiences. A quasi-experimental crossover design was used. Between and within group differences were found relative to self-confidence with the decision making process. When comparing groups, at baseline the simulation followed by hospital group had significantly higher self-confidence scores, however, at 14-weeks both groups were not significantly different. Significant within group differences were found in the simulation followed by hospital group only, demonstrating a significant decrease in clinical decision making related anxiety across the semester. Finally, there were no significant difference in; perceived clinical decision making within or between the groups at the two measurement points. Preliminary findings suggest that simulated learning experiences can be offered with alternating sequences without impacting the process, anxiety or confidence with clinical decision making. This study provides beginning evidence to guide curriculum development and allow flexibility based on student needs and available resources. Copyright © 2017. Published by Elsevier Ltd.

  15. Shuttle mission simulator baseline definition report, volume 1

    NASA Technical Reports Server (NTRS)

    Burke, J. F.; Small, D. E.

    1973-01-01

    A baseline definition of the space shuttle mission simulator is presented. The subjects discussed are: (1) physical arrangement of the complete simulator system in the appropriate facility, with a definition of the required facility modifications, (2) functional descriptions of all hardware units, including the operational features, data demands, and facility interfaces, (3) hardware features necessary to integrate the items into a baseline simulator system to include the rationale for selecting the chosen implementation, and (4) operating, maintenance, and configuration updating characteristics of the simulator hardware.

  16. Single element injector testing for STME injector technology

    NASA Technical Reports Server (NTRS)

    Hulka, J.; Schneider, J. A.; Davis, J.

    1992-01-01

    An oxidizer-swirled coaxial element injector is being developed for application in the liquid oxygen/gaseous hydrogen Space Transportation Main Engine (STME) for the National Launch System (NLS) vehicle. This paper reports on the first two parts of a four part single injector element study for optimization of the STME injector design. Measurements of Rupe mixing efficiency and atomization characteristics are reported for single element versions of injection elements from two multielement injectors that have been recently hot fire tested. Rather than attempting to measure a definitive mixing efficiency or droplet size parameters of these injector elements, the purpose of these experiments was to provide a baseline comparison for evaluating future injector element design modifications. Hence, all the experiments reported here were conducted with cold flow simulants to nonflowing, ambient conditions. Mixing experiments were conducted with liquid/liquid simulants to provide economical trend data. Atomization experiments were conducted with liquid/gas simulants without backpressure. The results, despite significant differences from hot fire conditions, were found to relate to mixing and atomization parameters deduced from the hot fire testing, suggesting that these experiments are valid for trend analyses. Single element and subscale multielement hot fire testing will verify optimized designs before committing to fullscale fabrication.

  17. Terrain Portrayal for Synthetic Vision Systems Head-Down Displays Evaluation Results: Compilation of Pilot Transcripts

    NASA Technical Reports Server (NTRS)

    Hughes, Monica F.; Glaab, Louis J.

    2007-01-01

    The Terrain Portrayal for Head-Down Displays (TP-HDD) simulation experiment addressed multiple objectives involving twelve display concepts (two baseline concepts without terrain and ten synthetic vision system (SVS) variations), four evaluation maneuvers (two en route and one approach maneuver, plus a rare-event scenario), and three pilot group classifications. The TP-HDD SVS simulation was conducted in the NASA Langley Research Center's (LaRC's) General Aviation WorkStation (GAWS) facility. The results from this simulation establish the relationship between terrain portrayal fidelity and pilot situation awareness, workload, stress, and performance and are published in the NASA TP entitled Terrain Portrayal for Synthetic Vision Systems Head-Down Displays Evaluation Results. This is a collection of pilot comments during each run of the TP-HDD simulation experiment. These comments are not the full transcripts, but a condensed version where only the salient remarks that applied to the scenario, the maneuver, or the actual research itself were compiled.

  18. Computational Modeling of Aerosol Hazard Arising from the Opening of an Anthrax Letter in an Open-Office Complex

    NASA Astrophysics Data System (ADS)

    Lien, F. S.; Ji, H.; Yee, E.

    Early experimental work, conducted at Defence R&D Canada — Suffield, measured and characterized the personal and environmental contamination associated with the simulated opening of anthrax-tainted letters under a number of different scenarios. A better understanding of the physical and biological processes is considerably significant for detecting, assessing, and formulating potential mitigation strategies for managing these risks. These preliminary experimental investigations have been extended to simulate the contamination from the opening of anthrax-tainted letters in an Open-Office environment using Computational Fluid Dynamics (CFD). Bacillus globigii (BG) was used as a biological simulant for anthrax, with 0.1 gram of the simulant released from opened letters in the experiments conducted. The accuracy of the model for prediction of the spatial distribution of BG spores in the office is first assessed quantitatively by comparison with measured SF6 concentrations (the baseline experiment), and then qualitatively by comparison with measured BG concentrations obtained under a number of scenarios, some involving people moving within various offices.

  19. Computational Fluid Dynamic simulations of pipe elbow flow.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Homicz, Gregory Francis

    2004-08-01

    One problem facing today's nuclear power industry is flow-accelerated corrosion and erosion in pipe elbows. The Korean Atomic Energy Research Institute (KAERI) is performing experiments in their Flow-Accelerated Corrosion (FAC) test loop to better characterize these phenomena, and develop advanced sensor technologies for the condition monitoring of critical elbows on a continuous basis. In parallel with these experiments, Sandia National Laboratories is performing Computational Fluid Dynamic (CFD) simulations of the flow in one elbow of the FAC test loop. The simulations are being performed using the FLUENT commercial software developed and marketed by Fluent, Inc. The model geometry and meshmore » were created using the GAMBIT software, also from Fluent, Inc. This report documents the results of the simulations that have been made to date; baseline results employing the RNG k-e turbulence model are presented. The predicted value for the diametrical pressure coefficient is in reasonably good agreement with published correlations. Plots of the velocities, pressure field, wall shear stress, and turbulent kinetic energy adjacent to the wall are shown within the elbow section. Somewhat to our surprise, these indicate that the maximum values of both wall shear stress and turbulent kinetic energy occur near the elbow entrance, on the inner radius of the bend. Additional simulations were performed for the same conditions, but with the RNG k-e model replaced by either the standard k-{var_epsilon}, or the realizable k-{var_epsilon} turbulence model. The predictions using the standard k-{var_epsilon} model are quite similar to those obtained in the baseline simulation. However, with the realizable k-{var_epsilon} model, more significant differences are evident. The maximums in both wall shear stress and turbulent kinetic energy now appear on the outer radius, near the elbow exit, and are {approx}11% and 14% greater, respectively, than those predicted in the baseline calculation; secondary maxima in both quantities still occur near the elbow entrance on the inner radius. Which set of results better reflects reality must await experimental corroboration. Additional calculations demonstrate that whether or not FLUENT's radial equilibrium pressure distribution option is used in the PRESSURE OUTLET boundary condition has no significant impact on the flowfield near the elbow. Simulations performed with and without the chemical sensor and associated support bracket that were present in the experiments demonstrate that the latter have a negligible influence on the flow in the vicinity of the elbow. The fact that the maxima in wall shear stress and turbulent kinetic energy occur on the inner radius is therefore not an artifact of having introduced the sensor into the flow.« less

  20. CFD Code Validation of Wall Heat Fluxes for a G02/GH2 Single Element Combustor

    NASA Technical Reports Server (NTRS)

    Lin, Jeff; West, Jeff S.; Williams, Robert W.; Tucker, P. Kevin

    2005-01-01

    This paper puts forth the case for the need for improved injector design tools to meet NASA s Vision for Space Exploration goals. Requirements for this improved tool are outlined and discussed. The potential for Computational Fluid Dynamics (CFD) to meet these requirements is noted along with its current shortcomings, especially relative to demonstrated solution accuracy. The concept of verification and validation is introduced as the primary process for building and quantifying the confidence necessary for CFD to be useful as an injector design tool. The verification and validation process is considered in the context of the Marshall Space Flight Center (MSFC) Combustion Devices CFD Simulation Capability Roadmap via the Simulation Readiness Level (SRL) concept. The portion of the validation process which demonstrates the ability of a CFD code to simulate heat fluxes to a rocket engine combustor wall is the focus of the current effort. The FDNS and Loci-CHEM codes are used to simulate a shear coaxial single element G02/GH2 injector experiment. The experiment was conducted a t a chamber pressure of 750 psia using hot propellants from preburners. A measured wall temperature profile is used as a boundary condition to facilitate the calculations. Converged solutions, obtained from both codes by using wall functions with the K-E turbulence model and integrating to the wall using Mentor s baseline turbulence model, are compared to the experimental data. The initial solutions from both codes revealed significant issues with the wall function implementation associated with the recirculation zone between the shear coaxial jet and the chamber wall. The FDNS solution with a corrected implementation shows marked improvement in overall character and level of comparison to the data. With the FDNS code, integrating to the wall with Mentor s baseline turbulence model actually produce a degraded solution when compared to the wall function solution with the K--E model. The Loci-CHEM solution, produced by integrating to the wall with Mentor s baseline turbulence model, matches both the heat flux rise rate in the near injector region and the peak heat flux level very well. However, it moderately over predicts the heat fluxes downstream of the reattachment point. The Loci-CHEM solution achieved by integrating to the wall with Mentor s baseline turbulence model was clearly superior to the other solutions produced in this effort.

  1. The NOvA simulation chain

    NASA Astrophysics Data System (ADS)

    Aurisano, A.; Backhouse, C.; Hatcher, R.; Mayer, N.; Musser, J.; Patterson, R.; Schroeter, R.; Sousa, A.

    2015-12-01

    The NOνA experiment is a two-detector, long-baseline neutrino experiment operating in the recently upgraded NuMI muon neutrino beam. Simulating neutrino interactions and backgrounds requires many steps including: the simulation of the neutrino beam flux using FLUKA and the FLUGG interface; cosmic ray generation using CRY; neutrino interaction modeling using GENIE; and a simulation of the energy deposited in the detector using GEANT4. To shorten generation time, the modeling of detector-specific aspects, such as photon transport, detector and electronics noise, and readout electronics, employs custom, parameterized simulation applications. We will describe the NOνA simulation chain, and present details on the techniques used in modeling photon transport near the ends of cells, and in developing a novel data-driven noise simulation. Due to the high intensity of the NuMI beam, the Near Detector samples a high rate of muons originating in the surrounding rock. In addition, due to its location on the surface at Ash River, MN, the Far Detector collects a large rate (˜ 140 kHz) of cosmic muons. We will discuss the methods used in NOνA for overlaying rock muons and cosmic ray muons with simulated neutrino interactions and show how realistically the final simulation reproduces the preliminary NOνA data.

  2. The NO vA simulation chain

    DOE PAGES

    Aurisano, A.; Backhouse, C.; Hatcher, R.; ...

    2015-12-23

    The NO vA experiment is a two-detector, long-baseline neutrino experiment operating in the recently upgraded NuMI muon neutrino beam. Simulating neutrino interactions and backgrounds requires many steps including: the simulation of the neutrino beam flux using FLUKA and the FLUGG interface, cosmic ray generation using CRY, neutrino interaction modeling using GENIE, and a simulation of the energy deposited in the detector using GEANT4. To shorten generation time, the modeling of detector-specific aspects, such as photon transport, detector and electronics noise, and readout electronics, employs custom, parameterized simulation applications. We will describe the NO vA simulation chain, and present details onmore » the techniques used in modeling photon transport near the ends of cells, and in developing a novel data-driven noise simulation. Due to the high intensity of the NuMI beam, the Near Detector samples a high rate of muons originating in the surrounding rock. In addition, due to its location on the surface at Ash River, MN, the Far Detector collects a large rate ((˜) 140 kHz) of cosmic muons. Furthermore, we will discuss the methods used in NO vA for overlaying rock muons and cosmic ray muons with simulated neutrino interactions and show how realistically the final simulation reproduces the preliminary NO vA data.« less

  3. Safety Performance of Airborne Separation: Preliminary Baseline Testing

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria C.; Hoadley, Sherwood T.; Wing, David J.; Baxley, Brian T.

    2007-01-01

    The Safety Performance of Airborne Separation (SPAS) study is a suite of Monte Carlo simulation experiments designed to analyze and quantify safety behavior of airborne separation. This paper presents results of preliminary baseline testing. The preliminary baseline scenario is designed to be very challenging, consisting of randomized routes in generic high-density airspace in which all aircraft are constrained to the same flight level. Sustained traffic density is varied from approximately 3 to 15 aircraft per 10,000 square miles, approximating up to about 5 times today s traffic density in a typical sector. Research at high traffic densities and at multiple flight levels are planned within the next two years. Basic safety metrics for aircraft separation are collected and analyzed. During the progression of experiments, various errors, uncertainties, delays, and other variables potentially impacting system safety will be incrementally introduced to analyze the effect on safety of the individual factors as well as their interaction and collective effect. In this paper we report the results of the first experiment that addresses the preliminary baseline condition tested over a range of traffic densities. Early results at five times the typical traffic density in today s NAS indicate that, under the assumptions of this study, airborne separation can be safely performed. In addition, we report on initial observations from an exploration of four additional factors tested at a single traffic density: broadcast surveillance signal interference, extent of intent sharing, pilot delay, and wind prediction error.

  4. The effects of video games on laparoscopic simulator skills.

    PubMed

    Jalink, Maarten B; Goris, Jetse; Heineman, Erik; Pierie, Jean-Pierre E N; ten Cate Hoedemaker, Henk O

    2014-07-01

    Recently, there has been a growth in studies supporting the hypothesis that video games have positive effects on basic laparoscopic skills. This review discusses all studies directly related to these effects. A search in the PubMed and EMBASE databases was performed using synonymous terms for video games and laparoscopy. All available articles concerning video games and their effects on skills on any laparoscopic simulator (box trainer, virtual reality, and animal models) were selected. Video game experience has been related to higher baseline laparoscopic skills in different studies. There is currently, however, no standardized method to assess video game experience, making it difficult to compare these studies. Several controlled experiments have, nevertheless, shown that video games cannot only be used to improve laparoscopic basic skills in surgical novices, but are also used as a temporary warming-up before laparoscopic surgery. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Software Geometry in Simulations

    NASA Astrophysics Data System (ADS)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  6. Centimeter repeatability of the VLBI estimates of European baselines

    NASA Technical Reports Server (NTRS)

    Rius, Antonio; Zarraoa, Nestor; Sardon, Esther; Ma, Chopo

    1992-01-01

    In the last three years, the European Geodetic Very Long Baseline Interferometry (VLBI) Network has grown to a total of six fixed antennas placed in Germany, Italy, Spain and Sweden, all equipped with the standard geodetic VLBI instrumentation and data recording systems. During this period of time, several experiments have been carried out using this interferometer providing data of very high quality due to the excellent sensitivity and performance of the European stations. The purpose of this paper is to study the consistency of the VLBI geodetic results on the European baselines with respect to the different degrees of freedom in the analysis procedure. Used to complete this study were both real and simulated data sets, two different software packages (OCCAM 3.0 and CALC 7.4/SOLVE), and a variety of data analysis strategies.

  7. Modeling and analysis of pinhole occulter experiment: Initial study phase

    NASA Technical Reports Server (NTRS)

    Vandervoort, R. J.

    1985-01-01

    The feasibility of using a generic simulation, TREETOPS, to simulate the Pinhole/Occulter Facility (P/OF) to be tested on the space shuttle was demonstrated. The baseline control system was used to determine the pointing performance of the P/OF. The task included modeling the structure as a three body problem (shuttle-instrument pointing system- P/OP) including the flexibility of the 32 meter P/OF boom. Modeling of sensors, actuators, and control algorithms was also required. Detailed mathematical models for the structure, sensors, and actuators are presented, as well as the control algorithm and corresponding design procedure. Closed loop performance using this controller and computer listings for the simulator are also given.

  8. Temporal pattern of emotions and cognitive load during simulation training and debriefing.

    PubMed

    Fraser, Kristin; McLaughlin, Kevin

    2018-04-24

    In the simulated clinical environment, there is a perceived benefit to the emotional activation experienced by learners; however, potential harm of excessive and/or negative emotions has also been hypothesized. An improved understanding of the emotional experiences of learners during each phase of the simulation session will inform instructional design. In this observational study, we asked 174 first-year medical students about their emotional state upon arrival to the simulation lab (t1). They were then trained on a standard simulation scenario, after which they rated their emotional state and perceived cognitive load (t2). After debriefing, we then asked them to again rate their emotions and cognitive load (t3). Students reported that their experience of tranquility (a positive and low-arousal state) dropped from pre-scenario (t1) to post-scenario (t2), and returned to baseline levels after debriefing (t3), from 0.69 (0.87) to 0.14 (0.78) to 0.62 (0.78). Post scenario cognitive load was rated to be moderately high at 6.62 (1.12) and scores increased after debriefing to 6.90 (1.05) d = 0.26, p < 0.001. Cognitive load was associated with the simultaneous measures of emotions at both t2 and t3. Participant emotions are significantly altered through the experience of medical simulation and emotions are associated with subjective ratings of cognitive load.

  9. Improving interprofessional competence in undergraduate students using a novel blended learning approach.

    PubMed

    Riesen, Eleanor; Morley, Michelle; Clendinneng, Debra; Ogilvie, Susan; Ann Murray, Mary

    2012-07-01

    Interprofessional simulation interventions, especially when face-to-face, involve considerable resources and require that all participants convene in a single location at a specific time. Scheduling multiple people across different programs is an important barrier to implementing interprofessional education interventions. This study explored a novel way to overcome the challenges associated with scheduling interprofessional learning experiences through the use of simulations in a virtual environment (Web.Alive™) where learners interact as avatars. In this study, 60 recent graduates from nursing, paramedic, police, and child and youth service programs participated in a 2-day workshop designed to improve interprofessional competencies through a blend of learning environments that included virtual face-to-face experiences, traditional face-to-face experiences and online experiences. Changes in learners' interprofessional competence were assessed through three outcomes: change in interprofessional attitudes pre- to post-workshop, self-perceived changes in interprofessional competence and observer ratings of performance across three clinical simulations. Results from the study indicate that from baseline to post-intervention, there was significant improvement in learners' interprofessional competence across all outcomes, and that the blended learning environment provided an acceptable way to develop these competencies.

  10. 60 seconds to survival: A pilot study of a disaster triage video game for prehospital providers.

    PubMed

    Cicero, Mark X; Whitfill, Travis; Munjal, Kevin; Madhok, Manu; Diaz, Maria Carmen G; Scherzer, Daniel J; Walsh, Barbara M; Bowen, Angela; Redlener, Michael; Goldberg, Scott A; Symons, Nadine; Burkett, James; Santos, Joseph C; Kessler, David; Barnicle, Ryan N; Paesano, Geno; Auerbach, Marc A

    2017-01-01

    Disaster triage training for emergency medical service (EMS) providers is not standardized. Simulation training is costly and time-consuming. In contrast, educational video games enable low-cost and more time-efficient standardized training. We hypothesized that players of the video game "60 Seconds to Survival" (60S) would have greater improvements in disaster triage accuracy compared to control subjects who did not play 60S. Participants recorded their demographics and highest EMS training level and were randomized to play 60S (intervention) or serve as controls. At baseline, all participants completed a live school-shooting simulation in which manikins and standardized patients depicted 10 adult and pediatric victims. The intervention group then played 60S at least three times over the course of 13 weeks (time 2). Players triaged 12 patients in three scenarios (school shooting, house fire, tornado), and received in-game performance feedback. At time 2, the same live simulation was conducted for all participants. Controls had no disaster training during the study. The main outcome was improvement in triage accuracy in live simulations from baseline to time 2. Physicians and EMS providers predetermined expected triage level (RED/YELLOW/GREEN/BLACK) via modified Delphi method. There were 26 participants in the intervention group and 21 in the control group. There was no difference in gender, level of training, or years of EMS experience (median 5.5 years intervention, 3.5 years control, p = 0.49) between the groups. At baseline, both groups demonstrated median triage accuracy of 80 percent (IQR 70-90 percent, p = 0.457). At time 2, the intervention group had a significant improvement from baseline (median accuracy = 90 percent [IQR: 80-90 percent], p = 0.005), while the control group did not (median accuracy = 80 percent [IQR:80-95], p = 0.174). However, the mean improvement from baseline was not significant between the two groups (difference = 6.5, p = 0.335). The intervention demonstrated a significant improvement in accuracy from baseline to time 2 while the control did not. However, there was no significant difference in the improvement between the intervention and control groups. These results may be due to small sample size. Future directions include assessment of the game's effect on triage accuracy with a larger, multisite site cohort and iterative development to improve 60S.

  11. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 2: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  12. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  13. Optimization of the Neutrino Factory, revisited

    NASA Astrophysics Data System (ADS)

    Agarwalla, Sanjib K.; Huber, Patrick; Tang, Jian; Winter, Walter

    2011-01-01

    We perform the baseline and energy optimization of the Neutrino Factory including the latest simulation results on the magnetized iron detector (MIND). We also consider the impact of τ decays, generated by νμ → ντ or ν e → ντ appearance, on the mass hierarchy, CP violation, and θ 13 discovery reaches, which we find to be negligible for the considered detector. For the baseline-energy optimization for small sin2 2 θ 13, we qualitatively recover the results with earlier simulations of the MIND detector. We find optimal baselines of about 2500km to 5000km for the CP violation measurement, where now values of E μ as low as about 12 GeV may be possible. However, for large sin2 2 θ 13, we demonstrate that the lower threshold and the backgrounds reconstructed at lower energies allow in fact for muon energies as low as 5 GeV at considerably shorter baselines, such as FNAL-Homestake. This implies that with the latest MIND analysis, low-and high-energy versions of the Neutrino Factory are just two different versions of the same experiment optimized for different parts of the parameter space. Apart from a green-field study of the updated detector performance, we discuss specific implementations for the two-baseline Neutrino Factory, where the considered detector sites are taken to be currently discussed underground laboratories. We find that reasonable setups can be found for the Neutrino Factory source in Asia, Europe, and North America, and that a triangular-shaped storage ring is possible in all cases based on geometrical arguments only.

  14. Simulation-based education with mastery learning improves residents' lumbar puncture skills

    PubMed Central

    Cohen, Elaine R.; Caprio, Timothy; McGaghie, William C.; Simuni, Tanya; Wayne, Diane B.

    2012-01-01

    Objective: To evaluate the effect of simulation-based mastery learning (SBML) on internal medicine residents' lumbar puncture (LP) skills, assess neurology residents' acquired LP skills from traditional clinical education, and compare the results of SBML to traditional clinical education. Methods: This study was a pretest-posttest design with a comparison group. Fifty-eight postgraduate year (PGY) 1 internal medicine residents received an SBML intervention in LP. Residents completed a baseline skill assessment (pretest) using a 21-item LP checklist. After a 3-hour session featuring deliberate practice and feedback, residents completed a posttest and were expected to meet or exceed a minimum passing score (MPS) set by an expert panel. Simulator-trained residents' pretest and posttest scores were compared to assess the impact of the intervention. Thirty-six PGY2, 3, and 4 neurology residents from 3 medical centers completed the same simulated LP assessment without SBML. SBML posttest scores were compared to neurology residents' baseline scores. Results: PGY1 internal medicine residents improved from a mean of 46.3% to 95.7% after SBML (p < 0.001) and all met the MPS at final posttest. The performance of traditionally trained neurology residents was significantly lower than simulator-trained residents (mean 65.4%, p < 0.001) and only 6% met the MPS. Conclusions: Residents who completed SBML showed significant improvement in LP procedural skills. Few neurology residents were competent to perform a simulated LP despite clinical experience with the procedure. PMID:22675080

  15. Simulation and analyses of the aeroassist flight experiment attitude update method

    NASA Technical Reports Server (NTRS)

    Carpenter, J. R.

    1991-01-01

    A method which will be used to update the alignment of the Aeroassist Flight Experiment's Inertial Measuring Unit is simulated and analyzed. This method, the Star Line Maneuver, uses measurements from the Space Shuttle Orbiter star trackers along with an extended Kalman filter to estimate a correction to the attitude quaternion maintained by an Inertial Measuring Unit in the Orbiter's payload bay. This quaternion is corrupted by on-orbit bending of the Orbiter payload bay with respect to the Orbiter navigation base, which is incorporated into the payload quaternion when it is initialized via a direct transfer of the Orbiter attitude state. The method of updating this quaternion is examined through verification of baseline cases and Monte Carlo analysis using a simplified simulation, The simulation uses nominal state dynamics and measurement models from the Kalman filter as its real world models, and is programmed on Microvax minicomputer using Matlab, and interactive matrix analysis tool. Results are presented which confirm and augment previous performance studies, thereby enhancing confidence in the Star Line Maneuver design methodology.

  16. Light field geometry of a Standard Plenoptic Camera.

    PubMed

    Hahne, Christopher; Aggoun, Amar; Haxha, Shyqyri; Velisavljevic, Vladan; Fernández, Juan Carlos Jácome

    2014-11-03

    The Standard Plenoptic Camera (SPC) is an innovation in photography, allowing for acquiring two-dimensional images focused at different depths, from a single exposure. Contrary to conventional cameras, the SPC consists of a micro lens array and a main lens projecting virtual lenses into object space. For the first time, the present research provides an approach to estimate the distance and depth of refocused images extracted from captures obtained by an SPC. Furthermore, estimates for the position and baseline of virtual lenses which correspond to an equivalent camera array are derived. On the basis of paraxial approximation, a ray tracing model employing linear equations has been developed and implemented using Matlab. The optics simulation tool Zemax is utilized for validation purposes. By designing a realistic SPC, experiments demonstrate that a predicted image refocusing distance at 3.5 m deviates by less than 11% from the simulation in Zemax, whereas baseline estimations indicate no significant difference. Applying the proposed methodology will enable an alternative to the traditional depth map acquisition by disparity analysis.

  17. Dynamic performance of an aero-assist spacecraft - AFE

    NASA Technical Reports Server (NTRS)

    Chang, Ho-Pen; French, Raymond A.

    1992-01-01

    Dynamic performance of the Aero-assist Flight Experiment (AFE) spacecraft was investigated using a high-fidelity 6-DOF simulation model. Baseline guidance logic, control logic, and a strapdown navigation system to be used on the AFE spacecraft are also modeled in the 6-DOF simulation. During the AFE mission, uncertainties in the environment and the spacecraft are described by an error space which includes both correlated and uncorrelated error sources. The principal error sources modeled in this study include navigation errors, initial state vector errors, atmospheric variations, aerodynamic uncertainties, center-of-gravity off-sets, and weight uncertainties. The impact of the perturbations on the spacecraft performance is investigated using Monte Carlo repetitive statistical techniques. During the Solid Rocket Motor (SRM) deorbit phase, a target flight path angle of -4.76 deg at entry interface (EI) offers very high probability of avoiding SRM casing skip-out from the atmosphere. Generally speaking, the baseline designs of the guidance, navigation, and control systems satisfy most of the science and mission requirements.

  18. A preliminary 6 DOF attitude and translation control system design for Starprobe

    NASA Technical Reports Server (NTRS)

    Mak, P.; Mettler, E.; Vijayarahgavan, A.

    1981-01-01

    The extreme thermal environment near perihelion and the high-accuracy gravitational science experiments impose unique design requirements on various subsystems of Starprobe. This paper examines some of these requirements and their impact on the preliminary design of a six-degree-of-freedom attitude and translational control system. Attention is given to design considerations, the baseline attitude/translational control system, system modeling, and simulation studies.

  19. Risk as Feelings in the Effect of Patient Outcomes on Physicians' Subsequent Treatment Decisions: A Randomized Trial and Manipulation Validation

    PubMed Central

    Hemmerich, Joshua A; Elstein, Arthur S; Schwarze, Margaret L; Moliski, Elizabeth G; Dale, William

    2013-01-01

    The present study tested predictions derived from the Risk as Feelings hypothesis about the effects of prior patients' negative treatment outcomes on physicians' subsequent treatment decisions. Two experiments at The University of Chicago, U.S.A., utilized a computer simulation of an abdominal aortic aneurysm (AAA) patient with enhanced realism to present participants with one of three experimental conditions: AAA rupture causing a watchful waiting death (WWD), perioperative death (PD), or a successful operation (SO), as well as the statistical treatment guidelines for AAA. Experiment 1 tested effects of these simulated outcomes on (n=76) laboratory participants' (university student sample) self-reported emotions, and their ratings of valence and arousal of the AAA rupture simulation and other emotion inducing picture stimuli. Experiment 2 tested two hypotheses: 1) that experiencing a patient WWD in the practice trial's experimental condition would lead physicians to choose surgery earlier, and 2) experiencing a patient PD would lead physicians to choose surgery later with the next patient. Experiment 2 presented (n=132) physicians (surgeons and geriatricians) with the same experimental manipulation and a second simulated AAA patient. Physicians then chose to either go to surgery or continue watchful waiting. The results of Experiment 1 demonstrated that the WWD experimental condition significantly increased anxiety, and was rated similarly to other negative and arousing pictures. The results of Experiment 2 demonstrated that, after controlling for demographics, baseline anxiety, intolerance for uncertainty, risk attitudes, and the influence of simulation characteristics, the WWD experimental condition significantly expedited decisions to choose surgery for the next patient. The results support the Risk as Feelings hypothesis on physicians' treatment decisions in a realistic AAA patient computer simulation. Bad outcomes affected emotions and decisions, even with statistical AAA rupture risk guidance present. These results suggest that bad patient outcomes cause physicians to experience anxiety and regret that influences their subsequent treatment decision-making for the next patient. PMID:22571890

  20. Risk as feelings in the effect of patient outcomes on physicians' future treatment decisions: a randomized trial and manipulation validation.

    PubMed

    Hemmerich, Joshua A; Elstein, Arthur S; Schwarze, Margaret L; Moliski, Elizabeth Ghini; Dale, William

    2012-07-01

    The present study tested predictions derived from the Risk as Feelings hypothesis about the effects of prior patients' negative treatment outcomes on physicians' subsequent treatment decisions. Two experiments at The University of Chicago, U.S.A., utilized a computer simulation of an abdominal aortic aneurysm (AAA) patient with enhanced realism to present participants with one of three experimental conditions: AAA rupture causing a watchful waiting death (WWD), perioperative death (PD), or a successful operation (SO), as well as the statistical treatment guidelines for AAA. Experiment 1 tested effects of these simulated outcomes on (n = 76) laboratory participants' (university student sample) self-reported emotions, and their ratings of valence and arousal of the AAA rupture simulation and other emotion-inducing picture stimuli. Experiment 2 tested two hypotheses: 1) that experiencing a patient WWD in the practice trial's experimental condition would lead physicians to choose surgery earlier, and 2) experiencing a patient PD would lead physicians to choose surgery later with the next patient. Experiment 2 presented (n = 132) physicians (surgeons and geriatricians) with the same experimental manipulation and a second simulated AAA patient. Physicians then chose to either go to surgery or continue watchful waiting. The results of Experiment 1 demonstrated that the WWD experimental condition significantly increased anxiety, and was rated similarly to other negative and arousing pictures. The results of Experiment 2 demonstrated that, after controlling for demographics, baseline anxiety, intolerance for uncertainty, risk attitudes, and the influence of simulation characteristics, the WWD experimental condition significantly expedited decisions to choose surgery for the next patient. The results support the Risk as Feelings hypothesis on physicians' treatment decisions in a realistic AAA patient computer simulation. Bad outcomes affected emotions and decisions, even with statistical AAA rupture risk guidance present. These results suggest that bad patient outcomes cause physicians to experience anxiety and regret that influences their subsequent treatment decision-making for the next patient. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Use of a national continuing medical education meeting to provide simulation-based training in temporary hemodialysis catheter insertion skills: a pre-test post-test study.

    PubMed

    Clark, Edward G; Paparello, James J; Wayne, Diane B; Edwards, Cedric; Hoar, Stephanie; McQuillan, Rory; Schachter, Michael E; Barsuk, Jeffrey H

    2014-01-01

    Simulation-based-mastery-learning (SBML) is an effective method to train nephrology fellows to competently insert temporary, non-tunneled hemodialysis catheters (NTHCs). Previous studies of SBML for NTHC-insertion have been conducted at a local level. Determine if SBML for NTHC-insertion can be effective when provided at a national continuing medical education (CME) meeting. Describe the correlation of demographic factors, prior experience with NTHC-insertion and procedural self-confidence with simulated performance of the procedure. Pre-test - post-test study. 2014 Canadian Society of Nephrology annual meeting. Nephrology fellows, internal medicine residents and medical students. Participants were surveyed regarding demographics, prior NTHC-insertion experience, procedural self-confidence and attitudes regarding the training they received. NTHC-insertion skills were assessed using a 28-item checklist. Participants underwent a pre-test of their NTHC-insertion skills at the internal jugular site using a realistic patient simulator and ultrasound machine. Participants then had a training session that included a didactic presentation and 2 hours of deliberate practice using the simulator. On the following day, trainees completed a post-test of their NTHC-insertion skills. All participants were required to meet or exceed a minimum passing score (MPS) previously set at 79%. Trainees who did not reach the MPS were required to perform more deliberate practice until the MPS was achieved. Twenty-two individuals participated in SBML training. None met or exceeded the MPS at baseline with a median checklist score of 20 (IQR, 7.25 to 21). Seventeen of 22 participants (77%) completed post-testing and improved their scores to a median of 27 (IQR, 26 to 28; p < 0.001). All met or exceeded the MPS on their first attempt. There were no significant correlations between demographics, prior experience or procedural self-confidence with pre-test performance. Small sample-size and self-selection of participants. Costs could limit the long-term feasibility of providing this type of training at a CME conference. Despite most participants reporting having previously inserted NTHCs in clinical practice, none met the MPS at baseline; this suggests their prior training may have been inadequate.

  2. Implementation of Slater Boundary Condition into OVERFLOW

    NASA Astrophysics Data System (ADS)

    Duncan, Sean

    Bleed is one of the primary methods of controlling the flow within a mixed compression inlet. In this work the Slater boundary condition, first applied in WindUS, is implemented in OVERFLOW. Further, a simulation using discrete holes is run in order to show the differences between use of the boundary condition and use of the bleed hole geometry. Recent tests at Wright Patterson Air Force Base seek to provide a baseline for study of mixed compression inlets. The inlet used by the Air Force Research Laboratory is simulated in the modified OVERFLOW. The results from the experiment are compared to the CFD to qualitatively assess the accuracy of the simulations. The boundary condition is shown to be robust and viable in studying bleed.

  3. Dark focus of accommodation as dependent and independent variables in visual display technology

    NASA Technical Reports Server (NTRS)

    Jones, Sherrie; Kennedy, Robert; Harm, Deborah

    1992-01-01

    When independent stimuli are available for accommodation, as in the dark or under low contrast conditions, the lens seeks its resting position. Individual differences in resting positions are reliable, under autonomic control, and can change with visual task demands. We hypothesized that motion sickness in a flight simulator might result in dark focus changes. Method: Subjects received training flights in three different Navy flight simulators. Two were helicopter simulators entailed CRT presentation using infinity optics, one involved a dome presentation of a computer graphic visual projection system. Results: In all three experiments there were significant differences between dark focus activity before and after simulator exposure when comparisons were made between sick and not-sick pilot subjects. In two of these experiments, the average shift in dark focus for the sick subjects was toward increased myopia when each subject was compared to his own baseline. In the third experiment, the group showed an average shift outward of small amount and the subjects who were sick showed significantly less outward movement than those who were symptom free. Conclusions: Although the relationship is not a simple one, dark focus changes in simulator sickness imply parasympathetic activity. Because changes can occur in relation to endogenous and exogenous events, such measurement may have useful applications as dependent measures in studies of visually coupled systems, virtual reality systems, and space adaptation syndrome.

  4. The effect of human patient simulation on critical thinking and its predictors in prelicensure nursing students.

    PubMed

    Shinnick, Mary Ann; Woo, Mary A

    2013-09-01

    Human patient simulation (HPS) is becoming a popular teaching method in nursing education globally and is believed to enhance both knowledge and critical thinking. While there is evidence that HPS improves knowledge, there is no objective nursing data to support HPS impact on critical thinking. Therefore, we studied knowledge and critical thinking before and after HPS in prelicensure nursing students and attempted to identify the predictors of higher critical thinking scores. Using a one-group, quasi-experimental, pre-test post-test design, 154 prelicensure nursing students (age 25.7± 6.7; gender=87.7% female) from 3 schools were studied at the same point in their curriculum using a high-fidelity simulation. Pre- and post-HPS assessments of knowledge, critical thinking, and self-efficacy were done as well as assessments for demographics and learning style. There was a mean improvement in knowledge scores of 6.5 points (P<0.001), showing evidence of learning. However, there was no statistically significant change in the critical thinking scores. A logistic regression with 10 covariates revealed three variables to be predictors of higher critical thinking scores: greater "age" (P=0.01), baseline "knowledge" (P=0.04) and a low self-efficacy score ("not at all confident") in "baseline self-efficacy in managing a patient's fluid levels" (P=.05). This study reveals that gains in knowledge with HPS do not equate to changes in critical thinking. It does expose the variables of older age, higher baseline knowledge and low self-efficacy in "managing a patient's fluid levels" as being predictive of higher critical thinking ability. Further study is warranted to determine the effect of repeated or sequential simulations (dosing) and timing after the HPS experience on critical thinking gains. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Will the Playstation generation become better endoscopic surgeons?

    PubMed

    van Dongen, Koen W; Verleisdonk, Egbert-Jan M M; Schijven, Marlies P; Broeders, Ivo A M J

    2011-07-01

    A frequently heard comment is that the current "Playstation generation" will have superior baseline psychomotor skills. However, research has provided inconsistent results on this matter. The purpose of this study was to investigate whether the "Playstation generation" shows superior baseline psychomotor skills for endoscopic surgery on a virtual reality simulator. The 46 study participants were interns (mean age 24 years) of the department of surgery and schoolchildren (mean age 12.5 years) of the first year of a secondary school. Participants were divided into four groups: 10 interns with videogame experience and 10 without, 13 schoolchildren with videogame experience and 13 without. They performed four tasks twice on a virtual reality simulator for basic endoscopic skills. The one-way analysis of variance (ANOVA) with post hoc test Tukey-Bonferroni and the independent Student's t test were used to determine differences in mean scores. Interns with videogame experience scored significantly higher on total score (93 vs. 74.5; p=0.014) compared with interns without this experience. There was a nonsignificant difference in mean total scores between the group of schoolchildren with and those without videogame experience (61.69 vs. 55.46; p=0.411). The same accounts for interns with regard to mean scores on efficiency (50.7 vs. 38.9; p=0.011) and speed (18.8 vs. 14.3; p=0.023). In the group of schoolchildren, there was no statistical difference for efficiency (32.69 vs. 27.31; p=0.218) or speed (13.92 vs. 13.15; p=0.54). The scores concerning precision parameters did not differ for interns (23.5 vs. 21.3; p=0.79) or for schoolchildren (mean 15.08 vs. 15; p=0.979). Our study results did not predict an advantage of videogame experience in children with regard to superior psychomotor skills for endoscopic surgery. However, at adult age, a difference in favor of gaming is present. The next generation of surgeons might benefit from videogame experience during their childhood.

  6. Impact of Assimilation on Heavy Rainfall Simulations Using WRF Model: Sensitivity of Assimilation Results to Background Error Statistics

    NASA Astrophysics Data System (ADS)

    Rakesh, V.; Kantharao, B.

    2017-03-01

    Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events

  7. Controlled experiments for dense gas diffusion: Experimental design and execution, model comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egami, R.; Bowen, J.; Coulombe, W.

    1995-07-01

    An experimental baseline CO2 release experiment at the DOE Spill Test Facility on the Nevada Test Site in Southern Nevada is described. This experiment was unique in its use of CO2 as a surrogate gas representative of a variety of specific chemicals. Introductory discussion places the experiment in historical perspective. CO2 was selected as a surrogate gas to provide a data base suitable for evaluation of model scenarios involving a variety of specific dense gases. The experiment design and setup are described, including design rationale and quality assurance methods employed. Resulting experimental data are summarized. Data usefulness is examined throughmore » a preliminary comparison of experimental results with simulations performed using the SLAV and DEGADIS dense gas models.« less

  8. Flight Simulator Evaluation of Synthetic Vision Display Concepts to Prevent Controlled Flight Into Terrain (CFIT)

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Kramer, Lynda J.; Parrish, Russell V.; Bailey, Randall E.

    2004-01-01

    In commercial aviation, over 30-percent of all fatal accidents worldwide are categorized as Controlled Flight Into Terrain (CFIT) accidents, where a fully functioning airplane is inadvertently flown into the ground. The major hypothesis for a simulation experiment conducted at NASA Langley Research Center was that a Primary Flight Display (PFD) with synthetic terrain will improve pilots ability to detect and avoid potential CFITs compared to conventional instrumentation. All display conditions, including the baseline, contained a Terrain Awareness and Warning System (TAWS) and Vertical Situation Display (VSD) enhanced Navigation Display (ND). Each pilot flew twenty-two approach departure maneuvers in Instrument Meteorological Conditions (IMC) to the terrain challenged Eagle County Regional Airport (EGE) in Colorado. For the final run, flight guidance cues were altered such that the departure path went into terrain. All pilots with a synthetic vision system (SVS) PFD (twelve of sixteen pilots) noticed and avoided the potential CFIT situation. The four pilots who flew the anomaly with the conventional baseline PFD configuration (which included a TAWS and VSD enhanced ND) had a CFIT event. Additionally, all the SVS display concepts enhanced the pilot s situational awareness, decreased workload and improved flight technical error (FTE) compared to the baseline configuration.

  9. Relating a Jet-Surface Interaction Experiment to a Commercial Supersonic Transport Aircraft Using Numerical Simulations

    NASA Technical Reports Server (NTRS)

    Dippold, Vance F. III; Friedlander, David

    2017-01-01

    Reynolds-Averaged Navier-Stokes (RANS) simulations were performed for a commercial supersonic transport aircraft concept and experimental hardware models designed to represent the installed propulsion system of the conceptual aircraft in an upcoming test campaign. The purpose of the experiment is to determine the effects of jet-surface interactions from supersonic aircraft on airport community noise. RANS simulations of the commercial supersonic transport aircraft concept were performed to relate the representative experimental hardware to the actual aircraft. RANS screening simulations were performed on the proposed test hardware to verify that it would be free from potential rig noise and to predict the aerodynamic forces on the model hardware to assist with structural design. The simulations showed a large region of separated flow formed in a junction region of one of the experimental configurations. This was dissimilar with simulations of the aircraft and could invalidate the noise measurements. This configuration was modified and a subsequent RANS simulation showed that the size of the flow separation was greatly reduced. The aerodynamic forces found on the experimental models were found to be relatively small when compared to the expected loads from the model’s own weight.Reynolds-Averaged Navier-Stokes (RANS) simulations were completed for two configurations of a three-stream inverted velocity profile (IVP) nozzle and a baseline single-stream round nozzle (mixed-flow equivalent conditions). For the Sideline and Cutback flow conditions, while the IVP nozzles did not reduce the peak turbulent kinetic energy on the lower side of the jet plume, the IVP nozzles did significantly reduce the size of the region of peak turbulent kinetic energy when compared to the jet plume of the baseline nozzle cases. The IVP nozzle at Sideline conditions did suffer a region of separated flow from the inner stream nozzle splitter that did produce an intense, but small, region of turbulent kinetic energy in the vicinity of the nozzle exit. When viewed with the understanding that jet noise is directly related to turbulent kinetic energy, these IVP nozzle simulations show the potential to reduce noise to observers located below the nozzle. However, these RANS simulations also show that some modifications may be needed to prevent the small region of separated flow-induced turbulent kinetic energy from the inner stream nozzle splitter at Sideline conditions.

  10. Modal identification experiment

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G.

    1992-01-01

    The Modal Identification Experiment (MIE) is a proposed on-orbit experiment being developed by NASA's Office of Aeronautics and Space Technology wherein a series of vibration measurements would be made on various configurations of Space Station Freedom (SSF) during its on-orbit assembly phase. The experiment is to be conducted in conjunction with station reboost operations and consists of measuring the dynamic responses of the spacecraft produced by station-based attitude control system and reboost thrusters, recording and transmitting the data, and processing the data on the ground to identify the natural frequencies, damping factors, and shapes of significant vibratory modes. The experiment would likely be a part of the Space Station on-orbit verification. Basic research objectives of MIE are to evaluate and improve methods for analytically modeling large space structures, to develop techniques for performing in-space modal testing, and to validate candidate techniques for in-space modal identification. From an engineering point of view, MIE will provide the first opportunity to obtain vibration data for the fully-assembled structure because SSF is too large and too flexible to be tested as a single unit on the ground. Such full-system data is essential for validating the analytical model of SSF which would be used in any engineering efforts associated with structural or control system changes that might be made to the station as missions evolve over time. Extensive analytical simulations of on-orbit tests, as well exploratory laboratory simulations using small-scale models, have been conducted in-house and under contract to develop a measurement plan and evaluate its potential performance. In particular, performance trade and parametric studies conducted as part of these simulations were used to resolve issues related to the number and location of the measurements, the type of excitation, data acquisition and data processing, effects of noise and nonlinearities, selection of target vibration modes, and the appropriate type of data analysis scheme. The purpose of this talk is to provide an executive-summary-type overview of the modal identification experiment which has emerged from the conceptual design studies conducted to-date. Emphasis throughout is on those aspects of the experiment which should be of interest to those attending the subject utilization conference. The presentation begins with some preparatory remarks to provide background and motivation for the experiment, describe the experiment in general terms, and cite the specific technical objectives. This is followed by a summary of the major results of the conceptual design studies conducted to define the baseline experiment. The baseline experiment which has resulted from the studies is then described.

  11. Seasonal Parameterizations of the Tau-Omega Model Using the ComRAD Ground-Based SMAP Simulator

    NASA Technical Reports Server (NTRS)

    O'Neill, P.; Joseph, A.; Srivastava, P.; Cosh, M.; Lang, R.

    2014-01-01

    NASA's Soil Moisture Active Passive (SMAP) mission is scheduled for launch in November 2014. In the prelaunch time frame, the SMAP team has focused on improving retrieval algorithms for the various SMAP baseline data products. The SMAP passive-only soil moisture product depends on accurate parameterization of the tau-omega model to achieve the required accuracy in soil moisture retrieval. During a field experiment (APEX12) conducted in the summer of 2012 under dry conditions in Maryland, the Combined Radar/Radiometer (ComRAD) truck-based SMAP simulator collected active/passive microwave time series data at the SMAP incident angle of 40 degrees over corn and soybeans throughout the crop growth cycle. A similar experiment was conducted only over corn in 2002 under normal moist conditions. Data from these two experiments will be analyzed and compared to evaluate how changes in vegetation conditions throughout the growing season in both a drought and normal year can affect parameterizations in the tau-omega model for more accurate soil moisture retrieval.

  12. A comparison of different ways of including baseline counts in negative binomial models for data from falls prevention trials.

    PubMed

    Zheng, Han; Kimber, Alan; Goodwin, Victoria A; Pickering, Ruth M

    2018-01-01

    A common design for a falls prevention trial is to assess falling at baseline, randomize participants into an intervention or control group, and ask them to record the number of falls they experience during a follow-up period of time. This paper addresses how best to include the baseline count in the analysis of the follow-up count of falls in negative binomial (NB) regression. We examine the performance of various approaches in simulated datasets where both counts are generated from a mixed Poisson distribution with shared random subject effect. Including the baseline count after log-transformation as a regressor in NB regression (NB-logged) or as an offset (NB-offset) resulted in greater power than including the untransformed baseline count (NB-unlogged). Cook and Wei's conditional negative binomial (CNB) model replicates the underlying process generating the data. In our motivating dataset, a statistically significant intervention effect resulted from the NB-logged, NB-offset, and CNB models, but not from NB-unlogged, and large, outlying baseline counts were overly influential in NB-unlogged but not in NB-logged. We conclude that there is little to lose by including the log-transformed baseline count in standard NB regression compared to CNB for moderate to larger sized datasets. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Pathway concepts experiment for head-down synthetic vision displays

    NASA Astrophysics Data System (ADS)

    Prinzel, Lawrence J., III; Arthur, Jarvis J., III; Kramer, Lynda J.; Bailey, Randall E.

    2004-08-01

    Eight 757 commercial airline captains flew 22 approaches using the Reno Sparks 16R Visual Arrival under simulated Category I conditions. Approaches were flown using a head-down synthetic vision display to evaluate four tunnel ("minimal", "box", "dynamic pathway", "dynamic crow's feet") and three guidance ("ball", "tadpole", "follow-me aircraft") concepts and compare their efficacy to a baseline condition (i.e., no tunnel, ball guidance). The results showed that the tunnel concepts significantly improved pilot performance and situation awareness and lowered workload compared to the baseline condition. The dynamic crow's feet tunnel and follow-me aircraft guidance concepts were found to be the best candidates for future synthetic vision head-down displays. These results are discussed with implications for synthetic vision display design and future research.

  14. Simulations of High Current NuMI Magnetic Horn Striplines at FNAL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sipahi, Taylan; Biedron, Sandra; Hylen, James

    2016-06-01

    Both the NuMI (Neutrinos and the Main Injector) beam line, that has been providing intense neutrino beams for several Fermilab experiments (MINOS, MINERVA, NOVA), and the newly proposed LBNF (Long Baseline Neutrino Facility) beam line which plans to produce the highest power neutrino beam in the world for DUNE (the Deep Underground Neutrino Experiment) need pulsed magnetic horns to focus the mesons which decay to produce the neutrinos. The high-current horn and stripline design has been evolving as NuMI reconfigures for higher beam power and to meet the needs of the LBNF design. The CSU particle accelerator group has aidedmore » the neutrino physics experiments at Fermilab by producing EM simulations of magnetic horns and the required high-current striplines. In this paper, we present calculations, using the Poisson and ANSYS Maxwell 3D codes, of the EM interaction of the stripline plates of the NuMI horns at critical stress points. In addition, we give the electrical simulation results using the ANSYS Electric code. These results are being used to support the development of evolving horn stripline designs to handle increased electrical current and higher beam power for NuMI upgrades and for LBNF« less

  15. Dynamics of Active Separation Control at High Reynolds Numbers

    NASA Technical Reports Server (NTRS)

    Pack, LaTunia G.; Seifert, Avi

    2000-01-01

    A series of active flow control experiments were recently conducted at high Reynolds numbers on a generic separated configuration. The model simulates the upper surface of a 20% thick Glauert-Goldschmied type airfoil at zero angle of attack. The flow is fully turbulent since the tunnel sidewall boundary layer flows over the model. The main motivation for the experiments is to generate a comprehensive data base for validation of unsteady numerical simulation as a first step in the development of a CFD design tool, without which it would not be possible to effectively utilize the great potential of unsteady flow control. This paper focuses on the dynamics of several key features of the baseline as well as the controlled flow. It was found that the thickness of the upstream boundary layer has a negligible effect on the flow dynamics. It is speculated that separation is caused mainly by the highly convex surface while viscous effects are less important. The two-dimensional separated flow contains unsteady waves centered on a reduced frequency of 0.9, while in the three dimensional separated flow, frequencies around a reduced frequency of 0.3 and 1 are active. Several scenarios of resonant wave interaction take place at the separated shear-layer and in the pressure recovery region. The unstable reduced frequency bands for periodic excitation are centered on 1.5 and 5, but these reduced frequencies are based on the length of the baseline bubble that shortens due to the excitation. The conventional works well for the coherent wave features. Reproduction of these dynamic effects by a numerical simulation would provide benchmark validation.

  16. NASA's UAS Integration into the NAS: A Report on the Human Systems Integration Phase 1 Simulation Activities

    NASA Technical Reports Server (NTRS)

    Fern, Lisa; Rorie, R. Conrad; Shively, R. Jay

    2014-01-01

    In 2011 the National Aeronautics and Space Administration (NASA) began a five-year Project to address the technical barriers related to routine access of Unmanned Aerial Systems (UAS) in the National Airspace System (NAS). Planned in two phases, the goal of the first phase was to lay the foundations for the Project by identifying those barriers and key issues to be addressed to achieve integration. Phase 1 activities were completed two years into the five-year Project. The purpose of this paper is to review activities within the Human Systems Integration (HSI) subproject in Phase 1 toward its two objectives: 1) develop GCS guidelines for routine UAS access to the NAS, and 2) develop a prototype display suite within an existing Ground Control Station (GCS). The first objective directly addresses a critical barrier for UAS integration into the NAS - a lack of GCS design standards or requirements. First, the paper describes the initial development of a prototype GCS display suite and supporting simulation software capabilities. Then, three simulation experiments utilizing this simulation architecture are summarized. The first experiment sought to determine a baseline performance of UAS pilots operating in civil airspace under current instrument flight rules for manned aircraft. The second experiment examined the effect of currently employed UAS contingency procedures on Air Traffic Control (ATC) participants. The third experiment compared three GCS command and control interfaces on UAS pilot response times in compliance with ATC clearances. The authors discuss how the results of these and future simulation and flight-testing activities contribute to the development of GCS guidelines to support the safe integration of UAS into the NAS. Finally, the planned activities for Phase 2, including an integrated human-in-the-loop simulation and two flight tests are briefly described.

  17. Significant volume reduction of tank waste by selective crystallization: 1994 Annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herting, D.L.; Lunsford, T.R.

    1994-09-27

    The objective of this technology task plan is to develop and demonstrate a scaleable process of reclaim sodium nitrate (NaNO{sub 3}) from Hanford waste tanks as a clean nonradioactive salt. The purpose of the so-called Clean Salt Process is to reduce the volume of low level waste glass by as much as 70%. During the reporting period of October 1, 1993, through May 31, 1994, progress was made on four fronts -- laboratory studies, surrogate waste compositions, contracting for university research, and flowsheet development and modeling. In the laboratory, experiments with simulated waste were done to explore the effects ofmore » crystallization parameters on the size and crystal habit of product NaNO{sub 3} crystals. Data were obtained to allows prediction of decontamination factor as a function of solid/liquid separation parameters. Experiments with actual waste from tank 101-SY were done to determine the extent of contaminant occlusions in NaNO{sub 3} crystals. In preparation for defining surrogate waste compositions, single shell tanks were categorized according to the weight percent NaNO{sub 3} in each tank. A detailed process flowsheet and computer model were created using the ASPENPlus steady state process simulator. This is the same program being used by the Tank Waste Remediation System (TWRS) program for their waste pretreatment and disposal projections. Therefore, evaluations can be made of the effect of the Clean Salt Process on the low level waste volume and composition resulting from the TWRS baseline flowsheet. Calculations, using the same assumptions as used for the TWRS baseline where applicable indicate that the number of low level glass vaults would be reduced from 44 to 16 if the Clean Salt Process were incorporated into the baseline flowsheet.« less

  18. Executive function on the 16-day of bed rest in young healthy men

    NASA Astrophysics Data System (ADS)

    Ishizaki, Yuko; Fukuoka, Hideoki; Tanaka, Hidetaka; Ishizaki, Tatsuro; Fujii, Yuri; Hattori-Uchida, Yuko; Nakamura, Minako; Ohkawa, Kaoru; Kobayashi, Hodaka; Taniuchi, Shoichiro; Kaneko, Kazunari

    2009-05-01

    Microgravity due to prolonged bed rest may cause changes in cerebral circulation, which is related to brain function. We evaluate the effect of simulated microgravity due to a 6° head-down tilt bed rest experiment on executive function among 12 healthy young men. Four kinds of psychoneurological tests—the table tapping test, the trail making test, the pointing test and losing at rock-paper-scissors—were performed on the baseline and on day 16 of the experiment. There was no significant difference in the results between the baseline and day 16 on all tests, which indicated that executive function was not impaired by the 16-day 6° head-down tilting bed rest. However, we cannot conclude that microgravity did not affect executive function because of the possible contribution of the following factors: (1) the timing of tests, (2) the learning effect, or (3) changes in psychophysiology that were too small to affect higher brain function.

  19. Segmented beryllium target for a 2 MW super beam facility

    DOE PAGES

    Davenne, T.; Caretta, O.; Densham, C.; ...

    2015-09-14

    The Long Baseline Neutrino Facility (LBNF, formerly the Long Baseline Neutrino Experiment) is under design as a next generation neutrino oscillation experiment, with primary objectives to search for CP violation in the leptonic sector, to determine the neutrino mass hierarchy and to provide a precise measurement of θ 23. The facility will generate a neutrino beam at Fermilab by the interaction of a proton beam with a target material. At the ultimate anticipated proton beam power of 2.3 MW the target material must dissipate a heat load of between 10 and 25 kW depending on the target size. This paper presents amore » target concept based on an array of spheres and compares it to a cylindrical monolithic target such as that which currently operates at the T2K facility. Thus simulation results show that the proposed technology offers efficient cooling and lower stresses whilst delivering a neutrino production comparable with that of a conventional solid cylindrical target.« less

  20. The Fate of Saharan Dust Across the Atlantic and Implications for a Central American Dust Barrier

    NASA Technical Reports Server (NTRS)

    Nowottnick, E.; Colarco, P.; da Silva, A.; Hlavka, D.; McGill, M.

    2011-01-01

    Saharan dust was observed over the Caribbean basin during the summer 2007 NASA Tropical Composition, Cloud, and Climate Coupling (TC4) field experiment. Airborne Cloud Physics Lidar (CPL) and satellite observations from MODIS suggest a barrier to dust transport across Central America into the eastern Pacific. We use the NASA GEOS-5 atmospheric transport model with online aerosol tracers to perform simulations of the TC4 time period in order to understand the nature of this barrier. Our simulations are driven by the Modem Era Retrospective-Analysis for Research and Applications (MERRA) meteorological analyses. We evaluate our baseline simulated dust distributions using MODIS and CALIOP satellite and ground-based AERONET sun photometer observations. GEOS-5 reproduces the observed location, magnitude, and timing of major dust events, but our baseline simulation does not develop as strong a barrier to dust transport across Central America as observations suggest. Analysis of the dust transport dynamics and lost processes suggest that while both mechanisms play a role in defining the dust transport barrier, loss processes by wet removal of dust are about twice as important as transport. Sensitivity analyses with our model showed that the dust barrier would not exist without convective scavenging over the Caribbean. The best agreement between our model and the observations was obtained when dust wet removal was parameterized to be more aggressive, treating the dust as we do hydrophilic aerosols.

  1. Beam-based measurements of long-range transverse wakefields in the Compact Linear Collider main-linac accelerating structure

    DOE PAGES

    Zha, Hao; Latina, Andrea; Grudiev, Alexej; ...

    2016-01-20

    The baseline design of CLIC (Compact Linear Collider) uses X-band accelerating structures for its main linacs. In order to maintain beam stability in multibunch operation, long-range transverse wakefields must be suppressed by 2 orders of magnitude between successive bunches, which are separated in time by 0.5 ns. Such strong wakefield suppression is achieved by equipping every accelerating structure cell with four damping waveguides terminated with individual rf loads. A beam-based experiment to directly measure the effectiveness of this long-range transverse wakefield and benchmark simulations was made in the FACET test facility at SLAC using a prototype CLIC accelerating structure. Furthermore,more » the experiment showed good agreement with the simulations and a strong suppression of the wakefields with an unprecedented minimum resolution of 0.1 V/(pC mm m).« less

  2. A Fatigue Crack Size Evaluation Method Based on Lamb Wave Simulation and Limited Experimental Data

    PubMed Central

    He, Jingjing; Ran, Yunmeng; Liu, Bin; Yang, Jinsong; Guan, Xuefei

    2017-01-01

    This paper presents a systematic and general method for Lamb wave-based crack size quantification using finite element simulations and Bayesian updating. The method consists of construction of a baseline quantification model using finite element simulation data and Bayesian updating with limited Lamb wave data from target structure. The baseline model correlates two proposed damage sensitive features, namely the normalized amplitude and phase change, with the crack length through a response surface model. The two damage sensitive features are extracted from the first received S0 mode wave package. The model parameters of the baseline model are estimated using finite element simulation data. To account for uncertainties from numerical modeling, geometry, material and manufacturing between the baseline model and the target model, Bayesian method is employed to update the baseline model with a few measurements acquired from the actual target structure. A rigorous validation is made using in-situ fatigue testing and Lamb wave data from coupon specimens and realistic lap-joint components. The effectiveness and accuracy of the proposed method is demonstrated under different loading and damage conditions. PMID:28902148

  3. AgRISTARS: Foreign commodity production forecasting. The 1980 US corn and soybeans exploratory experiment

    NASA Technical Reports Server (NTRS)

    Malin, J. T.; Carnes, J. G. (Principal Investigator)

    1981-01-01

    The U.S. corn and soybeans exploratory experiment is described which consisted of evaluations of two technology components of a production forecasting system: classification procedures (crop labeling and proportion estimation at the level of a sampling unit) and sampling and aggregation procedures. The results from the labeling evaluations indicate that the corn and soybeans labeling procedure works very well in the U.S. corn belt with full season (after tasseling) LANDSAT data. The procedure should be readily adaptable to corn and soybeans labeling required for subsequent exploratory experiments or pilot tests. The machine classification procedures evaluated in this experiment were not effective in improving the proportion estimates. The corn proportions produced by the machine procedures had a large bias when the bias correction was not performed. This bias was caused by the manner in which the machine procedures handled spectrally impure pixels. The simulation test indicated that the weighted aggregation procedure performed quite well. Although further work can be done to improve both the simulation tests and the aggregation procedure, the results of this test show that the procedure should serve as a useful baseline procedure in future exploratory experiments and pilot tests.

  4. Simulation center training as a means to improve resident performance in percutaneous noncontinuous CT-guided fluoroscopic procedures with dose reduction.

    PubMed

    Mendiratta-Lala, Mishal; Williams, Todd R; Mendiratta, Vivek; Ahmed, Hafeez; Bonnett, John W

    2015-04-01

    The purpose of this study was to evaluate the effectiveness of a multifaceted simulation-based resident training for CT-guided fluoroscopic procedures by measuring procedural and technical skills, radiation dose, and procedure times before and after simulation training. A prospective analysis included 40 radiology residents and eight staff radiologists. Residents took an online pretest to assess baseline procedural knowledge. Second-through fourth-year residents' baseline technical skills with a procedural phantom were evaluated. First-through third-year residents then underwent formal didactic and simulation-based procedural and technical training with one of two interventional radiologists and followed the training with 1 month of supervised phantom-based practice. Thereafter, residents underwent final written and practical examinations. The practical examination included essential items from a 20-point checklist, including site and side marking, consent, time-out, and sterile technique along with a technical skills portion assessing pedal steps, radiation dose, needle redirects, and procedure time. The results indicated statistically significant improvement in procedural and technical skills after simulation training. For residents, the median number of pedal steps decreased by three (p=0.001), median dose decreased by 15.4 mGy (p<0.001), median procedure time decreased by 4.0 minutes (p<0.001), median number of needle redirects decreased by 1.0 (p=0.005), and median number of 20-point checklist items successfully completed increased by three (p<0.001). The results suggest that procedural skills can be acquired and improved by simulation-based training of residents, regardless of experience. CT simulation training decreases procedural time, decreases radiation dose, and improves resident efficiency and confidence, which may transfer to clinical practice with improved patient care and safety.

  5. The effect of noise and lipid signals on determination of Gaussian and non-Gaussian diffusion parameters in skeletal muscle.

    PubMed

    Cameron, Donnie; Bouhrara, Mustapha; Reiter, David A; Fishbein, Kenneth W; Choi, Seongjin; Bergeron, Christopher M; Ferrucci, Luigi; Spencer, Richard G

    2017-07-01

    This work characterizes the effect of lipid and noise signals on muscle diffusion parameter estimation in several conventional and non-Gaussian models, the ultimate objectives being to characterize popular fat suppression approaches for human muscle diffusion studies, to provide simulations to inform experimental work and to report normative non-Gaussian parameter values. The models investigated in this work were the Gaussian monoexponential and intravoxel incoherent motion (IVIM) models, and the non-Gaussian kurtosis and stretched exponential models. These were evaluated via simulations, and in vitro and in vivo experiments. Simulations were performed using literature input values, modeling fat contamination as an additive baseline to data, whereas phantom studies used a phantom containing aliphatic and olefinic fats and muscle-like gel. Human imaging was performed in the hamstring muscles of 10 volunteers. Diffusion-weighted imaging was applied with spectral attenuated inversion recovery (SPAIR), slice-select gradient reversal and water-specific excitation fat suppression, alone and in combination. Measurement bias (accuracy) and dispersion (precision) were evaluated, together with intra- and inter-scan repeatability. Simulations indicated that noise in magnitude images resulted in <6% bias in diffusion coefficients and non-Gaussian parameters (α, K), whereas baseline fitting minimized fat bias for all models, except IVIM. In vivo, popular SPAIR fat suppression proved inadequate for accurate parameter estimation, producing non-physiological parameter estimates without baseline fitting and large biases when it was used. Combining all three fat suppression techniques and fitting data with a baseline offset gave the best results of all the methods studied for both Gaussian diffusion and, overall, for non-Gaussian diffusion. It produced consistent parameter estimates for all models, except IVIM, and highlighted non-Gaussian behavior perpendicular to muscle fibers (α ~ 0.95, K ~ 3.1). These results show that effective fat suppression is crucial for accurate measurement of non-Gaussian diffusion parameters, and will be an essential component of quantitative studies of human muscle quality. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  6. Geoelectrical monitoring of simulated subsurface leakage to support high-hazard nuclear decommissioning at the Sellafield Site, UK.

    PubMed

    Kuras, Oliver; Wilkinson, Paul B; Meldrum, Philip I; Oxby, Lucy S; Uhlemann, Sebastian; Chambers, Jonathan E; Binley, Andrew; Graham, James; Smith, Nicholas T; Atherton, Nick

    2016-10-01

    A full-scale field experiment applying 4D (3D time-lapse) cross-borehole Electrical Resistivity Tomography (ERT) to the monitoring of simulated subsurface leakage was undertaken at a legacy nuclear waste silo at the Sellafield Site, UK. The experiment constituted the first application of geoelectrical monitoring in support of decommissioning work at a UK nuclear licensed site. Images of resistivity changes occurring since a baseline date prior to the simulated leaks revealed likely preferential pathways of silo liquor simulant flow in the vadose zone and upper groundwater system. Geophysical evidence was found to be compatible with historic contamination detected in permeable facies in sediment cores retrieved from the ERT boreholes. Results indicate that laterally discontinuous till units forming localized hydraulic barriers substantially affect flow patterns and contaminant transport in the shallow subsurface at Sellafield. We conclude that only geophysical imaging of the kind presented here has the potential to provide the detailed spatial and temporal information at the (sub-)meter scale needed to reduce the uncertainty in models of subsurface processes at nuclear sites. Copyright © 2016 British Geological Survey, NERC. Published by Elsevier B.V. All rights reserved.

  7. Comparison of virtual patient simulation with mannequin-based simulation for improving clinical performances in assessing and managing clinical deterioration: randomized controlled trial.

    PubMed

    Liaw, Sok Ying; Chan, Sally Wai-Chi; Chen, Fun-Gee; Hooi, Shing Chuan; Siau, Chiang

    2014-09-17

    Virtual patient simulation has grown substantially in health care education. A virtual patient simulation was developed as a refresher training course to reinforce nursing clinical performance in assessing and managing deteriorating patients. The objective of this study was to describe the development of the virtual patient simulation and evaluate its efficacy, by comparing with a conventional mannequin-based simulation, for improving the nursing students' performances in assessing and managing patients with clinical deterioration. A randomized controlled study was conducted with 57 third-year nursing students who were recruited through email. After a baseline evaluation of all participants' clinical performance in a simulated environment, the experimental group received a 2-hour fully automated virtual patient simulation while the control group received 2-hour facilitator-led mannequin-based simulation training. All participants were then re-tested one day (first posttest) and 2.5 months (second posttest) after the intervention. The participants from the experimental group completed a survey to evaluate their learning experiences with the newly developed virtual patient simulation. Compared to their baseline scores, both experimental and control groups demonstrated significant improvements (P<.001) in first and second post-test scores. While the experimental group had significantly lower (P<.05) second post-test scores compared with the first post-test scores, no significant difference (P=.94) was found between these two scores for the control group. The scores between groups did not differ significantly over time (P=.17). The virtual patient simulation was rated positively. A virtual patient simulation for a refreshing training course on assessing and managing clinical deterioration was developed. Although the randomized controlled study did not show that the virtual patient simulation was superior to mannequin-based simulation, both simulations have demonstrated to be effective refresher learning strategies for improving nursing students' clinical performance. Given the greater resource requirements of mannequin-based simulation, the virtual patient simulation provides a more promising alternative learning strategy to mitigate the decay of clinical performance over time.

  8. Default network connectivity decodes brain states with simulated microgravity.

    PubMed

    Zeng, Ling-Li; Liao, Yang; Zhou, Zongtan; Shen, Hui; Liu, Yadong; Liu, Xufeng; Hu, Dewen

    2016-04-01

    With great progress of space navigation technology, it becomes possible to travel beyond Earth's gravity. So far, it remains unclear whether the human brain can function normally within an environment of microgravity and confinement. Particularly, it is a challenge to figure out some neuroimaging-based markers for rapid screening diagnosis of disrupted brain function in microgravity environment. In this study, a 7-day -6° head down tilt bed rest experiment was used to simulate the microgravity, and twenty healthy male participants underwent resting-state functional magnetic resonance imaging scans at baseline and after the simulated microgravity experiment. We used a multivariate pattern analysis approach to distinguish the brain states with simulated microgravity from normal gravity based on the functional connectivity within the default network, resulting in an accuracy of no less than 85 % via cross-validation. Moreover, most discriminative functional connections were mainly located between the limbic system and cortical areas and were enhanced after simulated microgravity, implying a self-adaption or compensatory enhancement to fulfill the need of complex demand in spatial navigation and motor control functions in microgravity environment. Overall, the findings suggest that the brain states in microgravity are likely different from those in normal gravity and that brain connectome could act as a biomarker to indicate the brain state in microgravity.

  9. Sensitivity of Boreal-Summer Circulation and Precipitation to Atmospheric Aerosols in Selected Regions: Part I Africa and India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sud, Yogesh C.; Wilcox, Eric; Lau, William K.

    2009-10-23

    Version-4 of the Goddard Earth Observing System (GEOS-4) General Circulation Model (GCM) was employed to assess the influence of potential changes in aerosols on the regional circulation, ambient temperatures, and precipitation in four selected regions: India and Africa (current paper), as well as North and South America (companion paper). Ensemble-simulations were carried out with the GCM to assess the aerosol direct and indirect effects, hereafter ADE and AIE. Each simulation was started from the NCEP-analyzed initial conditions for May 1 and was integrated through May-June-July-August of each year: 1982-1987 to provide an ensemble set of six simulations. In the firstmore » set, called the baseline experiment (#1), climatological aerosols were prescribed. The next two experiments (#2 and #3) had two sets of simulations each: one with 2X and another with 1/2X the climatological aerosols over each of the four selected regions. In experiment#2, the anomaly regions were advectively restricted (AR), i.e., the large-scale prognostic fields outside the aerosol anomaly regions were prescribed while in experiment#3, the anomaly regions were advectively Interactive (AI) as is the case in a normal GCM integrations, but with the same aerosols anomalies as in experiment #2. Intercomparisons of circulation, diabatic heating, and precipitation difference fields showed large disparities among the AR and AI simulations, which raised serious questions about the AR assumption, commonly invoked in regional climate simulation studies. Consequently AI simulation mode was chosen for the subsequent studies. Two more experiments (#4 and #5) were performed in the AI mode in which ADE and AIE were activated one at a time. The results showed that ADE and AIE work in concert to make the joint influences larger than sum of each acting alone. Moreover, the ADE and AIE influences were vastly different for the Indian and Africa regions, which suggest an imperative need to include them rationally in climate models. We also found that the aerosol induced increase of tropical cirrus clouds would potentially offset any cirrus thinning that may occur due to global warming in response to CO2 increase.« less

  10. Sex Differences in the Effects of Marijuana on Simulated Driving Performance†

    PubMed Central

    Anderson, Beth M.; Rizzo, Matthew; Block, Robert I.; Pearlson, Godfrey D.; O'Leary, Daniel S.

    2011-01-01

    In the United States, one in six teenagers has driven under the influence of marijuana. Driving under the influence of marijuana and alcohol is equally prevalent, despite the fact that marijuana use is less common than alcohol use. Much of the research examining the effects of marijuana on driving performance was conducted in the 1970s and led to equivocal findings. During that time, few studies included women and driving simulators were rudimentary. Further, the potency of marijuana commonly used recreationally has increased. This study examined sex differences in the acute effects of marijuana on driving performance using a realistic, validated driving simulator. Eighty-five subjects (n = 50 males, 35 females) participated in this between-subjects, double-blind, placebo controlled study. In addition to an uneventful, baseline segment of driving, participants were challenged with collision avoidance and distracted driving scenarios. Under the influence of marijuana, participants decreased their speed and failed to show expected practice effects during a distracted drive. No differences were found during the baseline driving segment or collision avoidance scenarios. No differences attributable to sex were observed. This study enhances the current literature by identifying distracted driving and the integration of prior experience as particularly problematic under the influence of marijuana. PMID:20464803

  11. Simulation of Ventricular, Cavo-Pulmonary, and Biventricular Ventricular Assist Devices in Failing Fontan.

    PubMed

    Di Molfetta, Arianna; Amodeo, Antonio; Fresiello, Libera; Trivella, Maria Giovanna; Iacobelli, Roberta; Pilati, Mara; Ferrari, Gianfranco

    2015-07-01

    Considering the lack of donors, ventricular assist devices (VADs) could be an alternative to heart transplantation for failing Fontan patients, in spite of the lack of experience and the complex anatomy and physiopathology of these patients. Considering the high number of variables that play an important role such as type of Fontan failure, type of VAD connection, and setting (right VAD [RVAD], left VAD [LVAD], or biventricular VAD [BIVAD]), a numerical model could be useful to support clinical decisions. The aim of this article is to develop and test a lumped parameter model of the cardiovascular system simulating and comparing the VAD effects on failing Fontan. Hemodynamic and echocardiographic data of 10 Fontan patients were used to simulate the baseline patients' condition using a dedicated lumped parameter model. Starting from the simulated baseline and for each patient, a systolic dysfunction, a diastolic dysfunction, and an increment of the pulmonary vascular resistance were simulated. Then, for each patient and for each pathology, the RVAD, LVAD, and BIVAD implantations were simulated. The model can reproduce patients' baseline well. In the case of systolic dysfunction, the LVAD unloads the single ventricle and increases the cardiac output (CO) (35%) and the arterial systemic pressure (Pas) (25%). With RVAD, a decrement of inferior vena cava pressure (Pvci) (39%) was observed with 34% increment of CO, but an increment of the single ventricle external work (SVEW). With the BIVAD, an increment of Pas (29%) and CO (37%) was observed. In the case of diastolic dysfunction, the LVAD increases CO (42%) and the RVAD decreases the Pvci, while both increase the SVEW. In the case of pulmonary vascular resistance increment, the highest CO (50%) and Pas (28%) increment is obtained with an RVAD with the highest decrement of Pvci (53%) and an increment of the SVEW but with the lowest VAD power consumption. The use of numerical models could be helpful in this innovative field to evaluate the effect of VAD implantation on Fontan patients to support patient and VAD type selection personalizing the assistance. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  12. Dissecting Driver Behaviors Under Cognitive, Emotional, Sensorimotor, and Mixed Stressors

    PubMed Central

    Pavlidis, I.; Dcosta, M.; Taamneh, S.; Manser, M.; Ferris, T.; Wunderlich, R.; Akleman, E.; Tsiamyrtzis, P.

    2016-01-01

    In a simulation experiment we studied the effects of cognitive, emotional, sensorimotor, and mixed stressors on driver arousal and performance with respect to (wrt) baseline. In a sample of n = 59 drivers, balanced in terms of age and gender, we found that all stressors incurred significant increases in mean sympathetic arousal accompanied by significant increases in mean absolute steering. The latter, translated to significantly larger range of lane departures only in the case of sensorimotor and mixed stressors, indicating more dangerous driving wrt baseline. In the case of cognitive or emotional stressors, often a smaller range of lane departures was observed, indicating safer driving wrt baseline. This paradox suggests an effective coping mechanism at work, which compensates erroneous reactions precipitated by cognitive or emotional conflict. This mechanisms’ grip slips, however, when the feedback loop is intermittently severed by sensorimotor distractions. Interestingly, mixed stressors did not affect crash rates in startling events, suggesting that the coping mechanism’s compensation time scale is above the range of neurophysiological latency. PMID:27170291

  13. Dissecting Driver Behaviors Under Cognitive, Emotional, Sensorimotor, and Mixed Stressors.

    PubMed

    Pavlidis, I; Dcosta, M; Taamneh, S; Manser, M; Ferris, T; Wunderlich, R; Akleman, E; Tsiamyrtzis, P

    2016-05-12

    In a simulation experiment we studied the effects of cognitive, emotional, sensorimotor, and mixed stressors on driver arousal and performance with respect to (wrt) baseline. In a sample of n = 59 drivers, balanced in terms of age and gender, we found that all stressors incurred significant increases in mean sympathetic arousal accompanied by significant increases in mean absolute steering. The latter, translated to significantly larger range of lane departures only in the case of sensorimotor and mixed stressors, indicating more dangerous driving wrt baseline. In the case of cognitive or emotional stressors, often a smaller range of lane departures was observed, indicating safer driving wrt baseline. This paradox suggests an effective coping mechanism at work, which compensates erroneous reactions precipitated by cognitive or emotional conflict. This mechanisms' grip slips, however, when the feedback loop is intermittently severed by sensorimotor distractions. Interestingly, mixed stressors did not affect crash rates in startling events, suggesting that the coping mechanism's compensation time scale is above the range of neurophysiological latency.

  14. Evaluation of Movement Restriction Zone Sizes in Controlling Classical Swine Fever Outbreaks

    PubMed Central

    Yadav, Shankar; Olynk Widmar, Nicole; Lay, Donald C.; Croney, Candace; Weng, Hsin-Yi

    2017-01-01

    The objective of this study was to compare the impacts of movement restriction zone sizes of 3, 5, 9, and 11 km with that of 7 km (the recommended zone size in the United States) in controlling a classical swine fever (CSF) outbreak. In addition to zone size, different compliance assumptions and outbreak types (single site and multiple site) were incorporated in the study. Three assumptions of compliance level were simulated: baseline, baseline ± 10%, and baseline ± 15%. The compliance level was held constant across all zone sizes in the baseline simulation. In the baseline ± 10% and baseline ± 15% simulations, the compliance level was increased for 3 and 5 km and decreased for 9 and 11 km from the baseline by the indicated percentages. The compliance level remained constant in all simulations for the 7-km zone size. Four single-site (i.e., with one index premises at the onset of outbreak) and four multiple-site (i.e., with more than one index premises at the onset of outbreak) CSF outbreak scenarios in Indiana were simulated incorporating various zone sizes and compliance assumptions using a stochastic between-premises disease spread model to estimate epidemic duration, percentage of infected, and preemptively culled swine premises. Furthermore, a risk assessment model that incorporated the results from the disease spread model was developed to estimate the number of swine premises under movement restrictions that would experience animal welfare outcomes of overcrowding or feed interruption during a CSF outbreak in Indiana. Compared with the 7-km zone size, the 3-km zone size resulted in a longer median epidemic duration, larger percentages of infected premises, and preemptively culled premises (P’s < 0.001) across all compliance assumptions and outbreak types. With the assumption of a higher compliance level, the 5-km zone size significantly (P < 0.001) reduced the epidemic duration and percentage of swine premises that would experience animal welfare outcomes in both outbreak types, whereas assumption of a lower compliance level for 9- and 11-km zone sizes significantly (P < 0.001) increased the epidemic duration and percentage of swine premises with animal welfare outcomes compared with the 7-km zone size. The magnitude of impact due to a zone size varied across the outbreak types (single site and multiple site). Overall, the 7-km zone size was found to be most effective in controlling CSF outbreaks, whereas the 5-km zone size was comparable to the 7-km zone size in some circumstances. PMID:28119920

  15. Changes in stress hormones and metabolism during a 105-day simulated Mars mission.

    PubMed

    Strollo, Felice; Vassilieva, Galina; Ruscica, Massimiliano; Masini, Mariangela; Santucci, Daniela; Borgia, Luisa; Magni, Paolo; Celotti, Fabio; Nikiporuc, Igor

    2014-08-01

    The Mars-105 project was aimed at simulating crew's activities, workload, and communication during a mission to Mars, evaluating the homeostatic adaptations to prolonged confinement and cohabitation. Fasting plasma glucose (FPG) and insulin, C-peptide, leptin, cortisol, and NGF and BDNF plasma levels were monitored in six healthy nonsmoking male subjects taking part in a 105-d Mars mission simulation. Samples were collected from each subject before (0 wk), during (2.5 wk; 5 wk; 10 wk; 15 wk), and after confinement (+1 wk). Confinement resulted in impaired glucometabolic parameters, since FPG increased during the first 5 wk (baseline: 85.2 ± 10.8 mg · dl⁻¹; 2.5 wk: 98.4 ± 4.7 mg · dl⁻¹; 5 wk: 92.5 ± 6.0 mg · dl⁻¹) and insulin dropped at 2.5 wk (baseline: 14.4 ± 4.8 mU · L⁻¹; 2.5 wk: 7.7 ± 2.1 mU · L⁻¹), subsequently returning to baseline values. HOMA-IR paralleled plasma insulin, dropping to 1.8 ± 0.5 at 2.5 wk (baseline: 3.0 ± 1.2). At all time-points tested, plasma leptin levels were decreased (baseline: 4.4 ± 3.3 ng · dl⁻¹; 2.5 wk: 1.6 ± 1.2 ng · dl⁻¹; 5 wk: 1.3 ± 0.8 ng · dl⁻¹; 10 wk: 1.5 ± 1.1 ng · dl⁻¹; 15 wk:1.7 ± 0.8 ng · dl⁻¹), whereas cortisol levels were increased (baseline: 10.8 ± 4.9 ng · dl⁻¹; 2.5 wk: 16.8 ± 3.5 ng · dl⁻¹; 5 wk: 18.1 ± 7.6 ng · dl⁻¹; 10 wk: 18.1 ± 8.3 ng · dl⁻¹; 15 wk:14.2 ± 4.4 ng · dl⁻¹), resulting in a negative correlation between these hormones. BDNF levels increased only at 5 and 10 wk (baseline: 67.1 ± 36.0 pg · ml⁻¹; 5 wk: 164 ± 54 pg · ml⁻¹; and 10 wk: 110.2 ± 28.9 pg · ml⁻¹). The data obtained with the Mars-105 experiment suggest that environmental stress has a strong impact upon metabolic and stress response, indicating the need for further studies and the implementation of specific countermeasures.

  16. Tidal stirring and phytoplankton bloom dynamics in an estuary

    USGS Publications Warehouse

    Cloern, J.E.

    1991-01-01

    In South San Francisco Bay, estuarine phytoplankton biomass fluctuates at the time scale of days to weeks; much of this variability is associated with fluctuations in tidal energy. During the spring seasons of every year from 1980-1990, episodic blooms occurred in which phytoplankton biomass rose from a baseline of 2-4mg chlorophyll a m-3, peaked at 20-40 chlorophyll a m-3, then returned to baseline values, all within several weeks. Each episode of biomass increase occurred during neap tides, and each bloom decline coincided with spring tides. This suggests that daily variations in the rate of vertical mixing by tidal stirring might control phytoplankton bloom dynamics in some estuaries. Simulation experiments with a numerical model of phytoplankton population dynamics support this hypothesis. -from Author

  17. Pathway Concepts Experiment for Head-Down Synthetic Vision Displays

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Arthur, Jarvis J., III; Kramer, Lynda J.; Bailey, Randall E.

    2004-01-01

    Eight 757 commercial airline captains flew 22 approaches using the Reno Sparks 16R Visual Arrival under simulated Category I conditions. Approaches were flown using a head-down synthetic vision display to evaluate four tunnel ("minimal", "box", "dynamic pathway", "dynamic crow s feet") and three guidance ("ball", "tadpole", "follow-me aircraft") concepts and compare their efficacy to a baseline condition (i.e., no tunnel, ball guidance). The results showed that the tunnel concepts significantly improved pilot performance and situation awareness and lowered workload compared to the baseline condition. The dynamic crow s feet tunnel and follow-me aircraft guidance concepts were found to be the best candidates for future synthetic vision head-down displays. These results are discussed with implications for synthetic vision display design and future research.

  18. Simulated effects of allocated and projected 2025 withdrawals from the Potomac-Raritan-Magothy aquifer system, Gloucester and Northeastern Salem Counties, New Jersey

    USGS Publications Warehouse

    Charles, Emmanuel; Nawyn, John P.; Voronin, Lois M.; Gordon, Alison D.

    2011-01-01

    Withdrawals from the Potomac-Raritan-Magothy aquifer system in New Jersey, which includes the Upper, Middle, and Lower Potomac-Raritan-Magothy aquifers, are the principal source of groundwater supply in northern Gloucester and northeastern Salem Counties in the New Jersey Coastal Plain. Water levels in these aquifers have declined in response to pumping. With increased population growth and development expected in Gloucester County and parts of Salem County over the next 2 decades (2005-2025), withdrawals from these aquifers also are expected to increase. A steady-state groundwater-flow model, developed to simulate flow in the Potomac-Raritan-Magothy aquifer system in northern Gloucester and northeastern Salem Counties, was calibrated to withdrawal conditions in 1998, when groundwater withdrawals from the Potomac-Raritan-Magothy aquifer system in the model area were more than 10,100 Mgal/yr (million gallons per year). Withdrawals from water-purveyor wells accounted for about 63 percent of these withdrawals, and withdrawals from industrial self-supply wells accounted for about 32 percent. Withdrawals from agricultural-irrigation, commercial self-supply, and domestic self-supply wells accounted for the remaining 5 percent. Results of the 2000 baseline groundwater-flow simulation, incorporating average annual 1999-2001 groundwater withdrawals, indicate that the average simulated water levels in the Upper, Middle, and Lower Potomac-Raritan-Magothy aquifers are 31, 27, and 30 feet below the National Geodetic Vertical Datum of 1929 (NGVD 29), respectively, and the lowest simulated water levels are 77, 65, and 59 feet below NGVD 29, respectively. In the full-allocation scenario, the maximum State-permitted (allocated) groundwater withdrawals totaled 16,567 Mgal/yr, an increase of 72 percent from the 2000 baseline simulation. Results of the full-allocation simulation indicate that the average simulated water levels in the Upper, Middle, and Lower Potomac-Raritan-Magothy aquifers are 49, 43, and 48 feet below NGVD 29, respectively, which are 18, 16, and 18 feet lower, respectively, than in the 2000 baseline simulation. The lowest simulated water levels are 156, 95, and 69 feet below NGVD 29, respectively, which are 79, 30, and 10 feet lower, respectively, than in the 2000 baseline simulation. Simulated net flow from the Potomac-Raritan-Magothy aquifer system to streams is 8,441 Mgal/yr in the 2000 baseline simulation but is 6,018 Mgal/yr in the full-allocation scenario, a decrease of 29 percent from the 2000 baseline simulation. Simulated net flow in the 2000 baseline simulation is 1,183 Mgal/yr from the aquifer system to the Delaware River but in the full-allocation scenario is 1,816 Mgal/yr from the river to the aquifer system. Four other simulations were conducted that incorporated full-allocation conditions at water-purveyor wells in Critical Area 2 but increased or decreased withdrawals at selected water-purveyor wells outside Critical Area 2 and agricultural-irrigation and industrial-self-supply wells in the study area. The results of the four simulations also indicate net flow from the Delaware River to the Potomac-Raritan-Magothy aquifer system. A growth scenario was developed to simulate future withdrawals in 2025 estimated from population projections for municipalities in the Salem-Gloucester study area. Simulated withdrawals for this scenario totaled 10,261 Mgal/yr, an increase of 6 percent from the 2000 baseline simulation. This total includes about 25 Mgal/yr withdrawn from the Englishtown aquifer system for domestic self-supply. This scenario incorporated full-allocation withdrawals at water-purveyor wells in Critical Area 2, and increased withdrawals at water-purveyor wells outside Critical Area 2. Results of this simulation indicate that the average simulated water levels in the Upper, Middle, and Lower Potomac-Raritan-Magothy aquifers are 32, 29, and 32 feet below NGVD 29, respectively, which are 1, 2, and

  19. Microwave window breakdown experiments and simulations on the UM/L-3 relativistic magnetron

    NASA Astrophysics Data System (ADS)

    Hoff, B. W.; Mardahl, P. J.; Gilgenbach, R. M.; Haworth, M. D.; French, D. M.; Lau, Y. Y.; Franzi, M.

    2009-09-01

    Experiments have been performed on the UM/L-3 (6-vane, L-band) relativistic magnetron to test a new microwave window configuration designed to limit vacuum side breakdown. In the baseline case, acrylic microwave windows were mounted between three of the waveguide coupling cavities in the anode block vacuum housing and the output waveguides. Each of the six 3 cm deep coupling cavities is separated from its corresponding anode cavity by a 1.75 cm wide aperture. In the baseline case, vacuum side window breakdown was observed to initiate at single waveguide output powers close to 20 MW. In the new window configuration, three Air Force Research Laboratory-designed, vacuum-rated directional coupler waveguide segments were mounted between the coupling cavities and the microwave windows. The inclusion of the vacuum side power couplers moved the microwave windows an additional 30 cm away from the anode apertures. Additionally, the Lucite microwave windows were replaced with polycarbonate windows and the microwave window mounts were redesigned to better maintain waveguide continuity in the region around the microwave windows. No vacuum side window breakdown was observed in the new window configuration at single waveguide output powers of 120+MW (a factor of 3 increase in measured microwave pulse duration and factor of 3 increase in measured peak power over the baseline case). Simulations were performed to investigate likely causes for the window breakdown in the original configuration. Results from these simulations have shown that in the original configuration, at typical operating voltage and magnetic field ranges, electrons emitted from the anode block microwave apertures strike the windows with a mean kinetic energy of 33 keV with a standard deviation of 14 keV. Calculations performed using electron impact angle and energy data predict a first generation secondary electron yield of 65% of the primary electron population. The effects of the primary aperture electron impacts, combined with multiplication of the secondary populations, were determined to be the likely causes of the poor microwave window performance in the original configuration.

  20. Dynamics of Active Separation Control at High Reynolds Numbers

    NASA Technical Reports Server (NTRS)

    Pack, LaTunia G.; Seifert, Avi

    2000-01-01

    A series of active flow control experiments were recently conducted at high Reynolds numbers on a generic separated configuration. The model simulates the upper surface of a 20% thick Glauert-Goldschmied type airfoil at zero angle of attack. The flow is fully turbulent since the tunnel sidewall boundary layer flows over the model. The main motivation for the experiments is to generate a comprehensive data base for validation of unsteady numerical simulation as a first step in the development of a CFD design tool, without which it would not be possible to effectively utilize the great potential of unsteady flow control. This paper focuses on the dynamics of several key features of the baseline as well as the controlled flow. It was found that the thickness of the upstream boundary layer has a negligible effect on the flow dynamics. It is speculated that separation is caused mainly by the highly convex surface while viscous effects are less important. The two-dimensional separated flow contains unsteady waves centered on a reduced frequency of 0.8, while in the three dimensional separated flow, frequencies around a reduced frequency of 0.3 and 1 are active. Several scenarios of resonant wave interaction take place at the separated shear-layer and in the pressure recovery region. The unstable reduced frequency bands for periodic excitation are centered on 1.5 and 5, but these reduced frequencies are based on the length of the baseline bubble that shortens due to the excitation. The conventional swept wing-scaling works well for the coherent wave features. Reproduction of these dynamic effects by a numerical simulation would provide benchmark validation.

  1. Implementing a Facial Recognition System to Improve Accessibility and Increase Utilization of Entry Control Points at Military Installations

    DTIC Science & Technology

    2011-06-01

    event simulation is used to model three alternatives to the ECP system. The baseline system which contains two manned kiosks, a fully automated system...experience is traffic delays in the morning for government employees accessing the bases. If one or two lanes were dedicated to 3 completely or even semi...purpose of clarity, the figure below displays only the two lowest levels of functions. This final functional decomposition identifies the sub functions

  2. Effects of combined dimension reduction and tabulation on the simulations of a turbulent premixed flame using a large-eddy simulation/probability density function method

    NASA Astrophysics Data System (ADS)

    Kim, Jeonglae; Pope, Stephen B.

    2014-05-01

    A turbulent lean-premixed propane-air flame stabilised by a triangular cylinder as a flame-holder is simulated to assess the accuracy and computational efficiency of combined dimension reduction and tabulation of chemistry. The computational condition matches the Volvo rig experiments. For the reactive simulation, the Lagrangian Large-Eddy Simulation/Probability Density Function (LES/PDF) formulation is used. A novel two-way coupling approach between LES and PDF is applied to obtain resolved density to reduce its statistical fluctuations. Composition mixing is evaluated by the modified Interaction-by-Exchange with the Mean (IEM) model. A baseline case uses In Situ Adaptive Tabulation (ISAT) to calculate chemical reactions efficiently. Its results demonstrate good agreement with the experimental measurements in turbulence statistics, temperature, and minor species mass fractions. For dimension reduction, 11 and 16 represented species are chosen and a variant of Rate Controlled Constrained Equilibrium (RCCE) is applied in conjunction with ISAT to each case. All the quantities in the comparison are indistinguishable from the baseline results using ISAT only. The combined use of RCCE/ISAT reduces the computational time for chemical reaction by more than 50%. However, for the current turbulent premixed flame, chemical reaction takes only a minor portion of the overall computational cost, in contrast to non-premixed flame simulations using LES/PDF, presumably due to the restricted manifold of purely premixed flame in the composition space. Instead, composition mixing is the major contributor to cost reduction since the mean-drift term, which is computationally expensive, is computed for the reduced representation. Overall, a reduction of more than 15% in the computational cost is obtained.

  3. Simulation of homogeneous condensation of small polyatomic systems in high pressure supersonic nozzle flows using Bhatnagar-Gross-Krook model

    NASA Astrophysics Data System (ADS)

    Kumar, Rakesh; Levin, Deborah A.

    2011-03-01

    In the present work, we have simulated the homogeneous condensation of carbon dioxide and ethanol using the Bhatnagar-Gross-Krook based approach. In an earlier work of Gallagher-Rogers et al. [J. Thermophys. Heat Transfer 22, 695 (2008)], it was found that it was not possible to simulate condensation experiments of Wegener et al. [Phys. Fluids 15, 1869 (1972)] using the direct simulation Monte Carlo method. Therefore, in this work, we have used the statistical Bhatnagar-Gross-Krook approach, which was found to be numerically more efficient than direct simulation Monte Carlo method in our previous studies [Kumar et al., AIAA J. 48, 1531 (2010)], to model homogeneous condensation of two small polyatomic systems, carbon dioxide and ethanol. A new weighting scheme is developed in the Bhatnagar-Gross-Krook framework to reduce the computational load associated with the study of homogeneous condensation flows. The solutions obtained by the use of the new scheme are compared with those obtained by the baseline Bhatnagar-Gross-Krook condensation model (without the species weighting scheme) for the condensing flow of carbon dioxide in the stagnation pressure range of 1-5 bars. Use of the new weighting scheme in the present work makes the simulation of homogeneous condensation of ethanol possible. We obtain good agreement between our simulated predictions for homogeneous condensation of ethanol and experiments in terms of the point of condensation onset and the distribution of mass fraction of ethanol condensed along the nozzle centerline.

  4. Transurethral Resection of Bladder Tumors: Next-generation Virtual Reality Training for Surgeons.

    PubMed

    Neumann, Eva; Mayer, Julian; Russo, Giorgio Ivan; Amend, Bastian; Rausch, Steffen; Deininger, Susanne; Harland, Niklas; da Costa, Inês Anselmo; Hennenlotter, Jörg; Stenzl, Arnulf; Kruck, Stephan; Bedke, Jens

    2018-05-22

    The number of virtual reality (VR) simulators is increasing. The aim of this prospective trial was to determine the benefit of VR cystoscopy (UC) and transurethral bladder tumor resection (TURBT) training in students. Medical students without endoscopic experience (n=51, median age=25 yr, median 4th academic year) were prospectively randomized into groups A and B. After an initial VR-UC and VR-TURBT task, group A (n=25) underwent a video-based tutorial by a skilled expert. Group B (n=26) was trained using a VR training program (Uro-Trainer). Following the training, every participant performed a final VR-UC and VR-TURBT task. Performance indicators were recorded via the simulator. Data was analyzed by Mann-Whitney U test. VR cystoscopy and TURBT. No baseline and post-training differences were found for VR-UC between groups. During baseline, VR-TURBT group A showed higher inspected bladder surface than group B (56% vs 73%, p=0.03). Subgroup analysis detected differences related to sex before training (male: 31.2% decreased procedure time; 38.1% decreased resectoscope movement; p=0.02). After training, significant differences in procedure time (3.9min vs 2.7min, p=0.007), resectoscope movement (857mm vs 529mm, p=0.005), and accidental bladder injury (n=3.0 vs n=0.88, p=0.003) were found. Male participants showed reduced blood loss (males: 3.92ml vs females: 10.12ml; p=0.03) after training. Measuring endoscopic skills within a virtual environment can be done easily. Short training improved efficacy and safety of VR-TURBT. Nevertheless, transfer of improved VR performance into real world surgery needs further clarification. We investigated how students without endoscopic experience profit from simulation-based training. The safe environment and repeated simulations can improve the surgical training. It may be possible to enhance patient's safety and the training of surgeons in long term. Copyright © 2018 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  5. The effect of entrapped nonaqueous phase liquids on tracer transport in heterogeneous porous media: Laboratory experiments at the intermediate scale

    USGS Publications Warehouse

    Barth, Gilbert R.; Illangasekare, T.H.; Rajaram, H.

    2003-01-01

    This work considers the applicability of conservative tracers for detecting high-saturation nonaqueous-phase liquid (NAPL) entrapment in heterogeneous systems. For this purpose, a series of experiments and simulations was performed using a two-dimensional heterogeneous system (10??1.2 m), which represents an intermediate scale between laboratory and field scales. Tracer tests performed prior to injecting the NAPL provide the baseline response of the heterogeneous porous medium. Two NAPL spill experiments were performed and the entrapped-NAPL saturation distribution measured in detail using a gamma-ray attenuation system. Tracer tests following each of the NAPL spills produced breakthrough curves (BTCs) reflecting the impact of entrapped NAPL on conservative transport. To evaluate significance, the impact of NAPL entrapment on the conservative-tracer breakthrough curves was compared to simulated breakthrough curve variability for different realizations of the heterogeneous distribution. Analysis of the results reveals that the NAPL entrapment has a significant impact on the temporal moments of conservative-tracer breakthrough curves. ?? 2003 Elsevier B.V. All rights reserved.

  6. The Effects of Gravitational Instabilities on Gas Giant Planet Migration in Protoplanetary Disks

    NASA Astrophysics Data System (ADS)

    Michael, Scott A.; Durisen, R. H.

    2010-05-01

    In this paper we conduct several three-dimensional radiative hydrodynamic simulations to explore the effect of the inclusion of gas giant planets in gravitationally unstable protoplanetary disks. We compare several simulations carried out with the CHYMERA code including: a baseline simulation without a planet, and three simulations including planets of various masses 0.3, 1 and 3 Jupiter masses. The planets are inserted into the baseline simulation after the gravitational instabilities (GIs) have grown to non-linear amplitude. The planets are inserted at the same radius, which coincides with the co-rotation radius of the dominant global mode in the baseline simulation. We examine the effect that the GIs have on migration rates as well as the potential of halting inward migration. We also examine the effect the insertion of the planet has on the global torques caused by the GIs. Furthermore, we explore the relationship between planet mass and migration rates and effect on GIs.

  7. Investigation on aerodynamic characteristics of baseline-II E-2 blended wing-body aircraft with canard via computational simulation

    NASA Astrophysics Data System (ADS)

    Nasir, Rizal E. M.; Ali, Zurriati; Kuntjoro, Wahyu; Wisnoe, Wirachman

    2012-06-01

    Previous wind tunnel test has proven the improved aerodynamic charasteristics of Baseline-II E-2 Blended Wing-Body (BWB) aircraft studied in Universiti Teknologi Mara. The E-2 is a version of Baseline-II BWB with modified outer wing and larger canard, solely-designed to gain favourable longitudinal static stability during flight. This paper highlights some results from current investigation on the said aircraft via computational fluid dynamics simulation as a mean to validate the wind tunnel test results. The simulation is conducted based on standard one-equation turbulence, Spalart-Allmaras model with polyhedral mesh. The ambience of the flight simulation is made based on similar ambience of wind tunnel test. The simulation shows lift, drag and moment results to be near the values found in wind tunnel test but only within angles of attack where the lift change is linear. Beyond the linear region, clear differences between computational simulation and wind tunnel test results are observed. It is recommended that different type of mathematical model be used to simulate flight conditions beyond linear lift region.

  8. Effects of baseline conditions on the simulated hydrologic response to projected climate change

    USGS Publications Warehouse

    Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.

    2011-01-01

    Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.

  9. An Analysis of Results of a High-Resolution World Ocean Circulation Model.

    DTIC Science & Technology

    1988-03-01

    Level Experiments ............... 16 a. Baseline (Laplacian Mixing) Integration ........ 16 b. Isopycnal Mixing Integration ................... 18 3...One-Half Degree, Twenty Level Experiments .......... 18 a. Baseline (Three Year Interior Restoring) Integration...TWENTY LEVEL EXPERIMENTS .................... 21 1. Baseline (Laplacian Mixing) Integration ............. 21 2. Isopycnal Mixing Integration

  10. Examination of the Entry to Burn and Burn Control for the ITER 15 MA Baseline and Other Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kesse, Charles E.; Kim, S-H.; Koechl, F.

    2014-09-01

    The entry to burn and flattop burn control in ITER will be a critical need from the first DT experiments. Simulations are used to address time-dependent behavior under a range of possible conditions that include injected power level, impurity content (W, Ar, Be), density evolution, H-mode regimes, controlled parameter (Wth, Pnet, Pfusion), and actuator (Paux, fueling, fAr), with a range of transport models. A number of physics issues at the L-H transition require better understanding to project to ITER, however, simulations indicate viable control with sufficient auxiliary power (up to 73 MW), while lower powers become marginal (as low asmore » 43 MW).« less

  11. Handling Qualities Evaluations of Low Complexity Model Reference Adaptive Controllers for Reduced Pitch and Roll Damping Scenarios

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Schaefer, Jacob; Burken, John J.; Johnson, Marcus; Nguyen, Nhan

    2011-01-01

    National Aeronautics and Space Administration (NASA) researchers have conducted a series of flight experiments designed to study the effects of varying levels of adaptive controller complexity on the performance and handling qualities of an aircraft under various simulated failure or damage conditions. A baseline, nonlinear dynamic inversion controller was augmented with three variations of a model reference adaptive control design. The simplest design consisted of a single adaptive parameter in each of the pitch and roll axes computed using a basic gradient-based update law. A second design was built upon the first by increasing the complexity of the update law. The third and most complex design added an additional adaptive parameter to each axis. Flight tests were conducted using NASA s Full-scale Advanced Systems Testbed, a highly modified F-18 aircraft that contains a research flight control system capable of housing advanced flight controls experiments. Each controller was evaluated against a suite of simulated failures and damage ranging from destabilization of the pitch and roll axes to significant coupling between the axes. Two pilots evaluated the three adaptive controllers as well as the non-adaptive baseline controller in a variety of dynamic maneuvers and precision flying tasks designed to uncover potential deficiencies in the handling qualities of the aircraft, and adverse interactions between the pilot and the adaptive controllers. The work was completed as part of the Integrated Resilient Aircraft Control Project under NASA s Aviation Safety Program.

  12. Focused and Corrective Feedback Versus Structured and Supported Debriefing in a Simulation-Based Cardiac Arrest Team Training: A Pilot Randomized Controlled Study.

    PubMed

    Kim, Ji-Hoon; Kim, Young-Min; Park, Seong Heui; Ju, Eun A; Choi, Se Min; Hong, Tai Yong

    2017-06-01

    The aim of the study was to compare the educational impact of two postsimulation debriefing methods-focused and corrective feedback (FCF) versus Structured and Supported Debriefing (SSD)-on team dynamics in simulation-based cardiac arrest team training. This was a pilot randomized controlled study conducted at a simulation center. Fourth-year medical students were randomly assigned to the FCF or SSD group, with each team composed of six students and a confederate. Each team participated in two simulations and the assigned debriefing (FCF or SSD) sessions and then underwent a test simulation. Two trained raters blindly assessed all of the recorded simulations using checklists. The primary outcome was the improvement in team dynamics scores between baseline and test simulation. The secondary outcomes were improvements before and after training in team clinical performance scores, self-assessed comprehension of and confidence in cardiac arrest management and team dynamics, as well as evaluations of the postsimulation debriefing intervention. In total, 95 students participated [FCF (8 teams, n = 47) and SSD (8 teams, n = 48)]. The SSD team dynamics score during the test simulation was higher than at baseline [baseline: 74.5 (65.9-80.9), test: 85.0 (71.9-87.6), P = 0.035]. However, there were no differences in the improvement in the team dynamics or team clinical performance scores between the two groups (P = 0.328, respectively). There was no significant difference in improvement in team dynamics scores during the test simulation compared with baseline between the SSD and FCF groups in a simulation-based cardiac arrest team training in fourth-year Korean medical students.

  13. Simulation Study of Flap Effects on a Commercial Transport Airplane in Upset Conditions

    NASA Technical Reports Server (NTRS)

    Cunningham, Kevin; Foster, John V.; Shah, Gautam H.; Stewart, Eric C.; Ventura, Robin N.; Rivers, Robert A.; Wilborn, James E.; Gato, William

    2005-01-01

    As part of NASA's Aviation Safety and Security Program, a simulation study of a twinjet transport airplane crew training simulation was conducted to address fidelity for upset or loss of control conditions and to study the effect of flap configuration in those regimes. Piloted and desktop simulations were used to compare the baseline crew training simulation model with an enhanced aerodynamic model that was developed for high-angle-of-attack conditions. These studies were conducted with various flap configurations and addressed the approach-to-stall, stall, and post-stall flight regimes. The enhanced simulation model showed that flap configuration had a significant effect on the character of departures that occurred during post-stall flight. Preliminary comparisons with flight test data indicate that the enhanced model is a significant improvement over the baseline. Some of the unrepresentative characteristics that are predicted by the baseline crew training simulation for flight in the post-stall regime have been identified. This paper presents preliminary results of this simulation study and discusses key issues regarding predicted flight dynamics characteristics during extreme upset and loss-of-control flight conditions with different flap configurations.

  14. Using Virtual Patient Simulations to Prepare Primary Health Care Professionals to Conduct Substance Use and Mental Health Screening and Brief Intervention.

    PubMed

    Albright, Glenn; Bryan, Craig; Adam, Cyrille; McMillan, Jeremiah; Shockley, Kristen

    Primary health care professionals are in an excellent position to identify, screen, and conduct brief interventions for patients with mental health and substance use disorders. However, discomfort in initiating conversations about behavioral health, time concerns, lack of knowledge about screening tools, and treatment resources are barriers. This study examines the impact of an online simulation where users practice role-playing with emotionally responsive virtual patients to learn motivational interviewing strategies to better manage screening, brief interventions, and referral conversations. Baseline data were collected from 227 participants who were then randomly assigned into the treatment or wait-list control groups. Treatment group participants then completed the simulation, postsimulation survey, and 3-month follow-up survey. Results showed significant increases in knowledge/skill to identify and engage in collaborative decision making with patients. Results strongly suggest that role-play simulation experiences can be an effective means of teaching screening and brief intervention.

  15. Contingency contracting with delinquents: effects of a brief training manual on staff contract negotiation and writing skills.

    PubMed

    Welch, S J; Holborn, S W

    1988-01-01

    A brief training manual was developed for the purpose of teaching child-care workers to contingency contract with delinquent youths living in residential care facilities. The manual was designed to require minimal supplementary training by a professional. In Experiment 1 a multiple baseline design was used to assess the effect of the manual on 4 child-care workers' contract negotiation and writing behaviors. Experiment 2 consisted of four A-B systematic replications. Behaviors were assessed within the context of analogue training simulations and generalization tests with delinquent youths. Results from the analogue simulations indicated that the manual was successful in increasing both types of behaviors to a level of proficiency that equaled or surpassed that of behaviorally trained graduate students, and results from the generalization tests indicated that the child-care workers were able to apply their newly acquired contracting skills with delinquent youths. Procedural reliability varied across child-care workers, but was usually high.

  16. Contingency contracting with delinquents: effects of a brief training manual on staff contract negotiation and writing skills.

    PubMed Central

    Welch, S J; Holborn, S W

    1988-01-01

    A brief training manual was developed for the purpose of teaching child-care workers to contingency contract with delinquent youths living in residential care facilities. The manual was designed to require minimal supplementary training by a professional. In Experiment 1 a multiple baseline design was used to assess the effect of the manual on 4 child-care workers' contract negotiation and writing behaviors. Experiment 2 consisted of four A-B systematic replications. Behaviors were assessed within the context of analogue training simulations and generalization tests with delinquent youths. Results from the analogue simulations indicated that the manual was successful in increasing both types of behaviors to a level of proficiency that equaled or surpassed that of behaviorally trained graduate students, and results from the generalization tests indicated that the child-care workers were able to apply their newly acquired contracting skills with delinquent youths. Procedural reliability varied across child-care workers, but was usually high. PMID:3225253

  17. Comparative-Effectiveness of Simulation-Based Deliberate Practice Versus Self-Guided Practice on Resident Anesthesiologists' Acquisition of Ultrasound-Guided Regional Anesthesia Skills.

    PubMed

    Udani, Ankeet Deepak; Harrison, T Kyle; Mariano, Edward R; Derby, Ryan; Kan, Jack; Ganaway, Toni; Shum, Cynthia; Gaba, David M; Tanaka, Pedro; Kou, Alex; Howard, Steven K

    2016-01-01

    Simulation-based education strategies to teach regional anesthesia have been described, but their efficacy largely has been assumed. We designed this study to determine whether residents trained using the simulation-based strategy of deliberate practice show greater improvement of ultrasound-guided regional anesthesia (UGRA) skills than residents trained using self-guided practice in simulation. Anesthesiology residents new to UGRA were randomized to participate in either simulation-based deliberate practice (intervention) or self-guided practice (control). Participants were recorded and assessed while performing simulated peripheral nerve blocks at baseline, immediately after the experimental condition, and 3 months after enrollment. Subject performance was scored from video by 2 blinded reviewers using a composite tool. The amount of time each participant spent in deliberate or self-guided practice was recorded. Twenty-eight participants completed the study. Both groups showed within-group improvement from baseline scores immediately after the curriculum and 3 months following study enrollment. There was no difference between groups in changed composite scores immediately after the curriculum (P = 0.461) and 3 months following study enrollment (P = 0.927) from baseline. The average time in minutes that subjects spent in simulation practice was 6.8 minutes for the control group compared with 48.5 minutes for the intervention group (P < 0.001). In this comparative effectiveness study, there was no difference in acquisition and retention of skills in UGRA for novice residents taught by either simulation-based deliberate practice or self-guided practice. Both methods increased skill from baseline; however, self-guided practice required less time and faculty resources.

  18. Synthetic Vision Enhances Situation Awareness and RNP Capabilities for Terrain-Challenged Approaches

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Prinzel, Lawrence J., III; Bailey, Randall E.; Arthur, Jarvis J., III

    2003-01-01

    The Synthetic Vision Systems (SVS) Project of Aviation Safety Program is striving to eliminate poor visibility as a causal factor in aircraft accidents as well as enhance operational capabilities of all aircraft through the display of computer generated imagery derived from an onboard database of terrain, obstacle, and airport information. To achieve these objectives, NASA 757 flight test research was conducted at the Eagle-Vail, Colorado airport to evaluate three SVS display types (Head-Up Display, Head-Down Size A, Head-Down Size X) and two terrain texture methods (photo-realistic, generic) in comparison to the simulated Baseline Boeing-757 Electronic Attitude Direction Indicator and Navigation / Terrain Awareness and Warning System displays. These independent variables were evaluated for situation awareness, path error, and workload while making approaches to Runway 25 and 07 and during simulated engine-out Cottonwood 2 and KREMM departures. The results of the experiment showed significantly improved situation awareness, performance, and workload for SVS concepts compared to the Baseline displays and confirmed the retrofit capability of the Head-Up Display and Size A SVS concepts. The research also demonstrated that the pathway and pursuit guidance used within the SVS concepts achieved required navigation performance (RNP) criteria.

  19. Bronchoscopy Simulation Training as a Tool in Medical School Education.

    PubMed

    Gopal, Mallika; Skobodzinski, Alexus A; Sterbling, Helene M; Rao, Sowmya R; LaChapelle, Christopher; Suzuki, Kei; Litle, Virginia R

    2018-07-01

    Procedural simulation training is rare at the medical school level and little is known about its usefulness in improving anatomic understanding and procedural confidence in students. Our aim is to assess the impact of bronchoscopy simulation training on bronchial anatomy knowledge and technical skills in medical students. Medical students were recruited by email, consented, and asked to fill out a survey regarding their baseline experience. Two thoracic surgeons measured their knowledge of bronchoscopy on a virtual reality bronchoscopy simulator using the Bronchoscopy Skills and Tasks Assessment Tool (BSTAT), a validated 65-point checklist (46 for anatomy, 19 for simulation). Students performed four self-directed training sessions of 15 minutes per week. A posttraining survey and BSTAT were completed afterward. Differences between pretraining and posttraining scores were analyzed with paired Student's t tests and random intercept linear regression models accounting for baseline BSTAT score, total training time, and training year. The study was completed by 47 medical students with a mean training time of 81.5 ± 26.8 minutes. Mean total BSTAT score increased significantly from 12.3 ± 5.9 to 48.0 ± 12.9 (p < 0.0001); mean scores for bronchial anatomy increased from 0.1 ± 0.9 to 31.1 ± 12.3 (p < 0.0001); and bronchoscopy navigational skills increased from 12.1 ± 5.7 to 17.4 ± 2.5 (p < 0.0001). Total training time and frequency of training did not have a significant impact on level of improvement. Self-driven bronchoscopy simulation training in medical students led to improvements in bronchial anatomy knowledge and bronchoscopy skills. Further investigation is under way to determine the impact of bronchoscopy simulation training on future specialty interest and long-term skills retention. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  20. 'I didn't see that coming': simulated visual fields and driving hazard perception test performance.

    PubMed

    Glen, Fiona C; Smith, Nicholas D; Jones, Lee; Crabb, David P

    2016-09-01

    Evidence is limited regarding specific types of visual field loss associated with unsafe driving. We use novel gaze-contingent software to examine the effect of simulated visual field loss on computer-based driving hazard detection with the specific aim of testing the impact of scotomata located to the right and left of fixation. The 'hazard perception test' is a component of the UK driving licence examination, which measures speed of detecting 15 different hazards in a series of real-life driving films. We have developed a novel eye-tracking and computer set up capable of generating a realistic gaze-contingent scotoma simulation (GazeSS) overlaid on film content. Thirty drivers with healthy vision completed three versions of the hazard perception test in a repeated measures experiment. In two versions, GazeSS simulated a scotoma in the binocular field of view to the left or right of fixation. A third version was unmodified to establish baseline performance. Participants' mean baseline hazard perception test score was 51 ± 7 (out of 75). This reduced to 46 ± 9 and 46 ± 11 when completing the task with a binocular visual field defect located to the left and right of fixation, respectively. While the main effect of simulated visual field loss on performance was statistically significant (p = 0.007), there were no average differences in the experimental conditions where a scotoma was located in the binocular visual field to the right or left of fixation. Simulated visual field loss impairs driving hazard detection on a computer-based test. There was no statistically significant difference in average performance when the simulated scotoma was located to the right or left of fixation of the binocular visual field, but certain types of hazard caused more difficulties than others. © 2016 Optometry Australia.

  1. Short and Long Baseline Neutrino Experiments

    NASA Astrophysics Data System (ADS)

    Autiero, Dario

    2005-04-01

    These two lectures discuss the past and current neutrino oscillation experiments performed with man-made neutrino sources, like accelerators and nuclear reactors. The search for neutrino oscillations is a remarkable effort, which has been performed over three decades. It is therefore interesting to discuss the short and long baseline neutrino experiments in their historical context and to see how this line of research evolved up to the present generation of experiments, looking at what was learnt from past experiments and how this experience is used in the current ones. The first lecture focuses on the past generation of short baseline experiments (NOMAD and CHORUS) performed at CERN and ends with LSND and MINIBOONE. The second lecture discusses how after the CHOOZ and the atmospheric neutrino results the line of the long baseline experiments developed and presents in details the K2K and MINOS experiments and the CNGS program.

  2. Using compressive sensing to recover images from PET scanners with partial detector rings.

    PubMed

    Valiollahzadeh, SeyyedMajid; Clark, John W; Mawlawi, Osama

    2015-01-01

    Most positron emission tomography/computed tomography (PET/CT) scanners consist of tightly packed discrete detector rings to improve scanner efficiency. The authors' aim was to use compressive sensing (CS) techniques in PET imaging to investigate the possibility of decreasing the number of detector elements per ring (introducing gaps) while maintaining image quality. A CS model based on a combination of gradient magnitude and wavelet domains (wavelet-TV) was developed to recover missing observations in PET data acquisition. The model was designed to minimize the total variation (TV) and L1-norm of wavelet coefficients while constrained by the partially observed data. The CS model also incorporated a Poisson noise term that modeled the observed noise while suppressing its contribution by penalizing the Poisson log likelihood function. Three experiments were performed to evaluate the proposed CS recovery algorithm: a simulation study, a phantom study, and six patient studies. The simulation dataset comprised six disks of various sizes in a uniform background with an activity concentration of 5:1. The simulated image was multiplied by the system matrix to obtain the corresponding sinogram and then Poisson noise was added. The resultant sinogram was masked to create the effect of partial detector removal and then the proposed CS algorithm was applied to recover the missing PET data. In addition, different levels of noise were simulated to assess the performance of the proposed algorithm. For the phantom study, an IEC phantom with six internal spheres each filled with F-18 at an activity-to-background ratio of 10:1 was used. The phantom was imaged twice on a RX PET/CT scanner: once with all detectors operational (baseline) and once with four detector blocks (11%) turned off at each of 0 ˚, 90 ˚, 180 ˚, and 270° (partially sampled). The partially acquired sinograms were then recovered using the proposed algorithm. For the third test, PET images from six patient studies were investigated using the same strategy of the phantom study. The recovered images using WTV and TV as well as the partially sampled images from all three experiments were then compared with the fully sampled images (the baseline). Comparisons were done by calculating the mean error (%bias), root mean square error (RMSE), contrast recovery (CR), and SNR of activity concentration in regions of interest drawn in the background as well as the disks, spheres, and lesions. For the simulation study, the mean error, RMSE, and CR for the WTV (TV) recovered images were 0.26% (0.48%), 2.6% (2.9%), 97% (96%), respectively, when compared to baseline. For the partially sampled images, these results were 22.5%, 45.9%, and 64%, respectively. For the simulation study, the average SNR for the baseline was 41.7 while for WTV (TV), recovered image was 44.2 (44.0). The phantom study showed similar trends with 5.4% (18.2%), 15.6% (18.8%), and 78% (60%), respectively, for the WTV (TV) images and 33%, 34.3%, and 69% for the partially sampled images. For the phantom study, the average SNR for the baseline was 14.7 while for WTV (TV) recovered image was 13.7 (11.9). Finally, the average of these values for the six patient studies for the WTV-recovered, TV, and partially sampled images was 1%, 7.2%, 92% and 1.3%, 15.1%, 87%, and 27%, 25.8%, 45%, respectively. CS with WTV is capable of recovering PET images with good quantitative accuracy from partially sampled data. Such an approach can be used to potentially reduce the cost of scanners while maintaining good image quality.

  3. Using compressive sensing to recover images from PET scanners with partial detector rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valiollahzadeh, SeyyedMajid, E-mail: sv4@rice.edu; Clark, John W.; Mawlawi, Osama

    2015-01-15

    Purpose: Most positron emission tomography/computed tomography (PET/CT) scanners consist of tightly packed discrete detector rings to improve scanner efficiency. The authors’ aim was to use compressive sensing (CS) techniques in PET imaging to investigate the possibility of decreasing the number of detector elements per ring (introducing gaps) while maintaining image quality. Methods: A CS model based on a combination of gradient magnitude and wavelet domains (wavelet-TV) was developed to recover missing observations in PET data acquisition. The model was designed to minimize the total variation (TV) and L1-norm of wavelet coefficients while constrained by the partially observed data. The CSmore » model also incorporated a Poisson noise term that modeled the observed noise while suppressing its contribution by penalizing the Poisson log likelihood function. Three experiments were performed to evaluate the proposed CS recovery algorithm: a simulation study, a phantom study, and six patient studies. The simulation dataset comprised six disks of various sizes in a uniform background with an activity concentration of 5:1. The simulated image was multiplied by the system matrix to obtain the corresponding sinogram and then Poisson noise was added. The resultant sinogram was masked to create the effect of partial detector removal and then the proposed CS algorithm was applied to recover the missing PET data. In addition, different levels of noise were simulated to assess the performance of the proposed algorithm. For the phantom study, an IEC phantom with six internal spheres each filled with F-18 at an activity-to-background ratio of 10:1 was used. The phantom was imaged twice on a RX PET/CT scanner: once with all detectors operational (baseline) and once with four detector blocks (11%) turned off at each of 0 °, 90 °, 180 °, and 270° (partially sampled). The partially acquired sinograms were then recovered using the proposed algorithm. For the third test, PET images from six patient studies were investigated using the same strategy of the phantom study. The recovered images using WTV and TV as well as the partially sampled images from all three experiments were then compared with the fully sampled images (the baseline). Comparisons were done by calculating the mean error (%bias), root mean square error (RMSE), contrast recovery (CR), and SNR of activity concentration in regions of interest drawn in the background as well as the disks, spheres, and lesions. Results: For the simulation study, the mean error, RMSE, and CR for the WTV (TV) recovered images were 0.26% (0.48%), 2.6% (2.9%), 97% (96%), respectively, when compared to baseline. For the partially sampled images, these results were 22.5%, 45.9%, and 64%, respectively. For the simulation study, the average SNR for the baseline was 41.7 while for WTV (TV), recovered image was 44.2 (44.0). The phantom study showed similar trends with 5.4% (18.2%), 15.6% (18.8%), and 78% (60%), respectively, for the WTV (TV) images and 33%, 34.3%, and 69% for the partially sampled images. For the phantom study, the average SNR for the baseline was 14.7 while for WTV (TV) recovered image was 13.7 (11.9). Finally, the average of these values for the six patient studies for the WTV-recovered, TV, and partially sampled images was 1%, 7.2%, 92% and 1.3%, 15.1%, 87%, and 27%, 25.8%, 45%, respectively. Conclusions: CS with WTV is capable of recovering PET images with good quantitative accuracy from partially sampled data. Such an approach can be used to potentially reduce the cost of scanners while maintaining good image quality.« less

  4. Pre-Test CFD for the Design and Execution of the Enhanced Injection and Mixing Project at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Drozda, Tomasz G.; Axdahl, Erik L.; Cabell, Karen F.

    2014-01-01

    With the increasing costs of physics experiments and simultaneous increase in availability and maturity of computational tools it is not surprising that computational fluid dynamics (CFD) is playing an increasingly important role, not only in post-test investigations, but also in the early stages of experimental planning. This paper describes a CFD-based effort executed in close collaboration between computational fluid dynamicists and experimentalists to develop a virtual experiment during the early planning stages of the Enhanced Injection and Mixing project at NASA Langley Research Center. This projects aims to investigate supersonic combustion ramjet (scramjet) fuel injection and mixing physics, improve the understanding of underlying physical processes, and develop enhancement strategies and functional relationships relevant to flight Mach numbers greater than 8. The purpose of the virtual experiment was to provide flow field data to aid in the design of the experimental apparatus and the in-stream rake probes, to verify the nonintrusive measurements based on NO-PLIF, and to perform pre-test analysis of quantities obtainable from the experiment and CFD. The approach also allowed for the joint team to develop common data processing and analysis tools, and to test research ideas. The virtual experiment consisted of a series of Reynolds-averaged simulations (RAS). These simulations included the facility nozzle, the experimental apparatus with a baseline strut injector, and the test cabin. Pure helium and helium-air mixtures were used to determine the efficacy of different inert gases to model hydrogen injection. The results of the simulations were analyzed by computing mixing efficiency, total pressure recovery, and stream thrust potential. As the experimental effort progresses, the simulation results will be compared with the experimental data to calibrate the modeling constants present in the CFD and validate simulation fidelity. CFD will also be used to investigate different injector concepts, improve understanding of the flow structure and flow physics, and develop functional relationships. Both RAS and large eddy simulations (LES) are planned for post-test analysis of the experimental data.

  5. Modeling forest dynamics along climate gradients in Bolivia

    NASA Astrophysics Data System (ADS)

    Seiler, C.; Hutjes, R. W. A.; Kruijt, B.; Quispe, J.; Añez, S.; Arora, V. K.; Melton, J. R.; Hickler, T.; Kabat, P.

    2014-05-01

    Dynamic vegetation models have been used to assess the resilience of tropical forests to climate change, but the global application of these modeling experiments often misrepresents carbon dynamics at a regional level, limiting the validity of future projections. Here a dynamic vegetation model (Lund Potsdam Jena General Ecosystem Simulator) was adapted to simulate present-day potential vegetation as a baseline for climate change impact assessments in the evergreen and deciduous forests of Bolivia. Results were compared to biomass measurements (819 plots) and remote sensing data. Using regional parameter values for allometric relations, specific leaf area, wood density, and disturbance interval, a realistic transition from the evergreen Amazon to the deciduous dry forest was simulated. This transition coincided with threshold values for precipitation (1400 mm yr-1) and water deficit (i.e., potential evapotranspiration minus precipitation) (-830 mm yr-1), beyond which leaf abscission became a competitive advantage. Significant correlations were found between modeled and observed values of seasonal leaf abscission (R2 = 0.6, p <0.001) and vegetation carbon (R2 = 0.31, p <0.01). Modeled Gross Primary Productivity (GPP) and remotely sensed normalized difference vegetation index showed that dry forests were more sensitive to rainfall anomalies than wet forests. GPP was positively correlated to the El Niño-Southern Oscillation index in the Amazon and negatively correlated to consecutive dry days. Decreasing rainfall trends were simulated to reduce GPP in the Amazon. The current model setup provides a baseline for assessing the potential impacts of climate change in the transition zone from wet to dry tropical forests in Bolivia.

  6. Comparison of Baseline Wander Removal Techniques considering the Preservation of ST Changes in the Ischemic ECG: A Simulation Study

    PubMed Central

    Pilia, Nicolas; Schulze, Walther H. W.; Dössel, Olaf

    2017-01-01

    The most important ECG marker for the diagnosis of ischemia or infarction is a change in the ST segment. Baseline wander is a typical artifact that corrupts the recorded ECG and can hinder the correct diagnosis of such diseases. For the purpose of finding the best suited filter for the removal of baseline wander, the ground truth about the ST change prior to the corrupting artifact and the subsequent filtering process is needed. In order to create the desired reference, we used a large simulation study that allowed us to represent the ischemic heart at a multiscale level from the cardiac myocyte to the surface ECG. We also created a realistic model of baseline wander to evaluate five filtering techniques commonly used in literature. In the simulation study, we included a total of 5.5 million signals coming from 765 electrophysiological setups. We found that the best performing method was the wavelet-based baseline cancellation. However, for medical applications, the Butterworth high-pass filter is the better choice because it is computationally cheap and almost as accurate. Even though all methods modify the ST segment up to some extent, they were all proved to be better than leaving baseline wander unfiltered. PMID:28373893

  7. The Effects of Attrition on Baseline Comparability in Randomized Experiments in Education: A Meta-Analysis

    ERIC Educational Resources Information Center

    Valentine, Jeffrey C.; McHugh, Cathleen M.

    2007-01-01

    Using meta-analysis, randomized experiments in education that either clearly did or clearly did not experience student attrition were examined for the baseline comparability of groups. Results from 35 studies suggested that after attrition, the observed measures of baseline comparability of groups did not differ more than would be expected given…

  8. Ensembles of novelty detection classifiers for structural health monitoring using guided waves

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Karpenko, Oleksii; Koricho, Ermias; Khomenko, Anton; Haq, Mahmoodul; Udpa, Lalita

    2018-01-01

    Guided wave structural health monitoring uses sparse sensor networks embedded in sophisticated structures for defect detection and characterization. The biggest challenge of those sensor networks is developing robust techniques for reliable damage detection under changing environmental and operating conditions (EOC). To address this challenge, we develop a novelty classifier for damage detection based on one class support vector machines. We identify appropriate features for damage detection and introduce a feature aggregation method which quadratically increases the number of available training observations. We adopt a two-level voting scheme by using an ensemble of classifiers and predictions. Each classifier is trained on a different segment of the guided wave signal, and each classifier makes an ensemble of predictions based on a single observation. Using this approach, the classifier can be trained using a small number of baseline signals. We study the performance using Monte-Carlo simulations of an analytical model and data from impact damage experiments on a glass fiber composite plate. We also demonstrate the classifier performance using two types of baseline signals: fixed and rolling baseline training set. The former requires prior knowledge of baseline signals from all EOC, while the latter does not and leverages the fact that EOC vary slowly over time and can be modeled as a Gaussian process.

  9. The effects of response cost and response restriction on a multiple-response repertoire with humans

    PubMed Central

    Crosbie, John

    1993-01-01

    In two experiments a multiple-response repertoire of four free-operant responses was developed with university students as subjects using monetary gain as reinforcement. Following baseline, one of the responses was reduced either by making monetary loss contingent upon it (response cost) or by removing it from the repertoire (response restriction). In Experiment 1 a multielement baseline design was employed in which baseline and restriction or response-cost contingencies alternated semirandomly every 3 minutes. In Experiment 2 a reversal design was employed (i.e., baseline, restriction or response cost, then baseline), and each response required a different amount of effort. Both experiments had the following results: (a) The target response decreased substantially; (b) most nontarget responses increased, and the rest remained near their baseline levels; and (c) no support was found for Dunham's hierarchical, most frequent follower, or greatest temporal similarity rules. For several subjects, the least probable responses during baseline increased most, and the most probable responses increased least. Furthermore, in Experiment 2, responses with the lowest frequency of reinforcement increased most (for all 7 subjects), and those with the greatest frequency of reinforcement increased least (for 5 subjects). PMID:16812683

  10. Influences of brain tissue poroelastic constants on intracranial pressure (ICP) during constant-rate infusion.

    PubMed

    Li, Xiaogai; von Holst, Hans; Kleiven, Svein

    2013-01-01

    A 3D finite element (FE) model has been developed to study the mean intracranial pressure (ICP) response during constant-rate infusion using linear poroelasticity. Due to the uncertainties in the poroelastic constants for brain tissue, the influence of each of the main parameters on the transient ICP infusion curve was studied. As a prerequisite for transient analysis, steady-state simulations were performed first. The simulated steady-state pressure distribution in the brain tissue for a normal cerebrospinal fluid (CSF) circulation system showed good correlation with experiments from the literature. Furthermore, steady-state ICP closely followed the infusion experiments at different infusion rates. The verified steady-state models then served as a baseline for the subsequent transient models. For transient analysis, the simulated ICP shows a similar tendency to that found in the experiments, however, different values of the poroelastic constants have a significant effect on the infusion curve. The influence of the main poroelastic parameters including the Biot coefficient α, Skempton coefficient B, drained Young's modulus E, Poisson's ratio ν, permeability κ, CSF absorption conductance C(b) and external venous pressure p(b) was studied to investigate the influence on the pressure response. It was found that the value of the specific storage term S(ε) is the dominant factor that influences the infusion curve, and the drained Young's modulus E was identified as the dominant parameter second to S(ε). Based on the simulated infusion curves from the FE model, artificial neural network (ANN) was used to find an optimised parameter set that best fit the experimental curve. The infusion curves from both the FE simulation and using ANN confirmed the limitation of linear poroelasticity in modelling the transient constant-rate infusion.

  11. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study.

    PubMed

    Egbewale, Bolaji E; Lewis, Martyn; Sim, Julius

    2014-04-09

    Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. 126 hypothetical trial scenarios were evaluated (126,000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power.

  12. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    PubMed Central

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  13. Low gravity synthesis of polymers with controlled molecular configuration

    NASA Technical Reports Server (NTRS)

    Heimbuch, A. H.; Parker, J. A.; Schindler, A.; Olf, H. G.

    1975-01-01

    Heterogeneous chemical systems have been studied for the synthesis of isotactic polypropylene in order to establish baseline parameters for the reaction process and to develop sensitive and accurate methods of analysis. These parameters and analytical methods may be used to make a comparison between the polypropylene obtained at one g with that of zero g (gravity). Baseline reaction parameters have been established for the slurry (liquid monomer in heptane/solid catalyst) polymerization of propylene to yield high purity, 98% isotactic polypropylene. Kinetic data for the slurry reaction showed that a sufficient quantity of polymer for complete characterization can be produced in a reaction time of 5 min; this time is compatible with that available on a sounding rocket for a zero-g simulation experiment. The preformed (activated) catalyst was found to be more reproducible in its activity than the in situ formed catalyst.

  14. A VLBI variance-covariance analysis interactive computer program. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Bock, Y.

    1980-01-01

    An interactive computer program (in FORTRAN) for the variance covariance analysis of VLBI experiments is presented for use in experiment planning, simulation studies and optimal design problems. The interactive mode is especially suited to these types of analyses providing ease of operation as well as savings in time and cost. The geodetic parameters include baseline vector parameters and variations in polar motion and Earth rotation. A discussion of the theroy on which the program is based provides an overview of the VLBI process emphasizing the areas of interest to geodesy. Special emphasis is placed on the problem of determining correlations between simultaneous observations from a network of stations. A model suitable for covariance analyses is presented. Suggestions towards developing optimal observation schedules are included.

  15. Shuttle mission simulator baseline definition report, volume 2

    NASA Technical Reports Server (NTRS)

    Dahlberg, A. W.; Small, D. E.

    1973-01-01

    The baseline definition report for the space shuttle mission simulator is presented. The subjects discussed are: (1) the general configurations, (2) motion base crew station, (3) instructor operator station complex, (4) display devices, (5) electromagnetic compatibility, (6) external interface equipment, (7) data conversion equipment, (8) fixed base crew station equipment, and (9) computer complex. Block diagrams of the supporting subsystems are provided.

  16. Linear MALDI-ToF simultaneous spectrum deconvolution and baseline removal.

    PubMed

    Picaud, Vincent; Giovannelli, Jean-Francois; Truntzer, Caroline; Charrier, Jean-Philippe; Giremus, Audrey; Grangeat, Pierre; Mercier, Catherine

    2018-04-05

    Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.

  17. The Efficacy of Using Synthetic Vision Terrain-Textured Images to Improve Pilot Situation Awareness

    NASA Technical Reports Server (NTRS)

    Uenking, Michael D.; Hughes, Monica F.

    2002-01-01

    The General Aviation Element of the Aviation Safety Program's Synthetic Vision Systems (SVS) Project is developing technology to eliminate low visibility induced General Aviation (GA) accidents. SVS displays present computer generated 3-dimensional imagery of the surrounding terrain on the Primary Flight Display (PFD) to greatly enhance pilot's situation awareness (SA), reducing or eliminating Controlled Flight into Terrain, as well as Low-Visibility Loss of Control accidents. SVS-conducted research is facilitating development of display concepts that provide the pilot with an unobstructed view of the outside terrain, regardless of weather conditions and time of day. A critical component of SVS displays is the appropriate presentation of terrain to the pilot. An experimental study is being conducted at NASA Langley Research Center (LaRC) to explore and quantify the relationship between the realism of the terrain presentation and resulting enhancements of pilot SA and performance. Composed of complementary simulation and flight test efforts, Terrain Portrayal for Head-Down Displays (TP-HDD) experiments will help researchers evaluate critical terrain portrayal concepts. The experimental effort is to provide data to enable design trades that optimize SVS applications, as well as develop requirements and recommendations to facilitate the certification process. In this part of the experiment a fixed based flight simulator was equipped with various types of Head Down flight displays, ranging from conventional round dials (typical of most GA aircraft) to glass cockpit style PFD's. The variations of the PFD included an assortment of texturing and Digital Elevation Model (DEM) resolution combinations. A test matrix of 10 terrain display configurations (in addition to the baseline displays) were evaluated by 27 pilots of various backgrounds and experience levels. Qualitative (questionnaires) and quantitative (pilot performance and physiological) data were collected during the experimental runs. This paper focuses on the experimental set-up and final physiological results of the TP-HDD simulation experiment. The physiological measures of skin temperature, heart rate, and muscle response, show a decreased engagement (while using the synthetic vision displays as compared to the baseline conventional display) of the sympathetic and somatic nervous system responses which, in turn, indicates a reduced level of mental workload. This decreased level of workload is expected to enable improvement in the pilot's situation and terrain awareness.

  18. Baseline process description for simulating plutonium oxide production for precalc project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pike, J. A.

    Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO 2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as wellmore » as process and facility design details necessary for multi-scale, multi-physics models are provided.« less

  19. Pyranometer offsets triggered by ambient meteorology: insights from laboratory and field experiments

    NASA Astrophysics Data System (ADS)

    Oswald, Sandro M.; Pietsch, Helga; Baumgartner, Dietmar J.; Weihs, Philipp; Rieder, Harald E.

    2017-03-01

    This study investigates the effects of ambient meteorology on the accuracy of radiation (R) measurements performed with pyranometers contained in various heating and ventilation systems (HV-systems). It focuses particularly on instrument offsets observed following precipitation events. To quantify pyranometer responses to precipitation, a series of controlled laboratory experiments as well as two targeted field campaigns were performed in 2016. The results indicate that precipitation (as simulated by spray tests or observed under ambient conditions) significantly affects the thermal environment of the instruments and thus their stability. Statistical analyses of laboratory experiments showed that precipitation triggers zero offsets of -4 W m-2 or more, independent of the HV-system. Similar offsets were observed in field experiments under ambient environmental conditions, indicating a clear exceedance of BSRN (Baseline Surface Radiation Network) targets following precipitation events. All pyranometers required substantial time to return to their initial signal states after the simulated precipitation events. Therefore, for BSRN-class measurements, the recommendation would be to flag the radiation measurements during a natural precipitation event and 90 min after it in nighttime conditions. Further daytime experiments show pyranometer offsets of 50 W m-2 or more in comparison to the reference system. As they show a substantially faster recovery, the recommendation would be to flag the radiation measurements within a natural precipitation event and 10 min after it in daytime conditions.

  20. A study of cloud microphysics and precipitation over the Tibetan Plateau by radar observations and cloud-resolving model simulations: Cloud Microphysics over Tibetan Plateau

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Wenhua; Sui, Chung-Hsiung; Fan, Jiwen

    Cloud microphysical properties and precipitation over the Tibetan Plateau (TP) are unique because of the high terrains, clean atmosphere, and sufficient water vapor. With dual-polarization precipitation radar and cloud radar measurements during the Third Tibetan Plateau Atmospheric Scientific Experiment (TIPEX-III), the simulated microphysics and precipitation by the Weather Research and Forecasting model (WRF) with the Chinese Academy of Meteorological Sciences (CAMS) microphysics and other microphysical schemes are investigated through a typical plateau rainfall event on 22 July 2014. Results show that the WRF-CAMS simulation reasonably reproduces the spatial distribution of 24-h accumulated precipitation, but has limitations in simulating time evolutionmore » of precipitation rates. The model-calculated polarimetric radar variables have biases as well, suggesting bias in modeled hydrometeor types. The raindrop sizes in convective region are larger than those in stratiform region indicated by the small intercept of raindrop size distribution in the former. The sensitivity experiments show that precipitation processes are sensitive to the changes of warm rain processes in condensation and nucleated droplet size (but less sensitive to evaporation process). Increasing droplet condensation produces the best area-averaged rain rate during weak convection period compared with the observation, suggesting a considerable bias in thermodynamics in the baseline simulation. Increasing the initial cloud droplet size causes the rain rate reduced by half, an opposite effect to that of increasing droplet condensation.« less

  1. Baseline-dependent sampling and windowing for radio interferometry: data compression, field-of-interest shaping, and outer field suppression

    NASA Astrophysics Data System (ADS)

    Atemkeng, M.; Smirnov, O.; Tasse, C.; Foster, G.; Keimpema, A.; Paragi, Z.; Jonas, J.

    2018-07-01

    Traditional radio interferometric correlators produce regular-gridded samples of the true uv-distribution by averaging the signal over constant, discrete time-frequency intervals. This regular sampling and averaging then translate to be irregular-gridded samples in the uv-space, and results in a baseline-length-dependent loss of amplitude and phase coherence, which is dependent on the distance from the image phase centre. The effect is often referred to as `decorrelation' in the uv-space, which is equivalent in the source domain to `smearing'. This work discusses and implements a regular-gridded sampling scheme in the uv-space (baseline-dependent sampling) and windowing that allow for data compression, field-of-interest shaping, and source suppression. The baseline-dependent sampling requires irregular-gridded sampling in the time-frequency space, i.e. the time-frequency interval becomes baseline dependent. Analytic models and simulations are used to show that decorrelation remains constant across all the baselines when applying baseline-dependent sampling and windowing. Simulations using MeerKAT telescope and the European Very Long Baseline Interferometry Network show that both data compression, field-of-interest shaping, and outer field-of-interest suppression are achieved.

  2. First GPS baseline results from the North Andes

    NASA Technical Reports Server (NTRS)

    Kellogg, James N.; Freymueller, Jeffrey T.; Dixon, Timothy H.; Neilan, Ruth E.; Ropain, Clemente

    1990-01-01

    The CASA Uno GPS experiment (January-February 1988) has provided the first epoch baseline measurements for the study of plate motions and crustal deformation in and around the North Andes. Two dimensional horizontal baseline repeatabilities are as good as 5 parts in 10 to the 8th for short baselines (100-1000 km), and better than 3 parts in 10 to the 8th for long baselines (greater than 1000 km). Vertical repeatabilities are typically 4-6 cm, with a weak dependence on baseline length. The expected rate of plate convergence across the Colombia Trench is 6-8cm/yr, which should be detectable by the repeat experiment planned for 1991. Expected deformation rates within the North Andes are of the order of 1 cm/yr, which may be detectable with the 1991 experiment.

  3. Extending the Operational Envelope of a Turbofan Engine Simulation into the Sub-Idle Region

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Hamley, Andrew J.; Guo, Ten-Huei; Litt, Jonathan S.

    2016-01-01

    In many non-linear gas turbine simulations, operation in the sub-idle region can lead to model instability. This paper lays out a method for extending the operational envelope of a map based gas turbine simulation to include the sub-idle region. This method develops a multi-simulation solution where the baseline component maps are extrapolated below the idle level and an alternate model is developed to serve as a safety net when the baseline model becomes unstable or unreliable. Sub-idle model development takes place in two distinct operational areas, windmilling/shutdown and purge/cranking/ startup. These models are based on derived steady state operating points with transient values extrapolated between initial (known) and final (assumed) states. Model transitioning logic is developed to predict baseline model sub-idle instability, and transition smoothly and stably to the backup sub-idle model. Results from the simulation show a realistic approximation of sub-idle behavior as compared to generic sub-idle engine performance that allows the engine to operate continuously and stably from shutdown to full power.

  4. Extending the Operational Envelope of a Turbofan Engine Simulation into the Sub-Idle Region

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes Walter; Hamley, Andrew J.; Guo, Ten-Huei; Litt, Jonathan S.

    2016-01-01

    In many non-linear gas turbine simulations, operation in the sub-idle region can lead to model instability. This paper lays out a method for extending the operational envelope of a map based gas turbine simulation to include the sub-idle region. This method develops a multi-simulation solution where the baseline component maps are extrapolated below the idle level and an alternate model is developed to serve as a safety net when the baseline model becomes unstable or unreliable. Sub-idle model development takes place in two distinct operational areas, windmilling/shutdown and purge/cranking/startup. These models are based on derived steady state operating points with transient values extrapolated between initial (known) and final (assumed) states. Model transitioning logic is developed to predict baseline model sub-idle instability, and transition smoothly and stably to the backup sub-idle model. Results from the simulation show a realistic approximation of sub-idle behavior as compared to generic sub-idle engine performance that allows the engine to operate continuously and stably from shutdown to full power.

  5. THE EFFECT OF GAMBLING ACTIVITIES ON HAPPINESS LEVELS OF NURSING HOME RESIDENTS

    PubMed Central

    Dixon, Mark R; Nastally, Becky L; Waterman, Amber

    2010-01-01

    The current study evaluated the effect of participating in simulated gambling activities on happiness levels of 3 nursing home residents. A 4-component analysis was used to measure objective responses associated with happiness during baseline, varying durations of engagement in simulated gambling activities, and 2 follow-up periods. Results indicated that all residents exhibited a higher percentage of happiness levels while engaged in simulated gambling activities compared with baseline. Follow-up assessment took place 10 min and 30 min following the intervention; no lasting effects were observed. PMID:21358915

  6. An MDOE Investigation of Chevrons for Supersonic Jet Noise Reduction

    NASA Technical Reports Server (NTRS)

    Henderson, Brenda; Bridges, James

    2010-01-01

    The impact of chevron design on the noise radiated from heated, overexpanded, supersonic jets is presented. The experiments used faceted bi-conic convergent-divergent nozzles with design Mach numbers equal to 1.51 and 1.65. The purpose of the facets was to simulate divergent seals on a military style nozzle. The nozzle throat diameter was equal to 4.5 inches. Modern Design of Experiment (MDOE) techniques were used to investigate the impact of chevron penetration, length, and width on the resulting acoustic radiation. All chevron configurations used 12 chevrons to match the number of facets in the nozzle. Most chevron designs resulted in increased broadband shock noise relative to the baseline nozzle. In the peak jet noise direction, the optimum chevron design reduced peak sound pressure levels by 4 dB relative to the baseline nozzle. The penetration was the parameter having the greatest impact on radiated noise at all observation angles. While increasing chevron penetration decreased acoustic radiation in the peak jet noise direction, broadband shock noise was adversely impacted. Decreasing chevron length increased noise at most observation angles. The impact of chevron width on radiated noise depended on frequency and observation angle.

  7. ITER Baseline Scenario with ECCD Applied to Neoclassical Tearing Modes in DIII-D

    NASA Astrophysics Data System (ADS)

    Welander, A. G.; La Haye, R. J.; Lohr, J. M.; Humphreys, D. A.; Prater, R.; Paz-Soldan, C.; Kolemen, E.; Turco, F.; Olofsson, E.

    2015-11-01

    The neoclassical tearing mode (NTM) is a magnetic island that can occur on flux surfaces where the safety factor q is a rational number. Both m/n=3/2 and 2/1 NTM's degrade confinement, and the 2/1 mode often locks to the wall and disrupts the plasma. An NTM can be suppressed by depositing electron cyclotron current drive (ECCD) on the q-surface by injecting microwave beams into the plasma from gyrotrons. Recent DIII-D experiments have studied the application of ECCD/ECRH in the ITER Baseline Scenario. The power required from the gyrotrons can be significant enough to impact the fusion gain, Q in ITER. However, if gyrotron power could be minimized or turned off in ITER when not needed, this impact would be small. In fact, tearing-stable operation at low torque has been achieved previously in DIII-D without EC power. A vision for NTM control in ITER will be described together with results obtained from simulations and experiments in DIII-D under ITER like conditions. Work supported by the US DOE under DE-FC02-04ER54698, DE-AC02-09CH11466, DE-FG02-04ER54761.

  8. Retention of colonoscopy skills after virtual reality simulator training by independent and proctored methods.

    PubMed

    Snyder, Christopher W; Vandromme, Marianne J; Tyra, Sharon L; Hawn, Mary T

    2010-07-01

    Virtual reality (VR) simulators may enhance surgical resident colonoscopy skills, but the duration of skill retention and the effects of different simulator training methods are unknown. Medical students participating in a randomized trial of independent (automated simulator feedback only) versus proctored (human expert feedback plus simulator feedback) simulator training performed a standardized VR colonoscopy scenario at baseline, at the end of training (posttraining), and after a median 4.5 months without practice (retention). Performances were scored on a 10-point scale based on expert proficiency criteria and compared for the independent and proctored groups. Thirteen trainees (8 proctored, 5 independent) were included. Performance at retention testing was significantly better than baseline (median score 10 vs. 5, P < 0.0001), and no different from posttraining (median score 10 vs. 10, P = 0.19). Score changes from baseline to retention and from posttraining to retention were no different for the proctored and independent groups. Overinsufflation and excessive force were the most common reasons for nonproficiency at retention. After proficiency-based VR simulator training, colonoscopy skills are retained for several months, regardless of whether an independent or proctored approach is used. Error avoidance skills may not be retained as well as speed and efficiency skills.

  9. Swept-Wing Ice Accretion Characterization and Aerodynamics

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.

    2013-01-01

    NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65% scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20%, 64% and 83% semispan stations of the baseline-reference wing. Three-dimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date. 1

  10. Swept-Wing Ice Accretion Characterization and Aerodynamics

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.

    2013-01-01

    NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65 percent scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20, 64 and 83 percent semispan stations of the baseline-reference wing. Threedimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date.

  11. Baseline metal enrichment from Population III star formation in cosmological volume simulations

    NASA Astrophysics Data System (ADS)

    Jaacks, Jason; Thompson, Robert; Finkelstein, Steven L.; Bromm, Volker

    2018-04-01

    We utilize the hydrodynamic and N-body code GIZMO coupled with our newly developed sub-grid Population III (Pop III) Legacy model, designed specifically for cosmological volume simulations, to study the baseline metal enrichment from Pop III star formation at z > 7. In this idealized numerical experiment, we only consider Pop III star formation. We find that our model Pop III star formation rate density (SFRD), which peaks at ˜ 10- 3 M⊙ yr- 1 Mpc- 1 near z ˜ 10, agrees well with previous numerical studies and is consistent with the observed estimates for Pop II SFRDs. The mean Pop III metallicity rises smoothly from z = 25 to 7, but does not reach the critical metallicity value, Zcrit = 10-4 Z⊙, required for the Pop III to Pop II transition in star formation mode until z ≃ 7. This suggests that, while individual haloes can suppress in situ Pop III star formation, the external enrichment is insufficient to globally terminate Pop III star formation. The maximum enrichment from Pop III star formation in star-forming dark matter haloes is Z ˜ 10-2 Z⊙, whereas the minimum found in externally enriched haloes is Z ≳ 10-7 Z⊙. Finally, mock observations of our simulated IGM enriched with Pop III metals produce equivalent widths similar to observations of an extremely metal-poor damped Lyman alpha system at z = 7.04, which is thought to be enriched by Pop III star formation only.

  12. Uncertainty of future projections of species distributions in mountainous regions.

    PubMed

    Tang, Ying; Winkler, Julie A; Viña, Andrés; Liu, Jianguo; Zhang, Yuanbin; Zhang, Xiaofeng; Li, Xiaohong; Wang, Fang; Zhang, Jindong; Zhao, Zhiqiang

    2018-01-01

    Multiple factors introduce uncertainty into projections of species distributions under climate change. The uncertainty introduced by the choice of baseline climate information used to calibrate a species distribution model and to downscale global climate model (GCM) simulations to a finer spatial resolution is a particular concern for mountainous regions, as the spatial resolution of climate observing networks is often insufficient to detect the steep climatic gradients in these areas. Using the maximum entropy (MaxEnt) modeling framework together with occurrence data on 21 understory bamboo species distributed across the mountainous geographic range of the Giant Panda, we examined the differences in projected species distributions obtained from two contrasting sources of baseline climate information, one derived from spatial interpolation of coarse-scale station observations and the other derived from fine-spatial resolution satellite measurements. For each bamboo species, the MaxEnt model was calibrated separately for the two datasets and applied to 17 GCM simulations downscaled using the delta method. Greater differences in the projected spatial distributions of the bamboo species were observed for the models calibrated using the different baseline datasets than between the different downscaled GCM simulations for the same calibration. In terms of the projected future climatically-suitable area by species, quantification using a multi-factor analysis of variance suggested that the sum of the variance explained by the baseline climate dataset used for model calibration and the interaction between the baseline climate data and the GCM simulation via downscaling accounted for, on average, 40% of the total variation among the future projections. Our analyses illustrate that the combined use of gridded datasets developed from station observations and satellite measurements can help estimate the uncertainty introduced by the choice of baseline climate information to the projected changes in species distribution.

  13. Uncertainty of future projections of species distributions in mountainous regions

    PubMed Central

    Tang, Ying; Viña, Andrés; Liu, Jianguo; Zhang, Yuanbin; Zhang, Xiaofeng; Li, Xiaohong; Wang, Fang; Zhang, Jindong; Zhao, Zhiqiang

    2018-01-01

    Multiple factors introduce uncertainty into projections of species distributions under climate change. The uncertainty introduced by the choice of baseline climate information used to calibrate a species distribution model and to downscale global climate model (GCM) simulations to a finer spatial resolution is a particular concern for mountainous regions, as the spatial resolution of climate observing networks is often insufficient to detect the steep climatic gradients in these areas. Using the maximum entropy (MaxEnt) modeling framework together with occurrence data on 21 understory bamboo species distributed across the mountainous geographic range of the Giant Panda, we examined the differences in projected species distributions obtained from two contrasting sources of baseline climate information, one derived from spatial interpolation of coarse-scale station observations and the other derived from fine-spatial resolution satellite measurements. For each bamboo species, the MaxEnt model was calibrated separately for the two datasets and applied to 17 GCM simulations downscaled using the delta method. Greater differences in the projected spatial distributions of the bamboo species were observed for the models calibrated using the different baseline datasets than between the different downscaled GCM simulations for the same calibration. In terms of the projected future climatically-suitable area by species, quantification using a multi-factor analysis of variance suggested that the sum of the variance explained by the baseline climate dataset used for model calibration and the interaction between the baseline climate data and the GCM simulation via downscaling accounted for, on average, 40% of the total variation among the future projections. Our analyses illustrate that the combined use of gridded datasets developed from station observations and satellite measurements can help estimate the uncertainty introduced by the choice of baseline climate information to the projected changes in species distribution. PMID:29320501

  14. Improved Lunar Lander Handling Qualities Through Control Response Type and Display Enhancements

    NASA Technical Reports Server (NTRS)

    Mueller, Eric Richard; Bilimoria, Karl D.; Frost, Chad Ritchie

    2010-01-01

    A piloted simulation that studied the handling qualities for a precision lunar landing task from final approach to touchdown is presented. A vehicle model based on NASA's Altair Lunar Lander was used to explore the design space around the nominal vehicle configuration to determine which combination of factors provides satisfactory pilot-vehicle performance and workload; details of the control and propulsion systems not available for that vehicle were derived from Apollo Lunar Module data. The experiment was conducted on a large motion base simulator. Eight Space Shuttle and Apollo pilot astronauts and three NASA test pilots served as evaluation pilots, providing Cooper-Harper ratings, Task Load Index ratings and qualitative comments. Each pilot flew seven combinations of control response types and three sets of displays, including two varieties of guidance and a nonguided approach. The response types included Rate Command with Attitude Hold, which was used in the original Apollo Moon landings, a Velocity Increment Command response type designed for up-and-away flight, three response types designed specifically for the vertical descent portion of the trajectory, and combinations of these. It was found that Velocity Increment Command significantly improved handling qualities when compared with the baseline Apollo design, receiving predominantly Level 1 ratings. This response type could be flown with or without explicit guidance cues, something that was very difficult with the baseline design, and resulted in approximately equivalent touchdown accuracies and propellant burn as the baseline response type. The response types designed to be used exclusively in the vertical descent portion of the trajectory did not improve handling qualities.

  15. Display system replacement baseline research report.

    DOT National Transportation Integrated Search

    2000-12-01

    This report provides baseline measurements on the Display System Replacement (DSR). These measurements followed six constructs: : safety, capacity, performance, workload, usability, and simulation fidelity. To collect these measurements, human factor...

  16. Capabilities of long-baseline experiments in the presence of a sterile neutrino

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, Debajyoti; Gandhi, Raj; Kayser, Boris

    Assuming that there is a sterile neutrino, we ask what then is the ability of long-baseline experiments to i) establish that neutrino oscillation violates CP, ii) determine the three-neutrino mass ordering, and iii) determine which CP-violating phase or phases are the cause of any CP violation that may be observed. We find that the ability to establish CP violation and to determine the mass ordering could be very substantial. However, the effects of the sterile neutrino could be quite large, and it might prove very difficult to determine which phase is responsible for an observed CP violation. We explain whymore » a sterile neutrino changes the long-baseline sensitivities to CP violation and to the mass ordering in the ways that it does. We note that long-baseline experiments can probe the presence of sterile neutrinos in a way that is different from, and complementary to, the probes of short-baseline experiments. As a result, we explore the question of how large sterile-active mixing angles need to be before long-baseline experiments can detect their effects, or how small they need to be before the interpretation of these experiments can safely disregard the possible existence of sterile neutrinos.« less

  17. Capabilities of long-baseline experiments in the presence of a sterile neutrino

    DOE PAGES

    Dutta, Debajyoti; Gandhi, Raj; Kayser, Boris; ...

    2016-11-21

    Assuming that there is a sterile neutrino, we ask what then is the ability of long-baseline experiments to i) establish that neutrino oscillation violates CP, ii) determine the three-neutrino mass ordering, and iii) determine which CP-violating phase or phases are the cause of any CP violation that may be observed. We find that the ability to establish CP violation and to determine the mass ordering could be very substantial. However, the effects of the sterile neutrino could be quite large, and it might prove very difficult to determine which phase is responsible for an observed CP violation. We explain whymore » a sterile neutrino changes the long-baseline sensitivities to CP violation and to the mass ordering in the ways that it does. We note that long-baseline experiments can probe the presence of sterile neutrinos in a way that is different from, and complementary to, the probes of short-baseline experiments. As a result, we explore the question of how large sterile-active mixing angles need to be before long-baseline experiments can detect their effects, or how small they need to be before the interpretation of these experiments can safely disregard the possible existence of sterile neutrinos.« less

  18. Adaptive Augmenting Control Flight Characterization Experiment on an F/A-18

    NASA Technical Reports Server (NTRS)

    VanZwieten, Tannen S.; Orr, Jeb S.; Wall, John H.; Gilligan, Eric T.

    2014-01-01

    This paper summarizes the Adaptive Augmenting Control (AAC) flight characterization experiments performed using an F/A-18 (TN 853). AAC was designed and developed specifically for launch vehicles, and is currently part of the baseline autopilot design for NASA's Space Launch System (SLS). The scope covered here includes a brief overview of the algorithm (covered in more detail elsewhere), motivation and benefits of flight testing, top-level SLS flight test objectives, applicability of the F/A-18 as a platform for testing a launch vehicle control design, test cases designed to fully vet the AAC algorithm, flight test results, and conclusions regarding the functionality of AAC. The AAC algorithm developed at Marshall Space Flight Center is a forward loop gain multiplicative adaptive algorithm that modifies the total attitude control system gain in response to sensed model errors or undesirable parasitic mode resonances. The AAC algorithm provides the capability to improve or decrease performance by balancing attitude tracking with the mitigation of parasitic dynamics, such as control-structure interaction or servo-actuator limit cycles. In the case of the latter, if unmodeled or mismodeled parasitic dynamics are present that would otherwise result in a closed-loop instability or near instability, the adaptive controller decreases the total loop gain to reduce the interaction between these dynamics and the controller. This is in contrast to traditional adaptive control logic, which focuses on improving performance by increasing gain. The computationally simple AAC attitude control algorithm has stability properties that are reconcilable in the context of classical frequency-domain criteria (i.e., gain and phase margin). The algorithm assumes that the baseline attitude control design is well-tuned for a nominal trajectory and is designed to adapt only when necessary. Furthermore, the adaptation is attracted to the nominal design and adapts only on an as-needed basis (see Figure 1). The MSFC algorithm design was formulated during the Constellation Program and reached a high maturity level during SLS through simulation-based development and internal and external analytical review. The AAC algorithm design has three summary-level objectives: (1) "Do no harm;" return to baseline control design when not needed, (2) Increase performance; respond to error in ability of vehicle to track command, and (3) Regain stability; respond to undesirable control-structure interaction or other parasitic dynamics. AAC has been successfully implemented as part of the Space Launch System baseline design, including extensive testing in high-fidelity 6-DOF simulations the details of which are described in [1]. The Dryden Flight Research Center's F/A-18 Full-Scale Advanced Systems Testbed (FAST) platform is used to conduct an algorithm flight characterization experiment intended to fully vet the aforementioned design objectives. FAST was specifically designed with this type of test program in mind. The onboard flight control system has full-authority experiment control of ten aerodynamic effectors and two throttles. It has production and research sensor inputs and pilot engage/disengage and real-time configuration of up to eight different experiments on a single flight. It has failure detection and automatic reversion to fail-safe mode. The F/A-18 aircraft has an experiment envelope cleared for full-authority control and maneuvering and exhibits characteristics for robust recovery from unusual attitudes and configurations aided by the presence of a qualified test pilot. The F/A-18 aircraft has relatively high mass and inertia with exceptional performance; the F/A-18 also has a large thrust-to-weight ratio, owing to its military heritage. This enables the simulation of a portion of the ascent trajectory with a high degree of dynamic similarity to a launch vehicle, and the research flight control system can simulate unstable longitudinal dynamics. Parasitic dynamics such as slosh and bending modes, as well as atmospheric disturbances, are being produced by the airframe via modification of bending filters and the use of secondary control surfaces, including leading and trailing edge flaps, symmetric ailerons, and symmetric rudders. The platform also has the ability to inject signals in flight to simulate structural mode resonances or other challenging dynamics. This platform also offers more test maneuvers and longer maneuver times than a single rocket or missile test, which provides ample opportunity to fully and repeatedly exercise all aspects of the algorithm. Prior to testing on an F/A-18, AAC was the only component of the SLS autopilot design that had not been flight tested. The testing described in this paper raises the Technology Readiness Level (TRL) early in the SLS Program and is able to demonstrate its capabilities and robustness in a flight environment.

  19. Bayesian Techniques for Comparing Time-dependent GRMHD Simulations to Variable Event Horizon Telescope Observations

    NASA Astrophysics Data System (ADS)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.

  20. BAYESIAN TECHNIQUES FOR COMPARING TIME-DEPENDENT GRMHD SIMULATIONS TO VARIABLE EVENT HORIZON TELESCOPE OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore themore » robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.« less

  1. Baccalaureate nursing students' perspectives of peer tutoring in simulation laboratory, a Q methodology study.

    PubMed

    Li, Ting; Petrini, Marcia A; Stone, Teresa E

    2018-02-01

    The study aim was to identify the perceived perspectives of baccalaureate nursing students toward the peer tutoring in the simulation laboratory. Insight into the nursing students' experiences and baseline data related to their perception of peer tutoring will assist to improve nursing education. Q methodology was applied to explore the students' perspectives of peer tutoring in the simulation laboratory. A convenience P-sample of 40 baccalaureate nursing students was used. Fifty-eight selected Q statements from each participant were classified into the shape of a normal distribution using an 11-point bipolar scale form with a range from -5 to +5. PQ Method software analyzed the collected data. Three discrete factors emerged: Factor I ("Facilitate or empower" knowledge acquisition), Factor II ("Safety Net" Support environment), and Factor III ("Mentoring" learn how to learn). The findings of this study support and indicate that peer tutoring is an effective supplementary strategy to promote baccalaureate students' knowledge acquisition, establishing a supportive safety net and facilitating their abilities to learn in the simulation laboratory. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Evaluation of a 6% hydrogen peroxide tooth-whitening gel on enamel microhardness after extended use.

    PubMed

    Toteda, Mariarosaria; Philpotts, Carole J; Cox, Trevor F; Joiner, Andrew

    2008-11-01

    To evaluate the effects of a 6% hydrogen peroxide tooth whitener, Xtra White, on sound human enamel microhardness in vitro after an extended and exaggerated simulated 8 weeks of product use. Polished human enamel specimens were prepared and baseline microhardness and color measurements determined. The enamel specimens were exposed to a fluoride-containing toothpaste for 30 seconds and then exposed to water, Xtra White, a control carbopol gel containing no hydrogen peroxide, or a carbonated beverage (each group, n = 8) for 20 minutes. Specimens were exposed to whole saliva at all other times. In order to simulate 8 weeks of extended product use, quadruple the length of the manufacturer's instructions, 112 treatments, were conducted. Microhardness measurements were taken after 2, 4, 6, and 8 weeks of simulated treatments, and color was measured after 2 and 8 weeks. The Xtra White-treated specimens showed a statistically significant (P < .0001) increase in L* and decrease in b* compared to the water-treated specimens after 2 weeks simulated use, indicating bleaching had occurred. The carbonated beverage-treated specimens were significantly softened (P = .0009) compared to baseline after only 1 treatment. The carbopol gel-treated specimens were significantly softened (P = .0028) after 2 weeks of simulated treatments compared to baseline. There were no statistically significant differences in enamel microhardness between baseline and all treatment times for XW and water groups. Xtra White does not have any deleterious effects on sound human enamel microhardness after an extended and exaggerated simulated 8 weeks of product use.

  3. War-gaming application for future space systems acquisition part 1: program and technical baseline war-gaming modeling and simulation approaches

    NASA Astrophysics Data System (ADS)

    Nguyen, Tien M.; Guillen, Andy T.

    2017-05-01

    This paper describes static Bayesian game models with "Pure" and "Mixed" games for the development of an optimum Program and Technical Baseline (PTB) solution for affordable acquisition of future space systems. The paper discusses System Engineering (SE) frameworks and analytical and simulation modeling approaches for developing the optimum PTB solutions from both the government and contractor perspectives.

  4. High-Fidelity Simulation for Advanced Cardiac Life Support Training

    PubMed Central

    Davis, Lindsay E.; Storjohann, Tara D.; Spiegel, Jacqueline J.; Beiber, Kellie M.

    2013-01-01

    Objective. To determine whether a high-fidelity simulation technique compared with lecture would produce greater improvement in advanced cardiac life support (ACLS) knowledge, confidence, and overall satisfaction with the training method. Design. This sequential, parallel-group, crossover trial randomized students into 2 groups distinguished by the sequence of teaching technique delivered for ACLS instruction (ie, classroom lecture vs high-fidelity simulation exercise). Assessment. Test scores on a written examination administered at baseline and after each teaching technique improved significantly from baseline in all groups but were highest when lecture was followed by simulation. Simulation was associated with a greater degree of overall student satisfaction compared with lecture. Participation in a simulation exercise did not improve pharmacy students’ knowledge of ACLS more than attending a lecture, but it was associated with improved student confidence in skills and satisfaction with learning and application. Conclusions. College curricula should incorporate simulation to complement but not replace lecture for ACLS education. PMID:23610477

  5. High-fidelity simulation for advanced cardiac life support training.

    PubMed

    Davis, Lindsay E; Storjohann, Tara D; Spiegel, Jacqueline J; Beiber, Kellie M; Barletta, Jeffrey F

    2013-04-12

    OBJECTIVE. To determine whether a high-fidelity simulation technique compared with lecture would produce greater improvement in advanced cardiac life support (ACLS) knowledge, confidence, and overall satisfaction with the training method. DESIGN. This sequential, parallel-group, crossover trial randomized students into 2 groups distinguished by the sequence of teaching technique delivered for ACLS instruction (ie, classroom lecture vs high-fidelity simulation exercise). ASSESSMENT. Test scores on a written examination administered at baseline and after each teaching technique improved significantly from baseline in all groups but were highest when lecture was followed by simulation. Simulation was associated with a greater degree of overall student satisfaction compared with lecture. Participation in a simulation exercise did not improve pharmacy students' knowledge of ACLS more than attending a lecture, but it was associated with improved student confidence in skills and satisfaction with learning and application. CONCLUSIONS. College curricula should incorporate simulation to complement but not replace lecture for ACLS education.

  6. Determining β-lactam exposure threshold to suppress resistance development in Gram-negative bacteria.

    PubMed

    Tam, Vincent H; Chang, Kai-Tai; Zhou, Jian; Ledesma, Kimberly R; Phe, Kady; Gao, Song; Van Bambeke, Françoise; Sánchez-Díaz, Ana María; Zamorano, Laura; Oliver, Antonio; Cantón, Rafael

    2017-05-01

    β-Lactams are commonly used for nosocomial infections and resistance to these agents among Gram-negative bacteria is increasing rapidly. Optimized dosing is expected to reduce the likelihood of resistance development during antimicrobial therapy, but the target for clinical dose adjustment is not well established. We examined the likelihood that various dosing exposures would suppress resistance development in an in vitro hollow-fibre infection model. Two strains of Klebsiella pneumoniae and two strains of Pseudomonas aeruginosa (baseline inocula of ∼10 8  cfu/mL) were examined. Various dosing exposures of cefepime, ceftazidime and meropenem were simulated in the hollow-fibre infection model. Serial samples were obtained to ascertain the pharmacokinetic simulations and viable bacterial burden for up to 120 h. Drug concentrations were determined by a validated LC-MS/MS assay and the simulated exposures were expressed as C min /MIC ratios. Resistance development was detected by quantitative culture on drug-supplemented media plates (at 3× the corresponding baseline MIC). The C min /MIC breakpoint threshold to prevent bacterial regrowth was identified by classification and regression tree (CART) analysis. For all strains, the bacterial burden declined initially with the simulated exposures, but regrowth was observed in 9 out of 31 experiments. CART analysis revealed that a C min /MIC ratio ≥3.8 was significantly associated with regrowth prevention (100% versus 44%, P  = 0.001). The development of β-lactam resistance during therapy could be suppressed by an optimized dosing exposure. Validation of the proposed target in a well-designed clinical study is warranted. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Towards Full Aircraft Airframe Noise Prediction: Detached Eddy Simulations

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Mineck, Raymond E.

    2014-01-01

    Results from a computational study on the aeroacoustic characteristics of an 18%-scale, semi-span Gulf-stream aircraft model are presented in this paper. NASA's FUN3D unstructured compressible Navier-Stokes solver was used to perform steady and unsteady simulations of the flow field associated with this high-fidelity aircraft model. Solutions were obtained for free-air at a Mach number of 0.2 with the flap deflected at 39 deg, with the main gear off and on (the two baseline configurations). Initially, the study focused on accurately predicting the prominent noise sources at both flap tips for the baseline configuration with deployed flap only. Building upon the experience gained from this initial effort, subsequent work involved the full landing configuration with both flap and main landing gear deployed. For the unsteady computations, we capitalized on the Detached Eddy Simulation capability of FUN3D to capture the complex time-dependent flow features associated with the flap and main gear. To resolve the noise sources over a broad frequency range, the tailored grid was very dense near the flap inboard and outboard tips and the region surrounding the gear. Extensive comparison of the computed steady and unsteady surface pressures with wind tunnel measurements showed good agreement for the global aerodynamic characteristics and the local flow field at the flap inboard tip. However, the computed pressure coefficients indicated that a zone of separated flow that forms in the vicinity of the outboard tip is larger in extent along the flap span and chord than measurements suggest. Computed farfield acoustic characteristics from a FW-H integral approach that used the simulated pressures on the model solid surface were in excellent agreement with corresponding measurements.

  8. Constraining Polarized Foregrounds for EoR Experiments. II. Polarization Leakage Simulations in the Avoidance Scheme

    NASA Astrophysics Data System (ADS)

    Nunhokee, C. D.; Bernardi, G.; Kohn, S. A.; Aguirre, J. E.; Thyagarajan, N.; Dillon, J. S.; Foster, G.; Grobler, T. L.; Martinot, J. Z. E.; Parsons, A. R.

    2017-10-01

    A critical challenge in the observation of the redshifted 21 cm line is its separation from bright Galactic and extragalactic foregrounds. In particular, the instrumental leakage of polarized foregrounds, which undergo significant Faraday rotation as they propagate through the interstellar medium, may harmfully contaminate the 21 cm power spectrum. We develop a formalism to describe the leakage due to instrumental widefield effects in visibility-based power spectra measured with redundant arrays, extending the delay-spectrum approach presented in Parsons et al. We construct polarized sky models and propagate them through the instrument model to simulate realistic full-sky observations with the Precision Array to Probe the Epoch of Reionization. We find that the leakage due to a population of polarized point sources is expected to be higher than diffuse Galactic polarization at any k mode for a 30 m reference baseline. For the same reference baseline, a foreground-free window at k > 0.3 h Mpc-1 can be defined in terms of leakage from diffuse Galactic polarization even under the most pessimistic assumptions. If measurements of polarized foreground power spectra or a model of polarized foregrounds are given, our method is able to predict the polarization leakage in actual 21 cm observations, potentially enabling its statistical subtraction from the measured 21 cm power spectrum.

  9. Indirect MRI of 17 o-labeled water using steady-state sequences: Signal simulation and preclinical experiment.

    PubMed

    Kudo, Kohsuke; Harada, Taisuke; Kameda, Hiroyuki; Uwano, Ikuko; Yamashita, Fumio; Higuchi, Satomi; Yoshioka, Kunihiro; Sasaki, Makoto

    2018-05-01

    Few studies have been reported for T 2 -weighted indirect 17 O imaging. To evaluate the feasibility of steady-state sequences for indirect 17 O brain imaging. Signal simulation, phantom measurements, and prospective animal experiments were performed in accordance with the institutional guidelines for animal experiments. Signal simulations of balanced steady-state free precession (bSSFP) were performed for concentrations of 17 O ranging from 0.037-1.600%. Phantom measurements with concentrations of 17 O water ranging from 0.037-1.566% were also conducted. Six healthy beagle dogs were scanned with intravenous administration of 20% 17 O-labeled water (1 mL/kg). Dynamic 3D-bSSFP scans were performed at 3T MRI. 17 O-labeled water was injected 60 seconds after the scan start, and the total scan duration was 5 minutes. Based on the result of signal simulation and phantom measurement, signal changes in the beagle dogs were measured and converted into 17 O concentrations. The 17 O concentrations were averaged for every 15 seconds, and compared to the baseline (30-45 sec) with Dunnett's multiple comparison tests. Signal simulation revealed that the relationships between 17 O concentration and the natural logarithm of relative signals were linear. The intraclass correlation coefficient between relative signals in phantom measurement and signal simulations was 0.974. In the animal experiments, significant increases in 17 O concentration (P < 0.05) were observed 60 seconds after the injection of 17 O. At the end of scanning, mean respective 17 O concentrations of 0.084 ± 0.026%, 0.117 ± 0.038, 0.082 ± 0.037%, and 0.049 ± 0.004% were noted for the cerebral cortex, cerebellar cortex, cerebral white matter, and ventricle. Dynamic steady-state sequences were feasible for indirect 17 O imaging, and absolute quantification was possible. This method can be applied for the measurement of permeability and blood flow in the brain, and for kinetic analysis of cerebrospinal fluid. 2 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2018;47:1373-1379. © 2017 International Society for Magnetic Resonance in Medicine.

  10. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Yu, Yi-Hsiang; Nielsen, Kim

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30)more » [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.« less

  11. VizieR Online Data Catalog: Kepler pipeline transit signal recovery. III. (Christiansen+, 2016)

    NASA Astrophysics Data System (ADS)

    Christiansen, J. L.; Clarke, B. D.; Burke, C. J.; Jenkins, J. M.; Bryson, S. T.; Coughlin, J. L.; Mullally, F.; Thompson, S. E.; Twicken, J. D.; Batalha, N. M.; Haas, M. R.; Catanzarite, J.; Campbell, J. R.; Uddin, A. K.; Zamudio, K.; Smith, J. C.; Henze, C. E.

    2018-03-01

    Here we describe the third transit injection experiment, which tests the entire Kepler observing baseline (Q1-Q17) for the first time across all 84 CCD channels. It was performed to measure the sensitivity of the Kepler pipeline used to generate the Q1-Q17 Data Release 24 (DR24) catalog of Kepler Objects of Interest (Coughlin et al. 2016, J/ApJS/224/12) available at the NASA Exoplanet Archive (Akeson et al. 2013PASP..125..989A). The average detection efficiency describes the likelihood that the Kepler pipeline would successfully recover a given transit signal. To measure this property we perform a Monte Carlo experiment where we inject the signatures of simulated transiting planets around 198154 target stars, one per star, across the focal plane starting with the Q1-Q17 DR24 calibrated pixels. The simulated transits are generated using the Mandel & Agol (2002ApJ...580L.171M) model. Of the injections, 159013 resulted in three or more injected transits (the minimum required for detection by the pipeline) and were used for the subsequent analysis. (1 data file).

  12. Effects of Geometric Details on Slat Noise Generation and Propagation

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Lockard, David P.

    2009-01-01

    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations show that the presence of the "blade" seal at the cusp in the simulated geometry significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, the computations suggest that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

  13. A method to detect layover and shadow based on distributed spaceborne single-baseline InSAR

    NASA Astrophysics Data System (ADS)

    Yun, Ren; Huanxin, Zou; Shilin, Zhou; Hao, Sun; Kefeng, Ji

    2014-03-01

    Layover and Shadow are inevitable phenomenena in InSAR, which seriously destroy the continuity of interferometric phase images and present difficulties in the follow-up phase unwrapping. Thus, it's significant to detect layover and shadow. This paper presents an approach to detect layover and shadow using the auto-correlation matrix and amplitude of the two images. The method can make full use of the spatial information of neighboring pixels and effectively detect layover and shadow regions in the case of low registration accuracy. Experiment result on the simulated data verifies effectiveness of the algorithm.

  14. Determination of Earth outgoing radiation using a constellation of satellites

    NASA Astrophysics Data System (ADS)

    Gristey, Jake; Chiu, Christine; Gurney, Robert; Han, Shin-Chan; Morcrette, Cyril

    2017-04-01

    The outgoing radiation fluxes at the top of the atmosphere, referred to as Earth outgoing radiation (EOR), constitute a vital component of the Earth's energy budget. This EOR exhibits strong diurnal signatures and is inherently connected to the rapidly evolving scene from which the radiation originates, so our ability to accurately monitor EOR with sufficient temporal resolution and spatial coverage is crucial for weather and climate studies. Despite vast improvements in satellite observations in recent decades, achieving these criteria remains challenging from current measurements. A technology revolution in small satellites and sensor miniaturisation has created a new and exciting opportunity for a novel, viable and sustainable observation strategy from a constellation of satellites, capable of providing both global coverage and high temporal resolution simultaneously. To explore the potential of a constellation approach for observing EOR we perform a series of theoretical simulation experiments. Using the results from these simulation experiments, we will demonstrate a baseline constellation configuration capable of accurately monitoring global EOR at unprecedented temporal resolution. We will also show whether it is possible to reveal synoptic scale, fast evolving phenomena by applying a deconvolution technique to the simulated measurements. The ability to observe and understand the relationship between these phenomena and changes in EOR is of fundamental importance in constraining future warming of our climate system.

  15. A Multispecialty Evaluation of Thiel Cadavers for Surgical Training.

    PubMed

    Yiasemidou, Marina; Roberts, David; Glassman, Daniel; Tomlinson, James; Biyani, Shekhar; Miskovic, Danilo

    2017-05-01

    Changes in UK legislation allow for surgical procedures to be performed on cadavers. The aim of this study was to assess Thiel cadavers as high-fidelity simulators and to examine their suitability for surgical training. Surgeons from various specialties were invited to attend a 1 day dissection workshop using Thiel cadavers. The surgeons completed a baseline questionnaire on cadaveric simulation. At the end of the workshop, they completed a similar questionnaire based on their experience with Thiel cadavers. Comparing the answers in the pre- and post-workshop questionnaires assessed whether using Thiel cadavers had changed the surgeons' opinions of cadaveric simulation. According to the 27 participants, simulation is important for surgical training and a full-procedure model is beneficial for all levels of training. Currently, there is dissatisfaction with existing models and a need for high-fidelity alternatives. After the workshop, surgeons concluded that Thiel cadavers are suitable for surgical simulation (p = 0.015). Thiel were found to be realistic (p < 0.001) to have reduced odour (p = 0.002) and be more cost-effective (p = 0.003). Ethical constraints were considered to be small. Thiel cadavers are suitable for training in most surgical specialties.

  16. Simulations of multi-component explosives using simplified geometric arrangements of their constituents

    NASA Astrophysics Data System (ADS)

    Butler, George; Pemberton, Steven

    2017-06-01

    Modeling and simulation is extremely important in the design and formulation of new explosives and explosive devices due to the high cost of experiment-based development. However, the efficacy of simulations depends on the accuracy of the equations of state (EOS) and reactive burn models used to characterize the energetic materials. We investigate the possibility of using the components of an explosive fill as discrete elements in a simulation, based on the relative amounts of the constituents. This is accomplished by assembling a mosaic, or ``checkerboard'', in which each cell comprises the relative amounts of the constituents as in the mixture; it is assumed that each constituent has a well-defined set of simulation parameters. We do not consider the underlying microstructure, and recognize there will be limitations to the usefulness of this technique. We are interested in determining whether there are applications for this technique that might prove useful. As a test of the concept, two binary explosives were considered. We considered shapes for a periodic cellular structure and compared results from the checkerboards with those of the baseline explosives; detonation rates, cylinder expansion, and gap test predictions were compared.

  17. Synthesizing SMOS Zero-Baselines with Aquarius Brightness Temperature Simulator

    NASA Technical Reports Server (NTRS)

    Colliander, A.; Dinnat, E.; Le Vine, D.; Kainulainen, J.

    2012-01-01

    SMOS [1] and Aquarius [2] are ESA and NASA missions, respectively, to make L-band measurements from the Low Earth Orbit. SMOS makes passive measurements whereas Aquarius measures both passive and active. SMOS was launched in November 2009 and Aquarius in June 2011.The scientific objectives of the missions are overlapping: both missions aim at mapping the global Sea Surface Salinity (SSS). Additionally, SMOS mission produces soil moisture product (however, Aquarius data will eventually be used for retrieving soil moisture too). The consistency of the brightness temperature observations made by the two instruments is essential for long-term studies of SSS and soil moisture. For resolving the consistency, the calibration of the instruments is the key. The basis of the SMOS brightness temperature level is the measurements performed with the so-called zero-baselines [3]; SMOS employs an interferometric measurement technique which forms a brightness temperature image from several baselines constructed by combination of multiple receivers in an array; zero-length baseline defines the overall brightness temperature level. The basis of the Aquarius brightness temperature level is resolved from the brightness temperature simulator combined with ancillary data such as antenna patterns and environmental models [4]. Consistency between the SMOS zero-baseline measurements and the simulator output would provide a robust basis for establishing the overall comparability of the missions.

  18. Estimability of geodetic parameters from space VLBI observables

    NASA Technical Reports Server (NTRS)

    Adam, Jozsef

    1990-01-01

    The feasibility of space very long base interferometry (VLBI) observables for geodesy and geodynamics is investigated. A brief review of space VLBI systems from the point of view of potential geodetic application is given. A selected notational convention is used to jointly treat the VLBI observables of different types of baselines within a combined ground/space VLBI network. The basic equations of the space VLBI observables appropriate for convariance analysis are derived and included. The corresponding equations for the ground-to-ground baseline VLBI observables are also given for a comparison. The simplified expression of the mathematical models for both space VLBI observables (time delay and delay rate) include the ground station coordinates, the satellite orbital elements, the earth rotation parameters, the radio source coordinates, and clock parameters. The observation equations with these parameters were examined in order to determine which of them are separable or nonseparable. Singularity problems arising from coordinate system definition and critical configuration are studied. Linear dependencies between partials are analytically derived. The mathematical models for ground-space baseline VLBI observables were tested with simulation data in the frame of some numerical experiments. Singularity due to datum defect is confirmed.

  19. The Effects of Synthetic and Enhanced Vision Technologies for Lunar Landings

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Norman, Robert M.; Prinzel, Lawrence J., III; Bailey, Randall E.; Arthur, Jarvis J., III; Shelton, Kevin J.; Williams, Steven P.

    2009-01-01

    Eight pilots participated as test subjects in a fixed-based simulation experiment to evaluate advanced vision display technologies such as Enhanced Vision (EV) and Synthetic Vision (SV) for providing terrain imagery on flight displays in a Lunar Lander Vehicle. Subjects were asked to fly 20 approaches to the Apollo 15 lunar landing site with four different display concepts - Baseline (symbology only with no terrain imagery), EV only (terrain imagery from Forward Looking Infra Red, or FLIR, and LIght Detection and Ranging, or LIDAR, sensors), SV only (terrain imagery from onboard database), and Fused EV and SV concepts. As expected, manual landing performance was excellent (within a meter of landing site center) and not affected by the inclusion of EV or SV terrain imagery on the Lunar Lander flight displays. Subjective ratings revealed significant situation awareness improvements with the concepts employing EV and/or SV terrain imagery compared to the Baseline condition that had no terrain imagery. In addition, display concepts employing EV imagery (compared to the SV and Baseline concepts which had none) were significantly better for pilot detection of intentional but unannounced navigation failures since this imagery provided an intuitive and obvious visual methodology to monitor the validity of the navigation solution.

  20. Pilot study comparing simulation-based and didactic lecture-based critical care teaching for final-year medical students.

    PubMed

    Solymos, Orsolya; O'Kelly, Patrick; Walshe, Criona M

    2015-10-21

    Simulation-based medical education has rapidly evolved over the past two decades, despite this, there are few published reports of its use in critical care teaching. We hypothesised that simulation-based teaching of a critical care topic to final-year medical students is superior to lecture-based teaching. Thirty-nine final-year medical students were randomly assigned to either simulation-based or lecture-based teaching in the chosen critical care topic. The study was conducted over a 6-week period. Efficacy of each teaching method was compared through use of multiple choice questionnaires (MCQ) - baseline, post-teaching and 2 week follow-up. Student satisfaction was evaluated by means of a questionnaire. Feasibility and resource requirements were documented by teachers. Eighteen students were randomised to simulation-based, and 21 to lecture-based teaching. There were no differences in age and gender between groups (p > 0.05). Simulation proved more resource intensive requiring specialised equipment, two instructors, and increased duration of teaching sessions (126.7 min (SD = 4.71) vs 68.3 min (SD = 2.36)). Students ranked simulation-based teaching higher with regard to enjoyment (p = 0.0044), interest (p = 0.0068), relevance to taught subject (p = 0.0313), ease of understanding (p = 0.0476) and accessibility to posing questions (p = 0.001). Both groups demonstrated improvement in post-teaching MCQ from baseline (p = 0.0002), with greater improvement seen among the simulation group (p = 0.0387), however, baseline scores were higher among the lecture group. The results of the 2-week follow-up MCQ and post-teaching MCQ were not statistically significant when each modality were compared. Simulation was perceived as more enjoyable by students. Although there was a greater improvement in post-teaching MCQ among the simulator group, baseline scores were higher among lecture group which limits interpretation of efficacy. Simulation is more resource intensive, as demonstrated by increased duration and personnel required, and this may have affected our results. The current pilot may be of use in informing future studies in this area.

  1. Future long-baseline neutrino oscillations: View from Asia

    NASA Astrophysics Data System (ADS)

    Hayato, Yoshinari

    2015-07-01

    Accelerator based long-baseline neutrino oscillation experiments have been playing important roles in revealing the nature of neutrinos. However, it turned out that the current experiments are not sufficient to study two major remaining problems, the CP violation in the lepton sector and the mass hierarchy of neutrinos. Therefore, several new experiments have been proposed. Among of them, two accelerator based long-baseline neutrino oscillation experiments, the J-PARC neutrino beam and Hyper-Kamiokande, and MOMENT, have been proposed in Asia. These two projects are reviewed in this article.

  2. Reorganization of Equivalence Classes: Evidence for Contextual Control by Baseline Reviews before Probes

    ERIC Educational Resources Information Center

    Garotti, Marilice; De Rose, Julio C.

    2007-01-01

    Two experiments investigated baseline reviews as a relevant variable in reorganization of equivalence classes. After formation of three 4-member classes, participants learned reversals of baseline conditional discriminations and expanded the classes to 5 members each. In Experiment 1, 4 students responded on equivalence probes without baseline…

  3. Ensembles of novelty detection classifiers for structural health monitoring using guided waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dib, Gerges; Karpenko, Oleksii; Koricho, Ermias

    Guided wave structural health monitoring uses sparse sensor networks embedded in sophisticated structures for defect detection and characterization. The biggest challenge of those sensor networks is developing robust techniques for reliable damage detection under changing environmental and operating conditions. To address this challenge, we develop a novelty classifier for damage detection based on one class support vector machines. We identify appropriate features for damage detection and introduce a feature aggregation method which quadratically increases the number of available training observations.We adopt a two-level voting scheme by using an ensemble of classifiers and predictions. Each classifier is trained on a differentmore » segment of the guided wave signal, and each classifier makes an ensemble of predictions based on a single observation. Using this approach, the classifier can be trained using a small number of baseline signals. We study the performance using monte-carlo simulations of an analytical model and data from impact damage experiments on a glass fiber composite plate.We also demonstrate the classifier performance using two types of baseline signals: fixed and rolling baseline training set. The former requires prior knowledge of baseline signals from all environmental and operating conditions, while the latter does not and leverages the fact that environmental and operating conditions vary slowly over time and can be modeled as a Gaussian process.« less

  4. How Much Can We Learn from a Single Chromatographic Experiment? A Bayesian Perspective.

    PubMed

    Wiczling, Paweł; Kaliszan, Roman

    2016-01-05

    In this work, we proposed and investigated a Bayesian inference procedure to find the desired chromatographic conditions based on known analyte properties (lipophilicity, pKa, and polar surface area) using one preliminary experiment. A previously developed nonlinear mixed effect model was used to specify the prior information about a new analyte with known physicochemical properties. Further, the prior (no preliminary data) and posterior predictive distribution (prior + one experiment) were determined sequentially to search towards the desired separation. The following isocratic high-performance reversed-phase liquid chromatographic conditions were sought: (1) retention time of a single analyte within the range of 4-6 min and (2) baseline separation of two analytes with retention times within the range of 4-10 min. The empirical posterior Bayesian distribution of parameters was estimated using the "slice sampling" Markov Chain Monte Carlo (MCMC) algorithm implemented in Matlab. The simulations with artificial analytes and experimental data of ketoprofen and papaverine were used to test the proposed methodology. The simulation experiment showed that for a single and two randomly selected analytes, there is 97% and 74% probability of obtaining a successful chromatogram using none or one preliminary experiment. The desired separation for ketoprofen and papaverine was established based on a single experiment. It was confirmed that the search for a desired separation rarely requires a large number of chromatographic analyses at least for a simple optimization problem. The proposed Bayesian-based optimization scheme is a powerful method of finding a desired chromatographic separation based on a small number of preliminary experiments.

  5. Integrated orbit and attitude hardware-in-the-loop simulations for autonomous satellite formation flying

    NASA Astrophysics Data System (ADS)

    Park, Han-Earl; Park, Sang-Young; Kim, Sung-Woo; Park, Chandeok

    2013-12-01

    Development and experiment of an integrated orbit and attitude hardware-in-the-loop (HIL) simulator for autonomous satellite formation flying are presented. The integrated simulator system consists of an orbit HIL simulator for orbit determination and control, and an attitude HIL simulator for attitude determination and control. The integrated simulator involves four processes (orbit determination, orbit control, attitude determination, and attitude control), which interact with each other in the same way as actual flight processes do. Orbit determination is conducted by a relative navigation algorithm using double-difference GPS measurements based on the extended Kalman filter (EKF). Orbit control is performed by a state-dependent Riccati equation (SDRE) technique that is utilized as a nonlinear controller for the formation control problem. Attitude is determined from an attitude heading reference system (AHRS) sensor, and a proportional-derivative (PD) feedback controller is used to control the attitude HIL simulator using three momentum wheel assemblies. Integrated orbit and attitude simulations are performed for a formation reconfiguration scenario. By performing the four processes adequately, the desired formation reconfiguration from a baseline of 500-1000 m was achieved with meter-level position error and millimeter-level relative position navigation. This HIL simulation demonstrates the performance of the integrated HIL simulator and the feasibility of the applied algorithms in a real-time environment. Furthermore, the integrated HIL simulator system developed in the current study can be used as a ground-based testing environment to reproduce possible actual satellite formation operations.

  6. Modeling Nonlinear Errors in Surface Electromyography Due To Baseline Noise: A New Methodology

    PubMed Central

    Law, Laura Frey; Krishnan, Chandramouli; Avin, Keith

    2010-01-01

    The surface electromyographic (EMG) signal is often contaminated by some degree of baseline noise. It is customary for scientists to subtract baseline noise from the measured EMG signal prior to further analyses based on the assumption that baseline noise adds linearly to the observed EMG signal. The stochastic nature of both the baseline and EMG signal, however, may invalidate this assumption. Alternately, “true” EMG signals may be either minimally or nonlinearly affected by baseline noise. This information is particularly relevant at low contraction intensities when signal-to-noise ratios (SNR) may be lowest. Thus, the purpose of this simulation study was to investigate the influence of varying levels of baseline noise (approximately 2 – 40 % maximum EMG amplitude) on mean EMG burst amplitude and to assess the best means to account for signal noise. The simulations indicated baseline noise had minimal effects on mean EMG activity for maximum contractions, but increased nonlinearly with increasing noise levels and decreasing signal amplitudes. Thus, the simple baseline noise subtraction resulted in substantial error when estimating mean activity during low intensity EMG bursts. Conversely, correcting EMG signal as a nonlinear function of both baseline and measured signal amplitude provided highly accurate estimates of EMG amplitude. This novel nonlinear error modeling approach has potential implications for EMG signal processing, particularly when assessing co-activation of antagonist muscles or small amplitude contractions where the SNR can be low. PMID:20869716

  7. Optimal Design of Passive Flow Control for a Boundary-Layer-Ingesting Offset Inlet Using Design-of-Experiments

    NASA Technical Reports Server (NTRS)

    Allan, Brian G.; Owens, Lewis R.; Lin, John C.

    2006-01-01

    This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan-face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan-face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3- Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCP(sub avg), the circumferential distortion level at the engine fan-face.

  8. Optimal Design of Passive Flow Control for a Boundary-Layer-Ingesting Offset Inlet Using Design-of-Experiments

    NASA Technical Reports Server (NTRS)

    Allan, Brian G.; Owens, Lewis R., Jr.; Lin, John C.

    2006-01-01

    This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3-Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCPavg, the circumferential distortion level at the engine fan face.

  9. Impact of geoengineered aerosols on the troposphere and stratosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tilmes, S.; Garcia, Rolando R.; Kinnison, Douglas E.

    2009-06-27

    A coupled chemistry climate model, the Whole Atmosphere Community Climate Model was used to perform a transient climate simulation to quantify the impact of geoengineered aerosols on atmospheric processes. In contrast to previous model studies, the impact on stratospheric chemistry, including heterogeneous chemistry in the polar regions, is considered in this simulation. In the geoengineering simulation, a constant stratospheric distribution of volcanic-sized, liquid sulfate aerosols is imposed in the period 2020–2050, corresponding to an injection of 2 Tg S/a. The aerosol cools the troposphere compared to a baseline simulation. Assuming an Intergovernmental Panel on Climate Change A1B emission scenario, globalmore » warming is delayed by about 40 years in the troposphere with respect to the baseline scenario. Large local changes of precipitation and temperatures may occur as a result of geoengineering. Comparison with simulations carried out with the Community Atmosphere Model indicates the importance of stratospheric processes for estimating the impact of stratospheric aerosols on the Earth’s climate. Changes in stratospheric dynamics and chemistry, especially faster heterogeneous reactions, reduce the recovery of the ozone layer in middle and high latitudes for the Southern Hemisphere. In the geoengineering case, the recovery of the Antarctic ozone hole is delayed by about 30 years on the basis of this model simulation. For the Northern Hemisphere, a onefold to twofold increase of the chemical ozone depletion occurs owing to a simulated stronger polar vortex and colder temperatures compared to the baseline simulation, in agreement with observational estimates.« less

  10. On the Feasibility of Intense Radial Velocity Surveys for Earth-twin Discoveries

    NASA Astrophysics Data System (ADS)

    Hall, Richard D.; Thompson, Samantha J.; Handley, Will; Queloz, Didier

    2018-06-01

    This work assesses the potential capability of the next generation of high-precision Radial Velocity (RV) instruments for Earth-twin exoplanet detection. From the perspective of the importance of data sampling, the Terra Hunting Experiment aims to do this through an intense series of nightly RV observations over a long baseline on a carefully selected target list, via the brand-new instrument HARPS3. This paper describes an end-to-end simulation of generating and processing such data to help us better understand the impact of uncharacterised stellar noise in the recovery of Earth-mass planets with orbital periods of the order of many months. We consider full Keplerian systems, realistic simulated stellar noise, instrument white noise, and location-specific weather patterns for our observation schedules. We use Bayesian statistics to assess various planetary models fitted to the synthetic data, and compare the successful planet recovery of the Terra Hunting Experiment schedule with a typical reference survey. We find that the Terra Hunting Experiment can detect Earth-twins in the habitable zones of solar-type stars, in single and multi-planet systems, and in the presence of stellar signals. Also that it out-performs a typical reference survey on accuracy of recovered parameters, and that it performs comparably to an uninterrupted space-based schedule.

  11. Baseline Optimization for the Measurement of CP Violation, Mass Hierarchy, and $$\\theta_{23}$$ Octant in a Long-Baseline Neutrino Oscillation Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bass, M.; Bishai, M.; Cherdack, D.

    2015-03-19

    Next-generation long-baseline electron neutrino appearance experiments will seek to discover C P violation, determine the mass hierarchy and resolve the θ 23 octant. In light of the recent precision measurements of θ 13 , we consider the sensitivity of these measurements in a study to determine the optimal baseline, including practical considerations regarding beam and detector performance. We conclude that a detector at a baseline of at least 1000 km in a wide-band muon neutrino beam is themore » optimal configuration.« less

  12. Use of Ventricular Assist Device in Univentricular Physiology: The Role of Lumped Parameter Models.

    PubMed

    Di Molfetta, Arianna; Ferrari, Gianfranco; Filippelli, Sergio; Fresiello, Libera; Iacobelli, Roberta; Gagliardi, Maria G; Amodeo, Antonio

    2016-05-01

    Failing single-ventricle (SV) patients might benefit from ventricular assist devices (VADs) as a bridge to heart transplantation. Considering the complex physiopathology of SV patients and the lack of established experience, the aim of this work was to realize and test a lumped parameter model of the cardiovascular system, able to simulate SV hemodynamics and VAD implantation effects. Data of 30 SV patients (10 Norwood, 10 Glenn, and 10 Fontan) were retrospectively collected and used to simulate patients' baseline. Then, the effects of VAD implantation were simulated. Additionally, both the effects of ventricular assistance and cavopulmonary assistance were simulated in different pathologic conditions on Fontan patients, including systolic dysfunction, diastolic dysfunction, and pulmonary vascular resistance increment. The model can reproduce patients' baseline well. Simulation results suggest that the implantation of VAD: (i) increases the cardiac output (CO) in all the three palliation conditions (Norwood 77.2%, Glenn 38.6%, and Fontan 17.2%); (ii) decreases the SV external work (SVEW) (Norwood 55%, Glenn 35.6%, and Fontan 41%); (iii) increases the mean pulmonary arterial pressure (Pap) (Norwood 39.7%, Glenn 12.1%, and Fontan 3%). In Fontan circulation, with systolic dysfunction, the left VAD (LVAD) increases CO (35%), while the right VAD (RVAD) determines a decrement of inferior vena cava pressure (Pvci) (39%) with 34% increment of CO. With diastolic dysfunction, the LVAD increases CO (42%) and the RVAD decreases the Pvci. With pulmonary vascular resistance increment, the RVAD allows the highest CO (50%) increment with the highest decrement of Pvci (53%). The single ventricular external work (SVEW) increases (decreases) increasing the VAD speed in cavopulmonary (ventricular) assistance. Numeric models could be helpful in this challenging and innovative field to support patients and VAD selection to optimize the clinical outcome and personalize the therapy. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  13. Haemodynamic effects of adrenaline (epinephrine) depend on chest compression quality during cardiopulmonary resuscitation in pigs.

    PubMed

    Pytte, Morten; Kramer-Johansen, Jo; Eilevstjønn, Joar; Eriksen, Morten; Strømme, Taevje A; Godang, Kristin; Wik, Lars; Steen, Petter Andreas; Sunde, Kjetil

    2006-12-01

    Adrenaline (epinephrine) is used during cardiopulmonary resuscitation (CPR) based on animal experiments without supportive clinical data. Clinically CPR was reported recently to have much poorer quality than expected from international guidelines and what is generally done in laboratory experiments. We have studied the haemodynamic effects of adrenaline during CPR with good laboratory quality and with quality simulating clinical findings and the feasibility of monitoring these effects through VF waveform analysis. After 4 min of cardiac arrest, followed by 4 min of basic life support, 14 pigs were randomised to ClinicalCPR (intermittent manual chest compressions, compression-to-ventilation ratio 15:2, compression depth 30-38 mm) or LabCPR (continuous mechanical chest compressions, 12 ventilations/min, compression depth 45 mm). Adrenaline 0.02 mg/kg was administered 30 s thereafter. Plasma adrenaline concentration peaked earlier with LabCPR than with ClinicalCPR, median (range), 90 (30, 150) versus 150 (90, 270) s (p = 0.007), respectively. Coronary perfusion pressure (CPP) and cortical cerebral blood flow (CCBF) increased and femoral blood flow (FBF) decreased after adrenaline during LabCPR (mean differences (95% CI) CPP 17 (6, 29) mmHg (p = 0.01), FBF -5.0 (-8.8, -1.2) ml min(-1) (p = 0.02) and median difference CCBF 12% of baseline (p = 0.04)). There were no significant effects during ClinicalCPR (mean differences (95% CI) CPP 4.7 (-3.2, 13) mmHg (p = 0.2), FBF -0.2 (-4.6, 4.2) ml min(-1)(p = 0.9) and CCBF 3.6 (-1.8, 9.0)% of baseline (p = 0.15)). Slope VF waveform analysis reflected changes in CPP. Adrenaline improved haemodynamics during laboratory quality CPR in pigs, but not with quality simulating clinically reported CPR performance.

  14. Psychomotor testing predicts rate of skill acquisition for proficiency-based laparoscopic skills training.

    PubMed

    Stefanidis, Dimitrios; Korndorffer, James R; Black, F William; Dunne, J Bruce; Sierra, Rafael; Touchard, Cheri L; Rice, David A; Markert, Ronald J; Kastl, Peter R; Scott, Daniel J

    2006-08-01

    Laparoscopic simulator training translates into improved operative performance. Proficiency-based curricula maximize efficiency by tailoring training to meet the needs of each individual; however, because rates of skill acquisition vary widely, such curricula may be difficult to implement. We hypothesized that psychomotor testing would predict baseline performance and training duration in a proficiency-based laparoscopic simulator curriculum. Residents (R1, n = 20) were enrolled in an IRB-approved prospective study at the beginning of the academic year. All completed the following: a background information survey, a battery of 12 innate ability measures (5 motor, and 7 visual-spatial), and baseline testing on 3 validated simulators (5 videotrainer [VT] tasks, 12 virtual reality [minimally invasive surgical trainer-virtual reality, MIST-VR] tasks, and 2 laparoscopic camera navigation [LCN] tasks). Participants trained to proficiency, and training duration and number of repetitions were recorded. Baseline test scores were correlated to skill acquisition rate. Cutoff scores for each predictive test were calculated based on a receiver operator curve, and their sensitivity and specificity were determined in identifying slow learners. Only the Cards Rotation test correlated with baseline simulator ability on VT and LCN. Curriculum implementation required 347 man-hours (6-person team) and 795,000 dollars of capital equipment. With an attendance rate of 75%, 19 of 20 residents (95%) completed the curriculum by the end of the academic year. To complete training, a median of 12 hours (range, 5.5-21), and 325 repetitions (range, 171-782) were required. Simulator score improvement was 50%. Training duration and repetitions correlated with prior video game and billiard exposure, grooved pegboard, finger tap, map planning, Rey Figure Immediate Recall score, and baseline performance on VT and LCN. The map planning cutoff score proved most specific in identifying slow learners. Proficiency-based laparoscopic simulator training provides improvement in performance and can be effectively implemented as a routine part of resident education, but may require significant resources. Although psychomotor testing may be of limited value in the prediction of baseline laparoscopic performance, its importance may lie in the prediction of the rapidity of skill acquisition. These tests may be useful in optimizing curricular design by allowing the tailoring of training to individual needs.

  15. DEEP UNDERGROUND NEUTRINO EXPERIMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Robert J.

    2016-03-03

    The Deep Underground Neutrino Experiment (DUNE) collaboration will perform an experiment centered on accelerator-based long-baseline neutrino studies along with nucleon decay and topics in neutrino astrophysics. It will consist of a modular 40-kt (fiducial) mass liquid argon TPC detector located deep underground at the Sanford Underground Research Facility in South Dakota and a high-resolution near detector at Fermilab in Illinois. This conguration provides a 1300-km baseline in a megawatt-scale neutrino beam provided by the Fermilab- hosted international Long-Baseline Neutrino Facility.

  16. Development of dual PZT transducers for reference-free crack detection in thin plate structures.

    PubMed

    Sohn, Hoon; Kim, Seuno Bum

    2010-01-01

    A new Lamb-wave-based nondestructive testing (NDT) technique, which does not rely on previously stored baseline data, is developed for crack monitoring in plate structures. Commonly, the presence of damage is identified by comparing "current data" measured from a potentially damaged stage of a structure with "baseline data" previously obtained at the intact condition of the structure. In practice, structural defects typically take place long after collection of the baseline data, and the baseline data can be also affected by external loading, temperature variations, and changing boundary conditions. To eliminate the dependence on the baseline data comparison, the authors previously developed a reference-free NDT technique using 2 pairs of collocated lead zirconate titanate (PZT) transducers placed on both sides of a plate. This reference-free technique is further advanced in the present study by the necessity of attaching transducers only on a single surface of a structure for certain applications such as aircraft. To achieve this goal, a new design of PZT transducers called dual PZT transducers is proposed. Crack formation creates Lamb wave mode conversion due to a sudden thickness change of the structure. This crack appearance is instantly detected from the measured Lamb wave signals using the dual PZT transducers. This study also suggests a reference-free statistical approach that enables damage classification using only the currently measured data set. Numerical simulations and experiments were conducted using an aluminum plate with uniform thickness and fundamental Lamb waves modes to demonstrate the applicability of the proposed technique to reference-free crack detection.

  17. Measuring the sterile neutrino CP phase at DUNE and T2HK

    NASA Astrophysics Data System (ADS)

    Choubey, Sandhya; Dutta, Debajyoti; Pramanik, Dipyaman

    2018-04-01

    The CP phases associated with the sterile neutrino cannot be measured in the dedicated short-baseline experiments being built to test the sterile neutrino hypothesis. On the other hand, these phases can be measured in long-baseline experiments, even though the main goal of these experiments is not to test or measure sterile neutrino parameters. In particular, the sterile neutrino phase δ _{24} affects the charged-current electron appearance data in long-baseline experiment. In this paper we show how well the sterile neutrino phase δ _{24} can be measured by the next-generation long-baseline experiments DUNE, T2HK (and T2HKK). We also show the expected precision with which this sterile phase can be measured by combining the DUNE data with data from T2HK or T2HKK. The T2HK experiment is seen to be able to measure the sterile phase δ _{24} to a reasonable precision. We also present the sensitivity of these experiments to the sterile mixing angles, both by themselves, as well as when DUNE is combined with T2HK or T2HKK.

  18. VLBI geodesy - 2 parts-per-billion precision in length determinations for transcontinental baselines

    NASA Technical Reports Server (NTRS)

    Davis, J. L.; Herring, T. A.; Shapiro, I. I.

    1988-01-01

    VLBI was to make twenty-two independent measurements, between September 1984 and December 1986, of the length of the 3900-km baseline between the Mojave site in California and the Haystack/Westford site in Massachusetts. These experiments differ from the typical geodetic VLBI experiments in that a large fraction of observations is obtained at elevation angles between 4 and 10 deg. Data from these low elevation angles allow the vertical coordinate of site position, and hence the baseline length, to be estimated with greater precision. For the sixteen experiments processed thus far, the weighted root-mean-square scatter of the estimates of the baseline length is 8 mm.

  19. Numerical simulation and optimal design of Segmented Planar Imaging Detector for Electro-Optical Reconnaissance

    NASA Astrophysics Data System (ADS)

    Chu, Qiuhui; Shen, Yijie; Yuan, Meng; Gong, Mali

    2017-12-01

    Segmented Planar Imaging Detector for Electro-Optical Reconnaissance (SPIDER) is a cutting-edge electro-optical imaging technology to realize miniaturization and complanation of imaging systems. In this paper, the principle of SPIDER has been numerically demonstrated based on the partially coherent light theory, and a novel concept of adjustable baseline pairing SPIDER system has further been proposed. Based on the results of simulation, it is verified that the imaging quality could be effectively improved by adjusting the Nyquist sampling density, optimizing the baseline pairing method and increasing the spectral channel of demultiplexer. Therefore, an adjustable baseline pairing algorithm is established for further enhancing the image quality, and the optimal design procedure in SPIDER for arbitrary targets is also summarized. The SPIDER system with adjustable baseline pairing method can broaden its application and reduce cost under the same imaging quality.

  20. Dust extinction in the first galaxies

    NASA Astrophysics Data System (ADS)

    Jaacks, Jason; Finkelstein, Steven L.; Bromm, Volker

    2018-04-01

    Using cosmological volume simulations and a custom built sub-grid model for Population III (Pop III) star formation, we examine the baseline dust extinction in the first galaxies due to Pop III metal enrichment in the first billion years of cosmic history. We find that although the most enriched, high-density lines of sight in primordial galaxies can experience a measurable amount of extinction from Pop III dust [E(B - V)max = 0.07, AV, max ≈ 0.28], the average extinction is very low with ≲ 10-3. We derive a power-law relationship between dark matter halo mass and extinction of E(B-V)∝ M_halo^{0.80}. Performing a Monte Carlo parameter study, we establish the baseline reddening of the ultraviolet spectra of dwarf galaxies at high redshift due to Pop III enrichment only. With this method, we find <βUV> - 2.51 ± 0.07, which is both nearly halo mass and redshift independent.

  1. Flight Test Evaluation of Synthetic Vision Concepts at a Terrain Challenged Airport

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Prince, Lawrence J., III; Bailey, Randell E.; Arthur, Jarvis J., III; Parrish, Russell V.

    2004-01-01

    NASA's Synthetic Vision Systems (SVS) Project is striving to eliminate poor visibility as a causal factor in aircraft accidents as well as enhance operational capabilities of all aircraft through the display of computer generated imagery derived from an onboard database of terrain, obstacle, and airport information. To achieve these objectives, NASA 757 flight test research was conducted at the Eagle-Vail, Colorado airport to evaluate three SVS display types (Head-up Display, Head-Down Size A, Head-Down Size X) and two terrain texture methods (photo-realistic, generic) in comparison to the simulated Baseline Boeing-757 Electronic Attitude Direction Indicator and Navigation/Terrain Awareness and Warning System displays. The results of the experiment showed significantly improved situation awareness, performance, and workload for SVS concepts compared to the Baseline displays and confirmed the retrofit capability of the Head-Up Display and Size A SVS concepts. The research also demonstrated that the tunnel guidance display concept used within the SVS concepts achieved required navigation performance (RNP) criteria.

  2. A Mesh Refinement Study on the Impact Response of a Shuttle Leading-Edge Panel Finite Element Simulation

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Jackson, Karen E.; Lyle, Karen H.; Spellman, Regina L.

    2006-01-01

    A study was performed to examine the influence of varying mesh density on an LS-DYNA simulation of a rectangular-shaped foam projectile impacting the space shuttle leading edge Panel 6. The shuttle leading-edge panels are fabricated of reinforced carbon-carbon (RCC) material. During the study, nine cases were executed with all possible combinations of coarse, baseline, and fine meshes of the foam and panel. For each simulation, the same material properties and impact conditions were specified and only the mesh density was varied. In the baseline model, the shell elements representing the RCC panel are approximately 0.2-in. on edge, whereas the foam elements are about 0.5-in. on edge. The element nominal edge-length for the baseline panel was halved to create a fine panel (0.1-in. edge length) mesh and doubled to create a coarse panel (0.4-in. edge length) mesh. In addition, the element nominal edge-length of the baseline foam projectile was halved (0.25-in. edge length) to create a fine foam mesh and doubled (1.0-in. edge length) to create a coarse foam mesh. The initial impact velocity of the foam was 775 ft/s. The simulations were executed in LS-DYNA for 6 ms of simulation time. Contour plots of resultant panel displacement and effective stress in the foam were compared at four discrete time intervals. Also, time-history responses of internal and kinetic energy of the panel, kinetic and hourglass energy of the foam, and resultant contact force were plotted to determine the influence of mesh density.

  3. Two Hours of Teamwork Training Improves Teamwork in Simulated Cardiopulmonary Arrest Events.

    PubMed

    Mahramus, Tara L; Penoyer, Daleen A; Waterval, Eugene M E; Sole, Mary L; Bowe, Eileen M

    2016-01-01

    Teamwork during cardiopulmonary arrest events is important for resuscitation. Teamwork improvement programs are usually lengthy. This study assessed the effectiveness of a 2-hour teamwork training program. A prospective, pretest/posttest, quasi-experimental design assessed the teamwork training program targeted to resident physicians, nurses, and respiratory therapists. Participants took part in a simulated cardiac arrest. After the simulation, participants and trained observers assessed perceptions of teamwork using the Team Emergency Assessment Measure (TEAM) tool (ratings of 0 [low] to 4 [high]). A debriefing and 45 minutes of teamwork education followed. Participants then took part in a second simulated cardiac arrest scenario. Afterward, participants and observers assessed teamwork. Seventy-three team members participated-resident physicians (25%), registered nurses (32%), and respiratory therapists (41%). The physicians had significantly less experience on code teams (P < .001). Baseline teamwork scores were 2.57 to 2.72. Participants' mean (SD) scores on the TEAM tool for the first and second simulations were 3.2 (0.5) and 3.7 (0.4), respectively (P < .001). Observers' mean (SD) TEAM scores for the first and second simulations were 3.0 (0.5) and 3.7 (0.3), respectively (P < .001). Program evaluations by participants were positive. A 2-hour simulation-based teamwork educational intervention resulted in improved perceptions of teamwork behaviors. Participants reported interactions with other disciplines, teamwork behavior education, and debriefing sessions were beneficial for enhancing the program.

  4. Towards oscillations-based simulation of social systems: a neurodynamic approach

    NASA Astrophysics Data System (ADS)

    Plikynas, Darius; Basinskas, Gytis; Laukaitis, Algirdas

    2015-04-01

    This multidisciplinary work presents synopsis of theories in the search for common field-like fundamental principles of self-organisation and communication existing on quantum, cellular, and even social levels. Based on these fundamental principles, we formulate conceptually novel social neuroscience paradigm (OSIMAS), which envisages social systems emerging from the coherent neurodynamical processes taking place in the individual mind-fields. In this way, societies are understood as global processes emerging from the superposition of the conscious and subconscious mind-fields of individual members of society. For the experimental validation of the biologically inspired OSIMAS paradigm, we have designed a framework of EEG-based experiments. Initial baseline individual tests of spectral cross-correlations of EEG-recorded brainwave patterns for some mental states have been provided in this paper. Preliminary experimental results do not refute the main OSIMAS postulates. This paper also provides some insights for the construction of OSIMAS-based simulation models.

  5. An evaluation of the Johnson-Cook model to simulate puncture of 7075 aluminum plates.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corona, Edmundo; Orient, George Edgar

    The objective of this project was to evaluate the use of the Johnson-Cook strength and failure models in an adiabatic finite element model to simulate the puncture of 7075- T651 aluminum plates that were studied as part of an ASC L2 milestone by Corona et al (2012). The Johnson-Cook model parameters were determined from material test data. The results show a marked improvement, in particular in the calculated threshold velocity between no puncture and puncture, over those obtained in 2012. The threshold velocity calculated using a baseline model is just 4% higher than the mean value determined from experiment, inmore » contrast to 60% in the 2012 predictions. Sensitivity studies showed that the threshold velocity predictions were improved by calibrating the relations between the equivalent plastic strain at failure and stress triaxiality, strain rate and temperature, as well as by the inclusion of adiabatic heating.« less

  6. 1-D Photochemical Modeling of the Martian Atmosphere: Seasonal Variations

    NASA Astrophysics Data System (ADS)

    Boxe, C.; Emmanuel, S.; Hafsa, U.; Griffith, E.; Moore, J.; Tam, J.; Khan, I.; Cai, Z.; Bocolod, B.; Zhao, J.; Ahsan, S.; Tang, N.; Bartholomew, J.; Rafi, R.; Caltenco, K.; Smith, K.; Rivas, M.; Ditta, H.; Alawlaqi, H.; Rowley, N.; Khatim, F.; Ketema, N.; Strothers, J.; Diallo, I.; Owens, C.; Radosavljevic, J.; Austin, S. A.; Johnson, L. P.; Zavala-Gutierrez, R.; Breary, N.; Saint-Hilaire, D.; Skeete, D.; Stock, J.; Blue, S.; Gurung, D.; Salako, O.

    2016-12-01

    High school and undergraduate students, representative of academic institutions throughout USA's Tri-State Area (New York, New Jersey, Connecticut), utilize Caltech/JPL's one-dimensional atmospheric, photochemical models. These sophisticated models, were built over the course of the last four decades, describing all planetary bodies in our Solar System and selected extrasolar planets. Specifically, students employed the Martian one-dimensional photochemical model to assess the seasonal variability of molecules in its atmosphere. Students learned the overall model construct, running a baseline simulation, and fluctuating parameters (e.g., obliquity, orbital eccentricity) which affects the incoming solar radiation on Mars, temperature and pressure induce by seasonal variations. Students also attain a `real-world' experience that exemplifies the required level of coding competency and innovativeness needed for building an environment that can simulate observations and forecast. Such skills permeate STEM-related occupations that model systems and/or predict how that system may/will behave.

  7. A novel baseline-correction method for standard addition based derivative spectra and its application to quantitative analysis of benzo(a)pyrene in vegetable oil samples.

    PubMed

    Li, Na; Li, Xiu-Ying; Zou, Zhe-Xiang; Lin, Li-Rong; Li, Yao-Qun

    2011-07-07

    In the present work, a baseline-correction method based on peak-to-derivative baseline measurement was proposed for the elimination of complex matrix interference that was mainly caused by unknown components and/or background in the analysis of derivative spectra. This novel method was applicable particularly when the matrix interfering components showed a broad spectral band, which was common in practical analysis. The derivative baseline was established by connecting two crossing points of the spectral curves obtained with a standard addition method (SAM). The applicability and reliability of the proposed method was demonstrated through both theoretical simulation and practical application. Firstly, Gaussian bands were used to simulate 'interfering' and 'analyte' bands to investigate the effect of different parameters of interfering band on the derivative baseline. This simulation analysis verified that the accuracy of the proposed method was remarkably better than other conventional methods such as peak-to-zero, tangent, and peak-to-peak measurements. Then the above proposed baseline-correction method was applied to the determination of benzo(a)pyrene (BaP) in vegetable oil samples by second-derivative synchronous fluorescence spectroscopy. The satisfactory results were obtained by using this new method to analyze a certified reference material (coconut oil, BCR(®)-458) with a relative error of -3.2% from the certified BaP concentration. Potentially, the proposed method can be applied to various types of derivative spectra in different fields such as UV-visible absorption spectroscopy, fluorescence spectroscopy and infrared spectroscopy.

  8. Noise Simulations of the High-Lift Common Research Model

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Choudhari, Meelan M.; Vatsa, Veer N.; O'Connell, Matthew D.; Duda, Benjamin; Fares, Ehab

    2017-01-01

    The PowerFLOW(TradeMark) code has been used to perform numerical simulations of the high-lift version of the Common Research Model (HL-CRM) that will be used for experimental testing of airframe noise. Time-averaged surface pressure results from PowerFLOW(TradeMark) are found to be in reasonable agreement with those from steady-state computations using FUN3D. Surface pressure fluctuations are highest around the slat break and nacelle/pylon region, and synthetic array beamforming results also indicate that this region is the dominant noise source on the model. The gap between the slat and pylon on the HL-CRM is not realistic for modern aircraft, and most nacelles include a chine that is absent in the baseline model. To account for those effects, additional simulations were completed with a chine and with the slat extended into the pylon. The case with the chine was nearly identical to the baseline, and the slat extension resulted in higher surface pressure fluctuations but slightly reduced radiated noise. The full-span slat geometry without the nacelle/pylon was also simulated and found to be around 10 dB quieter than the baseline over almost the entire frequency range. The current simulations are still considered preliminary as changes in the radiated acoustics are still being observed with grid refinement, and additional simulations with finer grids are planned.

  9. Transient regional climate change: analysis of the summer climate response in a high-resolution, century-scale, ensemble experiment over the continental United States

    PubMed Central

    Diffenbaugh, Noah S.; Ashfaq, Moetasim; Scherer, Martin

    2013-01-01

    Integrating the potential for climate change impacts into policy and planning decisions requires quantification of the emergence of sub-regional climate changes that could occur in response to transient changes in global radiative forcing. Here we report results from a high-resolution, century-scale, ensemble simulation of climate in the United States, forced by atmospheric constituent concentrations from the Special Report on Emissions Scenarios (SRES) A1B scenario. We find that 21st century summer warming permanently emerges beyond the baseline decadal-scale variability prior to 2020 over most areas of the continental U.S. Permanent emergence beyond the baseline annual-scale variability shows much greater spatial heterogeneity, with emergence occurring prior to 2030 over areas of the southwestern U.S., but not prior to the end of the 21st century over much of the southcentral and southeastern U.S. The pattern of emergence of robust summer warming contrasts with the pattern of summer warming magnitude, which is greatest over the central U.S. and smallest over the western U.S. In addition to stronger warming, the central U.S. also exhibits stronger coupling of changes in surface air temperature, precipitation, and moisture and energy fluxes, along with changes in atmospheric circulation towards increased anticylonic anomalies in the mid-troposphere and a poleward shift in the mid-latitude jet aloft. However, as a fraction of the baseline variability, the transient warming over the central U.S. is smaller than the warming over the southwestern or northeastern U.S., delaying the emergence of the warming signal over the central U.S. Our comparisons with observations and the Coupled Model Intercomparison Project Phase 3 (CMIP3) ensemble of global climate model experiments suggest that near-term global warming is likely to cause robust sub-regional-scale warming over areas that exhibit relatively little baseline variability. In contrast, where there is greater variability in the baseline climate dynamics, there can be greater variability in the response to elevated greenhouse forcing, decreasing the robustness of the transient warming signal. PMID:24307747

  10. [Status of the water-soluble component of the antioxidant defense system in the conditions of 520-day isolation].

    PubMed

    Morukov, B V; Popov, I N; Levin, G; Markin, A A; Zhuravleva, O A; Kuzichkin, D S

    2013-01-01

    In the 520-d chamber experiment within the international project Mars-500 blood samples of 6 male test-subjects of 28 to 39 years of age were analyzed for water-soluble antioxidants: total bilirubin and uric acid; in addition, total antioxidant capacity of blood plasma was determined. Maximal values of these parameters were associated with the most stressful periods of the experiment, i.e. adaptation to the life in isolation and confinement, simulation of the egress onto Martian surface, and change of the diet. On attainment of the homeostatic equilibrium the parameters stabilized on levels slightly lower relative to baseline (pre-isolation) values. Therefore, dynamics of the water-soluble antioxidants reflected adequately the homeostatic reactions to and compensation by organism of the effects of the 520-day life in isolation and confinement.

  11. An Interferometry Imaging Beauty Contest

    NASA Technical Reports Server (NTRS)

    Lawson, Peter R.; Cotton, William D.; Hummel, Christian A.; Monnier, John D.; Zhaod, Ming; Young, John S.; Thorsteinsson, Hrobjartur; Meimon, Serge C.; Mugnier, Laurent; LeBesnerais, Guy; hide

    2004-01-01

    We present a formal comparison of the performance of algorithms used for synthesis imaging with optical/infrared long-baseline interferometers. Six different algorithms are evaluated based on their performance with simulated test data. Each set of test data is formated in the interferometry Data Exchange Standard and is designed to simulate a specific problem relevant to long-baseline imaging. The data are calibrated power spectra and bispectra measured with a ctitious array, intended to be typical of existing imaging interferometers. The strengths and limitations of each algorithm are discussed.

  12. Baseline performance of solar collectors for NASA Langley solar building test facility

    NASA Technical Reports Server (NTRS)

    Knoll, R. H.; Johnson, S. M.

    1977-01-01

    The solar collector field contains seven collector designs. Before operation in the field, the experimental performances (thermal efficiencies) of the seven collector designs were measured in an indoor solar simulator. The resulting data provided a baseline for later comparison with actual field test data. The simulator test results are presented for the collectors as received, and after several weeks of outdoor exposure with no coolant (dry operation). Six of the seven collector designs tested showed substantial reductions in thermal efficiency after dry operation.

  13. U-10Mo Baseline Fuel Fabrication Process Description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hubbard, Lance R.; Arendt, Christina L.; Dye, Daniel F.

    This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle ofmore » the USHPRR program. This document, along with the accompanying PFD, is updated regularly« less

  14. Technology research for strapdown inertial experiment and digital flight control and guidance

    NASA Technical Reports Server (NTRS)

    Carestia, R. A.; Cottrell, D. E.

    1985-01-01

    A helicopter flight-test program to evaluate the performance of Honeywell's Tetrad - a strapdown, laser gyro, inertial navitation system is discussed. The results of 34 flights showed a mean final navigational velocity error of 5.06 knots, with a standard deviation of 3.84 knots; a corresponding mean final position error of 2.66 n.mi., with a standard deviation of 1.48 n.m.; and a modeled mean-position-error growth rate for the 34 tests of 1.96 knots, with a standard deviation of 1.09 knots. Tetrad's four-ring laser gyros provided reliable and accurate angular rate sensing during the test program and on sensor failures were detected during the evaluation. Criteria suitable for investigating cockpit systems in rotorcraft were developed. This criteria led to the development of two basic simulators. The first was a standard simulator which could be used to obtain baseline information for studying pilot workload and interactions. The second was an advanced simulator which integrated the RODAAS developed by Honeywell into this simulator. The second area also included surveying the aerospace industry to determine the level of use and impact of microcomputers and related components on avionics systems.

  15. Neutrino Factory Targets and the MICE Beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walaron, Kenneth Andrew

    2007-01-01

    The future of particle physics in the next 30 years must include detailed study of neutrinos. The first proof of physics beyond the Standard Model of particle physics is evident in results from recent neutrino experiments which imply that neutrinos have mass and flavour mixing. The Neutrino Factory is the leading contender to measure precisely the neutrino mixing parameters to probe beyond the Standard Model physics. Significantly, one must look to measure the mixing angle θ 13 and investigate the possibility of leptonic CP violation. If found this may provide a key insight into the origins of the matter/anti- mattermore » asymmetry seen in the universe, through the mechanism of leptogenesis. The Neutrino Factory will be a large international multi-billion dollar experiment combining novel new accelerator and long-baseline detector technology. Arguably the most important and costly features of this facility are the proton driver and cooling channel. This thesis will present simulation work focused on determining the optimal proton driver energy to maximise pion production and also simulation of the transport of this pion °ux through some candidate transport lattices. Bench-marking of pion cross- sections calculated by MARS and GEANT4 codes to measured data from the HARP experiment is also presented. The cooling channel aims to reduce the phase-space volume of the decayed muon beam to a level that can be e±ciently injected into the accelerator system. The Muon Ionisation Cooling Experiment (MICE) hosted by the Rutherford Appleton laboratory, UK is a proof-of-principle experiment aimed at measuring ionisation cooling. The experiment will run parasitically to the ISIS accelerator and will produce muons from pion decay. The MICE beamline provides muon beams of variable emittance and momentum to the MICE experiment to enable measurement of cooling over a wide range of beam conditions. Simulation work in the design of this beamline is presented in this thesis as are results from an experiment to estimate the °ux from the target into the beamline acceptance.« less

  16. Constraining Polarized Foregrounds for EoR Experiments. II. Polarization Leakage Simulations in the Avoidance Scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nunhokee, C. D.; Bernardi, G.; Foster, G.

    A critical challenge in the observation of the redshifted 21 cm line is its separation from bright Galactic and extragalactic foregrounds. In particular, the instrumental leakage of polarized foregrounds, which undergo significant Faraday rotation as they propagate through the interstellar medium, may harmfully contaminate the 21 cm power spectrum. We develop a formalism to describe the leakage due to instrumental widefield effects in visibility-based power spectra measured with redundant arrays, extending the delay-spectrum approach presented in Parsons et al. We construct polarized sky models and propagate them through the instrument model to simulate realistic full-sky observations with the Precision Arraymore » to Probe the Epoch of Reionization. We find that the leakage due to a population of polarized point sources is expected to be higher than diffuse Galactic polarization at any k mode for a 30 m reference baseline. For the same reference baseline, a foreground-free window at k > 0.3 h Mpc{sup −1} can be defined in terms of leakage from diffuse Galactic polarization even under the most pessimistic assumptions. If measurements of polarized foreground power spectra or a model of polarized foregrounds are given, our method is able to predict the polarization leakage in actual 21 cm observations, potentially enabling its statistical subtraction from the measured 21 cm power spectrum.« less

  17. CAMx Ozone Source Attribution in the Eastern United States using Guidance from Observations during DISCOVER-AQ Maryland

    PubMed Central

    Goldberg, Daniel L.; Vinciguerra, Timothy P.; Anderson, Daniel C.; Hembeck, Linda; Canty, Timothy P.; Ehrman, Sheryl H.; Martins, Douglas K.; Stauffer, Ryan M.; Thompson, Anne M.; Salawitch, Ross J.; Dickerson, Russell R.

    2018-01-01

    A Comprehensive Air-Quality Model with Extensions (CAMx) version 6.10 simulation was assessed through comparison with data acquired during NASA’s 2011 DISCOVER-AQ Maryland field campaign. Comparisons for the baseline simulation (CB05 chemistry, EPA 2011 National Emissions Inventory) show a model overestimate of NOy by +86.2% and an underestimate of HCHO by −28.3%. We present a new model framework (CB6r2 chemistry, MEGAN v2.1 biogenic emissions, 50% reduction in mobile NOx, enhanced representation of isoprene nitrates) that better matches observations. The new model framework attributes 31.4% more surface ozone in Maryland to electric generating units (EGUs) and 34.6% less ozone to on-road mobile sources. Surface ozone becomes more NOx-limited throughout the eastern United States compared to the baseline simulation. The baseline model therefore likely underestimates the effectiveness of anthropogenic NOx reductions as well as the current contribution of EGUs to surface ozone. PMID:29618849

  18. CAMx Ozone Source Attribution in the Eastern United States using Guidance from Observations during DISCOVER-AQ Maryland.

    PubMed

    Goldberg, Daniel L; Vinciguerra, Timothy P; Anderson, Daniel C; Hembeck, Linda; Canty, Timothy P; Ehrman, Sheryl H; Martins, Douglas K; Stauffer, Ryan M; Thompson, Anne M; Salawitch, Ross J; Dickerson, Russell R

    2016-03-16

    A Comprehensive Air-Quality Model with Extensions (CAMx) version 6.10 simulation was assessed through comparison with data acquired during NASA's 2011 DISCOVER-AQ Maryland field campaign. Comparisons for the baseline simulation (CB05 chemistry, EPA 2011 National Emissions Inventory) show a model overestimate of NO y by +86.2% and an underestimate of HCHO by -28.3%. We present a new model framework (CB6r2 chemistry, MEGAN v2.1 biogenic emissions, 50% reduction in mobile NO x , enhanced representation of isoprene nitrates) that better matches observations. The new model framework attributes 31.4% more surface ozone in Maryland to electric generating units (EGUs) and 34.6% less ozone to on-road mobile sources. Surface ozone becomes more NO x -limited throughout the eastern United States compared to the baseline simulation. The baseline model therefore likely underestimates the effectiveness of anthropogenic NO x reductions as well as the current contribution of EGUs to surface ozone.

  19. Southern Ocean Open Ocean Polynyas in Observations and from a Low- and a High-Resolution Fully-Coupled Earth System Model Simulation

    NASA Astrophysics Data System (ADS)

    Veneziani, C.; Kurtakoti, P. K.; Weijer, W.; Stoessel, A.

    2016-12-01

    In contrast to their better known coastal counterpart, open ocean polynyas (OOPs) form through complex driving mechanisms, involving pre-conditioning of the water column, external forcing and internal ocean dynamics, and are therefore much more elusive and less predictable than coastal polynyas. Yet, their impact on bottom water formation and the Meridional Overturning Circulation could prove substantial. Here, we characterize the formation of Southern Ocean OOPs by analyzing the full satellite NASA microwave imager and radiometer (SSMI/SMMR) data record from 1972 to present day. We repeat the same analysis within the low-resolution (LR) and high-resolution (HR) fully-coupled Earth System Model simulations that are part of the Accelerated Climate Model for Energy (ACME) v0 baseline experiments. The focus is on two OOPs that are more consistently seen in observations: the Maud Rise and the Weddell Sea polynyas. Results show that the LR simulation is unable to reproduce any OOP over the 195 years of its duration, while both Maud Rise and Weddell Sea polynyas are seen in the HR simulation, with extents similar to observations'. We explore possible mechanisms that would explain the asymmetric behavior, including topographic processes, eddy shedding events, and different water column stratification between the two simulations.

  20. Creation and Assessment of a Bad News Delivery Simulation Curriculum for Pediatric Emergency Medicine Fellows.

    PubMed

    Chumpitazi, Corrie E; Rees, Chris A; Chumpitazi, Bruno P; Hsu, Deborah C; Doughty, Cara B; Lorin, Martin I

    2016-05-01

    Background  Bad news in the context of health care has been broadly defined as significant information that negatively alters people's perceptions of the present or future. Effectively delivering bad news (DBN) in the setting of the emergency department requires excellent communication skills. Evidence shows that bad news is frequently given inadequately. Studies show that trainees need to devote more time to developing this skill through formalized training. This program's objectives were to utilize trained standardized patients in a simulation setting to assist pediatric emergency medicine (PEM) fellows in the development of effective, sensitive, and compassionate communication with patients and family members when conveying bad news, and to recognize and respond to the patient/parent's reaction to such news. Methods PEM fellows participated in a novel curriculum utilizing simulated patients (SPs) acting as the patient's parent and immersive techniques in a realistic and supportive environment. A baseline survey was conducted to ascertain participant demographics and previous experience with simulation and DBN. Experienced, multi-disciplinary faculty participated in a training workshop with the SPs one week prior to course delivery. Three scenarios were developed for bad news delivery. Instructors watched via remote video feed while the fellows individually interacted with the SPs and then participated in a confidential debriefing. Fellows later joined for group debriefing. Fellow characteristics, experience, and self-perceived comfort pre/post-course were collected.   Results Baseline data demonstrated that 78% of fellows reported DBN two or more times per month. Ninety-three percent of fellows in this study were present during the delivery of news about the death of a child to a parent or family member in the six-month period preceding this course. Fellows' self-reported comfort level in DBN to a patient/family and dealing with patient and parent emotions improved significantly (p=0.034 and p=0.046, respectively). Conclusions Pediatric emergency medicine fellows frequently deliver bad news. A course using SPs was well received by trainees and resulted in improvement in self-assessed skills and comfort. This curriculum provides the opportunity for fellows to receive patient/parent feedback of their communication skills and observations from skilled instructors. This methodology should be considered when creating training curricula for bad news delivery skills.

  1. The X-IFU end-to-end simulations performed for the TES array optimization exercise

    NASA Astrophysics Data System (ADS)

    Peille, Philippe; Wilms, J.; Brand, T.; Cobo, B.; Ceballos, M. T.; Dauser, T.; Smith, S. J.; Barret, D.; den Herder, J. W.; Piro, L.; Barcons, X.; Pointecouteau, E.; Bandler, S.; den Hartog, R.; de Plaa, J.

    2015-09-01

    The focal plane assembly of the Athena X-ray Integral Field Unit (X-IFU) includes as the baseline an array of ~4000 single size calorimeters based on Transition Edge Sensors (TES). Other sensor array configurations could however be considered, combining TES of different properties (e.g. size). In attempting to improve the X-IFU performance in terms of field of view, count rate performance, and even spectral resolution, two alternative TES array configurations to the baseline have been simulated, each combining a small and a large pixel array. With the X-IFU end-to-end simulator, a sub-sample of the Athena core science goals, selected by the X-IFU science team as potentially driving the optimal TES array configuration, has been simulated for the results to be scientifically assessed and compared. In this contribution, we will describe the simulation set-up for the various array configurations, and highlight some of the results of the test cases simulated.

  2. Analysis of surface EMG baseline for detection of hidden muscle activity

    NASA Astrophysics Data System (ADS)

    Zhang, Xu; Zhou, Ping

    2014-02-01

    Objective. This study explored the feasibility of detecting hidden muscle activity in surface electromyogram (EMG) baseline. Approach. Power spectral density (PSD) analysis and multi-scale entropy (MSE) analysis were used. Both analyses were applied to computer simulations of surface EMG baseline with the presence (representing activity data) or absence (representing reference data) of hidden muscle activity, as well as surface electrode array EMG baseline recordings of healthy control and amyotrophic lateral sclerosis (ALS) subjects. Main results. Although the simulated reference data and the activity data yielded no distinguishable difference in the time domain, they demonstrated a significant difference in the frequency and signal complexity domains with the PSD and MSE analyses. For a comparison using pooled data, such a difference was also observed when the PSD and MSE analyses were applied to surface electrode array EMG baseline recordings of healthy control and ALS subjects, which demonstrated no distinguishable difference in the time domain. Compared with the PSD analysis, the MSE analysis appeared to be more sensitive for detecting the difference in surface EMG baselines between the two groups. Significance. The findings implied the presence of a hidden muscle activity in surface EMG baseline recordings from the ALS subjects. To promote the presented analysis as a useful diagnostic or investigatory tool, future studies are necessary to assess the pathophysiological nature or origins of the hidden muscle activity, as well as the baseline difference at the individual subject level.

  3. Analysis of Surface EMG Baseline for Detection of Hidden Muscle Activity

    PubMed Central

    Zhang, Xu; Zhou, Ping

    2014-01-01

    Objective This study explored the feasibility of detecting hidden muscle activity in surface electromyogram (EMG) baseline. Approach Power spectral density (PSD) analysis and multi-scale entropy (MSE) analysis were used respectively. Both analyses were applied to computer simulations of surface EMG baseline with presence (representing activity data) or absence (representing reference data) of hidden muscle activity, as well as surface electrode array EMG baseline recordings of healthy control and amyotrophic lateral sclerosis (ALS) subjects. Main results Although the simulated reference data and the activity data yielded no distinguishable difference in the time domain, they demonstrated a significant difference in the frequency and signal complexity domains with the PSD and MSE analyses. For a comparison using pooled data, such a difference was also observed when the PSD and MSE analyses were applied to surface electrode array EMG baseline recordings of healthy control and ALS subjects, which demonstrated no distinguishable difference in the time domain. Compared with the PSD analysis, the MSE analysis appeared to be more sensitive for detecting the difference in surface EMG baselines between the two groups. Significance The findings implied presence of hidden muscle activity in surface EMG baseline recordings from the ALS subjects. To promote the presented analysis as a useful diagnostic or investigatory tool, future studies are necessary to assess the pathophysiological nature or origins of the hidden muscle activity, as well as the baseline difference at the individual subject level. PMID:24445526

  4. Simulations and Data analysis for the 35 ton Liquid Argon detector as a prototype for the DUNE experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warburton, Thomas Karl

    2017-01-01

    The Deep Underground Neutrino Experiment (DUNE) is a next-generation neutrino experiment which will be built at the Sanford Underground Research Facility (SURF), and will receive a wide-band neutrino beam from Fermilab, 1300~km away. At this baseline DUNE will be able to study many of the properties of neutrino mixing, including the neutrino mass hierarchy and the value of the CP-violating complex phase (more » $$\\delta_{CP}$$). DUNE will utilise Liquid Argon (LAr) Time Projection Chamber (TPC) (LArTPC) technology, and the Far Detector (FD) will consist of four modules, each containing 17.1~kt of LAr with a fiducial mass of around 10~kt. Each of these FD modules represents around an order of magnitude increase in size, when compared to existing LArTPC experiments. \\\\ The 35 ton detector is the first DUNE prototype for the single (LAr) phase design of the FD. There were two running periods, one from November 2013 to February 2014, and a second from November 2015 to March 2016. During t he second running period, a system of TPCs was installed, and cosmic-ray data were collected. A method of particle identification was developed using simulations, though this was not applied to the data due to the higher than expected noise level. A new method of determining the interaction time of a track, using the effects of longitudinal diffusion, was developed using the cosmic-ray data. A camera system was also installed in the detector for monitoring purposes, and to look for high voltage breakdowns. \\\\ Simulations concerning the muon-induced background rate to nucleon decay are performed, following the incorporation of the MUon Simulations UNderground (MUSUN) generator into the DUNE software framework. A series of cuts which are based on Monte Carlo truth information is developed, designed to reject simulated background events, whilst preserving simulated signal events in the $$n \\rightarrow K^{+} + e^{-}$$ decay channel. No background events are seen to survive the app lication of these cuts in a sample of 2~$$\\times$$~10$^9$ muon! s, representing 401.6~years of detector live time. This corresponds to an annual background rate of <~0.44~events$$\\cdot$$Mt$$^{-1}\\cdot$$year$$^{-1}$$ at 90\\% confidence, using a fiducial mass of 13.8~kt.« less

  5. Evaluation of Synthetic Vision Display Concepts for Improved Awareness in Unusual Attitude Recovery Scenarios

    NASA Technical Reports Server (NTRS)

    Nicholas, Stephanie

    2016-01-01

    A recent study conducted by the Commercial Aviation Safety Team (CAST) determined 40 percent of all fixed-wing fatal accidents, between 2001 and 2011, were caused by Loss-of-Control (LOC) in flight (National Transportation Safety Board, 2015). Based on their findings, CAST recommended manufacturers develop and implement virtual day-visual meteorological conditions (VMC) display systems, such as synthetic vision or equivalent systems (CAST, 2016). In a 2015 simulation study conducted at NASA Langley Research Center (LaRC), researchers gathered to test and evaluate virtual day-VMC displays under realistic flight operation scenarios capable of inducing reduced attention states in pilots. Each display concept was evaluated to determine its efficacy to improve attitude awareness. During the experiment, Evaluation Pilots (EPs) were shown the following three display concepts on the Primary Flight Display (PFD): Baseline, Synthetic Vision (SV) with color gradient, and SV with texture. The baseline configuration was a standard, conventional 'blue over brown' display. Experiment scenarios were simulated over water to evaluate Unusual Attitude (UA) recovery over 'featureless terrain' environments. Thus, the SV with color gradient configuration presented a 'blue over blue' display with a linear blue color progression, to differentiate attitude changes between sky and ocean. The SV with texture configuration presented a 'blue over blue' display with a black checkerboard texture atop a synthetic ocean. These displays were paired with a Background Attitude Indicator (BAI) concept. The BAI was presented across all four Head-Down Displays (HDDs), displaying a wide field-of-view blue-over-blue attitude indicator. The BAI aligned with the PFD and showed through the background of the navigation displays with opaque transparency. Each EP participated in a two-part experiment series with a total seventy-five trial runs: Part I included a set of twenty-five Unusual Attitude Recovery (UAR) scenarios; Part II included a set of fifty Attitude Memory Recall Tasks (AMRT). At the conclusion of each trial, EPs were asked to complete a set post-run questionnaires. Quantitative results showed that there were no significant statistical effects on UA recovery times when utilizing SV with or without the presence of a BAI. Qualitative results show the SV displays (color, texture) with BAI On are most preferred for both UA recognition and recovery when compared with the baseline display. When only comparing SV display concepts, EPs performed better when using the SV with texture, BAI On, than any other display configuration. This is an interesting find considering most EPs noted their preference towards the SV with color gradient when the BAI was on.

  6. Bias error reduction using ratios to baseline experiments. Heat transfer case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakroun, W.; Taylor, R.P.; Coleman, H.W.

    1993-10-01

    Employing a set of experiments devoted to examining the effect of surface finish (riblets) on convective heat transfer as an example, this technical note seeks to explore the notion that precision uncertainties in experiments can be reduced by repeated trials and averaging. This scheme for bias error reduction can give considerable advantage when parametric effects are investigated experimentally. When the results of an experiment are presented as a ratio with the baseline results, a large reduction in the overall uncertainty can be achieved when all the bias limits in the variables of the experimental result are fully correlated with thosemore » of the baseline case. 4 refs.« less

  7. Does video gaming affect orthopaedic skills acquisition? A prospective cohort-study.

    PubMed

    Khatri, Chetan; Sugand, Kapil; Anjum, Sharika; Vivekanantham, Sayinthen; Akhtar, Kash; Gupte, Chinmay

    2014-01-01

    Previous studies have suggested that there is a positive correlation between the extent of video gaming and efficiency of surgical skill acquisition on laparoscopic and endovascular surgical simulators amongst trainees. However, the link between video gaming and orthopaedic trauma simulation remains unexamined, in particular dynamic hip screw (DHS) stimulation. To assess effect of prior video gaming experience on virtual-reality (VR) haptic-enabled DHS simulator performance. 38 medical students, naïve to VR surgical simulation, were recruited and stratified relative to their video gaming exposure. Group 1 (n = 19, video-gamers) were defined as those who play more than one hour per day in the last calendar year. Group 2 (n = 19, non-gamers) were defined as those who play video games less than one hour per calendar year. Both cohorts performed five attempts on completing a VR DHS procedure and repeated the task after a week. Metrics assessed included time taken for task, simulated flouroscopy time and screw position. Median and Bonett-Price 95% confidence intervals were calculated for seven real-time objective performance metrics. Data was confirmed as non-parametric by the Kolmogorov-Smirnov test. Analysis was performed using the Mann-Whitney U test for independent data whilst the Wilcoxon signed ranked test was used for paired data. A result was deemed significant when a two-tailed p-value was less than 0.05. All 38 subjects completed the study. The groups were not significantly different at baseline. After ten attempts, there was no difference between Group 1 and Group 2 in any of the metrics tested. These included time taken for task, simulated fluoroscopy time, number of retries, tip-apex distance, percentage cut-out and global score. Contrary to previous literature findings, there was no correlation between video gaming experience and gaining competency on a VR DHS simulator.

  8. A data driven partial ambiguity resolution: Two step success rate criterion, and its simulation demonstration

    NASA Astrophysics Data System (ADS)

    Hou, Yanqing; Verhagen, Sandra; Wu, Jie

    2016-12-01

    Ambiguity Resolution (AR) is a key technique in GNSS precise positioning. In case of weak models (i.e., low precision of data), however, the success rate of AR may be low, which may consequently introduce large errors to the baseline solution in cases of wrong fixing. Partial Ambiguity Resolution (PAR) is therefore proposed such that the baseline precision can be improved by fixing only a subset of ambiguities with high success rate. This contribution proposes a new PAR strategy, allowing to select the subset such that the expected precision gain is maximized among a set of pre-selected subsets, while at the same time the failure rate is controlled. These pre-selected subsets are supposed to obtain the highest success rate among those with the same subset size. The strategy is called Two-step Success Rate Criterion (TSRC) as it will first try to fix a relatively large subset with the fixed failure rate ratio test (FFRT) to decide on acceptance or rejection. In case of rejection, a smaller subset will be fixed and validated by the ratio test so as to fulfill the overall failure rate criterion. It is shown how the method can be practically used, without introducing a large additional computation effort. And more importantly, how it can improve (or at least not deteriorate) the availability in terms of baseline precision comparing to classical Success Rate Criterion (SRC) PAR strategy, based on a simulation validation. In the simulation validation, significant improvements are obtained for single-GNSS on short baselines with dual-frequency observations. For dual-constellation GNSS, the improvement for single-frequency observations on short baselines is very significant, on average 68%. For the medium- to long baselines, with dual-constellation GNSS the average improvement is around 20-30%.

  9. Long-Baseline Neutrino Facility (LBNF) and Deep Underground Neutrino Experiment (DUNE): Conceptual Design Report. Volume 1: The LBNF and DUNE Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acciarri, R.

    2016-01-22

    This document presents the Conceptual Design Report (CDR) put forward by an international neutrino community to pursue the Deep Underground Neutrino Experiment at the Long-Baseline Neutrino Facility (LBNF/DUNE), a groundbreaking science experiment for long-baseline neutrino oscillation studies and for neutrino astrophysics and nucleon decay searches. The DUNE far detector will be a very large modular liquid argon time-projection chamber (LArTPC) located deep underground, coupled to the LBNF multi-megawatt wide-band neutrino beam. DUNE will also have a high-resolution and high-precision near detector.

  10. Driver Injury Risk Variability in Finite Element Reconstructions of Crash Injury Research and Engineering Network (CIREN) Frontal Motor Vehicle Crashes.

    PubMed

    Gaewsky, James P; Weaver, Ashley A; Koya, Bharath; Stitzel, Joel D

    2015-01-01

    A 3-phase real-world motor vehicle crash (MVC) reconstruction method was developed to analyze injury variability as a function of precrash occupant position for 2 full-frontal Crash Injury Research and Engineering Network (CIREN) cases. Phase I: A finite element (FE) simplified vehicle model (SVM) was developed and tuned to mimic the frontal crash characteristics of the CIREN case vehicle (Camry or Cobalt) using frontal New Car Assessment Program (NCAP) crash test data. Phase II: The Toyota HUman Model for Safety (THUMS) v4.01 was positioned in 120 precrash configurations per case within the SVM. Five occupant positioning variables were varied using a Latin hypercube design of experiments: seat track position, seat back angle, D-ring height, steering column angle, and steering column telescoping position. An additional baseline simulation was performed that aimed to match the precrash occupant position documented in CIREN for each case. Phase III: FE simulations were then performed using kinematic boundary conditions from each vehicle's event data recorder (EDR). HIC15, combined thoracic index (CTI), femur forces, and strain-based injury metrics in the lung and lumbar vertebrae were evaluated to predict injury. Tuning the SVM to specific vehicle models resulted in close matches between simulated and test injury metric data, allowing the tuned SVM to be used in each case reconstruction with EDR-derived boundary conditions. Simulations with the most rearward seats and reclined seat backs had the greatest HIC15, head injury risk, CTI, and chest injury risk. Calculated injury risks for the head, chest, and femur closely correlated to the CIREN occupant injury patterns. CTI in the Camry case yielded a 54% probability of Abbreviated Injury Scale (AIS) 2+ chest injury in the baseline case simulation and ranged from 34 to 88% (mean = 61%) risk in the least and most dangerous occupant positions. The greater than 50% probability was consistent with the case occupant's AIS 2 hemomediastinum. Stress-based metrics were used to predict injury to the lower leg of the Camry case occupant. The regional-level injury metrics evaluated for the Cobalt case occupant indicated a low risk of injury; however, strain-based injury metrics better predicted pulmonary contusion. Approximately 49% of the Cobalt occupant's left lung was contused, though the baseline simulation predicted 40.5% of the lung to be injured. A method to compute injury metrics and risks as functions of precrash occupant position was developed and applied to 2 CIREN MVC FE reconstructions. The reconstruction process allows for quantification of the sensitivity and uncertainty of the injury risk predictions based on occupant position to further understand important factors that lead to more severe MVC injuries.

  11. A Novel Study Paradigm for Long-term Prevention Trials in Alzheimer Disease: The Placebo Group Simulation Approach (PGSA): Application to MCI data from the NACC database.

    PubMed

    Berres, M; Kukull, W A; Miserez, A R; Monsch, A U; Monsell, S E; Spiegel, R

    2014-01-01

    The PGSA (Placebo Group Simulation Approach) aims at avoiding problems of sample representativeness and ethical issues typical of placebo-controlled secondary prevention trials with MCI patients. The PGSA uses mathematical modeling to forecast the distribution of quantified outcomes of MCI patient groups based on their own baseline data established at the outset of clinical trials. These forecasted distributions are then compared with the distribution of actual outcomes observed on candidate treatments, thus substituting for a concomitant placebo group. Here we investigate whether a PGSA algorithm that was developed from the MCI population of ADNI 1*, can reliably simulate the distribution of composite neuropsychological outcomes from a larger, independently selected MCI subject sample. Data available from the National Alzheimer's Coordinating Center (NACC) were used. We included 1523 patients with single or multiple domain amnestic mild cognitive impairment (aMCI) and at least two follow-ups after baseline. In order to strengthen the analysis and to verify whether there was a drift over time in the neuropsychological outcomes, the NACC subject sample was split into 3 subsamples of similar size. The previously described PGSA algorithm for the trajectory of a composite neuropsychological test battery (NTB) score was adapted to the test battery used in NACC. Nine demographic, clinical, biological and neuropsychological candidate predictors were included in a mixed model; this model and its error terms were used to simulate trajectories of the adapted NTB. The distributions of empirically observed and simulated data after 1, 2 and 3 years were very similar, with some over-estimation of decline in all 3 subgroups. The by far most important predictor of the NTB trajectories is the baseline NTB score. Other significant predictors are the MMSE baseline score and the interactions of time with ApoE4 and FAQ (functional abilities). These are essentially the same predictors as determined for the original NTB score. An algorithm comprising a small number of baseline variables, notably cognitive performance at baseline, forecasts the group trajectory of cognitive decline in subsequent years with high accuracy. The current analysis of 3 independent subgroups of aMCI patients from the NACC database supports the validity of the PGSA longitudinal algorithm for a NTB. Use of the PGSA in long-term secondary AD prevention trials deserves consideration.

  12. Crustal dynamics project session 4 validation and intercomparison experiments 1979-1980 report

    NASA Technical Reports Server (NTRS)

    Liebrecht, P.; Kolenkiewicz, R.; Ryan, J.; Hothem, L.

    1983-01-01

    As part of the Crustal Dynamics Project, an experiment was performed to verify the ability of Satellite Laser Ranging (SLR), Very Long Baseline interferometry (VLBI) and Doppler Satellite Positioning System (Doppler) techniques to estimate the baseline distances between several locations. The Goddard Space Flight Center (GSFC) lasers were in operation at all five sites available to them. The ten baselines involved were analyzed using monthly orbits and various methods of selecting data. The standard deviation of the monthly SLR baseline lengths was at the 7 cm level. The GSFC VLBI (Mark III) data was obtained during three separate experiments. November 1979 at Haystack and Owens Valley, and April and July 1980 at Haystack, Owens Valley, and Fort Davis. Repeatability of the VLBI in determining baseline lengths was calculated to be at the 2 cm level. Jet Propulsion Laboratory (JPL) VLBI (Mark II) data was acquired on the Owens Valley to Goldstone baseline on ten occasions between August 1979 and November 1980. The repeatability of these baseline length determinations was calculated to be at the 5 cm level. National Geodetic Survey (NGS) Doppler data was acquired at all five sites in January 1980. Repeatability of the Doppler determined baseline lengths results were calculated at approximately 30 cm. An intercomparison between baseline distances and associated parameters was made utilizing SLR, VLBI, and Doppler results on all available baselines. The VLBI and SLR length determinations were compared on four baselines with a resultant mean difference of -1 cm and a maximum difference of 12 cm. The SLR and Doppler length determinations were compared on ten baselines with a resultant mean difference of about 30 cm and a maximum difference of about 60 cm. The VLBI and Doppler lengths from seven baselines showed a resultant mean difference of about 30 cm and maximum difference of about 1 meter. The intercomparison of baseline orientation parameters were consistent with past analysis.

  13. Agent Based Modeling of Air Carrier Behavior for Evaluation of Technology Equipage and Adoption

    NASA Technical Reports Server (NTRS)

    Horio, Brant M.; DeCicco, Anthony H.; Stouffer, Virginia L.; Hasan, Shahab; Rosenbaum, Rebecca L.; Smith, Jeremy C.

    2014-01-01

    As part of ongoing research, the National Aeronautics and Space Administration (NASA) and LMI developed a research framework to assist policymakers in identifying impacts on the U.S. air transportation system (ATS) of potential policies and technology related to the implementation of the Next Generation Air Transportation System (NextGen). This framework, called the Air Transportation System Evolutionary Simulation (ATS-EVOS), integrates multiple models into a single process flow to best simulate responses by U.S. commercial airlines and other ATS stakeholders to NextGen-related policies, and in turn, how those responses impact the ATS. Development of this framework required NASA and LMI to create an agent-based model of airline and passenger behavior. This Airline Evolutionary Simulation (AIRLINE-EVOS) models airline decisions about tactical airfare and schedule adjustments, and strategic decisions related to fleet assignments, market prices, and equipage. AIRLINE-EVOS models its own heterogeneous population of passenger agents that interact with airlines; this interaction allows the model to simulate the cycle of action-reaction as airlines compete with each other and engage passengers. We validated a baseline configuration of AIRLINE-EVOS against Airline Origin and Destination Survey (DB1B) data and subject matter expert opinion, and we verified the ATS-EVOS framework and agent behavior logic through scenario-based experiments. These experiments demonstrated AIRLINE-EVOS's capabilities in responding to an input price shock in fuel prices, and to equipage challenges in a series of analyses based on potential incentive policies for best equipped best served, optimal-wind routing, and traffic management initiative exemption concepts..

  14. Shock Generation and Control Using DBD Plasma Actuators

    NASA Technical Reports Server (NTRS)

    Patel, Mehul P.; Cain, Alan B.; Nelson, Christopher C.; Corke, Thomas C.; Matlis, Eric H.

    2012-01-01

    This report is the final report of a NASA Phase I SBIR contract, with some revisions to remove company proprietary data. The Shock Boundary Layer Interaction (SBLI) phenomena in a supersonic inlet involve mutual interaction of oblique shocks with boundary layers, forcing the boundary layer to separate from the inlet wall. To improve the inlet efficiency, it is desired to prevent or delay shock-induced boundary layer separation. In this effort, Innovative Technology Applications Company (ITAC), LLC and the University of Notre Dame (UND) jointly investigated the use of dielectric-barrier-discharge (DBD) plasma actuators for control of SBLI in a supersonic inlet. The research investigated the potential for DBD plasma actuators to suppress flow separation caused by a shock in a turbulent boundary layer. The research involved both numerical and experimental investigations of plasma flow control for a few different SBLI configurations: (a) a 12 wedge flow test case at Mach 1.5 (numerical and experimental), (b) an impinging shock test case at Mach 1.5 using an airfoil as a shock generator (numerical and experimental), and (c) a Mach 2.0 nozzle flow case in a simulated 15 X 15 cm wind tunnel with a shock generator (numerical). Numerical studies were performed for all three test cases to examine the feasibility of plasma flow control concepts. These results were used to guide the wind tunnel experiments conducted on the Mach 1.5 12 degree wedge flow (case a) and the Mach 1.5 impinging shock test case (case b) which were at similar flow conditions as the corresponding numerical studies to obtain experimental evidence of plasma control effects for SBLI control. The experiments also generated data that were used in validating the numerical studies for the baseline cases (without plasma actuators). The experiments were conducted in a Mach 1.5 test section in the University of Notre Dame Hessert Laboratory. The simulation results from cases a and b indicated that multiple spanwise actuators in series and at a voltage of 75 kVp-p could fully suppress the flow separation downstream of the shock. The simulation results from case c showed that the streamwise plasma actuators are highly effective in creating pairs of counter-rotating vortices, much like the mechanical vortex generators, and could also potentially have beneficial effects for SBLI control. However, to achieve these effects, the positioning and the quantity of the DBD actuators used must be optimized. The wind tunnel experiments mapped the baseline flow with good agreement to the numerical simulations. The experimental results were conducted with spanwise actuators for cases a and b, but were limited by the inability to generate a sufficiently high voltage due to arcing in the wind-tunnel test-section. The static pressure in the tunnel was lower than the static pressure in an inlet at flight conditions, promoting arching and degrading the actuator performance.

  15. No difference in learning retention in manikin-based simulation based on role

    PubMed Central

    Giuliano, Dominic; DC, Marion McGregor

    2016-01-01

    Objective: We evaluated learning retention in interns exposed to simulation. It was hypothesized that learning would degrade after 6 months and there would be a difference in retention between interns who played a critical role versus those who did not. Methods: A total of 23 groups of 5 to 9 interns underwent a cardiac scenario twice during 1 simulation experience and again 6 months later. We captured 69 recordings (23 before debrief at baseline [PrDV], 23 after debrief at baseline [PoDV], and 23 at 6-month follow-up [FUV]). Students were assigned different roles, including the critical role of “doctor” in a blinded, haphazard fashion. At 6-month follow-up, 12 interns who played the role of doctor initially were assigned that role again, while 11 interns who played noncritical roles initially were newly assigned to doctor. All videos of intern performance were scored independently and in a blinded fashion, by 3 judges using a 15-item check list. Results: Repeated-measures analysis of variance for interns completing all 3 time points indicated a significant difference between time points (F2,22 = 112, p = .00). Contrasts showed a statistically significant difference between PrDV and PoDV (p = .00), and PrDV and FUV (p = .00), but no difference between PoDV and FUV (p = .98). This was consistent with results including all data points. Checklist scores were more than double for PoDV recordings (16) and FUV (15), compared to PrDV recordings (6.6). Follow-up scores comparing old to new doctors showed no statistically significant difference (15.4 vs 15.2 respectively, t21 = 0.26, p = .80, d = .11). Conclusions: Learning retention was maintained regardless of role. PMID:26367345

  16. Collision judgment when using an augmented-vision head-mounted display device

    PubMed Central

    Luo, Gang; Woods, Russell L; Peli, Eli

    2016-01-01

    Purpose We have developed a device to provide an expanded visual field to patients with tunnel vision by superimposing minified edge images of the wide scene, in which objects appear closer to the heading direction than they really are. We conducted experiments in a virtual environment to determine if users would overestimate collision risks. Methods Given simulated scenes of walking or standing with intention to walk towards a given direction (intended walking) in a shopping mall corridor, participants (12 normally sighted and 7 with tunnel vision) reported whether they would collide with obstacles appearing at different offsets from variable walking paths (or intended directions), with and without the device. The collision envelope (CE), a personal space based on perceived collision judgments, and judgment uncertainty (variability of response) were measured. When the device was used, combinations of two image scales (5× minified and 1:1) and two image types (grayscale or edge images) were tested. Results Image type did not significantly alter collision judgment (p>0.7). Compared to the without-device baseline, minification did not significantly change the CE of normally sighted subjects for simulated walking (p=0.12), but increased CE by 30% for intended walking (p<0.001). Their uncertainty was not affected by minification (p>0.25). For the patients, neither CE nor uncertainty was affected by minification (p>0.13) in both walking conditions. Baseline CE and uncertainty were greater for patients than normally-sighted subjects in simulated walking (p=0.03), but the two groups were not significantly different in all other conditions. Conclusion Users did not substantially overestimate collision risk, as the 5× minified images had only limited impact on collision judgments either during walking or before starting to walk. PMID:19458339

  17. Collision judgment when using an augmented-vision head-mounted display device.

    PubMed

    Luo, Gang; Woods, Russell L; Peli, Eli

    2009-09-01

    A device was developed to provide an expanded visual field to patients with tunnel vision by superimposing minified edge images of the wide scene, in which objects appear closer to the heading direction than they really are. Experiments were conducted in a virtual environment to determine whether users would overestimate collision risks. Given simulated scenes of walking or standing with intention to walk toward a given direction (intended walking) in a shopping mall corridor, participants (12 normally sighted and 7 with tunnel vision) reported whether they would collide with obstacles appearing at different offsets from variable walking paths (or intended directions), with and without the device. The collision envelope (CE), a personal space based on perceived collision judgments, and judgment uncertainty (variability of response) were measured. When the device was used, combinations of two image scales (5x minified and 1:1) and two image types (grayscale or edge images) were tested. Image type did not significantly alter collision judgment (P > 0.7). Compared to the without-device baseline, minification did not significantly change the CE of normally sighted subjects for simulated walking (P = 0.12), but increased CE by 30% for intended walking (P < 0.001). Their uncertainty was not affected by minification (P > 0.25). For the patients, neither CE nor uncertainty was affected by minification (P > 0.13) in both walking conditions. Baseline CE and uncertainty were greater for patients than normally sighted subjects in simulated walking (P = 0.03), but the two groups were not significantly different in all other conditions. Users did not substantially overestimate collision risk, as the x5 minified images had only limited impact on collision judgments either during walking or before starting to walk.

  18. Autogenic-feedback training: A preventive method for space adaptation syndrome

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Sharp, Joseph C.; Toscano, William B.; Kamiya, Joe; Miller, Neal E.

    1987-01-01

    The progress made to date on the reduction of data for Spacelab 3 Shuttle experiment, No. 3AFT23 is reported. Four astronauts participated as subjects in this experiment. Crewmen A and B served as treatment subjects (i.e., received preflight training for control of their own motion sickness symptoms) and Crewmen C and D served as control (i.e., did not receive training). A preliminary evaluation of Autogenic Feedback Training (AFT) was made from visual inspections of graphs that were generated from the preflight and inflight and inflight physiological data which included: (1) Baseline rotating chair tests for all crewmen; (2) Posttraining rotating chair tests of treatment groups subjects; (3) Preflight data from Joint Integrated Simulations for all crewmen; and (4) Flight data for all crewmen during mission days 0 through 4, and mission day 6 for treatment subjects only. A summary of the findings suggested by these data is outlined.

  19. Fault detection and diagnosis of photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Wu, Xing

    The rapid growth of the solar industry over the past several years has expanded the significance of photovoltaic (PV) systems. One of the primary aims of research in building-integrated PV systems is to improve the performance of the system's efficiency, availability, and reliability. Although much work has been done on technological design to increase a photovoltaic module's efficiency, there is little research so far on fault diagnosis for PV systems. Faults in a PV system, if not detected, may not only reduce power generation, but also threaten the availability and reliability, effectively the "security" of the whole system. In this paper, first a circuit-based simulation baseline model of a PV system with maximum power point tracking (MPPT) is developed using MATLAB software. MATLAB is one of the most popular tools for integrating computation, visualization and programming in an easy-to-use modeling environment. Second, data collection of a PV system at variable surface temperatures and insolation levels under normal operation is acquired. The developed simulation model of PV system is then calibrated and improved by comparing modeled I-V and P-V characteristics with measured I--V and P--V characteristics to make sure the simulated curves are close to those measured values from the experiments. Finally, based on the circuit-based simulation model, a PV model of various types of faults will be developed by changing conditions or inputs in the MATLAB model, and the I--V and P--V characteristic curves, and the time-dependent voltage and current characteristics of the fault modalities will be characterized for each type of fault. These will be developed as benchmark I-V or P-V, or prototype transient curves. If a fault occurs in a PV system, polling and comparing actual measured I--V and P--V characteristic curves with both normal operational curves and these baseline fault curves will aid in fault diagnosis.

  20. 15 CFR 970.204 - Environmental and use conflict analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES... activities in the area, including the testing of integrated mining systems which simulate commercial recovery... baseline data or plans for acquiring them. The applicant may at his option delay submission of baseline and...

  1. 15 CFR 970.204 - Environmental and use conflict analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES... activities in the area, including the testing of integrated mining systems which simulate commercial recovery... baseline data or plans for acquiring them. The applicant may at his option delay submission of baseline and...

  2. 15 CFR 970.204 - Environmental and use conflict analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES... activities in the area, including the testing of integrated mining systems which simulate commercial recovery... baseline data or plans for acquiring them. The applicant may at his option delay submission of baseline and...

  3. 15 CFR 970.204 - Environmental and use conflict analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES... activities in the area, including the testing of integrated mining systems which simulate commercial recovery... baseline data or plans for acquiring them. The applicant may at his option delay submission of baseline and...

  4. 15 CFR 970.204 - Environmental and use conflict analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES... activities in the area, including the testing of integrated mining systems which simulate commercial recovery... baseline data or plans for acquiring them. The applicant may at his option delay submission of baseline and...

  5. Safe and sensible preprocessing and baseline correction of pupil-size data.

    PubMed

    Mathôt, Sebastiaan; Fabius, Jasper; Van Heusden, Elle; Van der Stigchel, Stefan

    2018-02-01

    Measurement of pupil size (pupillometry) has recently gained renewed interest from psychologists, but there is little agreement on how pupil-size data is best analyzed. Here we focus on one aspect of pupillometric analyses: baseline correction, i.e., analyzing changes in pupil size relative to a baseline period. Baseline correction is useful in experiments that investigate the effect of some experimental manipulation on pupil size. In such experiments, baseline correction improves statistical power by taking into account random fluctuations in pupil size over time. However, we show that baseline correction can also distort data if unrealistically small pupil sizes are recorded during the baseline period, which can easily occur due to eye blinks, data loss, or other distortions. Divisive baseline correction (corrected pupil size = pupil size/baseline) is affected more strongly by such distortions than subtractive baseline correction (corrected pupil size = pupil size - baseline). We discuss the role of baseline correction as a part of preprocessing of pupillometric data, and make five recommendations: (1) before baseline correction, perform data preprocessing to mark missing and invalid data, but assume that some distortions will remain in the data; (2) use subtractive baseline correction; (3) visually compare your corrected and uncorrected data; (4) be wary of pupil-size effects that emerge faster than the latency of the pupillary response allows (within ±220 ms after the manipulation that induces the effect); and (5) remove trials on which baseline pupil size is unrealistically small (indicative of blinks and other distortions).

  6. Disk brake design for cooling improvement using Computational Fluid Dynamics (CFD)

    NASA Astrophysics Data System (ADS)

    Munisamy, Kannan M.; Shafik, Ramel

    2013-06-01

    The car disk brake design is improved with two different blade designs compared to the baseline blade design. The two designs were simulated in Computational fluid dynamics (CFD) to obtain heat transfer properties such as Nusselt number and Heat transfer coefficient. The heat transfer property is compared against the baseline design. The improved shape has the highest heat transfer performance. The curved design is inferior to baseline design in heat transfer performance.

  7. An Overview of NASA's SubsoniC Research Aircraft Testbed (SCRAT)

    NASA Technical Reports Server (NTRS)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft’s mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft’s flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT’s research systems and capabilities

  8. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    NASA Technical Reports Server (NTRS)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  9. Using Neural Networks to Explore Air Traffic Controller Workload

    NASA Technical Reports Server (NTRS)

    Martin, Lynne; Kozon, Thomas; Verma, Savita; Lozito, Sandra C.

    2006-01-01

    When a new system, concept, or tool is proposed in the aviation domain, one concern is the impact that this will have on operator workload. As an experience, workload is difficult to measure in a way that will allow comparison of proposed systems with those already in existence. Chatterji and Sridhar (2001) suggested a method by which airspace parameters can be translated into workload ratings, using a neural network. This approach was employed, and modified to accept input from a non-real time airspace simulation model. The following sections describe the preparations and testing work that will enable comparison of a future airspace concept with a current day baseline in terms of workload levels.

  10. Low cost solar array project production process and equipment task: A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Several major modifications were made to the design presented at the PDR. The frame was deleted in favor of a "frameless" design which will provide a substantially improved cell packing factor. Potential shaded cell damage resulting from operation into a short circuit can be eliminated by a change in the cell series/parallel electrical interconnect configuration. The baseline process sequence defined for the MEPSON was refined and equipment design and specification work was completed. SAMICS cost analysis work accelerated, format A's were prepared and computer simulations completed. Design work on the automated cell interconnect station was focused on bond technique selection experiments.

  11. Docking simulation analysis of range data requirements for the orbital maneuvering vehicle

    NASA Technical Reports Server (NTRS)

    Micheal, J. D.; Vinz, F. L.

    1985-01-01

    The results of an initial study are reported assess the controllability of the Orbital Maneuvering Vehicle (OMV) for terminal closure and docking are reported. The vehicle characteristics used in this study are those of the Marshall Space Flight Center (MSFC) baseline OMV which were published with the request for proposals for preliminary design of this vehicle. This simulation was conducted at MSFC using the Target Motion Simulator. The study focused on the OMV manual mode capability to accommodate both stabilized and tumbling target engagements with varying complements of range and range rate data displayed to the OMV operator. Four trained test subjects performed over 400 simulated orbital dockings during this study. A firm requirement for radar during the terminal closure and dock phase of the OMV mission was not established by these simulations. Fifteen pound thrusters recommended in the MSFC baseline design were found to be advantageous for initial rate matching maneuvers with unstabilized targets; however, lower thrust levels were desirable for making the final docking maneuvers.

  12. Reconstructing a Large-Scale Population for Social Simulation

    NASA Astrophysics Data System (ADS)

    Fan, Zongchen; Meng, Rongqing; Ge, Yuanzheng; Qiu, Xiaogang

    The advent of social simulation has provided an opportunity to research on social systems. More and more researchers tend to describe the components of social systems in a more detailed level. Any simulation needs the support of population data to initialize and implement the simulation systems. However, it's impossible to get the data which provide full information about individuals and households. We propose a two-step method to reconstruct a large-scale population for a Chinese city according to Chinese culture. Firstly, a baseline population is generated through gathering individuals into households one by one; secondly, social relationships such as friendship are assigned to the baseline population. Through a case study, a population of 3,112,559 individuals gathered in 1,133,835 households is reconstructed for Urumqi city, and the results show that the generated data can respect the real data quite well. The generated data can be applied to support modeling of some social phenomenon.

  13. Emergence of nutrient limitation in tropical dry forests: hypotheses from simulation models

    NASA Astrophysics Data System (ADS)

    Medvigy, D.; Waring, B. G.; Xu, X.; Trierweiler, A.; Werden, L. K.; Wang, G.; Zhu, Q.; Powers, J. S.

    2017-12-01

    It is unclear to what extent tropical dry forest productivity may be limited by nutrients. Direct assessment of nutrient limitation through fertilization experiments has been rare, and paradigms pertaining to other ecosystems may not extend to tropical dry forests. For example, because dry tropical forests have a lower water supply than moist tropical forests, dry forests can have lower decomposition rates, higher soil carbon and nitrogen concentrations, and a more open nitrogen cycle than moist forests. We used a mechanistic, numerical model to generate hypotheses about nutrient limitation in tropical dry forests. The model dynamically couples ED2 (vegetation dynamics), MEND (biogeochemistry), and N-COM (plant-microbe competition for nutrients). Here, the MEND-component of the model has been extended to include nitrogen (N) and phosphorus (P) cycles. We focus on simulation of sixteen 25m x 25m plots in Costa Rica where a fertilization experiment has been underway since 2015. Baseline simulations are characterized by both nitrogen and phosphorus limitation of vegetation. Fertilization with N and P increased vegetation biomass, with N fertilization having a somewhat stronger effect. Nutrient limitation was also sensitive to climate and was more pronounced during drought periods. Overflow respiration was identified as a key process that mitigated nutrient limitation. These results suggest that, despite often having richer soils than tropical moist forests, tropical dry forests can also become nutrient-limited. If the climate becomes drier in the next century, as is expected for Central America, drier soils may decrease microbial activity and exacerbate nutrient limitation. The importance of overflow respiration underscores the need for appropriate treatment of microbial dynamics in ecosystem models. Ongoing and new nutrient fertilization experiments will present opportunities for testing whether, and how, nutrient limitation may indeed be emerging in tropical dry forests.

  14. Profile negotiation: An air/ground automation integration concept for managing arrival traffic

    NASA Technical Reports Server (NTRS)

    Williams, David H.; Arbuckle, P. Douglas; Green, Steven M.; Denbraven, Wim

    1993-01-01

    NASA Ames Research Center and NASA Langley Research Center conducted a joint simulation study to evaluate a profile negotiation process (PNP) between a time-based air traffic control ATC system and an airplane equipped with a four dimensional flight management system (4D FMS). Prototype procedures were developed to support the functional implementation of this process. The PNP was designed to provide an arrival trajectory solution that satisfies the separation requirements of ATC while remaining as close as possible to the airplane's preferred trajectory. The Transport Systems Research Vehicle cockpit simulator was linked in real-time to the Center/TRACON Automation System (CTAS) for the experiment. Approximately 30 hours of simulation testing were conducted over a three week period. Active airline pilot crews and active Center controller teams participated as test subjects. Results from the experiment indicate the potential for successful incorporation of airplane preferred arrival trajectories in the CTAS automation environment. Controllers were able to consistently and effectively negotiate nominally conflict-free trajectories with pilots flying a 4D-FMS-equipped airplane. The negotiated trajectories were substantially closer to the airplane's preference than would have otherwise been possible without the PNP. Airplane fuel savings relative to baseline CTAS were achieved in the test scenarios. The datalink procedures and clearances developed for this experiment, while providing the necessary functionality, were found to be operationally unacceptable to the pilots. Additional pilot control and understanding of the proposed airplane-preferred trajectory and a simplified clearance procedure were cited as necessary for operational implementation of the concept. From the controllers' perspective, the main concerns were the ability of the 4D airplane to accurately track the negotiated trajectory and the workload required to support the PNP as implemented in this study.

  15. Combining d-cycloserine with motor training does not result in improved general motor learning in neurologically intact people or in people with stroke

    PubMed Central

    Cherry, Kendra M.; Lenze, Eric J.

    2014-01-01

    Neurological rehabilitation involving motor training has resulted in clinically meaningful improvements in function but is unable to eliminate many of the impairments associated with neurological injury. Thus there is a growing need for interventions that facilitate motor learning during rehabilitation therapy, to optimize recovery. d-Cycloserine (DCS), a partial N-methyl-d-aspartate (NMDA) receptor agonist that enhances neurotransmission throughout the central nervous system (Ressler KJ, Rothbaum BO, Tannenbaum L, Anderson P, Graap K, Zimand E, Hodges L, Davis M. Arch Gen Psychiatry 61: 1136–1144, 2004), has been shown to facilitate declarative and emotional learning. We therefore tested whether combining DCS with motor training facilitates motor learning after stroke in a series of two experiments. Forty-one healthy adults participated in experiment I, and twenty adults with stroke participated in experiment II of this two-session, double-blind study. Session one consisted of baseline assessment, subject randomization, and oral administration of DCS or placebo (250 mg). Subjects then participated in training on a balancing task, a simulated feeding task, and a cognitive task. Subjects returned 1–3 days later for posttest assessment. We found that all subjects had improved performance from pretest to posttest on the balancing task, the simulated feeding task, and the cognitive task. Subjects who were given DCS before motor training, however, did not show enhanced learning on the balancing task, the simulated feeding task, or the associative recognition task compared with subjects given placebo. Moreover, training on the balancing task did not generalize to a similar, untrained balance task. Our findings suggest that DCS does not enhance motor learning or motor skill generalization in neurologically intact adults or in adults with stroke. PMID:24671538

  16. CORDEX Coordinated Output for Regional Evaluation

    NASA Astrophysics Data System (ADS)

    Gutowski, William; Giorgi, Filippo; Lake, Irene

    2017-04-01

    The Science Advisory Team for the Coordinated Regional Downscaling Experiment (CORDEX) has developed a baseline framework of specified regions, resolutions and simulation periods intended to provide a foundation for ongoing regional CORDEX activities: the CORDEX Coordinated Output for Regional Evaluation, or CORDEX-CORE. CORDEX-CORE was conceived in part to be responsive to IPCC needs for coordinated simulations that could provide regional climate downscaling (RCD) that yields fine-scale climate information beyond that resolved by GCMs. For each CORDEX region, a matrix of GCM-RCD experiments is designed based on the need to cover as much as possible different dimensions of the uncertainty space (e.g., different emissions and land-use scenarios, GCMs, RCD models and techniques). An appropriate set of driving GCMs can allow a program of simulations that efficiently addresses key scientific issues within CORDEX, while facilitating comparison and transfer of results and lessons learned across different regions. The CORDEX-CORE program seeks to provide, as much as possible, homogeneity across domains, so it is envisioned that a standard set of regional climate models (RCMs) and empirical statistical downscaling (ESD) methods will downscale a standard set of GCMs over all or at least most CORDEX domains for a minimum set of scenarios (high and low end). The focus is on historical climate simulations for the 20th century and projections for 21st century, implying that data would be needed minimally for the period 1950-2100 (but ideally 1900-2100). This foundational ensemble can be regionally enriched with further contributions (additional GCM-RCD pairs) by individual groups over their selected domains of interest. The RCM model resolution for these core experiments will be in the range of 10-20 km, a resolution that has been shown to provide substantial added value for a variety of climate variables and that represents a significant forward step compared in the CORDEX program. This presentation presents the vision and structure of CORDEX-CORE while also soliciting discussion on plans for implementing the program.

  17. Development and Utility of a Piloted Flight Simulator for Icing Effects Training

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas P.; Ranaudo, Richard J.; Barnhart, Billy P.; Dickes, Edward G.; Gingras, David R.

    2003-01-01

    A piloted flight simulator called the Ice Contamination Effects Flight Training Device (ICEFTD), which uses low cost desktop components and a generic cockpit replication is being developed. The purpose of this device is to demonstrate the effectiveness of its use for training pilots to recognize and recover from aircraft handling anomalies that result from airframe ice formations. High-fidelity flight simulation models for various baseline (non-iced) and iced configurations were developed from wind tunnel tests of a subscale DeHavilland DHC-6 Twin Otter aircraft model. These simulation models were validated with flight test data from the NASA Twin Otter Icing Research Aircraft, which included the effects of ice on wing and tail stall characteristics. These simulation models are being implemented into an ICEFTD that will provide representative aircraft characteristics due to airframe icing. Scenario-based exercises are being constructed to give an operational-flavor to the simulation. Training pilots will learn to recognize iced aircraft characteristics from the baseline, and will practice and apply appropriate recovery procedures to a handling event.

  18. Pharmacy students' retention of knowledge and skills following training in automated external defibrillator use.

    PubMed

    Kopacek, Karen Birckelbaw; Dopp, Anna Legreid; Dopp, John M; Vardeny, Orly; Sims, J Jason

    2010-08-10

    To assess pharmacy students' retention of knowledge about appropriate automated external defibrillator use and counseling points following didactic training and simulated experience. Following a lecture on sudden cardiac arrest and automated external defibrillator use, second-year doctor of pharmacy (PharmD) students were assessed on their ability to perform basic life support and deliver a shock at baseline, 3 weeks, and 4 months. Students completed a questionnaire to evaluate recall of counseling points for laypeople/the public. Mean time to shock delivery at baseline was 74 ± 25 seconds, which improved significantly at 3 weeks (50 ± 17 seconds, p < 0.001) and was maintained at 4 months (47 ± 18 seconds, p < 0.001). Recall of all signs and symptoms of sudden cardiac arrest and automated external defibrillator counseling points was diminished after 4 months. Pharmacy students can use automated external defibrillators to quickly deliver a shock and are able to retain this ability after 4 months. Refresher training/courses will be required to improve students' retention of automated external defibrillator counseling points to ensure their ability to deliver appropriate patient education.

  19. Critical thinking of registered nurses in a fellowship program.

    PubMed

    Zori, Susan; Kohn, Nina; Gallo, Kathleen; Friedman, M Isabel

    2013-08-01

    Critical thinking is essential to nursing practice. This study examined differences in the critical thinking dispositions of registered nurses (RNs) in a nursing fellowship program. Control and experimental groups were used to compare differences in scores on the California Critical Thinking Disposition Inventory (CCTDI) of RNs at three points during a fellowship program: baseline, week 7, and month 5. The control group consisted of RNs who received no education in critical thinking. The experimental group received education in critical thinking using simulated scenarios and reflective journaling. CCTDI scores examined with analysis of variance showed no significant difference within groups over time or between groups. The baseline scores of the experimental group were slightly higher than those of the control group. Chi-square analysis of demographic variables between the two groups showed no significant differences. Critical thinking dispositions are a combination of attitudes, values, and beliefs that make up one's personality based on life experience. Lack of statistical significance using a quantitative approach did not capture the development of the critical thinking dispositions of participants. A secondary qualitative analysis of journal entries is being conducted. Copyright 2013, SLACK Incorporated.

  20. Lasting depression in corticomotor excitability associated with local scalp cooling.

    PubMed

    Tremblay, François; Remaud, Anthony; Mekonnen, Abeye; Gholami-Boroujeny, Shiva; Racine, Karl-Édouard; Bolic, Miodrag

    2015-07-23

    In this study, we investigated the effect of local scalp cooling on corticomotor excitability with transcranial magnetic simulation (TMS). Participants (healthy male adults, n=12) were first assessed with TMS to derive baseline measure of excitability from motor evoked potentials (MEPs) using the right first dorsal interosseous as the target muscle. Then, local cooling was induced on the right hemi-scalp (upper frontal region ∼ 15 cm(2)) by means of a cold wrap. The cooling was maintained for 10-15 min to get a decrease of at least 10°C from baseline temperature. In the post-cooling period, both scalp temperature and MEPs were reassessed at specific time intervals (i.e., T0, T10, T20 and T30 min). Scalp surface temperatures dropped on average by 12.5°C from baseline at T0 (p<0.001) with partial recovery at T10 (p<0.05) and full recovery at T20. Parallel analysis of post-cooling variations in MEP amplitude revealed significant reductions relative to baseline at T0, T10 and T20. No concurrent change in MEP latency was observed. A secondary control experiment was performed in a subset of participants (n=5) to account for the mild discomfort associated with the wrapping procedure without the cooling agent. Results showed no effect on any of the dependent variables (temperature, MEP amplitude and latency). To our knowledge, this report provides the first neurophysiological evidence linking changes in scalp temperature with lasting changes in corticomotor excitability. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. The Problem of Controlling for Imperfectly Measured Confounders on Dissimilar Populations: A Database Simulation Study

    PubMed Central

    Schonberger, Robert B; Gilbertsen, Todd; Dai, Feng

    2013-01-01

    Objective(s) Observational database research frequently relies on imperfect administrative markers to determine comorbid status, and it is difficult to infer to what extent the associated misclassification impacts validity in multivariable analyses. The effect that imperfect markers of disease will have on validity in situations where researchers attempt to match populations that have strong baseline health differences is underemphasized as a limitation in some otherwise high-quality observational studies. The present simulations were designed as a quantitative demonstration of the importance of this common and underappreciated issue. Design Two groups of Monte Carlo simulations were performed. The first demonstrated the degree to which controlling for a series of imperfect markers of disease between different populations taking 2 hypothetically harmless drugs would lead to spurious associations between drug assignment and mortality. The second Monte Carlo simulation applied this principle to a recent study in the field of anesthesiology that purported to show increased perioperative mortality in patients taking metoprolol versus atenolol. Setting/Participants/Interventions None. Measurements and Main Results Simulation 1: High type 1 error (ie, false positive findings of an independent association between drug assignment and mortality) was observed as sensitivity and specificity declined and as systematic differences in disease prevalence increased. Simulation 2: Propensity score matching across several imperfect markers was unlikely to eliminate important baseline health disparities in the referenced study. Conclusions In situations where large baseline health disparities exist between populations, matching on imperfect markers of disease may result in strong bias away from the null hypothesis. PMID:23962461

  2. Simulation of turbulent separated flows using a novel, evolution-based, eddy-viscosity formulation

    NASA Astrophysics Data System (ADS)

    Castellucci, Paul

    Currently, there exists a lack of confidence in the computational simulation of turbulent separated flows at large Reynolds numbers. The most accurate methods available are too computationally costly to use in engineering applications. Thus, inexpensive models, developed using the Reynolds-averaged Navier-Stokes (RANS) equations, are often extended beyond their applicability. Although these methods will often reproduce integrated quantities within engineering tolerances, such metrics are often insensitive to details within a separated wake, and therefore, poor indicators of simulation fidelity. Using concepts borrowed from large-eddy simulation (LES), a two-equation RANS model is modified to simulate the turbulent wake behind a circular cylinder. This modification involves the computation of one additional scalar field, adding very little to the overall computational cost. When properly inserted into the baseline RANS model, this modification mimics LES in the separated wake, yet reverts to the unmodified form at the cylinder surface. In this manner, superior predictive capability may be achieved without the additional cost of fine spatial resolution associated with LES near solid boundaries. Simulations using modified and baseline RANS models are benchmarked against both LES and experimental data for a circular cylinder wake at Reynolds number 3900. In addition, the computational tool used in this investigation is subject to verification via the Method of Manufactured Solutions. Post-processing of the resultant flow fields includes both mean value and triple-decomposition analysis. These results reveal substantial improvements using the modified system and appear to drive the baseline wake solution toward that of LES, as intended.

  3. Neutrino parameters from reactor and accelerator neutrino experiments

    NASA Astrophysics Data System (ADS)

    Lindner, Manfred; Rodejohann, Werner; Xu, Xun-Jie

    2018-04-01

    We revisit correlations of neutrino oscillation parameters in reactor and long-baseline neutrino oscillation experiments. A framework based on an effective value of θ13 is presented, which can be used to analytically study the correlations and explain some questions including why and when δC P has the best fit value of -π /2 , why current and future long-baseline experiments will have less precision of δC P around ±π /2 than that around zero, etc. Recent hints on the C P phase are then considered from the point of view that different reactor and long-baseline neutrino experiments provide currently different best-fit values of θ13 and θ23. We point out that the significance of the hints changes for the different available best-fit values.

  4. Patient Simulation for Assessment of Layperson Management of Opioid Overdose with Intranasal Naloxone in a Recently-Released Prisoner Cohort

    PubMed Central

    Kobayashi, Leo; Green, Traci C.; Bowman, Sarah E.; Ray, Madeline C.; McKenzie, Michelle S.; Rich, Josiah D.

    2016-01-01

    Introduction Investigators applied simulation to an experimental program that educated, trained and assessed at-risk, volunteering prisoners on opioid overdose (OD) prevention, recognition and layperson management with intranasal (IN) naloxone. Methods Consenting inmates were assessed for OD-related experience and knowledge then exposed on-site to standardized didactics and educational DVD (without simulation). Subjects were provided with IN naloxone kits at time of release and scheduled for post-release assessment. At follow-up, subjects were evaluated for their performance of layperson opioid OD resuscitative skills during video-recorded simulations. Two investigators independently scored each subject’s resuscitative actions with a 21-item checklist; post-hoc video reviews were separately completed to adjudicate subjects’ interactions for overall benefit or harm. Results One hundred and three prisoners completed the baseline assessment and study intervention then were prescribed IN naloxone kits. One-month follow-up and simulation data were available for 85 subjects (82.5% of trained recruits) who had been released and resided in the community. Subjects’ simulation checklist median score was 12.0 (IQR 11.0–15.0) out of 21 total indicated actions. Forty-four participants (51.8%) correctly administered naloxone; 16 additional subjects (18.8%) suboptimally administered naloxone. Non-indicated actions, primarily chest compressions, were observed in 49.4% of simulations. Simulated resuscitative actions by 80 subjects (94.1%) were determined post-hoc to be beneficial overall for patients overdosing on opioids. Conclusions As part of an opioid OD prevention research program for at-risk inmates, investigators applied simulation to 1-month follow-up assessments of knowledge retention and skills acquisition in post-release participants. Simulation supplemented traditional research tools for investigation of layperson OD management. PMID:28146450

  5. On Simulating the Mid-western-us Drought of 1988 with a GCM

    NASA Technical Reports Server (NTRS)

    Sud, Y. C.; Mocko, D. M.; Lau, William K.-M.; Atlas, R.

    2002-01-01

    The primary cause of the midwestern North American drought in the summer of 1988 has been identified to be the La Nina SST anomalies. Yet with the SST anomalies prescribed, this drought has not been simulated satisfactorily by any general circulation model. Seven simulation-experiments, each containing an ensemble of 4-sets of simulations, were conducted with the GEOS GCM for both 1987 and 1988. All simulations started from January 1 and continued through the end of August. In the first baseline case, Case 1, only the SST anomalies and some vegetation parameters were prescribed, while everything else (such as soil moisture, snow-cover, and clouds) was interactive. The GCM did produce some of the circulation features of a drought over North America, but they could only be identified on the planetary scales. The 1988 minus 1987 precipitation differences show that the GCM was successful in simulating reduced precipitation in the mid-west, but the accompanying circulation anomalies were not well simulated, leading one to infer that the GCM has simulated the drought for the wrong reason. To isolate the causes for this unremarkable circulation, analyzed winds and soil moisture were prescribed in Case 2 and Case 3 as continuous updates by direct replacement of the GCM-predicted fields. These cases show that a large number of simulation biases emanate from wind biases that are carried into the North American region from surroundings regions. Inclusion of soil moisture also helps to ameliorate the strong feedback, perhaps even stronger than that of the real atmosphere, between soil moisture and precipitation. Case 2 simulated one type of surface temperature anomaly pattern, whereas Case 3 with the prescribed soil moisture produced another.

  6. Enhancing student communication during end-of-life care: A pilot study.

    PubMed

    Bloomfield, Jacqueline G; O'Neill, Bernadette; Gillett, Karen

    2015-12-01

    Quality end-of-life care requires effective communication skills, yet medical and nursing students report limited opportunities to develop these skills, and that they lack confidence and the related competence. Our purpose was to design, implement, and evaluate an educational intervention employing simulated patient actors to enhance students' abilities to communicate with dying patients and their families. A study employing a mixed-methods design was conducted with prequalification nursing and medical students recruited from a London university. The first phase involved focus groups with students, which informed the development of an educational intervention involving simulated patient actors. Questionnaires measuring students' perceptions of confidence and competence levels when communicating with dying patients and their families were administered before and after the intervention. The themes from focus groups related to responding to grief and anger, difficulties dealing with emotions, knowing the "right thing" to say, and a lack of experience. A significant increase (p < 0.5) in competence and confidence from baseline levels followed participation in the simulated scenarios. Simulation was found to be an effective means of preparing students to communicate with dying patients and their families. The opportunity to develop communication skills was valued. Integration of educational interventions employing simulated patient actors into nursing and medical curricula may assist in improving the care provided to patients at the end of life.

  7. Modeling the effects of snowpack on heterotrophic respiration across northern temperate and high latitude regions: Comparison with measurements of atmospheric carbon dioxide in high latitudes

    USGS Publications Warehouse

    McGuire, A.D.; Melillo, J.M.; Randerson, J.T.; Parton, W.J.; Heimann, Martin; Meier, R.A.; Clein, Joy S.; Kicklighter, D.W.; Sauf, W.

    2000-01-01

    Simulations by global terrestrial biogeochemical models (TBMs) consistently underestimate the concentration of atmospheric carbon dioxide (CO2) at high latitude monitoring stations during the nongrowing season. We hypothesized that heterotrophic respiration is underestimated during the nongrowing season primarily because TBMs do not generally consider the insulative effects of snowpack on soil temperature. To evaluate this hypothesis, we compared the performance of baseline and modified versions of three TBMs in simulating the seasonal cycle of atmospheric CO2 at high latitude CO2 monitoring stations; the modified version maintained soil temperature at 0 ??C when modeled snowpack was present. The three TBMs include the Carnegie-Ames-Stanford Approach (CASA), Century, and the Terrestrial Ecosystem Model (TEM). In comparison with the baseline simulation of each model, the snowpack simulations caused higher releases of CO2 between November and March and greater uptake of CO2 between June and August for latitudes north of 30??N. We coupled the monthly estimates of CO2 exchange, the seasonal carbon dioxide flux fields generated by the HAMOCC3 seasonal ocean carbon cycle model, and fossil fuel source fields derived from standard sources to the three-dimensional atmospheric transport model TM2 forced by observed winds to simulate the seasonal cycle of atmospheric CO2 at each of seven high latitude monitoring stations, in comparison to the CO2 concentrations simulated with the baseline fluxes of each TBM, concentrations simulated using the snowpack fluxes are generally in better agreement with observed concentrations between August and March at each of the monitoring stations. Thus, representation of the insulative effects of snowpack in TBMs generally improves simulation of atmospheric CO2 concentrations in high latitudes during both the late growing season and nongrowing season. These simulations highlight the global importance of biogeochemical processes during the nongrowing season in estimating carbon balance of ecosystems in northern high and temperate latitudes.

  8. Higher mental workload is associated with poorer laparoscopic performance as measured by the NASA-TLX tool.

    PubMed

    Yurko, Yuliya Y; Scerbo, Mark W; Prabhu, Ajita S; Acker, Christina E; Stefanidis, Dimitrios

    2010-10-01

    Increased workload during task performance may increase fatigue and facilitate errors. The National Aeronautics and Space Administration-Task Load Index (NASA-TLX) is a previously validated tool for workload self-assessment. We assessed the relationship of workload and performance during simulator training on a complex laparoscopic task. NASA-TLX workload data from three separate trials were analyzed. All participants were novices (n = 28), followed the same curriculum on the fundamentals of laparoscopic surgery suturing model, and were tested in the animal operating room (OR) on a Nissen fundoplication model after training. Performance and workload scores were recorded at baseline, after proficiency achievement, and during the test. Performance, NASA-TLX scores, and inadvertent injuries during the test were analyzed and compared. Workload scores declined during training and mirrored performance changes. NASA-TLX scores correlated significantly with performance scores (r = -0.5, P < 0.001). Participants with higher workload scores caused more inadvertent injuries to adjacent structures in the OR (r = 0.38, P < 0.05). Increased mental and physical workload scores at baseline correlated with higher workload scores in the OR (r = 0.52-0.82; P < 0.05) and more inadvertent injuries (r = 0.52, P < 0.01). Increased workload is associated with inferior task performance and higher likelihood of errors. The NASA-TLX questionnaire accurately reflects workload changes during simulator training and may identify individuals more likely to experience high workload and more prone to errors during skill transfer to the clinical environment.

  9. Quantifying Intrinsic Variability of Sagittarius A* Using Closure Phase Measurements of the Event Horizon Telescope

    NASA Astrophysics Data System (ADS)

    Roelofs, Freek; Johnson, Michael D.; Shiokawa, Hotaka; Doeleman, Sheperd S.; Falcke, Heino

    2017-09-01

    General relativistic magnetohydrodynamic (GRMHD) simulations of accretion disks and jets associated with supermassive black holes show variability on a wide range of timescales. On timescales comparable to or longer than the gravitational timescale {t}G={GM}/{c}3, variation may be dominated by orbital dynamics of the inhomogeneous accretion flow. Turbulent evolution within the accretion disk is expected on timescales comparable to the orbital period, typically an order of magnitude larger than t G . For Sgr A*, t G is much shorter than the typical duration of a VLBI experiment, enabling us to study this variability within a single observation. Closure phases, the sum of interferometric visibility phases on a triangle of baselines, are particularly useful for studying this variability. In addition to a changing source structure, variations in observed closure phase can also be due to interstellar scattering, thermal noise, and the changing geometry of projected baselines over time due to Earth rotation. We present a metric that is able to distinguish the latter two from intrinsic or scattering variability. This metric is validated using synthetic observations of GRMHD simulations of Sgr A*. When applied to existing multi-epoch EHT data of Sgr A*, this metric shows that the data are most consistent with source models containing intrinsic variability from source dynamics, interstellar scattering, or a combination of those. The effects of black hole inclination, orientation, spin, and morphology (disk or jet) on the expected closure phase variability are also discussed.

  10. A Novel Method to Simulate the Progression of Collagen Degeneration of Cartilage in the Knee: Data from the Osteoarthritis Initiative

    NASA Astrophysics Data System (ADS)

    Mononen, Mika E.; Tanska, Petri; Isaksson, Hanna; Korhonen, Rami K.

    2016-02-01

    We present a novel algorithm combined with computational modeling to simulate the development of knee osteoarthritis. The degeneration algorithm was based on excessive and cumulatively accumulated stresses within knee joint cartilage during physiological gait loading. In the algorithm, the collagen network stiffness of cartilage was reduced iteratively if excessive maximum principal stresses were observed. The developed algorithm was tested and validated against experimental baseline and 4-year follow-up Kellgren-Lawrence grades, indicating different levels of cartilage degeneration at the tibiofemoral contact region. Test groups consisted of normal weight and obese subjects with the same gender and similar age and height without osteoarthritic changes. The algorithm accurately simulated cartilage degeneration as compared to the Kellgren-Lawrence findings in the subject group with excess weight, while the healthy subject group’s joint remained intact. Furthermore, the developed algorithm followed the experimentally found trend of cartilage degeneration in the obese group (R2 = 0.95, p < 0.05 experiments vs. model), in which the rapid degeneration immediately after initiation of osteoarthritis (0-2 years, p < 0.001) was followed by a slow or negligible degeneration (2-4 years, p > 0.05). The proposed algorithm revealed a great potential to objectively simulate the progression of knee osteoarthritis.

  11. The Value of Interrupted Time-Series Experiments for Community Intervention Research

    PubMed Central

    Biglan, Anthony; Ary, Dennis; Wagenaar, Alexander C.

    2015-01-01

    Greater use of interrupted time-series experiments is advocated for community intervention research. Time-series designs enable the development of knowledge about the effects of community interventions and policies in circumstances in which randomized controlled trials are too expensive, premature, or simply impractical. The multiple baseline time-series design typically involves two or more communities that are repeatedly assessed, with the intervention introduced into one community at a time. It is particularly well suited to initial evaluations of community interventions and the refinement of those interventions. This paper describes the main features of multiple baseline designs and related repeated-measures time-series experiments, discusses the threats to internal validity in multiple baseline designs, and outlines techniques for statistical analyses of time-series data. Examples are given of the use of multiple baseline designs in evaluating community interventions and policy changes. PMID:11507793

  12. Systematic errors in long baseline oscillation experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Deborah A.; /Fermilab

    This article gives a brief overview of long baseline neutrino experiments and their goals, and then describes the different kinds of systematic errors that are encountered in these experiments. Particular attention is paid to the uncertainties that come about because of imperfect knowledge of neutrino cross sections and more generally how neutrinos interact in nuclei. Near detectors are planned for most of these experiments, and the extent to which certain uncertainties can be reduced by the presence of near detectors is also discussed.

  13. A novel approach for baseline correction in 1H-MRS signals based on ensemble empirical mode decomposition.

    PubMed

    Parto Dezfouli, Mohammad Ali; Dezfouli, Mohsen Parto; Rad, Hamidreza Saligheh

    2014-01-01

    Proton magnetic resonance spectroscopy ((1)H-MRS) is a non-invasive diagnostic tool for measuring biochemical changes in the human body. Acquired (1)H-MRS signals may be corrupted due to a wideband baseline signal generated by macromolecules. Recently, several methods have been developed for the correction of such baseline signals, however most of them are not able to estimate baseline in complex overlapped signal. In this study, a novel automatic baseline correction method is proposed for (1)H-MRS spectra based on ensemble empirical mode decomposition (EEMD). This investigation was applied on both the simulated data and the in-vivo (1)H-MRS of human brain signals. Results justify the efficiency of the proposed method to remove the baseline from (1)H-MRS signals.

  14. Electrogastrographic and autonomic responses during oculovestibular recoupling in flight simulation.

    PubMed

    Cevette, Michael J; Pradhan, Gaurav N; Cocco, Daniela; Crowell, Michael D; Galea, Anna M; Bartlett, Jennifer; Stepanek, Jan

    2014-01-01

    Simulator sickness causes vestibulo-autonomic responses that increase sympathetic activity and decrease parasympathetic activity. The purpose of the study was to quantify these responses through electrogastrography and cardiac interbeat intervals during flight simulation. There were 29 subjects that were randomly assigned to 2 parallel arms: (1) oculovestibular recoupling, where galvanic vestibular stimulation was synchronous with the visual field; and (2) control. Electrogastrography and interbeat interval data were collected during baseline, simulation, and post-simulation periods. A simulator sickness questionnaire was administered. Statistically significant differences were observed in percentage of recording time with the dominant frequency of electrogastrography in normogastric and bradygastric domains between the oculovestibular recoupling and control groups. Normogastria was dominant during simulation in the oculovestibular recoupling group. In the control group, the percentage of recording time with the dominant frequency decreased by 22% in normogastria and increased by 20% in bradygastria. The percentage change of the dominant power instability coefficient from baseline to simulation was 26% in the oculovestibular recoupling group vs. 108% in the control group. The power of high-frequency components for interbeat intervals did not change significantly in the oculovestibular recoupling group and was decreased during simulation in the control group. Electrogastrography and interbeat intervals are sensitive indices of autonomic changes in subjects undergoing flight simulation. These data demonstrate the potential of oculovestibular recoupling to stabilize gastric activity and cardiac autonomic changes altered during simulator and motion sickness.

  15. Pile-Up Discrimination Algorithms for the HOLMES Experiment

    NASA Astrophysics Data System (ADS)

    Ferri, E.; Alpert, B.; Bennett, D.; Faverzani, M.; Fowler, J.; Giachero, A.; Hays-Wehle, J.; Maino, M.; Nucciotti, A.; Puiu, A.; Ullom, J.

    2016-07-01

    The HOLMES experiment is a new large-scale experiment for the electron neutrino mass determination by means of the electron capture decay of ^{163}Ho. In such an experiment, random coincidence events are one of the main sources of background which impair the ability to identify the effect of a non-vanishing neutrino mass. In order to resolve these spurious events, detectors characterized by a fast response are needed as well as pile-up recognition algorithms. For that reason, we have developed a code for testing the discrimination efficiency of various algorithms in recognizing pile up events in dependence of the time separation between two pulses. The tests are performed on simulated realistic TES signals and noise. Indeed, the pulse profile is obtained by solving the two coupled differential equations which describe the response of the TES according to the Irwin-Hilton model. To these pulses, a noise waveform which takes into account all the noise sources regularly present in a real TES is added. The amplitude of the generated pulses is distributed as the ^{163}Ho calorimetric spectrum. Furthermore, the rise time of these pulses has been chosen taking into account the constraints given by both the bandwidth of the microwave multiplexing read out with a flux ramp demodulation and the bandwidth of the ADC boards currently available for ROACH2. Among the different rejection techniques evaluated, the Wiener Filter technique, a digital filter to gain time resolution, has shown an excellent pile-up rejection efficiency. The obtained time resolution closely matches the baseline specifications of the HOLMES experiment. We report here a description of our simulation code and a comparison of the different rejection techniques.

  16. Effects of Geometric Details on Slat Noise Generation and Propagation

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Lockard, David P.

    2006-01-01

    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations clearly show that the presence of the "blade" seal at the cusp significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, it is demonstrated that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

  17. Multi-objective component sizing of a power-split plug-in hybrid electric vehicle powertrain using Pareto-based natural optimization machines

    NASA Astrophysics Data System (ADS)

    Mozaffari, Ahmad; Vajedi, Mahyar; Chehresaz, Maryyeh; Azad, Nasser L.

    2016-03-01

    The urgent need to meet increasingly tight environmental regulations and new fuel economy requirements has motivated system science researchers and automotive engineers to take advantage of emerging computational techniques to further advance hybrid electric vehicle and plug-in hybrid electric vehicle (PHEV) designs. In particular, research has focused on vehicle powertrain system design optimization, to reduce the fuel consumption and total energy cost while improving the vehicle's driving performance. In this work, two different natural optimization machines, namely the synchronous self-learning Pareto strategy and the elitism non-dominated sorting genetic algorithm, are implemented for component sizing of a specific power-split PHEV platform with a Toyota plug-in Prius as the baseline vehicle. To do this, a high-fidelity model of the Toyota plug-in Prius is employed for the numerical experiments using the Autonomie simulation software. Based on the simulation results, it is demonstrated that Pareto-based algorithms can successfully optimize the design parameters of the vehicle powertrain.

  18. Instantaneous Real-Time Kinematic Decimeter-Level Positioning with BeiDou Triple-Frequency Signals over Medium Baselines.

    PubMed

    He, Xiyang; Zhang, Xiaohong; Tang, Long; Liu, Wanke

    2015-12-22

    Many applications, such as marine navigation, land vehicles location, etc., require real time precise positioning under medium or long baseline conditions. In this contribution, we develop a model of real-time kinematic decimeter-level positioning with BeiDou Navigation Satellite System (BDS) triple-frequency signals over medium distances. The ambiguities of two extra-wide-lane (EWL) combinations are fixed first, and then a wide lane (WL) combination is reformed based on the two EWL combinations for positioning. Theoretical analysis and empirical analysis is given of the ambiguity fixing rate and the positioning accuracy of the presented method. The results indicate that the ambiguity fixing rate can be up to more than 98% when using BDS medium baseline observations, which is much higher than that of dual-frequency Hatch-Melbourne-Wübbena (HMW) method. As for positioning accuracy, decimeter level accuracy can be achieved with this method, which is comparable to that of carrier-smoothed code differential positioning method. Signal interruption simulation experiment indicates that the proposed method can realize fast high-precision positioning whereas the carrier-smoothed code differential positioning method needs several hundreds of seconds for obtaining high precision results. We can conclude that a relatively high accuracy and high fixing rate can be achieved for triple-frequency WL method with single-epoch observations, displaying significant advantage comparing to traditional carrier-smoothed code differential positioning method.

  19. C P -invariance violation at short-baseline experiments in 3 +1 neutrino scenarios

    NASA Astrophysics Data System (ADS)

    de Gouvêa, André; Kelly, Kevin J.; Kobach, Andrew

    2015-03-01

    New neutrino degrees of freedom allow for more sources of charge parity- (C P ) invariance violation (CPV). We explore the requirements for accessing C P -odd mixing parameters in the so-called 3 +1 scenario, where one assumes the existence of one extra, mostly sterile neutrino degree of freedom, heavier than the other three mass eigenstates. As a first step, we concentrate on the νe→νμ appearance channel in a hypothetical, upgraded version of the ν STORM proposal. We establish that the optimal baseline for CPV studies depends strongly on the value of Δ m142—the new mass-squared difference—and that the ability to observe CPV depends significantly on whether the experiment is performed at the optimal baseline. Even at the optimal baseline, it is very challenging to see CPV in 3 +1 scenarios if one considers only one appearance channel. Full exploration of CPV in short-baseline experiments will require precision measurements of tau appearance, a challenge significantly beyond what is currently being explored by the experimental neutrino community.

  20. Does early-life family income influence later dental pain experience? A prospective 14-year study.

    PubMed

    Ghorbani, Z; Peres, M A; Liu, P; Mejia, G C; Armfield, J M; Peres, K G

    2017-12-01

    The aim of this study was to investigate the association between early-life family income and dental pain experience from childhood to early adulthood. Data came from a 14-year prospective study (1991/1992-2005/2006) carried out in South Australia, which included children and adolescents aged 4-17 years (N = 9875) at baseline. The outcome was dental pain experience obtained at baseline, 14 years later in adulthood and at a middle point of time. The main explanatory variable was early-life family income collected at baseline. The prevalence of dental pain was 22.8% at baseline, 19.3% at 'middle time' and 39.3% at follow up. The proportion of people classified as 'poor' at baseline was 27.7%. Being poor early in life was significantly associated with dental pain at 14-year follow up (odds ratio = 1.45; 95% confidence interval = 1.27-1.66). Early-life relative poverty is associated with more frequent dental pain across the 14-year follow up and may be a key exposure variable for later dental conditions. © 2017 Australian Dental Association.

  1. Recent results from the OPERA experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meregaglia, Anselmo

    2009-04-17

    OPERA is a long-baseline neutrino oscillation experiment whose main goal is to detect for the first time neutrino oscillations in an appearance mode. Using an almost pure v{sub {mu}} beam we search for a v{sub {mu}}{r_reversible}v{sub {tau}}, transition detecting in a direct way the {tau} lepton. The detector is located on the high-energy, long-baseline CERN to LNGS beam (CNGS) at a baseline of 730 km. The apparatus consists of a target made of lead/emulsion-films bricks and of electronic detectors which are used to tag the neutrino interaction. Experiment description and results from the short but fruitful 2007 CNGS run aremore » reported in details.« less

  2. Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak

    2011-01-01

    This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.

  3. Improving the Interview Skills of College Students Using Behavioral Skills Training

    ERIC Educational Resources Information Center

    Stocco, Corey S.; Thompson, Rachel H.; Hart, John M.; Soriano, Heidi L.

    2017-01-01

    Obtaining a job as a college graduate is partly dependent on interview performance. We used a multiple baseline design across skills to evaluate the effects of behavioral skills training with self-evaluation for five college students. Training effects were evaluated using simulated interviews as baseline and posttraining assessments. All…

  4. Simulation Study of Evacuation Control Center Operations Analysis

    DTIC Science & Technology

    2011-06-01

    28 4.3 Baseline Manning (Runs 1, 2, & 3) . . . . . . . . . . . . 30 4.3.1 Baseline Statistics Interpretation...46 Appendix B. Key Statistic Matrix: Runs 1-12 . . . . . . . . . . . . . 48 Appendix C. Blue Dart...Completion Time . . . 33 11. Paired T result - Run 5 v. Run 6: ECC Completion Time . . . 35 12. Key Statistics : Run 3 vs. Run 9

  5. [An Improved Cubic Spline Interpolation Method for Removing Electrocardiogram Baseline Drift].

    PubMed

    Wang, Xiangkui; Tang, Wenpu; Zhang, Lai; Wu, Minghu

    2016-04-01

    The selection of fiducial points has an important effect on electrocardiogram(ECG)denoise with cubic spline interpolation.An improved cubic spline interpolation algorithm for suppressing ECG baseline drift is presented in this paper.Firstly the first order derivative of original ECG signal is calculated,and the maximum and minimum points of each beat are obtained,which are treated as the position of fiducial points.And then the original ECG is fed into a high pass filter with 1.5Hz cutoff frequency.The difference between the original and the filtered ECG at the fiducial points is taken as the amplitude of the fiducial points.Then cubic spline interpolation curve fitting is used to the fiducial points,and the fitting curve is the baseline drift curve.For the two simulated case test,the correlation coefficients between the fitting curve by the presented algorithm and the simulated curve were increased by 0.242and0.13 compared with that from traditional cubic spline interpolation algorithm.And for the case of clinical baseline drift data,the average correlation coefficient from the presented algorithm achieved 0.972.

  6. Integrated tokamak modeling: when physics informs engineering and research planning

    NASA Astrophysics Data System (ADS)

    Poli, Francesca

    2017-10-01

    Simulations that integrate virtually all the relevant engineering and physics aspects of a real tokamak experiment are a power tool for experimental interpretation, model validation and planning for both present and future devices. This tutorial will guide through the building blocks of an ``integrated'' tokamak simulation, such as magnetic flux diffusion, thermal, momentum and particle transport, external heating and current drive sources, wall particle sources and sinks. Emphasis is given to the connection and interplay between external actuators and plasma response, between the slow time scales of the current diffusion and the fast time scales of transport, and how reduced and high-fidelity models can contribute to simulate a whole device. To illustrate the potential and limitations of integrated tokamak modeling for discharge prediction, a helium plasma scenario for the ITER pre-nuclear phase is taken as an example. This scenario presents challenges because it requires core-edge integration and advanced models for interaction between waves and fast-ions, which are subject to a limited experimental database for validation and guidance. Starting from a scenario obtained by re-scaling parameters from the demonstration inductive ``ITER baseline'', it is shown how self-consistent simulations that encompass both core and edge plasma regions, as well as high-fidelity heating and current drive source models are needed to set constraints on the density, magnetic field and heating scheme. This tutorial aims at demonstrating how integrated modeling, when used with adequate level of criticism, can not only support design of operational scenarios, but also help to asses the limitations and gaps in the available models, thus indicating where improved modeling tools are required and how present experiments can help their validation and inform research planning. Work supported by DOE under DE-AC02-09CH1146.

  7. The Influence of Mesh Density on the Impact Response of a Shuttle Leading-Edge Panel Finite Element Simulation

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.; Spellman, Regina L.

    2004-01-01

    A study was performed to examine the influence of varying mesh density on an LS-DYNA simulation of a rectangular-shaped foam projectile impacting the space shuttle leading edge Panel 6. The shuttle leading-edge panels are fabricated of reinforced carbon-carbon (RCC) material. During the study, nine cases were executed with all possible combinations of coarse, baseline, and fine meshes of the foam and panel. For each simulation, the same material properties and impact conditions were specified and only the mesh density was varied. In the baseline model, the shell elements representing the RCC panel are approximately 0.2-in. on edge, whereas the foam elements are about 0.5-in. on edge. The element nominal edge-length for the baseline panel was halved to create a fine panel (0.1-in. edge length) mesh and doubled to create a coarse panel (0.4-in. edge length) mesh. In addition, the element nominal edge-length of the baseline foam projectile was halved (0.25-in. edge length) to create a fine foam mesh and doubled (1.0- in. edge length) to create a coarse foam mesh. The initial impact velocity of the foam was 775 ft/s. The simulations were executed in LS-DYNA version 960 for 6 ms of simulation time. Contour plots of resultant panel displacement and effective stress in the foam were compared at five discrete time intervals. Also, time-history responses of internal and kinetic energy of the panel, kinetic and hourglass energy of the foam, and resultant contact force were plotted to determine the influence of mesh density. As a final comparison, the model with a fine panel and fine foam mesh was executed with slightly different material properties for the RCC. For this model, the average degraded properties of the RCC were replaced with the maximum degraded properties. Similar comparisons of panel and foam responses were made for the average and maximum degraded models.

  8. High-Fidelity Modelng and Simulation for a High Flux Isotope Reactor Low-Enriched Uranium Core Design

    DOE PAGES

    Betzler, Benjamin R.; Chandler, David; Davidson, Eva E.; ...

    2017-05-08

    A high-fidelity model of the High Flux Isotope Reactor (HFIR) with a low-enriched uranium (LEU) fuel design and a representative experiment loading has been developed to serve as a new reference model for LEU conversion studies. With the exception of the fuel elements, this HFIR LEU model is completely consistent with the current highly enriched uranium HFIR model. Results obtained with the new LEU model provide a baseline for analysis of alternate LEU fuel designs and further optimization studies. The newly developed HFIR LEU model has an explicit representation of the HFIR-specific involute fuel plate geometry, including the within-plate fuelmore » meat contouring, and a detailed geometry model of the fuel element side plates. Such high-fidelity models are necessary to accurately account for the self-shielding from 238U and the depletion of absorber materials present in the side plates. In addition, a method was developed to account for fuel swelling in the high-density LEU fuel plates during the depletion simulation. In conclusion, calculated time-dependent metrics for the HFIR LEU model include fission rate and cumulative fission density distributions, flux and reaction rates for relevant experiment locations, point kinetics data, and reactivity coefficients.« less

  9. Piloted Simulation of Various Synthetic Vision Systems Terrain Portrayal and Guidance Symbology Concepts for Low Altitude En-Route Scenario

    NASA Technical Reports Server (NTRS)

    Takallu, M. A.; Glaab, L. J.; Hughes, M. F.; Wong, D. T.; Bartolone, A. P.

    2008-01-01

    In support of the NASA Aviation Safety Program's Synthetic Vision Systems Project, a series of piloted simulations were conducted to explore and quantify the relationship between candidate Terrain Portrayal Concepts and Guidance Symbology Concepts, specific to General Aviation. The experiment scenario was based on a low altitude en route flight in Instrument Metrological Conditions in the central mountains of Alaska. A total of 18 general aviation pilots, with three levels of pilot experience, evaluated a test matrix of four terrain portrayal concepts and six guidance symbology concepts. Quantitative measures included various pilot/aircraft performance data, flight technical errors and flight control inputs. The qualitative measures included pilot comments and pilot responses to the structured questionnaires such as perceived workload, subjective situation awareness, pilot preferences, and the rare event recognition. There were statistically significant effects found from guidance symbology concepts and terrain portrayal concepts but no significant interactions between them. Lower flight technical errors and increased situation awareness were achieved using Synthetic Vision Systems displays, as compared to the baseline Pitch/Roll Flight Director and Blue Sky Brown Ground combination. Overall, those guidance symbology concepts that have both path based guidance cue and tunnel display performed better than the other guidance concepts.

  10. Assembly considerations for large reflectors

    NASA Technical Reports Server (NTRS)

    Bush, H.

    1988-01-01

    The technologies developed at LaRC in the area of erectable instructures are discussed. The information is of direct value to the Large Deployable Reflector (LDR) because an option for the LDR backup structure is to assemble it in space. The efforts in this area, which include development of joints, underwater assembly simulation tests, flight assembly/disassembly tests, and fabrication of 5-meter trusses, led to the use of the LaRC concept as the baseline configuration for the Space Station Structure. The Space Station joint is linear in the load and displacement range of interest to Space Station; the ability to manually assemble and disassemble a 45-foot truss structure was demonstrated by astronauts in space as part of the ACCESS Shuttle Flight Experiment. The structure was built in 26 minutes 46 seconds, and involved a total of 500 manipulations of untethered hardware. Also, the correlation of the space experience with the neutral buoyancy simulation was very good. Sections of the proposed 5-meter bay Space Station truss have been built on the ground. Activities at LaRC have included the development of mobile remote manipulator systems (which can traverse the Space Station 5-meter structure), preliminary LDR sun shield concepts, LDR construction scenarios, and activities in robotic assembly of truss-type structures.

  11. High-Fidelity Modelng and Simulation for a High Flux Isotope Reactor Low-Enriched Uranium Core Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Betzler, Benjamin R.; Chandler, David; Davidson, Eva E.

    A high-fidelity model of the High Flux Isotope Reactor (HFIR) with a low-enriched uranium (LEU) fuel design and a representative experiment loading has been developed to serve as a new reference model for LEU conversion studies. With the exception of the fuel elements, this HFIR LEU model is completely consistent with the current highly enriched uranium HFIR model. Results obtained with the new LEU model provide a baseline for analysis of alternate LEU fuel designs and further optimization studies. The newly developed HFIR LEU model has an explicit representation of the HFIR-specific involute fuel plate geometry, including the within-plate fuelmore » meat contouring, and a detailed geometry model of the fuel element side plates. Such high-fidelity models are necessary to accurately account for the self-shielding from 238U and the depletion of absorber materials present in the side plates. In addition, a method was developed to account for fuel swelling in the high-density LEU fuel plates during the depletion simulation. In conclusion, calculated time-dependent metrics for the HFIR LEU model include fission rate and cumulative fission density distributions, flux and reaction rates for relevant experiment locations, point kinetics data, and reactivity coefficients.« less

  12. Time-to-Seizure Modeling of Lacosamide Used in Monotherapy in Patients with Newly Diagnosed Epilepsy.

    PubMed

    Lindauer, Andreas; Laveille, Christian; Stockis, Armel

    2017-11-01

    To quantify the relationship between exposure to lacosamide monotherapy and seizure probability, and to simulate the effect of changing the dose regimen. Structural time-to-event models for dropouts (not because of a lack of efficacy) and seizures were developed using data from 883 adult patients newly diagnosed with epilepsy and experiencing focal or generalized tonic-clonic seizures, participating in a trial (SP0993; ClinicalTrials.gov identifier: NCT01243177) comparing the efficacy of lacosamide and carbamazepine controlled-release monotherapy. Lacosamide dropout and seizure models were used for simulating the effect of changing the initial target dose on seizure freedom. Repeated time-to-seizure data were described by a Weibull distribution with parameters estimated separately for the first and subsequent seizures. Daily area under the plasma concentration-time curve was related linearly to the log-hazard. Disease severity, expressed as the number of seizures during the 3 months before the trial (baseline), was a strong predictor of seizure probability: patients with 7-50 seizures at baseline had a 2.6-fold (90% confidence interval 2.01-3.31) higher risk of seizures compared with the reference two to six seizures. Simulations suggested that a 400-mg/day, rather than a 200-mg/day initial target dose for patients with seven or more seizures at baseline could potentially result in an additional 8% of seizure-free patients for 6 months at the last evaluated dose level. Patients receiving lacosamide had a slightly lower dropout risk compared with those receiving carbamazepine. Baseline disease severity was the most important predictor of seizure probability. Simulations suggest that an initial target dose >200 mg/day could potentially benefit patients with greater disease severity.

  13. The selection of the optimal baseline in the front-view monocular vision system

    NASA Astrophysics Data System (ADS)

    Xiong, Bincheng; Zhang, Jun; Zhang, Daimeng; Liu, Xiaomao; Tian, Jinwen

    2018-03-01

    In the front-view monocular vision system, the accuracy of solving the depth field is related to the length of the inter-frame baseline and the accuracy of image matching result. In general, a longer length of the baseline can lead to a higher precision of solving the depth field. However, at the same time, the difference between the inter-frame images increases, which increases the difficulty in image matching and the decreases matching accuracy and at last may leads to the failure of solving the depth field. One of the usual practices is to use the tracking and matching method to improve the matching accuracy between images, but this algorithm is easy to cause matching drift between images with large interval, resulting in cumulative error in image matching, and finally the accuracy of solving the depth field is still very low. In this paper, we propose a depth field fusion algorithm based on the optimal length of the baseline. Firstly, we analyze the quantitative relationship between the accuracy of the depth field calculation and the length of the baseline between frames, and find the optimal length of the baseline by doing lots of experiments; secondly, we introduce the inverse depth filtering technique for sparse SLAM, and solve the depth field under the constraint of the optimal length of the baseline. By doing a large number of experiments, the results show that our algorithm can effectively eliminate the mismatch caused by image changes, and can still solve the depth field correctly in the large baseline scene. Our algorithm is superior to the traditional SFM algorithm in time and space complexity. The optimal baseline obtained by a large number of experiments plays a guiding role in the calculation of the depth field in front-view monocular.

  14. Partnering to Establish and Study Simulation in International Nursing Education.

    PubMed

    Garner, Shelby L; Killingsworth, Erin; Raj, Leena

    The purpose of this article was to describe an international partnership to establish and study simulation in India. A pilot study was performed to determine interrater reliability among faculty new to simulation when evaluating nursing student competency performance. Interrater reliability was below the ideal agreement level. Findings in this study underscore the need to obtain baseline interrater reliability data before integrating competency evaluation into a simulation program.

  15. Agricultural Baseline (BL0) scenario

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinckel, Chad M [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  16. A piloted-simulation evaluation of two electronic display formats for approach and landing

    NASA Technical Reports Server (NTRS)

    Steinmetz, G. G.; Morello, S. A.; Knox, C. E.; Person, L. H., Jr.

    1976-01-01

    The results of a piloted-simulation evaluation of the benefits of adding runway symbology and track information to a baseline electronic-attitude-director-indicator (EADI) format for the approach-to-landing task were presented. The evaluation was conducted for the baseline format and for the baseline format with the added symbology during 3 deg straight-in approaches with calm, cross-wind, and turbulence conditions. Flight-path performance data and pilot subjective comments were examined with regard to the pilot's tracking performance and mental workload for both display formats. The results show that the addition of a perspective runway image and relative track information to a basic situation-information EADI format improve the tracking performance both laterally and vertically during an approach-to-landing task and that the mental workload required to assess the approach situation was thus reduced as a result of integration of information.

  17. Artificial intelligence (AI) based tactical guidance for fighter aircraft

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.; Goodrich, Kenneth H.

    1990-01-01

    A research program investigating the use of artificial intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS), a second generation TDG, is presented. The knowledge-based systems used by CLAWS to aid in the tactical decision-making process are outlined in detail, and the results of tests to evaluate the performance of CLAWS versus a baseline TDG developed in FORTRAN to run in real time in the Langley Differential Maneuvering Simulator, are presented. To date, these test results have shown significant performance gains with respect to the TDG baseline in one-versus-one air combat engagements, and the AI-based TDG software has proven to be much easier to modify and maintain than the baseline FORTRAN TDG programs.

  18. Ultrasound simulator-assisted teaching of cardiac anatomy to preclinical anatomy students: A pilot randomized trial of a three-hour learning exposure.

    PubMed

    Canty, David Jeffrey; Hayes, Jenny A; Story, David Andrew; Royse, Colin Forbes

    2015-01-01

    Ultrasound simulation allows students to virtually explore internal anatomy by producing accurate, moving, color, three-dimensional rendered slices from any angle or approach leaving the organs and their relationships intact without requirement for consumables. The aim was to determine the feasibility and efficacy of self-directed learning of cardiac anatomy with an ultrasound simulator compared to cadavers and plastic models. After a single cardiac anatomy lecture, fifty university anatomy students participated in a three-hour supervised self-directed learning exposure in groups of five, randomized to an ultrasound simulator or human cadaveric specimens and plastic models. Pre- and post-tests were conducted using pictorial and non-pictorial multiple-choice questions (MCQs). Simulator students completed a survey on their experience. Four simulator and seven cadaver group students did not attend after randomization. Simulator use in groups of five students was feasible and feedback from participants was very positive. Baseline test scores were similar (P = 0.9) between groups. After the learning intervention, there was no difference between groups in change in total test score (P = 0.37), whether they were pictorial (P = 0.6) or non-pictorial (P = 0.21). In both groups there was an increase in total test scores (simulator +19.8 ±12.4%% and cadaver: +16.4% ± 10.2, P < 0.0001), pictorial question scores (+22.9 ±18.0%, 19.7 ±19.3%, P < 0.001) and non-pictorial question scores (+16.7 ±18.2%, +13 ±15.4%, P = 0.002). The ultrasound simulator appears equivalent to human cadaveric prosections for learning cardiac anatomy. © 2014 American Association of Anatomists.

  19. Virtual reality lead extraction as a method for training new physicians: a pilot study.

    PubMed

    Maytin, Melanie; Daily, Thomas P; Carillo, Roger G

    2015-03-01

    It is estimated that the demand for transvenous lead extraction (TLE) has reached an annual extraction rate of nearly 24,000 patients worldwide. Despite technologic advances, TLE still has the potential for significant morbidity and mortality. Complication rates with TLE directly parallel operator experience. However, obtaining adequate training during and postfellowship can be difficult. Given the potential for catastrophic complications and the steep learning curve (up to 300 cases) associated with this procedure, we sought to validate a virtual reality (VR) lead extraction simulator as an innovative training and evaluation tool for physicians new to TLE. We randomized eight electrophysiology fellows to VR simulator versus conventional training. We compared procedural skill competency between the groups using simulator competency, tactile measurements, markers of proficiency and attitudes, and cognitive abilities battery. Practical skills and simulator complications differed significantly between the VR simulator and conventional training groups. The VR simulator group executed patient preparation and procedure performance better than the conventional group (P < 0.01). All four fellows randomized to conventional training experienced a simulator complication (two superior vena cava [SVC] tears, three right ventricle [RV] avulsions) versus one fellow in the VR simulator group (one SVC tear) (P = 0.02). Tactile measurements revealed a trend toward excess pushing versus pulling forces among the conventionally trained group. The time for lead removal was also significantly higher in the conventional training group (12.46 minutes vs 5.54 minutes, P = 0.02). There was no significant difference in baseline or posttraining cognitive ability. We contend that the implementation of alternative training tools such as a VR simulation model will improve physician training and allow for an innovative pathway to assess the achievement of competency. ©2014 Wiley Periodicals, Inc.

  20. Multibaseline gravitational wave radiometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talukder, Dipongkar; Bose, Sukanta; Mitra, Sanjit

    2011-03-15

    We present a statistic for the detection of stochastic gravitational wave backgrounds (SGWBs) using radiometry with a network of multiple baselines. We also quantitatively compare the sensitivities of existing baselines and their network to SGWBs. We assess how the measurement accuracy of signal parameters, e.g., the sky position of a localized source, can improve when using a network of baselines, as compared to any of the single participating baselines. The search statistic itself is derived from the likelihood ratio of the cross correlation of the data across all possible baselines in a detector network and is optimal in Gaussian noise.more » Specifically, it is the likelihood ratio maximized over the strength of the SGWB and is called the maximized-likelihood ratio (MLR). One of the main advantages of using the MLR over past search strategies for inferring the presence or absence of a signal is that the former does not require the deconvolution of the cross correlation statistic. Therefore, it does not suffer from errors inherent to the deconvolution procedure and is especially useful for detecting weak sources. In the limit of a single baseline, it reduces to the detection statistic studied by Ballmer [Classical Quantum Gravity 23, S179 (2006).] and Mitra et al.[Phys. Rev. D 77, 042002 (2008).]. Unlike past studies, here the MLR statistic enables us to compare quantitatively the performances of a variety of baselines searching for a SGWB signal in (simulated) data. Although we use simulated noise and SGWB signals for making these comparisons, our method can be straightforwardly applied on real data.« less

  1. An evaluation of water vapor radiometer data for calibration of the wet path delay in very long baseline interferometry experiments

    NASA Technical Reports Server (NTRS)

    Kuehn, C. E.; Himwich, W. E.; Clark, T. A.; Ma, C.

    1991-01-01

    The internal consistency of the baseline-length measurements derived from analysis of several independent VLBI experiments is an estimate of the measurement precision. The paper investigates whether the inclusion of water vapor radiometer (WVR) data as an absolute calibration of the propagation delay due to water vapor improves the precision of VLBI baseline-length measurements. The paper analyzes 28 International Radio Interferometric Surveying runs between June 1988 and January 1989; WVR measurements were made during each session. The addition of WVR data decreased the scatter of the length measurements of the baselines by 5-10 percent. The observed reduction in the scatter of the baseline lengths is less than what is expected from the behavior of the formal errors, which suggest that the baseline-length measurement precision should improve 10-20 percent if WVR data are included in the analysis. The discrepancy between the formal errors and the baseline-length results can be explained as the consequence of systematic errors in the dry-mapping function parameters, instrumental biases in the WVR and the barometer, or both.

  2. Long-Term Climatic and Anthropogenic Impacts on Streamwater Salinity in New York State: INCA Simulations Offer Cautious Optimism.

    PubMed

    Gutchess, Kristina; Jin, Li; Ledesma, José L J; Crossman, Jill; Kelleher, Christa; Lautz, Laura; Lu, Zunli

    2018-02-06

    The long-term application of road salts has led to a rise in surface water chloride (Cl - ) concentrations. While models have been used to assess the potential future impacts of continued deicing practices, prior approaches have not incorporated changes in climate that are projected to impact hydrogeology in the 21st century. We use an INtegrated CAtchment (INCA) model to simulate Cl - concentrations in the Tioughnioga River watershed. The model was run over a baseline period (1961-1990) and climate simulations from a range of GCMs run over three 30-year intervals (2010-2039; 2040-2069; 2070-2099). Model projections suggest that Cl - concentrations in the two river branches will continue to rise for several decades, before beginning to decline around 2040-2069, with all GCM scenarios indicating reductions in snowfall and associated salt applications over the 21st century. The delay in stream response is most likely attributed to climate change and continued contribution of Cl - from aquifers. By 2100, surface water Cl - concentrations will decrease to below 1960s values. Catchments dominated by urban lands will experience a decrease in average surface water Cl - , although moderate compared to more rural catchments.

  3. Changes in Gait with Anteriorly Added Mass: A Pregnancy Simulation Study

    PubMed Central

    Ogamba, Maureen I.; Loverro, Kari L.; Laudicina, Natalie M.; Gill, Simone V.; Lewis, Cara L.

    2016-01-01

    During pregnancy, the female body experiences structural changes, such as weight gain. As pregnancy advances, most of the additional mass is concentrated anteriorly on the lower trunk. The purpose of this study is to analyze kinematic and kinetic changes when load is added anteriorly to the trunk, simulating a physical change experienced during pregnancy. Twenty healthy females walked on a treadmill while wearing a custom made pseudo-pregnancy sac (1 kg) under three load conditions: sac only, 10 pound condition (4.535 kg added anteriorly), and 20 pound condition (9.07 kg added anteriorly), used to simulate pregnancy, in the second trimester and at full term pregnancy, respectively. The increase in anterior mass resulted in kinematic changes at the knee, hip, pelvis, and trunk in the sagittal and frontal planes. Additionally, ankle, knee, and hip joint moments normalized to baseline mass increased with increased load; however, these moments decreased when normalized to total mass. These kinematic and kinetic changes may suggest that women modify gait biomechanics to reduce the effect of added load. Furthermore, the increase in joint moments increases stress on the musculoskeletal system and may contribute to musculoskeletal pain. PMID:26958743

  4. Smoothness of In vivo Spectral Baseline Determined by Mean Squared Error

    PubMed Central

    Zhang, Yan; Shen, Jun

    2013-01-01

    Purpose A nonparametric smooth line is usually added to spectral model to account for background signals in vivo magnetic resonance spectroscopy (MRS). The assumed smoothness of the baseline significantly influences quantitative spectral fitting. In this paper, a method is proposed to minimize baseline influences on estimated spectral parameters. Methods In this paper, the non-parametric baseline function with a given smoothness was treated as a function of spectral parameters. Its uncertainty was measured by root-mean-squared error (RMSE). The proposed method was demonstrated with a simulated spectrum and in vivo spectra of both short echo time (TE) and averaged echo times. The estimated in vivo baselines were compared with the metabolite-nulled spectra, and the LCModel-estimated baselines. The accuracies of estimated baseline and metabolite concentrations were further verified by cross-validation. Results An optimal smoothness condition was found that led to the minimal baseline RMSE. In this condition, the best fit was balanced against minimal baseline influences on metabolite concentration estimates. Conclusion Baseline RMSE can be used to indicate estimated baseline uncertainties and serve as the criterion for determining the baseline smoothness of in vivo MRS. PMID:24259436

  5. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

    DOE PAGES

    Worcester, Elizabeth

    2015-08-06

    In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle θ 23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

  6. Testing the H2O2-H2O hypothesis for life on Mars with the TEGA instrument on the Phoenix lander.

    PubMed

    Schulze-Makuch, Dirk; Turse, Carol; Houtkooper, Joop M; McKay, Christopher P

    2008-04-01

    In the time since the Viking life-detection experiments were conducted on Mars, many missions have enhanced our knowledge about the environmental conditions on the Red Planet. However, the martian surface chemistry and the Viking lander results remain puzzling. Nonbiological explanations that favor a strong inorganic oxidant are currently favored (e.g., Mancinelli, 1989; Plumb et al., 1989; Quinn and Zent, 1999; Klein, 1999; Yen et al., 2000), but problems remain regarding the lifetime, source, and abundance of that oxidant to account for the Viking observations (Zent and McKay, 1994). Alternatively, a hypothesis that favors the biological origin of a strong oxidizer has recently been advanced (Houtkooper and Schulze-Makuch, 2007). Here, we report on laboratory experiments that simulate the experiments to be conducted by the Thermal and Evolved Gas Analyzer (TEGA) instrument of the Phoenix lander, which is to descend on Mars in May 2008. Our experiments provide a baseline for an unbiased test for chemical versus biological responses, which can be applied at the time the Phoenix lander transmits its first results from the martian surface.

  7. Experimental Database with Baseline CFD Solutions: 2-D and Axisymmetric Hypersonic Shock-Wave/Turbulent-Boundary-Layer Interactions

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.; Brown, James L.; Gnoffo, Peter A.

    2013-01-01

    A database compilation of hypersonic shock-wave/turbulent boundary layer experiments is provided. The experiments selected for the database are either 2D or axisymmetric, and include both compression corner and impinging type SWTBL interactions. The strength of the interactions range from attached to incipient separation to fully separated flows. The experiments were chosen based on criterion to ensure quality of the datasets, to be relevant to NASA's missions and to be useful for validation and uncertainty assessment of CFD Navier-Stokes predictive methods, both now and in the future. An emphasis on datasets selected was on surface pressures and surface heating throughout the interaction, but include some wall shear stress distributions and flowfield profiles. Included, for selected cases, are example CFD grids and setup information, along with surface pressure and wall heating results from simulations using current NASA real-gas Navier-Stokes codes by which future CFD investigators can compare and evaluate physics modeling improvements and validation and uncertainty assessments of future CFD code developments. The experimental database is presented tabulated in the Appendices describing each experiment. The database is also provided in computer-readable ASCII files located on a companion DVD.

  8. Study of Anti-Neutrino Beam with Muon Monitor in the T2K experiment

    NASA Astrophysics Data System (ADS)

    Hiraki, Takahiro

    The T2K experiment is a long-baseline neutrino oscillation experiment. In 2013, the T2K collaboration observed electron neutrino appearance in a muon neutrino beam at 7.3 sigma significance. One of the next main goals of the T2K experiment is to measure electron anti-neutrino appearance. In June 2014 we took anti-neutrino beam data for the first time. The anti-neutrino beam was obtained by reversing the polarity of horn focusing magnets. To monitor the direction and intensity of the neutrino beam which is produced from the decay of pions and kaons, the muon beam is continuously measured by Muon Monitor (MUMON). To reconstruct the profile of the muon beam, MUMON is equipped with 49 sensors distributed on a plane behind the beam dump. In this report, we show some results of the anti-neutrino beam data taken by monitors including MUMON. In particular, dependence of the muon beam intensity on electric current of the horns, correlation between the proton beam position and the MUMON profile, and beam stability are presented. Comparison between the data and Monte Carlo simulation is also discussed.

  9. Impacts of climate change on river discharge in the northern Tien Shan: Results from the long-term observations and modelling

    NASA Astrophysics Data System (ADS)

    Shahgedanova, Maria; Afzal, Muhammad; Usmanova, Zamira; Kapitsa, Vasilii; Mayr, Elisabeth; Hagg, Wilfried; Severskiy, Igor; Zhumabayev, Dauren

    2017-04-01

    The study presents results of investigation of the observed and projected changes in discharge of seven snow- and glacier-nourished rivers of the northern Tien Shan (south-eastern Kazakhstan). The observed trends were assessed using the long-term (40-60 years) homogeneous daily records of discharge from the gauging stations located in the mountains and unaffected by human activities including water abstraction. Positive trends in discharge were registered at most sites between the 1950s and 2010s with the strongest increase in summer and autumn particularly in 2000-2010s in line with the positive temperature trends. The observed increase was most prominent in the catchments with a higher proportion of glacierized area. At the Ulken Almatinka and Kishi Almatinka rivers, where 16% and 12% of the catchment areas are glacierized, positive trends in summer and autumn discharge exceeded 1% per year. The strongest increase was observed in September indicating that melting period extends in the early autumn. In September-November, the number of days with extreme discharge values, defined as daily values exceeding 95th percentile (calculated for each meteorological season), increased at all rivers. Future changes in discharge were modelled using HBV-ETH hydrological model and four climate change scenarios derived using regional climate model PRECIS with 25 km spatial resolution driven by HadGEM GCM for RCP 2.6 and RCP 8.5 scenarios and HadCM3Q0 and ECHAM5 GCM for A1B scenario. A range of glacier change scenarios was considered. All climate experiments project increase in temperature with the strongest warming projected by the HadGEM-driven simulation for RCP 8.5 scenario and HadCM3Q0-driven simulation for A1B scenario. The projected changes in precipitation varied between models and seasons, however, most experiments did not show significant trends in precipitation within the studied catchments. The exception is a simulation driven by HadGEM GCM for 8.5 RCP scenario which projects summer drying. All simulations project that in the 2020s, discharge will remain close to its baseline (1990-2005) values suggesting that peak flow has been reached in the northern Tien Shan. Significant decrease in discharge is projected for the post 2030s period for June-September. The strongest changes are expected in July and August when discharge values are projected to decrease by 25-38% in 2030-2060 and decline further to up 50% of the baseline values in 2060-2099.

  10. Retention of laparoscopic and robotic skills among medical students: a randomized controlled trial.

    PubMed

    Orlando, Megan S; Thomaier, Lauren; Abernethy, Melinda G; Chen, Chi Chiung Grace

    2017-08-01

    Although simulation training beneficially contributes to traditional surgical training, there are less objective data on simulation skills retention. To investigate the retention of laparoscopic and robotic skills after simulation training. We present the second stage of a randomized single-blinded controlled trial in which 40 simulation-naïve medical students were randomly assigned to practice peg transfer tasks on either laparoscopic (N = 20, Fundamentals of Laparoscopic Surgery, Venture Technologies Inc., Waltham, MA) or robotic (N = 20, dV-Trainer, Mimic, Seattle, WA) platforms. In the first stage, two expert surgeons evaluated participants on both tasks before (Stage 1: Baseline) and immediately after training (Stage 1: Post-training) using a modified validated global rating scale of laparoscopic and robotic operative performance. In Stage 2, participants were evaluated on both tasks 11-20 weeks after training. Of the 40 students who participated in Stage 1, 23 (11 laparoscopic and 12 robotic) underwent repeat evaluation. During Stage 2, there were no significant differences between groups in objective or subjective measures for the laparoscopic task. Laparoscopic-trained participants' performances on the laparoscopic task were improved during Stage 2 compared to baseline measured by time to task completion, but not by the modified global rating scale. During the robotic task, the robotic-trained group demonstrated superior economy of motion (p = .017), Tissue Handling (p = .020), and fewer errors (p = .018) compared to the laparoscopic-trained group. Robotic skills acquisition from baseline with no significant deterioration as measured by modified global rating scale scores was observed among robotic-trained participants during Stage 2. Robotic skills acquired through simulation appear to be better maintained than laparoscopic simulation skills. This study is registered on ClinicalTrials.gov (NCT02370407).

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environmentmore » and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  12. A simulation-based efficiency comparison of AC and DC power distribution networks in commercial buildings

    DOE PAGES

    Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei; ...

    2017-06-12

    Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less

  13. A simulation-based efficiency comparison of AC and DC power distribution networks in commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei

    Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less

  14. Payload crew training complex simulation engineer's handbook

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1984-01-01

    The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.

  15. Precision Geodesy via Radio Interferometry.

    PubMed

    Hinteregger, H F; Shapiro, I I; Robertson, D S; Knight, C A; Ergas, R A; Whitney, A R; Rogers, A E; Moran, J M; Clark, T A; Burke, B F

    1972-10-27

    Very-long-baseline interferometry experiments, involving observations of extragalactic radio sources, were performed in 1969 to determine the vector separations between antenna sites in Massachusetts and West Virginia. The 845.130-kilometer baseline was estimated from two separate experiments. The results agreed with each other to within 2 meters in all three components and with a special geodetic survey to within 2 meters in length; the differences in baseline direction as determined by the survey and by interferometry corresponded to discrepancies of about 5 meters. The experiments also yielded positions for nine extragalactic radio sources, most to within 1 arc second, and allowed the hydrogen maser clocks at the two sites to be synchronized a posteriori with an uncertainty of only a few nanoseconds.

  16. Camx Ozone Source Attribution in the Eastern United States Using Guidance from Observations During DISCOVER-AQ Maryland

    NASA Technical Reports Server (NTRS)

    Goldberg, Daniel L.; Vinciguerra, Timothy P.; Anderson, Daniel C.; Hembeck, Linda; Canty, Timothy P.; Ehrman, Sheryl H.; Martins, Douglas K.; Stauffer, Ryan M.; Thompson, Anne M.; Salawitch, Ross J.; hide

    2016-01-01

    A Comprehensive Air-Quality Model with Extensions (CAMx) version 6.10 simulation was assessed through comparison with data acquired during NASA's 2011 Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality (DISCOVER-AQ) Maryland field campaign. Comparisons for the baseline simulation (Carbon Bond 2005 (CB05) chemistry, Environmental Protection Agency 2011 National Emissions Inventory) show a model overestimate of NOy by +86.2% and an underestimate of HCHO by -28.3%. We present a new model framework (Carbon Bond 6 Revision 2 chemistry (CB6r2), Model of Emissions of Gases and Aerosols from Nature (MEGAN) version 2.1 biogenic emissions, 50% reduction in mobile NOx, enhanced representation of isoprene nitrates) that better matches observations. The new model framework attributes 31.4% more surface ozone in Maryland to electric generating units (EGUs) and 34.6% less ozone to on-road mobile sources. Surface ozone becomes more NOx limited throughout the eastern United States compared to the baseline simulation. The baseline model therefore likely underestimates the effectiveness of anthropogenic NOx reductions as well as the current contribution of EGUs to surface ozone.

  17. The effect of tracking network configuration on Global Positioning System (GPS) baseline estimates for the CASA (Central and South America) Uno experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, S.K.; Dixon, T.H.; Freymueller, J.T.

    1990-04-01

    Geodetic monitoring of subduction of the Nazca and Cocos plates is a goal of the CASA (Central and South America) Global Positioning System (GPS) experiments, and requires measurement of intersite distances (baselines) in excess of 500 km. The major error source in these measurements is the uncertainty in the position of the GPS satellites at the time of observation. A key aspect of the first CASA experiment, CASA Uno, was the initiation of a global network of tracking stations minimize these errors. The authors studied the effect of using various subsets of this global tracking network on long (>100 km)more » baseline estimates in the CASA region. Best results were obtained with a global tracking network consisting of three U.S. fiducial stations, two sites in the southwest pacific and two sites in Europe. Relative to smaller subsets, this global network improved baseline repeatability, resolution of carrier phase cycle ambiguities, and formal errors of the orbit estimates. Describing baseline repeatability for horizontal components as {sigma}=(a{sup 2} + b{sup 2}L{sup 2}){sup 1/2} where L is baseline length, the authors obtained a = 4 and 9 mm and b = 2.8{times}10{sup {minus}8} and 2.3{times}10{sup {minus}8} for north and east components, respectively, on CASA baselines up to 1,000 km in length with this global network.« less

  18. Dental caries experience, rather than toothbrushing, influences the incidence of dental caries in young Japanese adults.

    PubMed

    Sonoda, C; Ebisawa, M; Nakashima, H; Sakurai, Y

    2017-06-01

    A dose-response relationship between toothbrushing frequency and the incidence of dental caries has not been confirmed. Furthermore, no longitudinal study about this relationship has considered dental caries experience at baseline, which is an important factor influencing the frequency of future caries. To elucidate the association between the incidence of dental caries and toothbrushing frequency after adjusting for dental caries experience at baseline in a Japanese population. The 92 recruits of the Japan Maritime Self-Defense Force in Kure, Japan, in 2011 were followed up for 3 years. They underwent oral examination at the annual checkups and answered questions about toothbrushing frequency. The multiple logistic regression analysis was used to analyze the incidence of dental caries and to identify independent effects of toothbrushing frequency and dental caries experience at baseline. Furthermore, the relative importance of the incidence of dental caries was investigated among other independent variables using the partial adjusted R² score. Logistic regression analysis showed that toothbrushing frequency alone did not influence the increment in decayed, missing, and filled teeth (DMFT). However, DMFT at baseline alone was associated with the increment in DMFT (crude odds ratio, OR, 1.20, 95% confidence interval, CI, 1.08,1.33). In the fully adjusted model, only DMFT at baseline was associated with the increment in DMFT (adjusted OR 1.23, 95%CI 1.09,1.38). After three years, the incidence of dental caries in young adult Japanese males was influenced by DMFT at baseline, rather than toothbrushing frequency. Copyright© 2017 Dennis Barber Ltd.

  19. Single pilot scanning behavior in simulated instrument flight

    NASA Technical Reports Server (NTRS)

    Pennington, J. E.

    1979-01-01

    A simulation of tasks associated with single pilot general aviation flight under instrument flight rules was conducted as a baseline for future research studies on advanced flight controls and avionics. The tasks, ranging from simple climbs and turns to an instrument landing systems approach, were flown on a fixed base simulator. During the simulation the control inputs, state variables, and the pilots visual scan pattern including point of regard were measured and recorded.

  20. Simulation-guided cardiac auscultation improves medical students' clinical skills: the Pavia pilot experience.

    PubMed

    Perlini, Stefano; Salinaro, Francesco; Santalucia, Paola; Musca, Francesco

    2014-03-01

    Clinical evaluation is the cornerstone of any cardiac diagnosis, although excessive over-specialisation often leads students to disregard the value of clinical skills, and to overemphasize the approach to instrumental cardiac diagnosis. Time restraints, low availability of "typical" cardiac patients on whom to perform effective bedside teaching, patients' respect and the underscoring of the value of clinical skills all lead to a progressive decay in teaching. Simulation-guided cardiac auscultation may improve clinical training in medical students and residents. Harvey(©) is a mannequin encompassing more than 50 cardiac diagnoses that was designed and developed at the University of Miami (Florida, USA). One of the advantages of Harvey(©) simulation resides in the possibility of listening, comparing and discussing "real" murmurs. To objectively assess its teaching performance, the capability to identify five different cardiac diagnoses (atrial septal defect, normal young subject, mitral stenosis with tricuspid regurgitation, chronic mitral regurgitation, and pericarditis) out of more than 50 diagnostic possibilities was assessed in 523 III-year medical students (i.e. at the very beginning of their clinical experience), in 92 VI-year students, and in 42 residents before and after a formal 10-h teaching session with Harvey(©). None of them had previously experienced simulation-based cardiac auscultation in addition to formal lecturing (all three groups) and bedside teaching (VI-year students and residents). In order to assess the "persistence" of the acquired knowledge over time, the test was repeated after 3 years in 85 students, who did not repeat the formal 10-h teaching session with Harvey(©) after the III year. As expected, the overall response was poor in the "beginners" who correctly identified 11.0 % of the administered cardiac murmurs. After simulation-guided training, the ability to recognise the correct cardiac diagnoses was much better (72.0 %; p < 0.001 vs. baseline). Rather unexpectedly, before the tutorial, the performance of VI-year students and of residents was not significantly different from their III-year colleagues, since the two groups correctly identified 14.2 and 16.2 % of the diagnoses, respectively. After the tutorial, the VI-year students and the residents also improved their overall performance (to 73.1 and 76.1 %, respectively; p < 0.001 for both when compared to before the tutorial). The persistence of this capability after 3 years was remarkable, since the 85 students who repeated the test without any further exposure to the 10-h teaching session with Harvey(©) correctly identified 68.4 % of the possible cardiac diagnoses (p < 0.001 vs. baseline). These data underscore the importance of clinical training in order to improve auscultation skills in our academic setting, prompting to redesign teaching curricula. Simulation-based cardiac auscultation should be considered as the "missing link" between formal lecturing and bedside teaching of heart sounds and murmurs.

  1. Spectral indices of cardiovascular adaptations to short-term simulated microgravity exposure

    NASA Technical Reports Server (NTRS)

    Patwardhan, A. R.; Evans, J. M.; Berk, M.; Grande, K. J.; Charles, J. B.; Knapp, C. F.

    1995-01-01

    We investigated the effects of exposure to microgravity on the baseline autonomic balance in cardiovascular regulation using spectral analysis of cardiovascular variables measured during supine rest. Heart rate, arterial pressure, radial flow, thoracic fluid impedance and central venous pressure were recorded from nine volunteers before and after simulated microgravity, produced by 20 hours of 6 degrees head down bedrest plus furosemide. Spectral powers increased after simulated microgravity in the low frequency region (centered at about 0.03 Hz) in arterial pressure, heart rate and radial flow, and decreased in the respiratory frequency region (centered at about 0.25 Hz) in heart rate. Reduced heart rate power in the respiratory frequency region indicates reduced parasympathetic influence on the heart. A concurrent increase in the low frequency power in arterial pressure, heart rate, and radial flow indicates increased sympathetic influence. These results suggest that the baseline autonomic balance in cardiovascular regulation is shifted towards increased sympathetic and decreased parasympathetic influence after exposure to short-term simulated microgravity.

  2. Noise levels from a model turbofan engine with simulated noise control measures applied

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Woodward, Richard P.

    1993-01-01

    A study of estimated full-scale noise levels based on measured levels from the Advanced Ducted Propeller (ADP) sub-scale model is presented. Testing of this model was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. Effective Perceived Noise Level (EPNL) estimates for the baseline configuration are documented, and also used as the control case in a study of the potential benefits of two categories of noise control. The effect of active noise control is evaluated by artificially removing various rotor-stator interaction tones. Passive noise control is simulated by applying a notch filter to the wind tunnel data. Cases with both techniques are included to evaluate hybrid active-passive noise control. The results for EPNL values are approximate because the original source data was limited in bandwidth and in sideline angular coverage. The main emphasis is on comparisons between the baseline and configurations with simulated noise control measures.

  3. Baseline dental plaque activity, mutans streptococci culture, and future caries experience in children.

    PubMed

    Hallett, Kerrod B; O'Rourke, Peter K

    2013-01-01

    The purpose of this study was to evaluate a chairside caries risk assessment protocol utilizing a caries prediction instrument, adenosine triphosphate (ATP) activity in dental plaque, mutans streptococci (MS) culture, and routine dental examination in five- to 10-year-old children at two regional Australian schools with high caries experience. Clinical indicators for future caries were assessed at baseline examination using a standardized prediction instrument. Plaque ATP activity was measured directly in relative light units (RLU) using a bioluminescence meter, and MS culture data were recorded. Each child's dentition was examined clinically and radiographically, and caries experience was recorded using enamel white spot lesions and decayed, missing, and filled surfaces for primary and permanent teeth indices. Univariate one-way analysis of variance between selected clinical indicators, ATP activity, MS count at baseline, and future new caries activity was performed, and a generalized linear model for prediction of new caries activity at 24 months was constructed. Future new caries activity was significantly associated with the presence of visible cavitations, reduced saliva flow, and orthodontic appliances at baseline (R(2)=0.2, P<.001). Baseline plaque adenosine triphosphate activity and mutans streptococci counts were not significantly associated with caries activity at 24 months.

  4. Training Surgical Residents With a Haptic Robotic Central Venous Catheterization Simulator.

    PubMed

    Pepley, David F; Gordon, Adam B; Yovanoff, Mary A; Mirkin, Katelin A; Miller, Scarlett R; Han, David C; Moore, Jason Z

    Ultrasound guided central venous catheterization (CVC) is a common surgical procedure with complication rates ranging from 5 to 21 percent. Training is typically performed using manikins that do not simulate anatomical variations such as obesity and abnormal vessel positioning. The goal of this study was to develop and validate the effectiveness of a new virtual reality and force haptic based simulation platform for CVC of the right internal jugular vein. A CVC simulation platform was developed using a haptic robotic arm, 3D position tracker, and computer visualization. The haptic robotic arm simulated needle insertion force that was based on cadaver experiments. The 3D position tracker was used as a mock ultrasound device with realistic visualization on a computer screen. Upon completion of a practice simulation, performance feedback is given to the user through a graphical user interface including scoring factors based on good CVC practice. The effectiveness of the system was evaluated by training 13 first year surgical residents using the virtual reality haptic based training system over a 3 month period. The participants' performance increased from 52% to 96% on the baseline training scenario, approaching the average score of an expert surgeon: 98%. This also resulted in improvement in positive CVC practices including a 61% decrease between final needle tip position and vein center, a decrease in mean insertion attempts from 1.92 to 1.23, and a 12% increase in time spent aspirating the syringe throughout the procedure. A virtual reality haptic robotic simulator for CVC was successfully developed. Surgical residents training on the simulation improved to near expert levels after three robotic training sessions. This suggests that this system could act as an effective training device for CVC. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. Recent evolution of the offline computing model of the NOvA experiment

    DOE PAGES

    Habig, Alec; Norman, A.; Group, Craig

    2015-12-23

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study ν e appearance in a ν μ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files onmore » either disk or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. In addition, the current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics.« less

  6. Recent Evolution of the Offline Computing Model of the NOvA Experiment

    NASA Astrophysics Data System (ADS)

    Habig, Alec; Norman, A.

    2015-12-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study νe appearance in a νμ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files on either disk or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. The current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics.

  7. A Randomized Trial Comparing Acupuncture, Simulated Acupuncture, and Usual Care for Chronic Low Back Pain

    PubMed Central

    Cherkin, Daniel C.; Sherman, Karen J.; Avins, Andrew L.; Erro, Janet H.; Ichikawa, Laura; Barlow, William E.; Delaney, Kristin; Hawkes, Rene; Hamilton, Luisa; Pressman, Alice; Khalsa, Partap S.; Deyo, Richard A.

    2009-01-01

    Background Acupuncture is a popular complementary and alternative treatment for chronic back pain. Recent European trials suggest similar short-term benefits from real and sham acupuncture needling. This trial addresses the importance of needle placement and skin penetration in eliciting acupuncture effects for patients with chronic low back pain. Methods 638 adults with chronic mechanical low back pain were randomized to: individualized acupuncture, standardized acupuncture, simulated acupuncture, or usual care. Ten treatments were provided over 7 weeks by experienced acupuncturists. The primary outcomes were back-related dysfunction (Roland Disability score, range: 0 to 23) and symptom bothersomeness (0 to 10 scale). Outcomes were assessed at baseline and after 8, 26 and 52 weeks. Results At 8 weeks, mean dysfunction scores for the individualized, standardized, and simulated acupuncture groups improved by 4.4, 4.5, and 4.4 points, respectively, compared with 2.1 points for those receiving usual care (P<0.001). Participants receiving real or simulated acupuncture were more likely than those receiving usual care to experience clinically meaningful improvements on the dysfunction scale (60% vs. 39%, P<0.0001). Symptoms improved by 1.6 to 1.9 points in the treatment groups compared with 0.7 points in the usual care group (P<0.0001). After one year, participants in the treatment groups were more likely than those receiving usual care group to experience clinically meaningful improvements in dysfunction (59% to 65% versus 50%, respectively, P=0.02) but not in symptoms (P>0.05). Conclusions Although acupuncture was found effective for chronic low back pain, tailoring needling sites to each patient and penetration of the skin appear to be unimportant in eliciting therapeutic benefits. These findings raise questions about acupuncture’s purported mechanisms of action. It remains unclear whether acupuncture, or our simulated method of acupuncture, provide physiologically important stimulation or represent placebo or non-specific effects. PMID:19433697

  8. Confusing placebo effect with natural history in epilepsy: A big data approach.

    PubMed

    Goldenholz, Daniel M; Moss, Robert; Scott, Jonathan; Auh, Sungyoung; Theodore, William H

    2015-09-01

    For unknown reasons, placebos reduce seizures in clinical trials in many patients. It is also unclear why some drugs showing statistical superiority to placebo in one trial may fail to do so in another. Using Seizuretracker.com, a patient-centered database of 684,825 seizures, we simulated "placebo" and "drug" trials. These simulations were employed to clarify the sources of placebo effects in epilepsy, and to identify methods of diminishing placebo effects. Simulation 1 included 9 trials with a 6-week baseline and 6-week test period, starting at time 0, 3, 6…24 months. Here, "placebo" reduced seizures regardless of study start time. Regression-to-the-mean persisted only for 3 to 6 months. Simulation 2 comprised a 6-week baseline and then 2 years of follow-up. Seizure frequencies continued to improve throughout follow-up. Although the group improved, individuals switched from improvement to worsening and back. Simulation 3 involved a placebo-controlled "drug" trial, to explore methods of placebo response reduction. An efficacious "drug" failed to demonstrate a significant effect compared with "placebo" (p = 0.12), although modifications either in study start time (p = 0.025) or baseline population reduction (p = 0.0028) allowed the drug to achieve a statistically significant effect compared with placebo. In epilepsy clinical trials, some seizure reduction traditionally attributed to placebo effect may reflect the natural course of the disease itself. Understanding these dynamics will allow future investigations into optimal clinical trial design and may lead to identification of more effective therapies. Ann Neurol 2015;78:329-336. © 2015 American Neurological Association.

  9. Using alcohol intoxication goggles (Fatal Vision® goggles) to detect alcohol related impairment in simulated driving.

    PubMed

    McCartney, Danielle; Desbrow, Ben; Irwin, Christopher

    2017-01-02

    Fatal vision goggles (FVGs) are image-distorting equipment used within driver education programs to simulate alcohol-related impairment. However, there is no empirical evidence comparing the behavioral effects associated with wearing FVGs to alcohol intoxication. The purpose of this study was to determine the validity of FVGs in producing alcohol-related impairment in simulated driving. Twenty-two healthy males (age: 23 ± 3 years, mean ± SD) participated in a placebo-controlled crossover design study involving 4 experimental trials. In each trial, participants completed a baseline level simulated driving task followed by an experimental driving task, involving one of 4 treatments: (1) a dose of alcohol designed to elicit 0.080% breath alcohol concentration (BrAC; AB), (2) an alcohol placebo beverage (PB), (3) FVG (estimated % blood alcohol concentration [BAC] 0.070-0.100+), and (4) placebo goggles (PGs). The driving tasks included 3 separate scenarios lasting ∼5 min each; these were a simple driving scenario, a complex driving scenario, and a hazard perception driving scenario. Selected lateral control parameters (standard deviation of lane position [SDLP]; total number of lane crossings [LCs]) and longitudinal control parameters (average speed; standard deviation of speed [SDSP]; distance headway; minimum distance headway) were monitored during the simple and complex driving scenarios. Latency to 2 different stimuli (choice reaction time [CRT]) was tested in the hazard perception driving scenario. Subjective ratings of mood and attitudes toward driving were also provided during each of the trials. Neither placebo treatment influenced simulated driving performance. Mean BrAC was 0.060 ± 0.010% at the time of driving on the AB trial. Lateral control: In the simple driving scenario, SDLP and LC were not affected under any of the experimental treatments. However, in the complex driving scenario, significantly greater SDLP was observed on both the FVG and AB trials compared to their respective baseline drives. LC increased significantly from baseline on the AB trial only. Longitudinal control: Speed was not affected by any of the experimental treatments; however, SDSP increased significantly from baseline on the FVG trial. A significant reduction in distance headway and minimum distance headway was detected on the FVG trial compared to baseline. Hazard perception: Neither AB nor FVG trials were influential on CRT. Subjective mood ratings were significantly altered on the AB and FVG trials compared to baseline and placebo conditions. Participants reported reduced willingness and ability to drive under the active treatments (AB and FVG) than the placebo treatments (PB and PG). FVGs may have some utility in replicating alcohol-related impairment on specific driving performance measurements. Hence, the equipment may offer an alternative approach to researching the impact of alcohol intoxication on simulated driving performance among populations where the provision of alcohol would otherwise be unethical (e.g., prelicensed drivers).

  10. Evaluation of a Standardized Program for Training Practicing Anesthesiologists in Ultrasound-Guided Regional Anesthesia Skills.

    PubMed

    Mariano, Edward R; Harrison, T Kyle; Kim, T Edward; Kan, Jack; Shum, Cynthia; Gaba, David M; Ganaway, Toni; Kou, Alex; Udani, Ankeet D; Howard, Steven K

    2015-10-01

    Practicing anesthesiologists have generally not received formal training in ultrasound-guided perineural catheter insertion. We designed this study to determine the efficacy of a standardized teaching program in this population. Anesthesiologists in practice for 10 years or more were recruited and enrolled to participate in a 1-day program: lectures and live-model ultrasound scanning (morning) and faculty-led iterative practice and mannequin-based simulation (afternoon). Participants were assessed and recorded while performing ultrasound-guided perineural catheter insertion at baseline, at midday (interval), and after the program (final). Videos were scored by 2 blinded reviewers using a composite tool and global rating scale. Participants were surveyed every 3 months for 1 year to report the number of procedures, efficacy of teaching methods, and implementation obstacles. Thirty-two participants were enrolled and completed the program; 31 of 32 (97%) completed the 1-year follow-up. Final scores [median (10th-90th percentiles)] were 21.5 (14.5-28.0) of 30 points compared to 14.0 (9.0-20.0) at interval (P < .001 versus final) and 12.0 (8.5-17.5) at baseline (P < .001 versus final), with no difference between interval and baseline. The global rating scale showed an identical pattern. Twelve of 26 participants without previous experience performed at least 1 perineural catheter insertion after training (P < .001). However, there were no differences in the monthly average number of procedures or complications after the course when compared to baseline. Practicing anesthesiologists without previous training in ultrasound-guided regional anesthesia can acquire perineural catheter insertion skills after a 1-day standardized course, but changing clinical practice remains a challenge. © 2015 by the American Institute of Ultrasound in Medicine.

  11. Long-baseline optical intensity interferometry. Laboratory demonstration of diffraction-limited imaging

    NASA Astrophysics Data System (ADS)

    Dravins, Dainis; Lagadec, Tiphaine; Nuñez, Paul D.

    2015-08-01

    Context. A long-held vision has been to realize diffraction-limited optical aperture synthesis over kilometer baselines. This will enable imaging of stellar surfaces and their environments, and reveal interacting gas flows in binary systems. An opportunity is now opening up with the large telescope arrays primarily erected for measuring Cherenkov light in air induced by gamma rays. With suitable software, such telescopes could be electronically connected and also used for intensity interferometry. Second-order spatial coherence of light is obtained by cross correlating intensity fluctuations measured in different pairs of telescopes. With no optical links between them, the error budget is set by the electronic time resolution of a few nanoseconds. Corresponding light-travel distances are approximately one meter, making the method practically immune to atmospheric turbulence or optical imperfections, permitting both very long baselines and observing at short optical wavelengths. Aims: Previous theoretical modeling has shown that full images should be possible to retrieve from observations with such telescope arrays. This project aims at verifying diffraction-limited imaging experimentally with groups of detached and independent optical telescopes. Methods: In a large optics laboratory, artificial stars (single and double, round and elliptic) were observed by an array of small telescopes. Using high-speed photon-counting solid-state detectors and real-time electronics, intensity fluctuations were cross-correlated over up to 180 baselines between pairs of telescopes, producing coherence maps across the interferometric Fourier-transform plane. Results: These interferometric measurements were used to extract parameters about the simulated stars, and to reconstruct their two-dimensional images. As far as we are aware, these are the first diffraction-limited images obtained from an optical array only linked by electronic software, with no optical connections between the telescopes. Conclusions: These experiments serve to verify the concepts for long-baseline aperture synthesis in the optical, somewhat analogous to radio interferometry.

  12. Non-Dependent and Dependent Daily Cannabis Users Differ in Mental Health but Not Prospective Memory Ability

    PubMed Central

    Braidwood, Ruth; Mansell, Samantha; Waldron, Jon; Rendell, Peter G.; Kamboj, Sunjeev K.; Curran, H. Valerie

    2018-01-01

    Research suggests that daily cannabis users have impaired memory for past events, but it is not clear whether they are also impaired in prospective memory (PM) for future events. The present study examined PM in daily cannabis users who were either dependent (n = 18) or non-dependent (n = 18), and compared them with non-using controls (n = 18). The effect of future event simulation (FES) on PM performance was also examined. Participants were matched across groups on age, gender, and highest level of education. The virtual week (VW) was used to objectively assess PM abilities, both at baseline and following FES. Other measures used were: cannabis use variables, immediate and delayed prose recall, phonemic and category fluency, spot-the-word test (premorbid intelligence), Beck Depression Inventory, Beck Anxiety Inventory, and a measure of schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences: unusual experiences subscale). No group differences were found in PM performance on the VW, and FES did not improve PM performance in any group. Dependent cannabis users scored higher on depression, anxiety, and schizotypy than both other groups with non-dependent cannabis users scoring at a similar level to controls. There were no group differences in alcohol use. Findings suggest that when carefully matched on baseline variables, and not differing in premorbid IQ or alcohol use, young, near-daily cannabis users do not differ from non-using controls in PM performance. PMID:29636705

  13. The effect of tracking network configuration on GPS baseline estimates for the CASA Uno experiment

    NASA Technical Reports Server (NTRS)

    Wolf, S. Kornreich; Dixon, T. H.; Freymueller, J. T.

    1990-01-01

    The effect of the tracking network on long (greater than 100 km) GPS baseline estimates was estimated using various subsets of the global tracking network initiated by the first Central and South America (CASA Uno) experiment. It was found that best results could be obtained with a global tacking network consisting of three U.S. stations, two sites in the southwestern Pacific, and two sites in Europe. In comparison with smaller subsets, this global network improved the baseline repeatability, the resolution of carrier phase cycle ambiguities, and formal errors of the orbit estimates.

  14. Numerical aerodynamic simulation facility preliminary study: Executive study

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A computing system was designed with the capability of providing an effective throughput of one billion floating point operations per second for three dimensional Navier-Stokes codes. The methodology used in defining the baseline design, and the major elements of the numerical aerodynamic simulation facility are described.

  15. Network Design in Close-Range Photogrammetry with Short Baseline Images

    NASA Astrophysics Data System (ADS)

    Barazzetti, L.

    2017-08-01

    The avaibility of automated software for image-based 3D modelling has changed the way people acquire images for photogrammetric applications. Short baseline images are required to match image points with SIFT-like algorithms, obtaining more images than those necessary for "old fashioned" photogrammetric projects based on manual measurements. This paper describes some considerations on network design for short baseline image sequences, especially on precision and reliability of bundle adjustment. Simulated results reveal that the large number of 3D points used for image orientation has very limited impact on network precision.

  16. Measuring continuous baseline covariate imbalances in clinical trial data

    PubMed Central

    Ciolino, Jody D.; Martin, Renee’ H.; Zhao, Wenle; Hill, Michael D.; Jauch, Edward C.; Palesch, Yuko Y.

    2014-01-01

    This paper presents and compares several methods of measuring continuous baseline covariate imbalance in clinical trial data. Simulations illustrate that though the t-test is an inappropriate method of assessing continuous baseline covariate imbalance, the test statistic itself is a robust measure in capturing imbalance in continuous covariate distributions. Guidelines to assess effects of imbalance on bias, type I error rate, and power for hypothesis test for treatment effect on continuous outcomes are presented, and the benefit of covariate-adjusted analysis (ANCOVA) is also illustrated. PMID:21865270

  17. Numerical simulation of supersonic flow using a new analytical bleed boundary condition

    NASA Technical Reports Server (NTRS)

    Harloff, G. J.; Smith, G. E.

    1995-01-01

    A new analytical bleed boundary condition is used to compute flowfields for a strong oblique shock wave/boundary layer interaction with a baseline and three bleed rates at a freestream Mach number of 2.47 with an 8 deg shock generator. The computational results are compared to experimental Pitot pressure profiles and wall static pressures through the interaction region. An algebraic turbulence model is employed for the bleed and baseline cases, and a one equation model is also used for the baseline case where the boundary layer is separated.

  18. The effect of perceptual grouping on haptic numerosity perception.

    PubMed

    Verlaers, K; Wagemans, J; Overvliet, K E

    2015-01-01

    We used a haptic enumeration task to investigate whether enumeration can be facilitated by perceptual grouping in the haptic modality. Eight participants were asked to count tangible dots as quickly and accurately as possible, while moving their finger pad over a tactile display. In Experiment 1, we manipulated the number and organization of the dots, while keeping the total exploration area constant. The dots were either evenly distributed on a horizontal line (baseline condition) or organized into groups based on either proximity (dots placed in closer proximity to each other) or configural cues (dots placed in a geometric configuration). In Experiment 2, we varied the distance between the subsets of dots. We hypothesized that when subsets of dots can be grouped together, the enumeration time will be shorter and accuracy will be higher than in the baseline condition. The results of both experiments showed faster enumeration for the configural condition than for the baseline condition, indicating that configural grouping also facilitates haptic enumeration. In Experiment 2, faster enumeration was also observed for the proximity condition than for the baseline condition. Thus, perceptual grouping speeds up haptic enumeration by both configural and proximity cues, suggesting that similar mechanisms underlie perceptual grouping in both visual and haptic enumeration.

  19. Computable general equilibrium model fiscal year 2013 capability development report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Brian Keith; Rivera, Michael Kelly; Boero, Riccardo

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences inmore » the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.« less

  20. Predicting the Consequences of Workload Management Strategies with Human Performance Modeling

    NASA Technical Reports Server (NTRS)

    Mitchell, Diane Kuhl; Samma, Charneta

    2011-01-01

    Human performance modelers at the US Army Research Laboratory have developed an approach for establishing Soldier high workload that can be used for analyses of proposed system designs. Their technique includes three key components. To implement the approach in an experiment, the researcher would create two experimental conditions: a baseline and a design alternative. Next they would identify a scenario in which the test participants perform all their representative concurrent interactions with the system. This scenario should include any events that would trigger a different set of goals for the human operators. They would collect workload values during both the control and alternative design condition to see if the alternative increased workload and decreased performance. They have successfully implemented this approach for military vehicle. designs using the human performance modeling tool, IMPRINT. Although ARL researches use IMPRINT to implement their approach, it can be applied to any workload analysis. Researchers using other modeling and simulations tools or conducting experiments or field tests can use the same approach.

  1. Utilization of cues in action anticipation in table tennis players.

    PubMed

    Zhao, Qi; Lu, Yingzhi; Jaquess, Kyle J; Zhou, Chenglin

    2018-04-11

    By manipulating the congruency between body kinematics and subsequent ball trajectory, this study investigated the anticipation capabilities of regional-level, college-level, and novice table tennis players using a full video simulation occluder paradigm. Participants watched footage containing congruent, incongruent, or no ball trajectory information, to predict the landing point of the ball. They were required to choose between two potential locations to make their prediction. Percent accuracy and relevant indexes (d-prime, criterion, effect size) were calculated for each condition. Results indicated that experienced table tennis players (both regional and college players) were superior to novices in the ability to anticipate ball trajectory using kinematic information, but no difference was found between regional-level and college-level players. The findings of this study further demonstrate the superior anticipation ability of experienced table tennis players. Furthermore, the present result suggests that there may be a certain "baseline" level of motor experience in racquet sports for effective action anticipation, while the addition of further motor experience does not appear to assist direction anticipation.

  2. S-Duct Engine Inlet Flow Control Using SDBD Plasma Streamwise Vortex Generators

    NASA Astrophysics Data System (ADS)

    Kelley, Christopher; He, Chuan; Corke, Thomas

    2009-11-01

    The results of a numerical simulation and experiment characterizing the performance of plasma streamwise vortex generators in controlling separation and secondary flow within a serpentine, diffusing duct are presented. A no flow control case is first run to check agreement of location of separation, development of secondary flow, and total pressure recovery between the experiment and numerical results. Upon validation, passive vane-type vortex generators and plasma streamwise vortex generators are implemented to increase total pressure recovery and reduce flow distortion at the aerodynamic interface plane: the exit of the S-duct. Total pressure recovery is found experimentally with a pitot probe rake assembly at the aerodynamic interface plane. Stagnation pressure distortion descriptors are also presented to show the performance increase with plasma streamwise vortex generators in comparison to the baseline no flow control case. These performance parameters show that streamwise plasma vortex generators are an effective alternative to vane-type vortex generators in total pressure recovery and total pressure distortion reduction in S-duct inlets.

  3. The Effects of Baseline Estimation on the Reliability, Validity, and Precision of CBM-R Growth Estimates

    ERIC Educational Resources Information Center

    Van Norman, Ethan R.; Christ, Theodore J.; Zopluoglu, Cengiz

    2013-01-01

    This study examined the effect of baseline estimation on the quality of trend estimates derived from Curriculum Based Measurement of Oral Reading (CBM-R) progress monitoring data. The authors used a linear mixed effects regression (LMER) model to simulate progress monitoring data for schedules ranging from 6-20 weeks for datasets with high and low…

  4. Implications of changing water cycle for the performance and yield characteristics of the multi-purpose Beas Reservoir in India

    NASA Astrophysics Data System (ADS)

    Adeloye, A. J.; Ojha, C. S.; Soundharajan, B.; Remesan, R.

    2013-12-01

    There is considerable change in both the spatial and temporal patterns of monsoon rainfall in India, with implications for water resources availability and security. 'Mitigating the Impacts of Climate Change on India Agriculture' (MICCI) is one of five on-going scientific efforts being sponsored as part of the UK-NERC/India-MOES Changing Water Cycle (South Asia) initiative to further the understanding of the problem and proffer solutions that are robust and effective. This paper focuses on assessing the implications of projected climate change on the yield and performance characteristics of the Pong Reservoir on the Beas River, Himachal Pradesh, India. The Pong serves both hydropower and irrigation needs and is therefore strategic for the socio-economic well-being of the region as well as sustaining the livelihoods of millions of farmers that rely on it for irrigation. Simulated baseline and climate-change perturbed hydro-climate scenarios developed as part of a companion Work Package of MICCI formed the basis of the analysis. For both of these scenarios, reservoir analyses were carried out using the Sequent Peak Algorithm (SPA) and Pong's existing level of releases to derive rule curves for the reservoir. These rule curves then formed the basis of further reservoir behaviour simulations in WEAP and the resulting performance of the reservoir was summarised in terms of reliability, resilience, vulnerability and sustainability. The whole exercise was implemented within a Monte Carlo framework for the benefit of characterising the variability in the assessments. The results show that the rule curves developed using future hydro-climate are significantly changed from the baseline in that higher storages will be required to be maintained in the Pong in the future to achieve reliable performance. As far as the overall performance of the reservoir is concerned, future reliability (both time-based and volume-based) is not significantly different from the baseline, provided the future simulations adopt the future rule curves. This is, however, not the case with the resilience, with the future hydro-climate resulting in a less resilient system when compared with the baseline. The resilience is the ability of the system to recover from a hydrological failure; consequently, lower resilience for the future systems is an indication that longer, continuous failure periods are likely with implications for the two purposes of the reservoir. For example, extended periods of water scarcity that may result from a low resilient system will mean that crops are likely to experience longer periods of water stress with implications for crop yields. In such situations, better operational practices that manage the available water through hedging and irrigation water scheduling will be required. Other interventions may include the introduction of water from other sources, e.g. groundwater.

  5. Cognitive Load in Mastoidectomy Skills Training: Virtual Reality Simulation and Traditional Dissection Compared.

    PubMed

    Andersen, Steven Arild Wuyts; Mikkelsen, Peter Trier; Konge, Lars; Cayé-Thomasen, Per; Sørensen, Mads Sølvsten

    2016-01-01

    The cognitive load (CL) theoretical framework suggests that working memory is limited, which has implications for learning and skills acquisition. Complex learning situations such as surgical skills training can potentially induce a cognitive overload, inhibiting learning. This study aims to compare CL in traditional cadaveric dissection training and virtual reality (VR) simulation training of mastoidectomy. A prospective, crossover study. Participants performed cadaveric dissection before VR simulation of the procedure or vice versa. CL was estimated by secondary-task reaction time testing at baseline and during the procedure in both training modalities. The national Danish temporal bone course. A total of 40 novice otorhinolaryngology residents. Reaction time was increased by 20% in VR simulation training and 55% in cadaveric dissection training of mastoidectomy compared with baseline measurements. Traditional dissection training increased CL significantly more than VR simulation training (p < 0.001). VR simulation training imposed a lower CL than traditional cadaveric dissection training of mastoidectomy. Learning complex surgical skills can be a challenge for the novice and mastoidectomy skills training could potentially be optimized by employing VR simulation training first because of the lower CL. Traditional dissection training could then be used to supplement skills training after basic competencies have been acquired in the VR simulation. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  6. Simulated effects of increased recharge on the ground-water flow system of Yucca Mountain and vicinity, Nevada-California

    USGS Publications Warehouse

    Czarnecki, J.B.

    1984-01-01

    A study was performed to assess the potential effects of changes in future climatic conditions on the groundwater system in the vicinity of Yucca Mountain, the site of a potential mined geologic repository for high-level nuclear wastes. These changes probably would result in greater rates of precipitation and, consequently, greater rates of recharge. The study was performed by simulating the groundwater system, using a two-dimensional, finite-element, groundwater flow model. The simulated position of the water table rose as much as 130 meters near the U.S. Department of Energy 's preferred repository area at Yucca Mountain for a simulation involving a 100-percent increase in precipitation compared to modern-day conditions. Despite the water table rise, no flooding of the potential repository would occur at its current proposed location. According to the simulation, springs would discharge south and west of Timber Mountain, along Fortymile Canyon, in the Amargosa Desert near Lathrop Wells and Franklin Lake playa, and near Furnace Creek Ranch in Death Valley, where they presently discharge. Simulated directions of groundwater flow paths near the potential repository area generally would be the same for the baseline (modern-day climate) and the increased-recharge simulations, but the magnitude of flow would increase by 2 to 4 times that of the baseline-simulation flow. (USGS)

  7. Effects of Airport Tower Controller Decision Support Tool on Controllers Head-Up Time

    NASA Technical Reports Server (NTRS)

    Hayashi, Miwa; Cruz Lopez, Jose M.

    2013-01-01

    Despite that aircraft positions and movements can be easily monitored on the radar displays at major airports nowadays, it is still important for the air traffic control tower (ATCT) controllers to look outside the window as much as possible to assure safe operations of traffic management. The present paper investigates whether an introduction of the NASA's proposed Spot and Runway Departure Advisor (SARDA), a decision support tool for the ATCT controller, would increase or decrease the controllers' head-up time. SARDA provides the controller departure-release schedule advisories, i.e., when to release each departure aircraft in order to minimize individual aircraft's fuel consumption on taxiways and simultaneously maximize the overall runway throughput. The SARDA advisories were presented on electronic flight strips (EFS). To investigate effects on the head-up time, a human-in-the-loop simulation experiment with two retired ATCT controller participants was conducted in a high-fidelity ATCT cab simulator with 360-degree computer-generated out-the-window view. Each controller participant wore a wearable video camera on a side of their head with the camera facing forward. The video data were later used to calculate their line of sight at each moment and eventually identify their head-up times. Four sessions were run with the SARDA advisories, and four sessions were run without (baseline). Traffic-load levels were varied in each session. The same set of user interface - EFS and the radar displays - were used in both the advisory and baseline sessions to make them directly comparable. The paper reports the findings and discusses their implications.

  8. Subarray Processing for Projection-based RFI Mitigation in Radio Astronomical Interferometers

    NASA Astrophysics Data System (ADS)

    Burnett, Mitchell C.; Jeffs, Brian D.; Black, Richard A.; Warnick, Karl F.

    2018-04-01

    Radio Frequency Interference (RFI) is a major problem for observations in Radio Astronomy (RA). Adaptive spatial filtering techniques such as subspace projection are promising candidates for RFI mitigation; however, for radio interferometric imaging arrays, these have primarily been used in engineering demonstration experiments rather than mainstream scientific observations. This paper considers one reason that adoption of such algorithms is limited: RFI decorrelates across the interferometric array because of long baseline lengths. This occurs when the relative RFI time delay along a baseline is large compared to the frequency channel inverse bandwidth used in the processing chain. Maximum achievable excision of the RFI is limited by covariance matrix estimation error when identifying interference subspace parameters, and decorrelation of the RFI introduces errors that corrupt the subspace estimate, rendering subspace projection ineffective over the entire array. In this work, we present an algorithm that overcomes this challenge of decorrelation by applying subspace projection via subarray processing (SP-SAP). Each subarray is designed to have a set of elements with high mutual correlation in the interferer for better estimation of subspace parameters. In an RFI simulation scenario for the proposed ngVLA interferometric imaging array with 15 kHz channel bandwidth for correlator processing, we show that compared to the former approach of applying subspace projection on the full array, SP-SAP improves mitigation of the RFI on the order of 9 dB. An example of improved image synthesis and reduced RFI artifacts for a simulated image “phantom” using the SP-SAP algorithm is presented.

  9. Estimation of Separation Buffers for Wind-Prediction Error in an Airborne Separation Assistance System

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria C.; Hoadley, Sherwood T.; Allen, B. Danette

    2009-01-01

    Wind prediction errors are known to affect the performance of automated air traffic management tools that rely on aircraft trajectory predictions. In particular, automated separation assurance tools, planned as part of the NextGen concept of operations, must be designed to account and compensate for the impact of wind prediction errors and other system uncertainties. In this paper we describe a high fidelity batch simulation study designed to estimate the separation distance required to compensate for the effects of wind-prediction errors throughout increasing traffic density on an airborne separation assistance system. These experimental runs are part of the Safety Performance of Airborne Separation experiment suite that examines the safety implications of prediction errors and system uncertainties on airborne separation assurance systems. In this experiment, wind-prediction errors were varied between zero and forty knots while traffic density was increased several times current traffic levels. In order to accurately measure the full unmitigated impact of wind-prediction errors, no uncertainty buffers were added to the separation minima. The goal of the study was to measure the impact of wind-prediction errors in order to estimate the additional separation buffers necessary to preserve separation and to provide a baseline for future analyses. Buffer estimations from this study will be used and verified in upcoming safety evaluation experiments under similar simulation conditions. Results suggest that the strategic airborne separation functions exercised in this experiment can sustain wind prediction errors up to 40kts at current day air traffic density with no additional separation distance buffer and at eight times the current day with no more than a 60% increase in separation distance buffer.

  10. Simulation of the effects of cavitation and anatomy in the shock path of model lithotripters

    PubMed Central

    Krimmel, Jeff; Colonius, Tim; Tanguay, Michel

    2011-01-01

    We report on recent efforts to develop predictive models for the pressure and other flow variables in the focal region of shock wave lithotripters. Baseline simulations of three representative lithotripters (electrohydraulic, electromagnetic, and piezoelectric) compare favorably with in vitro experiments (in a water bath). We proceed to model and investigate how shock focusing is altered by the presence of material interfaces associated with different types of tissue encountered along the shock path, and by the presence of cavitation bubbles that are excited by tensile pressures associated with the focused shock wave. We use human anatomical data, but simplify the description by assuming that the tissue behaves as a fluid, and by assuming cylindrical symmetry along the shock path. Scattering by material interfaces is significant, and regions of high pressure amplitudes (both compressive and tensile) are generated almost 4 cm postfocus. Bubble dynamics generate secondary shocks whose strength depends on the density of bubbles and the pulse repetition frequency (PRF). At sufficiently large densities, the bubbles also attenuate the shock. Together with experimental evidence, the simulations suggest that high PRF may be counter-productive for stone comminution. Finally, we discuss how the lithotripter simulations can be used as input to more detailed physical models that attempt to characterize the mechanisms by which collapsing cavitation models erode stones, and by which shock waves and bubbles may damage tissue. PMID:21063697

  11. Tractable flux-driven temperature, density, and rotation profile evolution with the quasilinear gyrokinetic transport model QuaLiKiz

    NASA Astrophysics Data System (ADS)

    Citrin, J.; Bourdelle, C.; Casson, F. J.; Angioni, C.; Bonanomi, N.; Camenen, Y.; Garbet, X.; Garzotti, L.; Görler, T.; Gürcan, O.; Koechl, F.; Imbeaux, F.; Linder, O.; van de Plassche, K.; Strand, P.; Szepesi, G.; Contributors, JET

    2017-12-01

    Quasilinear turbulent transport models are a successful tool for prediction of core tokamak plasma profiles in many regimes. Their success hinges on the reproduction of local nonlinear gyrokinetic fluxes. We focus on significant progress in the quasilinear gyrokinetic transport model QuaLiKiz (Bourdelle et al 2016 Plasma Phys. Control. Fusion 58 014036), which employs an approximated solution of the mode structures to significantly speed up computation time compared to full linear gyrokinetic solvers. Optimisation of the dispersion relation solution algorithm within integrated modelling applications leads to flux calculations × {10}6-7 faster than local nonlinear simulations. This allows tractable simulation of flux-driven dynamic profile evolution including all transport channels: ion and electron heat, main particles, impurities, and momentum. Furthermore, QuaLiKiz now includes the impact of rotation and temperature anisotropy induced poloidal asymmetry on heavy impurity transport, important for W-transport applications. Application within the JETTO integrated modelling code results in 1 s of JET plasma simulation within 10 h using 10 CPUs. Simultaneous predictions of core density, temperature, and toroidal rotation profiles for both JET hybrid and baseline experiments are presented, covering both ion and electron turbulence scales. The simulations are successfully compared to measured profiles, with agreement mostly in the 5%-25% range according to standard figures of merit. QuaLiKiz is now open source and available at www.qualikiz.com.

  12. Midplane neutral density profiles in the National Spherical Torus Experiment

    DOE PAGES

    Stotler, D. P.; Scotti, F.; Bell, R. E.; ...

    2015-08-13

    Atomic and molecular density data in the outer midplane of NSTX [Ono et al., Nucl. Fusion 40, 557 (2000)] are inferred from tangential camera data via a forward modeling procedure using the DEGAS 2 Monte Carlo neutral transport code. The observed Balmer-β light emission data from 17 shots during the 2010 NSTX campaign display no obvious trends with discharge parameters such as the divertor Balmer-α emission level or edge deuterium ion density. Simulations of 12 time slices in 7 of these discharges produce molecular densities near the vacuum vessel wall of 2–8 × 10 17 m –3 and atomic densitiesmore » ranging from 1 to 7 ×10 16 m –3; neither has a clear correlation with other parameters. Validation of the technique, begun in an earlier publication, is continued with an assessment of the sensitivity of the simulated camera image and neutral densities to uncertainties in the data input to the model. The simulated camera image is sensitive to the plasma profiles and virtually nothing else. The neutral densities at the vessel wall depend most strongly on the spatial distribution of the source; simulations with a localized neutral source yield densities within a factor of two of the baseline, uniform source, case. Furthermore, the uncertainties in the neutral densities associated with other model inputs and assumptions are ≤ 50%.« less

  13. A Novel Method to Simulate the Progression of Collagen Degeneration of Cartilage in the Knee: Data from the Osteoarthritis Initiative

    PubMed Central

    Mononen, Mika E.; Tanska, Petri; Isaksson, Hanna; Korhonen, Rami K.

    2016-01-01

    We present a novel algorithm combined with computational modeling to simulate the development of knee osteoarthritis. The degeneration algorithm was based on excessive and cumulatively accumulated stresses within knee joint cartilage during physiological gait loading. In the algorithm, the collagen network stiffness of cartilage was reduced iteratively if excessive maximum principal stresses were observed. The developed algorithm was tested and validated against experimental baseline and 4-year follow-up Kellgren-Lawrence grades, indicating different levels of cartilage degeneration at the tibiofemoral contact region. Test groups consisted of normal weight and obese subjects with the same gender and similar age and height without osteoarthritic changes. The algorithm accurately simulated cartilage degeneration as compared to the Kellgren-Lawrence findings in the subject group with excess weight, while the healthy subject group’s joint remained intact. Furthermore, the developed algorithm followed the experimentally found trend of cartilage degeneration in the obese group (R2 = 0.95, p < 0.05; experiments vs. model), in which the rapid degeneration immediately after initiation of osteoarthritis (0–2 years, p < 0.001) was followed by a slow or negligible degeneration (2–4 years, p > 0.05). The proposed algorithm revealed a great potential to objectively simulate the progression of knee osteoarthritis. PMID:26906749

  14. Measurement and reduction of system latency in see-through helmet mounted display (HMD) systems

    NASA Astrophysics Data System (ADS)

    Vincenzi, Dennis A.; Deaton, John E.; Blickenderfer, Elizabeth L.; Pray, Rick; Williams, Barry; Buker, Timothy J.

    2010-04-01

    Future military aviation platforms such as the proposed Joint Strike Fighter F-35 will integrate helmet mounted displays (HMDs) with the avionics and weapon systems to the degree that the HMDs will become the aircraft's primary display system. In turn, training of pilot flight skills using HMDs will be essential in future training systems. In order to train these skills using simulation based training, improvements must be made in the integration of HMDs with out-thewindow (OTW) simulations. Currently, problems such as latency contribute to the onset of simulator sickness and provide distractions during training with HMD simulator systems that degrade the training experience. Previous research has used Kalman predictive filters as a means of mitigating the system latency present in these systems. While this approach has yielded some success, more work is needed to develop innovative and improved strategies that reduce system latency as well as to include data collected from the user perspective as a measured variable during test and evaluation of latency reduction strategies. The purpose of this paper is twofold. First, the paper describes a new method to measure and assess system latency from the user perspective. Second, the paper describes use of the testbed to examine the efficacy of an innovative strategy that combines a customized Kalman filter with a neural network approach to mitigate system latency. Results indicate that the combined approach reduced system latency significantly when compared to baseline data and the traditional Kalman filter. Reduced latency errors should mitigate the onset of simulator sickness and ease simulator sickness symptomology. Implications for training systems will be discussed.

  15. Evaluation of very long baseline interferometry atmospheric modeling improvements

    NASA Technical Reports Server (NTRS)

    Macmillan, D. S.; Ma, C.

    1994-01-01

    We determine the improvement in baseline length precision and accuracy using new atmospheric delay mapping functions and MTT by analyzing the NASA Crustal Dynamics Project research and development (R&D) experiments and the International Radio Interferometric Surveying (IRIS) A experiments. These mapping functions reduce baseline length scatter by about 20% below that using the CfA2.2 dry and Chao wet mapping functions. With the newer mapping functions, average station vertical scatter inferred from observed length precision (given by length repeatabilites) is 11.4 mm for the 1987-1990 monthly R&D series of experiments and 5.6 mm for the 3-week-long extended research and development experiment (ERDE) series. The inferred monthly R&D station vertical scatter is reduced by 2 mm or by 7 mm is a root-sum-square (rss) sense. Length repeatabilities are optimum when observations below a 7-8 deg elevation cutoff are removed from the geodetic solution. Analyses of IRIS-A data from 1984 through 1991 and the monthly R&D experiments both yielded a nonatmospheric unmodeled station vertical error or about 8 mm. In addition, analysis of the IRIS-A exeriments revealed systematic effects in the evolution of some baseline length measurements. The length rate of change has an apparent acceleration, and the length evolution has a quasi-annual signature. We show that the origin of these effects is unlikely to be related to atmospheric modeling errors. Rates of change of the transatlantic Westford-Wettzell and Richmond-Wettzell baseline lengths calculated from 1988 through 1991 agree with the NUVEL-1 plate motion model (Argus and Gordon, 1991) to within 1 mm/yr. Short-term (less than 90 days) variations of IRIS-A baseline length measurements contribute more than 90% of the observed scatter about a best fit line, and this short-term scatter has large variations on an annual time scale.

  16. Electric-hybrid-vehicle simulation

    NASA Astrophysics Data System (ADS)

    Pasma, D. C.

    The simulation of electric hybrid vehicles is to be performed using experimental data to model propulsion system components. The performance of an existing ac propulsion system will be used as the baseline for comparative purposes. Hybrid components to be evaluated include electrically and mechanically driven flywheels, and an elastomeric regenerative braking system.

  17. Transferability of Virtual Reality, Simulation-Based, Robotic Suturing Skills to a Live Porcine Model in Novice Surgeons: A Single-Blind Randomized Controlled Trial.

    PubMed

    Vargas, Maria V; Moawad, Gaby; Denny, Kathryn; Happ, Lindsey; Misa, Nana Yaa; Margulies, Samantha; Opoku-Anane, Jessica; Abi Khalil, Elias; Marfori, Cherie

    To assess whether a robotic simulation curriculum for novice surgeons can improve performance of a suturing task in a live porcine model. Randomized controlled trial (Canadian Task Force classification I). Academic medical center. Thirty-five medical students without robotic surgical experience. Participants were enrolled in an online session of training modules followed by an in-person orientation. Baseline performance testing on the Mimic Technologies da Vinci Surgical Simulator (dVSS) was also performed. Participants were then randomly assigned to the completion of 4 dVSS training tasks (camera clutching 1, suture sponge 1 and 2, and tubes) versus no further training. The intervention group performed each dVSS task until proficiency or up to 10 times. A final suturing task was performed on a live porcine model, which was video recorded and blindly assessed by experienced surgeons. The primary outcomes were Global Evaluative Assessment of Robotic Skills (GEARS) scores and task time. The study had 90% power to detect a mean difference of 3 points on the GEARS scale, assuming a standard deviation (SD) of 2.65, and 80% power to detect a mean difference of 3 minutes, assuming an SD of 3 minutes. There were no differences in demographics and baseline skills between the 2 groups. No significant differences in task time in minutes or GEARS scores were seen for the final suturing task between the intervention and control groups, respectively (9.2 [2.65] vs 9.9 [2.07] minutes, p = .406; and 15.37 [2.51] vs 15.25 [3.38], p = .603). The 95% confidence interval for the difference in mean task times was -2.36 to .96 minutes and for mean GEARS scores -1.91 to 2.15 points. Live suturing task performance was not improved with a proficiency-based virtual reality simulation suturing curriculum compared with standard orientation to the da Vinci robotic console in a group of novice surgeons. Published by Elsevier Inc.

  18. Pharmacy Students' Retention of Knowledge and Skills Following Training in Automated External Defibrillator Use

    PubMed Central

    Dopp, Anna Legreid; Dopp, John M.; Vardeny, Orly; Sims, J. Jason

    2010-01-01

    Objectives To assess pharmacy students' retention of knowledge about appropriate automated external defibrillator use and counseling points following didactic training and simulated experience. Design Following a lecture on sudden cardiac arrest and automated external defibrillator use, second-year doctor of pharmacy (PharmD) students were assessed on their ability to perform basic life support and deliver a shock at baseline, 3 weeks, and 4 months. Students completed a questionnaire to evaluate recall of counseling points for laypeople/the public. Assessment Mean time to shock delivery at baseline was 74 ± 25 seconds, which improved significantly at 3 weeks (50 ± 17 seconds, p < 0.001) and was maintained at 4 months (47 ± 18 seconds, p < 0.001). Recall of all signs and symptoms of sudden cardiac arrest and automated external defibrillator counseling points was diminished after 4 months. Conclusion Pharmacy students can use automated external defibrillators to quickly deliver a shock and are able to retain this ability after 4 months. Refresher training/courses will be required to improve students' retention of automated external defibrillator counseling points to ensure their ability to deliver appropriate patient education. PMID:21045951

  19. Mental Mechanisms for Topics Identification

    PubMed Central

    2014-01-01

    Topics identification (TI) is the process that consists in determining the main themes present in natural language documents. The current TI modeling paradigm aims at acquiring semantic information from statistic properties of large text datasets. We investigate the mental mechanisms responsible for the identification of topics in a single document given existing knowledge. Our main hypothesis is that topics are the result of accumulated neural activation of loosely organized information stored in long-term memory (LTM). We experimentally tested our hypothesis with a computational model that simulates LTM activation. The model assumes activation decay as an unavoidable phenomenon originating from the bioelectric nature of neural systems. Since decay should negatively affect the quality of topics, the model predicts the presence of short-term memory (STM) to keep the focus of attention on a few words, with the expected outcome of restoring quality to a baseline level. Our experiments measured topics quality of over 300 documents with various decay rates and STM capacity. Our results showed that accumulated activation of loosely organized information was an effective mental computational commodity to identify topics. It was furthermore confirmed that rapid decay is detrimental to topics quality but that limited capacity STM restores quality to a baseline level, even exceeding it slightly. PMID:24744775

  20. Dynamic cycling in atrial size and flow during obstructive apnoea.

    PubMed

    Pressman, Gregg S; Cepeda-Valery, Beatriz; Codolosa, Nicolas; Orban, Marek; Samuel, Solomon P; Somers, Virend K

    2016-01-01

    Obstructive sleep apnoea (OSA) is strongly associated with cardiovascular disease. However, acute cardiovascular effects of repetitive airway obstruction are poorly understood. While past research used a sustained Mueller manoeuver to simulate OSA we employed a series of gasping efforts to better simulate true obstructive apnoeas. This report describes acute changes in cardiac anatomy and flow related to sudden changes in intrathoracic pressure. 26 healthy, normal weight participants performed 5-6 gasping efforts (target intrathoracic pressure -40 mm Hg) while undergoing Doppler echocardiography. 14 participants had sufficient echocardiographic images to allow comparison of atrial areas during the manoeuver with baseline measurements. Mitral and tricuspid E-wave and A-wave velocities postmanoeuver were compared with baseline in all participants. Average atrial areas changed little during the manoeuver, but variance in both atrial areas was significantly greater than baseline. Further, an inverse relationship was noted with left atrial collapse and right atrial enlargement at onset of inspiratory effort. Significant inverse changes were noted in Doppler flow when comparing the first beat postmanoeuver (pMM1) with baseline. Mitral E-wave velocity increased 9.1 cm/s while tricuspid E-wave velocity decreased 7.0 cm/s; by the eighth beat postmanoeuver (pMM8) values were not different from baseline. Mitral and tricuspid A-wave velocities were not different from baseline at pMM1, but both were significantly higher by pMM8. Repetitive obstructive apnoeas produce dynamic, inverse changes in atrial size and Doppler flow across the atrioventricular valves. These observations have important implications for understanding the pathophysiology of OSA.

  1. Simulated effects of hydrologic, water quality, and land-use changes of the Lake Maumelle watershed, Arkansas, 2004–10

    USGS Publications Warehouse

    Hart, Rheannon M.; Green, W. Reed; Westerman, Drew A.; Petersen, James C.; DeLanois, Jeanne L.

    2012-01-01

    Lake Maumelle, located in central Arkansas northwest of the cities of Little Rock and North Little Rock, is one of two principal drinking-water supplies for the Little Rock, and North Little Rock, Arkansas, metropolitan areas. Lake Maumelle and the Maumelle River (its primary tributary) are more pristine than most other reservoirs and streams in the region with 80 percent of the land area in the entire watershed being forested. However, as the Lake Maumelle watershed becomes increasingly more urbanized and timber harvesting becomes more extensive, concerns about the sustainability of the quality of the water supply also have increased. Two hydrodynamic and water-quality models were developed to examine the hydrology and water quality in the Lake Maumelle watershed and changes that might occur as the watershed becomes more urbanized and timber harvesting becomes more extensive. A Hydrologic Simulation Program–FORTRAN watershed model was developed using continuous streamflow and discreet suspended-sediment and water-quality data collected from January 2004 through 2010. A CE–QUAL–W2 model was developed to simulate reservoir hydrodynamics and selected water-quality characteristics using the simulated output from the Hydrologic Simulation Program–FORTRAN model from January 2004 through 2010. The calibrated Hydrologic Simulation Program–FORTRAN model and the calibrated CE–QUAL–W2 model were developed to simulate three land-use scenarios and to examine the potential effects of these land-use changes, as defined in the model, on the water quality of Lake Maumelle during the 2004 through 2010 simulation period. These scenarios included a scenario that simulated conversion of most land in the watershed to forest (scenario 1), a scenario that simulated conversion of potentially developable land to low-intensity urban land use in part of the watershed (scenario 2), and a scenario that simulated timber harvest in part of the watershed (scenario 3). Simulated land-use changes for scenarios 1 and 3 resulted in little (generally less than 10 percent) overall effect on the simulated water quality in the Hydrologic Simulation Program–FORTRAN model. The land-use change of scenario 2 affected subwatersheds that include Bringle, Reece, and Yount Creek tributaries and most other subwatersheds that drain into the northern side of Lake Maumelle; large percent increases in loading rates (generally between 10 and 25 percent) included dissolved nitrite plus nitrate nitrogen, dissolved orthophosphate, total phosphorus, suspended sediment, dissolved ammonia nitrogen, total organic carbon, and fecal coliform bacteria. For scenario 1, the simulated changes in nutrient, suspended sediment, and total organic carbon loads from the Hydrologic Simulation Program–FORTRAN model resulted in very slight (generally less than 10 percent) changes in simulated water quality for Lake Maumelle, relative to the baseline condition. Following lake mixing in the falls of 2006 and 2007, phosphorus and nitrogen concentrations were higher than the baseline condition and chlorophyll a responded accordingly. The increased nutrient and chlorophyll a concentrations in late October and into 2007 were enough to increase concentrations, on average, for the entire simulation period (2004–10). For scenario 2, the simulated changes in nutrient, suspended sediment, total organic carbon, and fecal coliform bacteria loads from the Lake Maumelle watershed resulted in slight changes in simulated water quality for Lake Maumelle, relative to the baseline condition (total nitrogen decreased by 0.01 milligram per liter; dissolved orthophosphate increased by 0.001 milligram per liter; chlorophyll a decreased by 0.1 microgram per liter). The differences in these concentrations are approximately an order of magnitude less than the error between measured and simulated concentrations in the baseline model. During the driest summer in the simulation period (2006), phosphorus and nitrogen concentrations were lower than the baseline condition and chlorophyll a concentrations decreased during the same summer season. The decrease in nitrogen and chlorophyll a concentrations during the dry summer in 2006 was enough to decrease concentrations of these constituents very slightly, on average, for the entire simulation period (2004–10). For scenario 3, the changes in simulated nutrient, suspended sediment, total organic carbon, and fecal coliform bacteria loads from Lake Maumelle watershed resulted in very slight changes in simulated water quality within Lake Maumelle, relative to the baseline condition, for most of the reservoir. Among the implications of the results of the modeling described in this report are those related to scale in both space and time. Spatial scales include limited size and location of land-use changes, their effects on loading rates, and resultant effects on water quality of Lake Maumelle. Temporally, the magnitude of the water-quality changes simulated by the land-use change scenarios over the 7-year period (2004–10) are not necessarily indicative of the changes that could be expected to occur with similar land-use changes persisting over a 20-, 30-, or 40- year period, for example. These implications should be tempered by realization of the described model limitations. The Hydrologic Simulation Program–FORTRAN watershed model was calibrated to streamflow and water-quality data from five streamflow-gaging stations, and in general, these stations characterize a range of subwatershed areas with varying land-use types. The CE–QUAL–W2 reservoir model was calibrated to water-quality data collected during January 2004 through December 2010 at three reservoir stations, representing the upper, middle, and lower sections of the reservoir. In general, the baseline simulation for the Hydrologic Simulation Program–FORTRAN and the CE–QUAL–W2 models matched reasonably well to the measured data. Simulated and measured suspended-sediment concentrations during periods of base flow (streamflows not substantially influenced by runoff) agree reasonably well for Maumelle River at Williams Junction, the station representing the upper end of the watershed (with differences—simulated minus measured value—generally ranging from -15 to 41 milligrams per liter, and percent difference—relative to the measured value—ranging from -99 to 182 percent) and Maumelle River near Wye, the station just above the reservoir at the lower end (differences generally ranging from -20 to 22 milligrams per liter, and percent difference ranging from -100 to 194 percent). In general, water temperature and dissolved-oxygen concentration simulations followed measured seasonal trends for all stations with the largest differences occurring during periods of lowest temperatures or during the periods of lowest measured dissolved-oxygen concentrations. For the CE–QUAL–W2 model, simulated vertical distributions of water temperatures and dissolved-oxygen concentrations agreed with measured vertical distributions over time, even for the most complex water-temperature profiles. Considering the oligotrophic-mesotrophic (low to intermediate primary productivity and associated low nutrient concentrations) condition of Lake Maumelle, simulated algae, phosphorus, and nitrogen concentrations compared well with generally low measured concentrations.

  2. Goal-directed transthoracic echocardiography during advanced cardiac life support: A pilot study using simulation to assess ability

    PubMed Central

    Greenstein, Yonatan Y.; Martin, Thomas J.; Rolnitzky, Linda; Felner, Kevin; Kaufman, Brian

    2015-01-01

    Introduction Goal-directed echocardiography (GDE) is used to answer specific clinical questions which provide invaluable information to physicians managing a hemodynamically unstable patient. We studied perception and ability of housestaff previously trained in GDE to accurately diagnose common causes of cardiac arrest during simulated advanced cardiac life support (ACLS); we compared their results to those of expert echocardiographers. Methods Eleven pulmonary and critical care medicine fellows, seven emergency medicine residents, and five cardiologists board-certified in echocardiography were enrolled. Baseline ability to acquire four transthoracic echocardiography views was assessed and participants were exposed to six simulated cardiac arrests and were asked to perform a GDE during ACLS. Housestaff performance was compared to the performance of five expert echocardiographers. Results Average baseline and scenario views by housestaff were of good or excellent quality 89% and 83% of the time, respectively. Expert average baseline and scenario views were always of good or excellent quality. Housestaff and experts made the correct diagnosis in 68% and 77% of cases, respectively. On average, participants required 1.5 pulse checks to make the correct diagnosis. 94% of housestaff perceived this study as an accurate assessment of ability. Conclusions In an ACLS compliant manner, housestaff are capable of diagnosing management altering pathologies the majority of the time and they reach similar diagnostic conclusions in the same amount of time as expert echocardiographers in a simulated cardiac arrest scenario. PMID:25932707

  3. Goal-Directed Transthoracic Echocardiography During Advanced Cardiac Life Support: A Pilot Study Using Simulation to Assess Ability.

    PubMed

    Greenstein, Yonatan Y; Martin, Thomas J; Rolnitzky, Linda; Felner, Kevin; Kaufman, Brian

    2015-08-01

    Goal-directed echocardiography (GDE) is used to answer specific clinical questions that provide invaluable information to physicians managing a hemodynamically unstable patient. We studied perception and ability of house staff previously trained in GDE to accurately diagnose common causes of cardiac arrest during simulated advanced cardiac life support (ACLS); we compared their results with those of expert echocardiographers. Eleven pulmonary and critical care medicine fellows, 7 emergency medicine residents, and 5 cardiologists board certified in echocardiography were enrolled. Baseline ability to acquire 4 transthoracic echocardiography views was assessed, and participants were exposed to 6 simulated cardiac arrests and were asked to perform a GDE during ACLS. House staff performance was compared with the performance of 5 expert echocardiographers. Average baseline and scenario views by house staff were of good or excellent quality 89% and 83% of the time, respectively. Expert average baseline and scenario views were always of good or excellent quality. House staff and experts made the correct diagnosis in 68% and 77% of cases, respectively. On average, participants required 1.5 pulse checks to make the correct diagnosis. Of house staff, 94% perceived this study as an accurate assessment of ability. In an ACLS-compliant manner, house staff are capable of diagnosing management-altering pathologies the majority of the time, and they reach similar diagnostic conclusions in the same amount of time as expert echocardiographers in a simulated cardiac arrest scenario.

  4. Microsurgical Performance After Sleep Interruption: A NeuroTouch Simulator Study.

    PubMed

    Micko, Alexander; Knopp, Karoline; Knosp, Engelbert; Wolfsberger, Stefan

    2017-10-01

    In times of the ubiquitous debate about doctors' working hour restrictions, it is still questionable if the physician's performance is impaired by high work load and long shifts. In this study, we evaluated the impact of sleep interruption on neurosurgical performance. Ten medical students and 10 neurosurgical residents were tested on the virtual-reality simulator NeuroTouch by performing an identical microsurgical task, well rested (baseline test), and after sleep interruption at night (stress test). Deviation of total score, timing, and excessive force on tissue were evaluated. In addition, vital parameters and self-assessment were analyzed. After sleep interruption, total performance score increased significantly (45.1 vs. 48.7, baseline vs. stress test, P = 0.048) while timing remained stable (10.1 vs. 10.4 minutes for baseline vs. stress test, P > 0.05) for both students and residents. Excessive force decreased in both groups during the stress test for the nondominant hand (P = 0.05). For the dominant hand, an increase of excessive force was encountered in the group of residents (P = 0.05). In contrast to their results, participants of both groups assessed their performance worse during the stress test. In our study, we found an increase of neurosurgical simulator performance in neurosurgical residents and medical students under simulated night shift conditions. Further, microsurgical dexterity remained unchanged. Based on our results and the data in the available literature, we cannot confirm that working hour restrictions will have a positive effect on neurosurgical performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Low-Cost Simulation to Teach Anesthetists' Non-Technical Skills in Rwanda.

    PubMed

    Skelton, Teresa; Nshimyumuremyi, Isaac; Mukwesi, Christian; Whynot, Sara; Zolpys, Lauren; Livingston, Patricia

    2016-08-01

    Safe anesthesia care is challenging in developing countries where there are shortages of personnel, drugs, equipment, and training. Anesthetists' Non-technical Skills (ANTS)-task management, team working, situation awareness, and decision making-are difficult to practice well in this context. Cesarean delivery is the most common surgical procedure in sub-Saharan Africa. This pilot study investigates whether a low-cost simulation model, with good psychological fidelity, can be used effectively to teach ANTS during cesarean delivery in Rwanda. Study participants were anesthesia providers working in a tertiary referral hospital in Rwanda. Baseline observations were conducted for 20 anesthesia providers during cesarean delivery using the established ANTS framework. After the first observation set was complete, participants were randomly assigned to either simulation intervention or control groups. The simulation intervention group underwent ANTS training using low-cost high psychological fidelity simulation with debriefing. No training was offered to the control group. Postintervention observations were then conducted in the same manner as the baseline observations. The primary outcome was the overall ANTS score (maximum, 16). The median (range) ANTS score of the simulation group was 13.5 (11-16). The ANTS score of the control group was 8 (8-9), with a statistically significant difference (P = .002). Simulation participants showed statistically significant improvement in subcategories and in the overall ANTS score compared with ANTS score before simulation exposure. Rwandan anesthesia providers show improvement in ANTS practice during cesarean delivery after 1 teaching session using a low-cost high psychological fidelity simulation model with debriefing.

  6. Agricultural Baseline (BL0) scenario of the 2016 Billion-Ton Report

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee, APAC] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000320373827)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  7. Navigation Strategies for Primitive Solar System Body Rendezvous and Proximity Operations

    NASA Technical Reports Server (NTRS)

    Getzandanner, Kenneth M.

    2011-01-01

    A wealth of scientific knowledge regarding the composition and evolution of the solar system can be gained through reconnaissance missions to primitive solar system bodies. This paper presents analysis of a baseline navigation strategy designed to address the unique challenges of primitive body navigation. Linear covariance and Monte Carlo error analysis was performed on a baseline navigation strategy using simulated data from a· design reference mission (DRM). The objective of the DRM is to approach, rendezvous, and maintain a stable orbit about the near-Earth asteroid 4660 Nereus. The outlined navigation strategy and resulting analyses, however, are not necessarily limited to this specific target asteroid as they may he applicable to a diverse range of mission scenarios. The baseline navigation strategy included simulated data from Deep Space Network (DSN) radiometric tracking and optical image processing (OpNav). Results from the linear covariance and Monte Carlo analyses suggest the DRM navigation strategy is sufficient to approach and perform proximity operations in the vicinity of the target asteroid with meter-level accuracy.

  8. A new sum parameter to estimate the bioconcentration and baseline-toxicity of hydrophobic compounds in river water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loon, W.M.G.M. van; Hermens, J.L.M.

    1994-12-31

    A large part of all aquatic pollutants can be classified as narcosis-type (baseline toxicity) chemicals. Many chemicals contribute to a joint baseline aquatic toxicity even at trace concentrations. A novel surrogate parameter, which simulated bioconcentration of hydrophobic substances from water and estimates internal molar concentrations, has been explored by Verhaar et al.. These estimated biological concentrations can be used to predict narcosis-type toxic effects, using the Lethal Body Burden (LBB) concept. The authors applied this toxicological-analytical concept to river water, and some recent technological developments and field results are pointed out. The simulation of bioconcentration is performed by extracting watermore » samples with empore{trademark} disks. The authors developed two extraction procedures; i.e., laboratory extraction and field extraction. Molar concentrations measurements are performed using vapor pressure osmometry, GC-FID and GC-MS. Results on the molar concentrations of hydrophobic compounds which can be bioaccumulated from several Dutch river systems will be presented.« less

  9. Baseline tests for arc melter vitrification of INEL buried wastes. Volume II: Baseline test data appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, L.L.; O`Conner, W.K.; Turner, P.C.

    1993-11-19

    This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc meltingmore » furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests.« less

  10. Development of an OSSE Framework for a Global Atmospheric Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald; Errico, Ronald M.; Prive, N.

    2012-01-01

    Observing system simulation experiments (OSSEs) are powerful tools for estimating the usefulness of various configurations of envisioned observing systems and data assimilation techniques. Their utility stems from their being conducted in an entirely simulated context, utilizing simulated observations having simulated errors and drawn from a simulation of the earth's environment. Observations are generated by applying physically based algorithms to the simulated state, such as performed during data assimilation or using other appropriate algorithms. Adding realistic instrument plus representativeness errors, including their biases and correlations, can be critical for obtaining realistic assessments of the impact of a proposed observing system or analysis technique. If estimates of the expected accuracy of proposed observations are realistic, then the OSSE can be also used to learn how best to utilize the new information, accelerating its transition to operations once the real data are available. As with any inferences from simulations, however, it is first imperative that some baseline OSSEs are performed and well validated against corresponding results obtained with a real observing system. This talk provides an overview of, and highlights critical issues related to, the development of an OSSE framework for the tropospheric weather prediction component of the NASA GEOS-5 global atmospheric data assimilation system. The framework includes all existing observations having significant impact on short-term forecast skill. Its validity has been carefully assessed using a range of metrics that can be evaluated in both the OSSE and real contexts, including adjoint-based estimates of observation impact. A preliminary application to the Aeolus Doppler wind lidar mission, scheduled for launch by the European Space Agency in 2014, has also been investigated.

  11. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    PubMed

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined.

  12. Using Simulation for Launch Team Training and Evaluation

    NASA Technical Reports Server (NTRS)

    Peaden, Cary J.

    2005-01-01

    This document describes some of the histor y and uses of simulation systems and processes for the training and evaluation of Launch Processing, Mission Control, and Mission Management teams. It documents some of the types of simulations that are used at Kennedy Space Center (KSC) today and that could be utilized (and possibly enhanced) for future launch vehicles. This article is intended to provide an initial baseline for further research into simulation for launch team training in the near future.

  13. Hypoxia-induced changes in standing balance.

    PubMed

    Wagner, Linsey S; Oakley, Sarah R; Vang, Pao; Noble, Brie N; Cevette, Michael J; Stepanek, Jan P

    2011-05-01

    A few studies in the literature have reported postural changes with hypoxia, but none have quantified the magnitude of change. Further understanding of this condition could have implications for patients at risk for falls, individuals undergoing acute altitude exposure, and pilots and commercial passengers. The objective of this study was to evaluate the effect of different levels of hypoxia (oxygen nitrogen mixtures) on postural standing balance using the computerized dynamic posturography (CDP) system. This improves upon previous protocols by manipulating vision and standing balance with a sway-referenced visual field and/or platform. Additionally, normative data were available for comparison with the cumulative test scores and scores for each condition. Altitude hypoxia was simulated by use of admixing nitrogen to the breathing gas to achieve equivalent altitudes of 1524 m, 2438 m, and 3048 m. Subjects were evaluated using the CDP system. Subjects showed an overall trend toward decreased performance at higher simulated altitudes consistent with the initial hypothesis. Composite standing balance sway scores for the sensory organization subtest of CDP were decreased compared to baseline for simulated altitudes as low as 2438 m (mean sway scores: 81.92 at baseline; 81.85 at 1524 m; 79.15 at 2438 m; 79.15 at 3048 m). Reaction times to unexpected movements in the support surface for the motor control subtest (MCT) increased compared to baseline (mean composite scores: 133.3 at baseline; 135.9 ms at 1524 m; 138.0 ms at 2438 m; 140.9 ms at 3048 m). The CDP testing provided a reliable objective measurement of degradation of balance under hypoxic conditions.

  14. Benchmarking Model Variants in Development of a Hardware-in-the-Loop Simulation System

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia M.; Kratz, Jonathan L.; Culley, Dennis E.; Thomas, George L.

    2016-01-01

    Distributed engine control architecture presents a significant increase in complexity over traditional implementations when viewed from the perspective of system simulation and hardware design and test. Even if the overall function of the control scheme remains the same, the hardware implementation can have a significant effect on the overall system performance due to differences in the creation and flow of data between control elements. A Hardware-in-the-Loop (HIL) simulation system is under development at NASA Glenn Research Center that enables the exploration of these hardware dependent issues. The system is based on, but not limited to, the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k). This paper describes the step-by-step conversion from the self-contained baseline model to the hardware in the loop model, and the validation of each step. As the control model hardware fidelity was improved during HIL system development, benchmarking simulations were performed to verify that engine system performance characteristics remained the same. The results demonstrate the goal of the effort; the new HIL configurations have similar functionality and performance compared to the baseline C-MAPSS40k system.

  15. Performance and driveline analyses of engine capacity in range extender engine hybrid vehicle

    NASA Astrophysics Data System (ADS)

    Praptijanto, Achmad; Santoso, Widodo Budi; Nur, Arifin; Wahono, Bambang; Putrasari, Yanuandri

    2017-01-01

    In this study, range extender engine designed should be able to meet the power needs of a power generator of hybrid electrical vehicle that has a minimum of 18 kW. Using this baseline model, the following range extenders will be compared between conventional SI piston engine (Baseline, BsL), engine capacity 1998 cm3, and efficiency-oriented SI piston with engine capacity 999 cm3 and 499 cm3 with 86 mm bore and stroke square gasoline engine in the performance, emission prediction of range extender engine, standard of charge by using engine and vehicle simulation software tools. In AVL Boost simulation software, range extender engine simulated from 1000 to 6000 rpm engine loads. The highest peak engine power brake reached up to 38 kW at 4500 rpm. On the other hand the highest torque achieved in 100 Nm at 3500 rpm. After that using AVL cruise simulation software, the model of range extended electric vehicle in series configuration with main components such as internal combustion engine, generator, electric motor, battery and the arthemis model rural road cycle was used to simulate the vehicle model. The simulation results show that engine with engine capacity 999 cm3 reported the economical performances of the engine and the emission and the control of engine cycle parameters.

  16. KIGAM Seafloor Observation System (KISOS) for the baseline study in monitoring of gas hydrate test production in the Ulleung Basin, Korea

    NASA Astrophysics Data System (ADS)

    Lee, Sung-rock; Chun, Jong-hwa

    2013-04-01

    For the baseline study in the monitoring gas hydrate test production in the Ulleung Basin, Korea Institute of Geoscience and Mineral Resources (KIGAM) has developed the KIGAM Seafloor Observation System (KISOS) for seafloor exploration using unmanned remotely operated vehicle connected with a ship by a cable. The KISOS consists of a transponder of an acoustic positioning system (USBL), a bottom finding pinger, still camera, video camera, water sampler, and measuring devices (methane, oxygen, CTD, and turbidity sensors) mounted on the unmanned ROV, and a sediment collecting device collecting sediment on the seafloor. It is very important to monitoring the environmental risks (gas leakage and production water/drilling mud discharge) which may be occurred during the gas hydrate test production drilling. The KISOS will be applied to solely conduct baseline study with the KIGAM seafloor monitoring system (KIMOS) of the Korean gas hydrate program in the future. The large scale of environmental monitoring program includes the environmental impact assessment such as seafloor disturbance and subsidence, detection of methane gas leakage around well and cold seep, methane bubbles and dissolved methane, change of marine environments, chemical factor variation of water column and seabed, diffusion of drilling mud and production water, and biological factors of biodiversity and marine habitats before and after drilling test well and nearby areas. The design of the baseline survey will be determined based on the result of SIMAP simulation in 2013. The baseline survey will be performed to provide the gas leakage and production water/drilling mud discharge before and after gas hydrate test production. The field data of the baseline study will be evaluated by the simulation and verification of SIMAP simulator in 2014. In the presentation, the authors would like introduce the configuration of KISOS and applicability to the seafloor observation for the gas hydrate test production in the Ulleung Basin. This work was financially supported by the the Ministry of Knowledge Economy(MKE) and Gas Hydrate R/D Organization(GHDO)

  17. An Efficient Implementation of Fixed Failure-Rate Ratio Test for GNSS Ambiguity Resolution.

    PubMed

    Hou, Yanqing; Verhagen, Sandra; Wu, Jie

    2016-06-23

    Ambiguity Resolution (AR) plays a vital role in precise GNSS positioning. Correctly-fixed integer ambiguities can significantly improve the positioning solution, while incorrectly-fixed integer ambiguities can bring large positioning errors and, therefore, should be avoided. The ratio test is an extensively used test to validate the fixed integer ambiguities. To choose proper critical values of the ratio test, the Fixed Failure-rate Ratio Test (FFRT) has been proposed, which generates critical values according to user-defined tolerable failure rates. This contribution provides easy-to-implement fitting functions to calculate the critical values. With a massive Monte Carlo simulation, the functions for many different tolerable failure rates are provided, which enriches the choices of critical values for users. Moreover, the fitting functions for the fix rate are also provided, which for the first time allows users to evaluate the conditional success rate, i.e., the success rate once the integer candidates are accepted by FFRT. The superiority of FFRT over the traditional ratio test regarding controlling the failure rate and preventing unnecessary false alarms is shown by a simulation and a real data experiment. In the real data experiment with a baseline of 182.7 km, FFRT achieved much higher fix rates (up to 30% higher) and the same level of positioning accuracy from fixed solutions as compared to the traditional critical value.

  18. What is $$\\Delta m^2_{ee}$$ ?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parke, Stephen

    Here, the current short baseline reactor experiments, Daya Bay and RENO (Double Chooz) have measured (or are capable of measuring) an effective Δm 2 associated with the atmospheric oscillation scale of 0.5 km/MeV in electron antineutrino disappearance. In this paper, I compare and contrast the different definitions of such an effective Δm 2 and argue that the simple, L/E independent definition given by Δ mee 2≡cos 2θ 12Δm312+sin 2θ 12Δm 32 2, i.e. “the νe weighted average of Δm 31 2 and Δm 32 2,” is superior to all other definitions and is useful for both short baseline experiments mentionedmore » above and for the future medium baseline experiments JUNO and RENO-50.« less

  19. What is $$\\Delta m^2_{ee}$$ ?

    DOE PAGES

    Parke, Stephen

    2016-03-09

    Here, the current short baseline reactor experiments, Daya Bay and RENO (Double Chooz) have measured (or are capable of measuring) an effective Δm 2 associated with the atmospheric oscillation scale of 0.5 km/MeV in electron antineutrino disappearance. In this paper, I compare and contrast the different definitions of such an effective Δm 2 and argue that the simple, L/E independent definition given by Δ mee 2≡cos 2θ 12Δm312+sin 2θ 12Δm 32 2, i.e. “the νe weighted average of Δm 31 2 and Δm 32 2,” is superior to all other definitions and is useful for both short baseline experiments mentionedmore » above and for the future medium baseline experiments JUNO and RENO-50.« less

  20. Neutrino Oscillations with the MINOS, MINOS+, T2K, and NOvA Experiments

    DOE PAGES

    Nakaya, Tsuyoshi; Plunkett, Robert K.

    2016-01-18

    Our paper discusses results and near-term prospects of the long-baseline neutrino experiments MINOS, MONOS+, T2K and NOvA. The non-zero value of the third neutrino mixing angle θ 13 allows experimental analysis in a manner which explicitly exhibits appearance and disappearance dependencies on additional parameters associated with mass-hierarchy, CP violation, and any non-maximal θ 23. Our current and near-future experiments begin the era of precision accelerator long-baseline measurements and lay the framework within which future experimental results will be interpreted.

  1. Long-Baseline Neutrino Experiments

    DOE PAGES

    Diwan, M. V.; Galymov, V.; Qian, X.; ...

    2016-10-19

    We review long-baseline neutrino experiments in which neutrinos are detected after traversing macroscopic distances. Over such distances neutrinos have been found to oscillate among flavor states. Experiments with solar, atmospheric, reactor, and accelerator neutrinos have resulted in a coherent picture of neutrino masses and mixing of the three known flavor states. We will summarize the current best knowledge of neutrino parameters and phenomenology with our focus on the evolution of the experimental technique. We will proceed from the rst evidence produced by astrophysical neutrino sources to the current open questions and the goals of future research

  2. The Case for Simulation-Based Mastery Learning Education Courses for Practicing Surgeons.

    PubMed

    Baumann, Lauren M; Barsness, Katherine A

    2018-03-12

    Pediatric surgeons rely on simulation courses to develop skills for safe minimally invasive repair of complex congenital anomalies. The majority of minimally invasive surgery (MIS) training courses occur during short "exposure courses" at annual conferences. Little data are available to support the benefit of these courses relative to the safe implementation of new skills. The purpose of this article is to determine the impact of an exposure course for advanced neonatal MIS on self-perceived comfort levels with independent performance of advanced MISs. Participants of a 4-hour hands-on course for neonatal MIS were surveyed regarding clinical practices and pre- and post-training perceived "comfort levels" of MIS skills for thoracoscopic esophageal atresia with tracheoesophageal fistula (tTEF) repair, thoracoscopic left upper lobe pulmonary lobectomy (tLobe), and laparoscopic duodenal atresia (lapDA) repair. Descriptive analyses were performed. Seventeen participants completed pre- and postcourse surveys. The majority of participants had no prior experience with tLobe (59%) or lapDA (53%), and 35% had no experience with tTEF repair. Similarly, the majority were "not comfortable" with these procedures. After the short course, the majority of surgeons reported that they were "likely to perform" these operations within 6 months, despite low levels of baseline experience and comfort levels. An exposure training course led to immediate perception of increased skills and confidence. However, these courses typically do not provide basic tenets of expert performance that demands deliberate practice. Future course design should transition to a mastery learning framework wherein regular skill assessments, milestones, and unlimited education time are prioritized before implementation of the new skills.

  3. Improving teamwork and communication in trauma care through in situ simulations.

    PubMed

    Miller, Daniel; Crandall, Cameron; Washington, Charles; McLaughlin, Steven

    2012-05-01

    Teamwork and communication often play a role in adverse clinical events. Due to the multidisciplinary and time-sensitive nature of trauma care, the effects of teamwork and communication can be especially pronounced in the treatment of the acutely injured patient. Our hypothesis was that an in situ trauma simulation (ISTS) program (simulating traumas in the trauma bay with all members of the trauma team) could be implemented in an emergency department (ED) and that this would improve teamwork and communication measured in the clinical setting. This was an observational study of the effect of an ISTS program on teamwork and communication during trauma care. The authors observed a convenience sample of 39 trauma activations. Cases were selected by their presenting to the resuscitation bay of a Level I trauma center between 09:00 and 16:00, Monday through Thursday, during the study period. Teamwork and communication were measured using the previously validated Clinical Teamwork Scale (CTS). The observers were three Trauma Nursing Core Course certified RNs trained on the CTS by observing simulated and actual trauma cases and following each of these cases with a discussion of appropriate CTS scores with two certified Advanced Trauma Life Support instructors/emergency physicians. Cases observed for measurement were scored in four phases: 1) preintervention phase (baseline); 2) didactic-only intervention, the phase following a lecture series on teamwork and communication in trauma care; 3) ISTS phase, real trauma cases scored during period when weekly ISTSs were performed; and 4) potential decay phase, observations following the discontinuation of the ISTSs. Multirater agreement was assessed with Krippendorf's alpha coefficient; agreement was excellent (mean agreement = 0.92). Nonparametric procedures (Kruskal-Wallis) were used to test the hypothesis that the scores observed during the various phases were different and to compare each individual phase to baseline scores. The ISTS program was implemented and achieved regular participation of all components of our trauma team. Data were collected on 39 cases. The scores for 11 of 14 measures improved from the baseline to the didactic phase, and the mean and median scores of all CTS component measures were greatest during the ISTS phase. When each phase was compared to baseline scores, using the baseline as a control, there were no significant differences seen during the didactic or the decay phases, but 12 of the 14 measures showed significant improvements from the baseline to the simulation phase. However, when the Kruskal-Wallis test was used to test for differences across all phases, only overall communication showed a significant difference. During the potential decay phase, the scores for every measure returned to baseline phase values. This study shows that an ISTS program can be implemented with participation from all members of a multidisciplinary trauma team in the ED of a Level I trauma center. While teamwork and communication in the clinical setting were improved during the ISTS program, this effect was not sustained after ISTS were stopped. © 2012 by the Society for Academic Emergency Medicine.

  4. Designing and Evaluating an Interactive Multimedia Web-Based Simulation for Developing Nurses’ Competencies in Acute Nursing Care: Randomized Controlled Trial

    PubMed Central

    Wong, Lai Fun; Chan, Sally Wai-Chi; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Sophia Bee Leng; Goh, Poh Sun; Ang, Emily Neo Kim

    2015-01-01

    Background Web-based learning is becoming an increasingly important instructional tool in nursing education. Multimedia advancements offer the potential for creating authentic nursing activities for developing nursing competency in clinical practice. Objective This study aims to describe the design, development, and evaluation of an interactive multimedia Web-based simulation for developing nurses’ competencies in acute nursing care. Methods Authentic nursing activities were developed in a Web-based simulation using a variety of instructional strategies including animation video, multimedia instructional material, virtual patients, and online quizzes. A randomized controlled study was conducted on 67 registered nurses who were recruited from the general ward units of an acute care tertiary hospital. Following a baseline evaluation of all participants’ clinical performance in a simulated clinical setting, the experimental group received 3 hours of Web-based simulation and completed a survey to evaluate their perceptions of the program. All participants were re-tested for their clinical performances using a validated tool. Results The clinical performance posttest scores of the experimental group improved significantly (P<.001) from the pretest scores after the Web-based simulation. In addition, compared to the control group, the experimental group had significantly higher clinical performance posttest scores (P<.001) after controlling the pretest scores. The participants from the experimental group were satisfied with their learning experience and gave positive ratings for the quality of the Web-based simulation. Themes emerging from the comments about the most valuable aspects of the Web-based simulation include relevance to practice, instructional strategies, and fostering problem solving. Conclusions Engaging in authentic nursing activities using interactive multimedia Web-based simulation can enhance nurses’ competencies in acute care. Web-based simulations provide a promising educational tool in institutions where large groups of nurses need to be trained in acute nursing care and accessibility to repetitive training is essential for achieving long-term retention of clinical competency. PMID:25583029

  5. Designing and evaluating an interactive multimedia Web-based simulation for developing nurses' competencies in acute nursing care: randomized controlled trial.

    PubMed

    Liaw, Sok Ying; Wong, Lai Fun; Chan, Sally Wai-Chi; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Sophia Bee Leng; Goh, Poh Sun; Ang, Emily Neo Kim

    2015-01-12

    Web-based learning is becoming an increasingly important instructional tool in nursing education. Multimedia advancements offer the potential for creating authentic nursing activities for developing nursing competency in clinical practice. This study aims to describe the design, development, and evaluation of an interactive multimedia Web-based simulation for developing nurses' competencies in acute nursing care. Authentic nursing activities were developed in a Web-based simulation using a variety of instructional strategies including animation video, multimedia instructional material, virtual patients, and online quizzes. A randomized controlled study was conducted on 67 registered nurses who were recruited from the general ward units of an acute care tertiary hospital. Following a baseline evaluation of all participants' clinical performance in a simulated clinical setting, the experimental group received 3 hours of Web-based simulation and completed a survey to evaluate their perceptions of the program. All participants were re-tested for their clinical performances using a validated tool. The clinical performance posttest scores of the experimental group improved significantly (P<.001) from the pretest scores after the Web-based simulation. In addition, compared to the control group, the experimental group had significantly higher clinical performance posttest scores (P<.001) after controlling the pretest scores. The participants from the experimental group were satisfied with their learning experience and gave positive ratings for the quality of the Web-based simulation. Themes emerging from the comments about the most valuable aspects of the Web-based simulation include relevance to practice, instructional strategies, and fostering problem solving. Engaging in authentic nursing activities using interactive multimedia Web-based simulation can enhance nurses' competencies in acute care. Web-based simulations provide a promising educational tool in institutions where large groups of nurses need to be trained in acute nursing care and accessibility to repetitive training is essential for achieving long-term retention of clinical competency.

  6. Airborne Doppler radar detection of low altitude windshear

    NASA Technical Reports Server (NTRS)

    Bracalente, Emedio M.; Jones, William R.; Britt, Charles L.

    1990-01-01

    As part of an integrated windshear program, the Federal Aviation Administration, jointly with NASA, is sponsoring a research effort to develop airborne sensor technology for the detection of low altitude windshear during aircraft take-off and landing. One sensor being considered is microwave Doppler radar operating at X-band or above. Using a Microburst/Clutter/Radar simulation program, a preliminary feasibility study was conducted to assess the performance of Doppler radars for this application. Preliminary results from this study are presented. Analysis show, that using bin-to-bin Automatic Gain Control (AGC), clutter filtering, limited detection range, and suitable antenna tilt management, windshear from a wet microburst can be accurately detected 10 to 65 seconds (.75 to 5 km) in front of the aircraft. Although a performance improvement can be obtained at higher frequency, the baseline X-band system that was simulated detected the presence of a windshear hazard for the dry microburst. Although this study indicates the feasibility of using an airborne Doppler radar to detect low altitude microburst windshear, further detailed studies, including future flight experiments, will be required to completely characterize the capabilities and limitations.

  7. Particle-in-Cell laser-plasma simulation on Xeon Phi coprocessors

    NASA Astrophysics Data System (ADS)

    Surmin, I. A.; Bastrakov, S. I.; Efimenko, E. S.; Gonoskov, A. A.; Korzhimanov, A. V.; Meyerov, I. B.

    2016-05-01

    This paper concerns the development of a high-performance implementation of the Particle-in-Cell method for plasma simulation on Intel Xeon Phi coprocessors. We discuss the suitability of the method for Xeon Phi architecture and present our experience in the porting and optimization of the existing parallel Particle-in-Cell code PICADOR. Direct porting without code modification gives performance on Xeon Phi close to that of an 8-core CPU on a benchmark problem with 50 particles per cell. We demonstrate step-by-step optimization techniques, such as improving data locality, enhancing parallelization efficiency and vectorization leading to an overall 4.2 × speedup on CPU and 7.5 × on Xeon Phi compared to the baseline version. The optimized version achieves 16.9 ns per particle update on an Intel Xeon E5-2660 CPU and 9.3 ns per particle update on an Intel Xeon Phi 5110P. For a real problem of laser ion acceleration in targets with surface grating, where a large number of macroparticles per cell is required, the speedup of Xeon Phi compared to CPU is 1.6 ×.

  8. VLBI-resolution radio-map algorithms: Performance analysis of different levels of data-sharing on multi-socket, multi-core architectures

    NASA Astrophysics Data System (ADS)

    Tabik, S.; Romero, L. F.; Mimica, P.; Plata, O.; Zapata, E. L.

    2012-09-01

    A broad area in astronomy focuses on simulating extragalactic objects based on Very Long Baseline Interferometry (VLBI) radio-maps. Several algorithms in this scope simulate what would be the observed radio-maps if emitted from a predefined extragalactic object. This work analyzes the performance and scaling of this kind of algorithms on multi-socket, multi-core architectures. In particular, we evaluate a sharing approach, a privatizing approach and a hybrid approach on systems with complex memory hierarchy that includes shared Last Level Cache (LLC). In addition, we investigate which manual processes can be systematized and then automated in future works. The experiments show that the data-privatizing model scales efficiently on medium scale multi-socket, multi-core systems (up to 48 cores) while regardless of algorithmic and scheduling optimizations, the sharing approach is unable to reach acceptable scalability on more than one socket. However, the hybrid model with a specific level of data-sharing provides the best scalability over all used multi-socket, multi-core systems.

  9. An Imaging Calorimeter for Access-Concept Study

    NASA Technical Reports Server (NTRS)

    Parnell, T. A.; Adams, James H.; Binns, R. W.; Christl, M. J.; Derrickson, J. H.; Fountain, W. F.; Howell, L. W.; Gregory, J. C.; Hink, P. L.; Israel, M. H.; hide

    2001-01-01

    A mission concept study to define the "Advanced Cosmic-ray Composition Experiment for Space Station (ACCESS)" was sponsored by the National Aeronautics and Space Administration (NASA). The ACCESS instrument complement contains a transition radiation detector and an ionization calorimeter to measure tile spectrum of protons, helium, and heavier nuclei up to approximately 10(exp 15) eV to search for the limit of S/N shock wave acceleration, or evidence for other explanations of the spectra. Several calorimeter configurations have been studied, including the "baseline" totally active bismuth germanate instrument and sampling calorimeters utilizing various detectors. The Imaging Calorimeter for ACCESS (ICA) concept comprises a carbon target and a calorimeter using a high atomic number absorber sampled approximately each radiation length (rl) by thin scintillating fiber (SCIFI) detectors. The main features and options of the ICA instrument configuration are described in this paper. Since direct calibration is not possible over most of the energy range, the best approach must be decided from simulations of calorimeter performance extrapolated from CERN calibrations at 0.375 TeV. This paper presents results from the ICA simulations study.

  10. Aircraft Engine On-Line Diagnostics Through Dual-Channel Sensor Measurements: Development of a Baseline System

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2008-01-01

    In this paper, a baseline system which utilizes dual-channel sensor measurements for aircraft engine on-line diagnostics is developed. This system is composed of a linear on-board engine model (LOBEM) and fault detection and isolation (FDI) logic. The LOBEM provides the analytical third channel against which the dual-channel measurements are compared. When the discrepancy among the triplex channels exceeds a tolerance level, the FDI logic determines the cause of the discrepancy. Through this approach, the baseline system achieves the following objectives: (1) anomaly detection, (2) component fault detection, and (3) sensor fault detection and isolation. The performance of the baseline system is evaluated in a simulation environment using faults in sensors and components.

  11. Computable General Equilibrium Model Fiscal Year 2013 Capability Development Report - April 2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Brian Keith; Rivera, Michael K.; Boero, Riccardo

    2014-04-01

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences inmore » the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.« less

  12. Grid Sensitivity Study for Slat Noise Simulations

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Choudhari, Meelan M.; Buning, Pieter G.

    2014-01-01

    The slat noise from the 30P/30N high-lift system is being investigated through computational fluid dynamics simulations in conjunction with a Ffowcs Williams-Hawkings acoustics solver. Many previous simulations have been performed for the configuration, and the case was introduced as a new category for the Second AIAA workshop on Benchmark problems for Airframe Noise Configurations (BANC-II). However, the cost of the simulations has restricted the study of grid resolution effects to a baseline grid and coarser meshes. In the present study, two different approaches are being used to investigate the effect of finer resolution of near-field unsteady structures. First, a standard grid refinement by a factor of two is used, and the calculations are performed by using the same CFL3D solver employed in the majority of the previous simulations. Second, the OVERFLOW code is applied to the baseline grid, but with a 5th-order upwind spatial discretization as compared with the second-order discretization used in the CFL3D simulations. In general, the fine grid CFL3D simulation and OVERFLOW calculation are in very good agreement and exhibit the lowest levels of both surface pressure fluctuations and radiated noise. Although the smaller scales resolved by these simulations increase the velocity fluctuation levels, they appear to mitigate the influence of the larger scales on the surface pressure. These new simulations are used to investigate the influence of the grid on unsteady high-lift simulations and to gain a better understanding of the physics responsible for the noise generation and radiation.

  13. How self-interactions can reconcile sterile neutrinos with cosmology.

    PubMed

    Hannestad, Steen; Hansen, Rasmus Sloth; Tram, Thomas

    2014-01-24

    Short baseline neutrino oscillation experiments have shown hints of the existence of additional sterile neutrinos in the eV mass range. However, such neutrinos seem incompatible with cosmology because they have too large of an impact on cosmic structure formation. Here we show that new interactions in the sterile neutrino sector can prevent their production in the early Universe and reconcile short baseline oscillation experiments with cosmology.

  14. Utilization of exploration-based learning and video-assisted learning to teach GlideScope videolaryngoscopy.

    PubMed

    Johnston, Lindsay C; Auerbach, Marc; Kappus, Liana; Emerson, Beth; Zigmont, Jason; Sudikoff, Stephanie N

    2014-01-01

    GlideScope (GS) is used in pediatric endotracheal intubation (ETI) but requires a different technique compared to direct laryngoscopy (DL). This article was written to evaluate the efficacy of exploration-based learning on procedural performance using GS for ETI of simulated pediatric airways and establish baseline success rates and procedural duration using DL in airway trainers among pediatric providers at various levels. Fifty-five pediatric residents, fellows, and faculty from Pediatric Critical Care, NICU, and Pediatric Emergency Medicine were enrolled. Nine physicians from Pediatric Anesthesia benchmarked expert performance. Participants completed a demographic survey and viewed a video by the GS manufacturer. Subjects spent 15 minutes exploring GS equipment and practicing the intubation procedure. Participants then intubated neonatal, infant, child, and adult airway simulators, using GS and DL, in random order. Time to ETI was recorded. Procedural performance after exploration-based learning, measured as time to successful ETI, was shorter for DL than for GS for neonatal and child airways at the.05 significance level. Time to ETI in adult airway using DL was correlated with experience level (p =.01). Failure rates were not different among subgroups. A brief video and period of exploration-based learning is insufficient for implementing a new technology. Pediatricians at various levels of training intubated simulated airways faster using DL than GS.

  15. Numerical simulation of circular cylinders in free-fall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero-Gomez, Pedro; Richmond, Marshall C.

    2016-02-01

    In this work, we combined the use of (i) overset meshes, (ii) a 6 degree-of-freedom (6- DOF) motion solver, and (iii) an eddy-resolving flow simulation approach to resolve the drag and secondary movement of large-sized cylinders settling in a quiescent fluid at moderate terminal Reynolds numbers (1,500 < Re < 28,000). These three strategies were implemented in a series of computational fluid dynamics (CFD) solutions to describe the fluid-structure interactions and the resulting effects on the cylinder motion. Using the drag coefficient, oscillation period, and maximum angular displacement as baselines, the findings show good agreement between the present CFD resultsmore » and corresponding data of published laboratory experiments. We discussed the computational expense incurred in using the present modeling approach. We also conducted a preceding simulation of flow past a fixed cylinder at Re = 3,900, which tested the influence of the turbulence approach (time-averaging vs eddy-resolving) and the meshing strategy (continuous vs. overset) on the numerical results. The outputs indicated a strong effect of the former and an insignificant influence of the latter. The long-term motivation for the present study is the need to understand the motion of an autonomous sensor of cylindrical shape used to measure the hydraulic conditions occurring in operating hydropower turbines.« less

  16. Looking for Sterile Neutrinos via Neutral-Current Disappearance with NOvA

    NASA Astrophysics Data System (ADS)

    Yang, Shaokai; NOvA Collaboration

    2017-01-01

    Contradictory evidence has been presented on the issue of neutrino mixing between the three known active neutrinos and light sterile neutrinos. The excess of events as seen by the LSND and MiniBooNE experiments interpreted as short-baseline neutrino oscillations, the collective evidence of the reactor neutrino anomaly, and the gallium anomaly all point towards sterile neutrinos with mass at the 1 eV level. While these results are tantalizing, they are not conclusive as they are in tension with null results from other short-baseline experiments, and with disappearance searches in long-baseline and atmospheric experiments. Resolving the issue of the existence of light sterile neutrinos has profound implications for both particle physics and cosmology. The NOvA (NuMI Off-Axis νe Appearance) experiment may help clarify the situation by searching for disappearance of active neutrinos from the NuMI (Neutrinos from the Main Injector) beam over a baseline of 810 km. In this talk, we will describe a method of how NOvA can look for oscillations into sterile neutrinos, with focus on disappearance of neutral current (NC) neutrino events, will present the first analysis result of this search, discuss their implications in constraining the existence of light sterile neutrinos, and the planned updates to this analysis.

  17. Airspace Concept Evaluation System (ACES), Concept Simulations using Communication, Navigation and Surveillance (CNS) System Models

    NASA Technical Reports Server (NTRS)

    Kubat, Greg; Vandrei, Don

    2006-01-01

    Project Objectives include: a) CNS Model Development; b Design/Integration of baseline set of CNS Models into ACES; c) Implement Enhanced Simulation Capabilities in ACES; d) Design and Integration of Enhanced (2nd set) CNS Models; and e) Continue with CNS Model Integration/Concept evaluations.

  18. Applications of Human Performance Reliability Evaluation Concepts and Demonstration Guidelines

    DTIC Science & Technology

    1977-03-15

    ship stops dead in the water and the AN/SQS-26 operator recommends a new heading (000°). At T + 14 minutes, the target ship begins a hard turn to...Various Simulated Conditions 82 9 Hunan Reliability for Each Simulated Operator (Baseline Run) 83 10 Human and Equipment Availabilit / under

  19. Evaluation of the Community Multiscale Air Quality Model for Simulating Winter Ozone Formation in the Uinta Basin.

    EPA Science Inventory

    The Weather Research and Forecasting (WRF) and Community Multiscale Air Quality (CMAQ) models were used to simulate a 10 day high‐ozone episode observed during the 2013 Uinta Basin Winter Ozone Study (UBWOS). The baseline model had a large negative bias when compared to ozo...

  20. Real-time simulation of the TF30-P-3 turbofan engine using a hybrid computer

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Bruton, W. M.

    1974-01-01

    A real-time, hybrid-computer simulation of the TF30-P-3 turbofan engine was developed. The simulation was primarily analog in nature but used the digital portion of the hybrid computer to perform bivariate function generation associated with the performance of the engine's rotating components. FORTRAN listings and analog patching diagrams are provided. The hybrid simulation was controlled by a digital computer programmed to simulate the engine's standard hydromechanical control. Both steady-state and dynamic data obtained from the digitally controlled engine simulation are presented. Hybrid simulation data are compared with data obtained from a digital simulation provided by the engine manufacturer. The comparisons indicate that the real-time hybrid simulation adequately matches the baseline digital simulation.

  1. The impact of physical and mental tasks on pilot mental workoad

    NASA Technical Reports Server (NTRS)

    Berg, S. L.; Sheridan, T. B.

    1986-01-01

    Seven instrument-rated pilots with a wide range of backgrounds and experience levels flew four different scenarios on a fixed-base simulator. The Baseline scenario was the simplest of the four and had few mental and physical tasks. An activity scenario had many physical but few mental tasks. The Planning scenario had few physical and many mental taks. A Combined scenario had high mental and physical task loads. The magnitude of each pilot's altitude and airspeed deviations was measured, subjective workload ratings were recorded, and the degree of pilot compliance with assigned memory/planning tasks was noted. Mental and physical performance was a strong function of the manual activity level, but not influenced by the mental task load. High manual task loads resulted in a large percentage of mental errors even under low mental task loads. Although all the pilots gave similar subjective ratings when the manual task load was high, subjective ratings showed greater individual differences with high mental task loads. Altitude or airspeed deviations and subjective ratings were most correlated when the total task load was very high. Although airspeed deviations, altitude deviations, and subjective workload ratings were similar for both low experience and high experience pilots, at very high total task loads, mental performance was much lower for the low experience pilots.

  2. Low cost, high yield: simulation of obstetric emergencies for family medicine training.

    PubMed

    Magee, Susanna R; Shields, Robin; Nothnagle, Melissa

    2013-01-01

    Simulation is now the educational standard for emergency training in residency and is particularly useful on a labor and delivery unit, which is often a stressful environment for learners given the frequency of emergencies. However, simulation can be costly. This study aimed to assess the feasibility and effectiveness of low-cost simulated obstetrical emergencies in training family medicine residents. The study took place in a community hospital in an urban underserved setting in the northeast United States. Low-cost simulations were developed for postpartum hemorrhage (PPH) and preeclampsia/eclampsia (PEC). Twenty residents were randomly assigned to the intervention (simulated PPH or PEC followed by debriefing) or control (lecture on PPH or PEC) group, and equal numbers of residents were assigned to each scenario. All participants completed a written test at baseline and an oral exam 6 months later on the respective scenario to which they were assigned. The participants provided written feedback on their respective teaching interventions. We compared performance on pretests and posttests by group using Wilcoxon Rank Sum. Twenty residents completed the study. Both groups performed similarly on baseline tests for both scenarios. Compared to controls, intervention residents scored significantly higher on the examination on the management of PPH but not for PEC. All intervention group participants reported that the simulation training was "extremely useful," and most found it "enjoyable." We demonstrated the feasibility and acceptability of two low-cost obstetric emergency simulations and found that they may result in persistent increases in trainee knowledge.

  3. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    PubMed Central

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined. PMID:22172142

  4. A baseline-free procedure for transformation models under interval censorship.

    PubMed

    Gu, Ming Gao; Sun, Liuquan; Zuo, Guoxin

    2005-12-01

    An important property of Cox regression model is that the estimation of regression parameters using the partial likelihood procedure does not depend on its baseline survival function. We call such a procedure baseline-free. Using marginal likelihood, we show that an baseline-free procedure can be derived for a class of general transformation models under interval censoring framework. The baseline-free procedure results a simplified and stable computation algorithm for some complicated and important semiparametric models, such as frailty models and heteroscedastic hazard/rank regression models, where the estimation procedures so far available involve estimation of the infinite dimensional baseline function. A detailed computational algorithm using Markov Chain Monte Carlo stochastic approximation is presented. The proposed procedure is demonstrated through extensive simulation studies, showing the validity of asymptotic consistency and normality. We also illustrate the procedure with a real data set from a study of breast cancer. A heuristic argument showing that the score function is a mean zero martingale is provided.

  5. Multilevel Analysis of Multiple-Baseline Data Evaluating Precision Teaching as an Intervention for Improving Fluency in Foundational Reading Skills for at Risk Readers

    ERIC Educational Resources Information Center

    Brosnan, Julie; Moeyaert, Mariola; Brooks Newsome, Kendra; Healy, Olive; Heyvaert, Mieke; Onghena, Patrick; Van den Noortgate, Wim

    2018-01-01

    In this article, multiple-baseline across participants designs were used to evaluate the impact of a precision teaching (PT) program, within a Tier 2 Response to Intervention framework, targeting fluency in foundational reading skills with at risk kindergarten readers. Thirteen multiple-baseline design experiments that included participation from…

  6. Scandinavia studies of recent crustal movements and the space geodetic baseline network

    NASA Technical Reports Server (NTRS)

    Anderson, A. J.

    1980-01-01

    A brief review of crustal movements within the Fenno-Scandia shield is given. Results from postglacial studies, projects for measuring active fault regions, and dynamic ocean loading experiments are presented. The 1979 Scandinavian Doppler Campaign Network is discussed. This network includes Doppler translocation baseline determination of future very long baseline interferometry baselines to be measured in Scandinavia. Intercomparison of earlier Doppler translocation measurements with a high precision terrestrial geodetic baseline in Scandinavia has yielded internal agreement of 6 cm over 887 km. This is a precision of better than 1 part in to the 7th power.

  7. Construction of Gridded Daily Weather Data and its Use in Central-European Agroclimatic Study

    NASA Astrophysics Data System (ADS)

    Dubrovsky, M.; Trnka, M.; Skalak, P.

    2013-12-01

    The regional-scale simulations of weather-sensitive processes (e.g. hydrology, agriculture and forestry) for the present and/or future climate often require high resolution meteorological inputs in terms of the time series of selected surface weather characteristics (typically temperature, precipitation, solar radiation, humidity, wind) for a set of stations or on a regular grid. As even the latest Global and Regional Climate Models (GCMs and RCMs) do not provide realistic representation of statistical structure of the surface weather, the model outputs must be postprocessed (downscaled) to achieve the desired statistical structure of the weather data before being used as an input to the follow-up simulation models. One of the downscaling approaches, which is employed also here, is based on a weather generator (WG), which is calibrated using the observed weather series, interpolated, and then modified according to the GCM- or RCM-based climate change scenarios. The present contribution, in which the parametric daily weather generator M&Rfi is linked to the high-resolution RCM output (ALADIN-Climate/CZ model) and GCM-based climate change scenarios, consists of two parts: The first part focuses on a methodology. Firstly, the gridded WG representing the baseline climate is created by merging information from observations and high resolution RCM outputs. In this procedure, WG is calibrated with RCM-simulated multi-variate weather series, and the grid specific WG parameters are then de-biased by spatially interpolated correction factors based on comparison of WG parameters calibrated with RCM-simulated weather series vs. spatially scarcer observations. To represent the future climate, the WG parameters are modified according to the 'WG-friendly' climate change scenarios. These scenarios are defined in terms of changes in WG parameters and include - apart from changes in the means - changes in WG parameters, which represent the additional characteristics of the weather series (e.g. probability of wet day occurrence and lag-1 autocorrelation of daily mean temperature). The WG-friendly scenarios for the present experiment are based on comparison of future vs baseline surface weather series simulated by GCMs from a CMIP3 database. The second part will present results of climate change impact study based on an above methodology applied to Central Europe. The changes in selected climatic (focusing on the extreme precipitation and temperature characteristics) and agroclimatic (including number of days during vegetation season with heat and drought stresses) characteristics will be analysed. In discussing the results, the emphasis will be put on 'added value' of various aspects of above methodology (e.g. inclusion of changes in 'advanced' WG parameters into the climate change scenarios). Acknowledgements: The present experiment is made within the frame of projects WG4VALUE (project LD12029 sponsored by the Ministry of Education, Youth and Sports of CR), ALARO-Climate (project P209/11/2405 sponsored by the Czech Science Foundation), and VALUE (COST ES 1102 action).

  8. Airfoil/Wing Flow Control Using Flexible Extended Trailing Edge

    DTIC Science & Technology

    2009-02-27

    and (b) Power spectrums of drag coefficient Figure 4. Mean velocity profiles O Baseline NACA0012. AoA 18 deg c Baseline NACA0012. AoA 20...dynamics, (a) fin amplitude and (b) power spectrum of fin amplitude Development of Computational Tools Simulations of the time-dependent deformation of...combination of experimental, computational and theoretical methods. Compared with Gurney flap and conventional flap, this device enhanced lift at a smaller

  9. Establishment of a rotor model basis

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. E.

    1982-01-01

    Radial-dimension computations in the RSRA's blade-element model are modified for both the acquisition of extensive baseline data and for real-time simulation use. The baseline data, which are for the evaluation of model changes, use very small increments and are of high quality. The modifications to the real-time simulation model are for accuracy improvement, especially when a minimal number of blade segments is required for real-time synchronization. An accurate technique for handling tip loss in discrete blade models is developed. The mathematical consistency and convergence properties of summation algorithms for blade forces and moments are examined and generalized integration coefficients are applied to equal-annuli midpoint spacing. Rotor conditions identified as 'constrained' and 'balanced' are used and the propagation of error is analyzed.

  10. An Improved Rank Correlation Effect Size Statistic for Single-Case Designs: Baseline Corrected Tau.

    PubMed

    Tarlow, Kevin R

    2017-07-01

    Measuring treatment effects when an individual's pretreatment performance is improving poses a challenge for single-case experimental designs. It may be difficult to determine whether improvement is due to the treatment or due to the preexisting baseline trend. Tau- U is a popular single-case effect size statistic that purports to control for baseline trend. However, despite its strengths, Tau- U has substantial limitations: Its values are inflated and not bound between -1 and +1, it cannot be visually graphed, and its relatively weak method of trend control leads to unacceptable levels of Type I error wherein ineffective treatments appear effective. An improved effect size statistic based on rank correlation and robust regression, Baseline Corrected Tau, is proposed and field-tested with both published and simulated single-case time series. A web-based calculator for Baseline Corrected Tau is also introduced for use by single-case investigators.

  11. Searches for sterile neutrinos with NOvA

    DOE PAGES

    Davies, Gavin S.; Aurisano, Adam; Kafka, Gareth K.; ...

    2016-11-15

    Contradictory evidence has been presented on the issue of neutrino mixing between the three known active neutrinos and light sterile neutrino species. Apparent short-baseline neutrino oscillations observed by the LSND and MiniBooNE experiments, the collective evidence of the reactor neutrino anomaly, and the gallium anomaly all point towards sterile neutrinos with mass at the 1 eV level. While these results are tantalizing, they are not conclusive as they are in tension with null results from other short-baseline experiments, and with disappearance searches in longbaseline and atmospheric experiments. The NOvA (NuMI Off-Axis v e Appearance) experiment may help clarify the situationmore » by searching for disappearance of active neutrinos from the NuMI (Neutrinos from the Main Injector) beam over a baseline of 810 km. We describe the method used by NOvA to look for oscillations into sterile neutrinos at the Far Detector (FD) through the disappearance of neutral-current (NC) neutrino events, including preliminary results of this search. In addition, the Near Detector (ND) is well suited for searching for anomalous short-baseline oscillations and probing the LSND and MiniBooNE sterile neutrino allowed regions using a variety of final states. We also present a novel method for selecting samples with high purity at the ND using convolutional neural networks. Furthermore, based on this method, the sensitivity to anomalous short-baseline nt appearance are shown, and searches for anomalous v e appearance and v μ disappearance at the NOvA ND are presented.« less

  12. Simulation-based Randomized Comparative Assessment of Out-of-Hospital Cardiac Arrest Resuscitation Bundle Completion by Emergency Medical Service Teams Using Standard Life Support or an Experimental Automation-assisted Approach.

    PubMed

    Choi, Bryan; Asselin, Nicholas; Pettit, Catherine C; Dannecker, Max; Machan, Jason T; Merck, Derek L; Merck, Lisa H; Suner, Selim; Williams, Kenneth A; Jay, Gregory D; Kobayashi, Leo

    2016-12-01

    Effective resuscitation of out-of-hospital cardiac arrest (OHCA) patients is challenging. Alternative resuscitative approaches using electromechanical adjuncts may improve provider performance. Investigators applied simulation to study the effect of an experimental automation-assisted, goal-directed OHCA management protocol on EMS providers' resuscitation performance relative to standard protocols and equipment. Two-provider (emergency medical technicians (EMT)-B and EMT-I/C/P) teams were randomized to control or experimental group. Each team engaged in 3 simulations: baseline simulation (standard roles); repeat simulation (standard roles); and abbreviated repeat simulation (reversed roles, i.e., basic life support provider performing ALS tasks). Control teams used standard OHCA protocols and equipment (with high-performance cardiopulmonary resuscitation training intervention); for second and third simulations, experimental teams performed chest compression, defibrillation, airway, pulmonary ventilation, vascular access, medication, and transport tasks with goal-directed protocol and resuscitation-automating devices. Videorecorders and simulator logs collected resuscitation data. Ten control and 10 experimental teams comprised 20 EMT-B's; 1 EMT-I, 8 EMT-C's, and 11 EMT-P's; study groups were not fully matched. Both groups suboptimally performed chest compressions and ventilations at baseline. For their second simulations, control teams performed similarly except for reduced on-scene time, and experimental teams improved their chest compressions (P=0.03), pulmonary ventilations (P<0.01), and medication administration (P=0.02); changes in their performance of chest compression, defibrillation, airway, and transport tasks did not attain significance against control teams' changes. Experimental teams maintained performance improvements during reversed-role simulations. Simulation-based investigation into OHCA resuscitation revealed considerable variability and improvable deficiencies in small EMS teams. Goal-directed, automation-assisted OHCA management augmented select resuscitation bundle element performance without comprehensive improvement.

  13. Monte Carlo Simulations for VLBI2010

    NASA Astrophysics Data System (ADS)

    Wresnik, J.; Böhm, J.; Schuh, H.

    2007-07-01

    Monte Carlo simulations are carried out at the Institute of Geodesy and Geophysics (IGG), Vienna, and at Goddard Space Flight Center (GSFC), Greenbelt (USA), with the goal to design a new geodetic Very Long Baseline Interferometry (VLBI) system. Influences of the schedule, the network geometry and the main stochastic processes on the geodetic results are investigated. Therefore schedules are prepared with the software package SKED (Vandenberg 1999), and different strategies are applied to produce temporally very dense schedules which are compared in terms of baseline length repeatabilities. For the simulation of VLBI observations a Monte Carlo Simulator was set up which creates artificial observations by randomly simulating wet zenith delay and clock values as well as additive white noise representing the antenna errors. For the simulation at IGG the VLBI analysis software OCCAM (Titov et al. 2004) was adapted. Random walk processes with power spectrum densities of 0.7 and 0.1 psec2/sec are used for the simulation of wet zenith delays. The clocks are simulated with Allan Standard Deviations of 1*10^-14 @ 50 min and 2*10^-15 @ 15 min and three levels of white noise, 4 psec, 8 psec and, 16 psec, are added to the artificial observations. The variations of the power spectrum densities of the clocks and wet zenith delays, and the application of different white noise levels show clearly that the wet delay is the critical factor for the improvement of the geodetic VLBI system. At GSFC the software CalcSolve is used for the VLBI analysis, therefore a comparison between the software packages OCCAM and CalcSolve was done with simulated data. For further simulations the wet zenith delay was modeled by a turbulence model. This data was provided by Nilsson T. and was added to the simulation work. Different schedules have been run.

  14. Dynamic cycling in atrial size and flow during obstructive apnoea

    PubMed Central

    Pressman, Gregg S; Cepeda-Valery, Beatriz; Codolosa, Nicolas; Orban, Marek; Samuel, Solomon P; Somers, Virend K

    2016-01-01

    Objective Obstructive sleep apnoea (OSA) is strongly associated with cardiovascular disease. However, acute cardiovascular effects of repetitive airway obstruction are poorly understood. While past research used a sustained Mueller manoeuver to simulate OSA we employed a series of gasping efforts to better simulate true obstructive apnoeas. This report describes acute changes in cardiac anatomy and flow related to sudden changes in intrathoracic pressure. Methods and results 26 healthy, normal weight participants performed 5–6 gasping efforts (target intrathoracic pressure −40 mm Hg) while undergoing Doppler echocardiography. 14 participants had sufficient echocardiographic images to allow comparison of atrial areas during the manoeuver with baseline measurements. Mitral and tricuspid E-wave and A-wave velocities postmanoeuver were compared with baseline in all participants. Average atrial areas changed little during the manoeuver, but variance in both atrial areas was significantly greater than baseline. Further, an inverse relationship was noted with left atrial collapse and right atrial enlargement at onset of inspiratory effort. Significant inverse changes were noted in Doppler flow when comparing the first beat postmanoeuver (pMM1) with baseline. Mitral E-wave velocity increased 9.1 cm/s while tricuspid E-wave velocity decreased 7.0 cm/s; by the eighth beat postmanoeuver (pMM8) values were not different from baseline. Mitral and tricuspid A-wave velocities were not different from baseline at pMM1, but both were significantly higher by pMM8. Conclusions Repetitive obstructive apnoeas produce dynamic, inverse changes in atrial size and Doppler flow across the atrioventricular valves. These observations have important implications for understanding the pathophysiology of OSA. PMID:27127636

  15. A Mathematical Model for Vertical Attitude Takeoff and Landing (VATOL) Aircraft Simulation. Volume 1; Model Description Application

    NASA Technical Reports Server (NTRS)

    Fortenbaugh, R. L.

    1980-01-01

    A mathematical model of a high performance airplane capable of vertical attitude takeoff and landing (VATOL) was developed. An off line digital simulation program incorporating this model was developed to provide trim conditions and dynamic check runs for the piloted simulation studies and support dynamic analyses of proposed VATOL configuration and flight control concepts. Development details for the various simulation component models and the application of the off line simulation program, Vertical Attitude Take-Off and Landing Simulation (VATLAS), to develop a baseline control system for the Vought SF-121 VATOL airplane concept are described.

  16. Phase retrieval based wavefront sensing experimental implementation and wavefront sensing accuracy calibration

    NASA Astrophysics Data System (ADS)

    Mao, Heng; Wang, Xiao; Zhao, Dazun

    2009-05-01

    As a wavefront sensing (WFS) tool, Baseline algorithm, which is classified as the iterative-transform algorithm of phase retrieval, estimates the phase distribution at pupil from some known PSFs at defocus planes. By using multiple phase diversities and appropriate phase unwrapping methods, this algorithm can accomplish reliable unique solution and high dynamic phase measurement. In the paper, a Baseline algorithm based wavefront sensing experiment with modification of phase unwrapping has been implemented, and corresponding Graphical User Interfaces (GUI) software has also been given. The adaptability and repeatability of Baseline algorithm have been validated in experiments. Moreover, referring to the ZYGO interferometric results, the WFS accuracy of this algorithm has been exactly calibrated.

  17. Results of the Australian geodetic VLBI experiment

    NASA Technical Reports Server (NTRS)

    Harvey, B. R.; Stolz, A.; Jauncey, D. L.; Niell, A.; Morabito, D. D.; Preston, R.

    1983-01-01

    The 250-2500 km baseline vectors between radio telescopes located at Tidbinbilla (DSS43) near Canberra, Parkes, Fleurs (X3) near Sydney, Hobart and Alice Springs were determined from radio interferometric observations of extragalactic sources. The observations were made during two 24-hour sessions on 26 April and 3 May 1982, and one 12-hour night-time session on 28 April 1982. The 275 km Tidbinbilla - Parkes baseline was measured with an accuracy of plus or minus 6 cm. The remaining baselines were measured with accuracies ranging from 15 cm to 6 m. The higher accuracies were achieved for the better instrumented sites of Tidbinbilla, Parkes and Fleurs. The data reduction technique and results of the experiment are discussed.

  18. Effect of Two Advanced Noise Reduction Technologies on the Aerodynamic Performance of an Ultra High Bypass Ratio Fan

    NASA Technical Reports Server (NTRS)

    Hughes, Christoper E.; Gazzaniga, John A.

    2013-01-01

    A wind tunnel experiment was conducted in the NASA Glenn Research Center anechoic 9- by 15-Foot Low-Speed Wind Tunnel to investigate two new advanced noise reduction technologies in support of the NASA Fundamental Aeronautics Program Subsonic Fixed Wing Project. The goal of the experiment was to demonstrate the noise reduction potential and effect on fan model performance of the two noise reduction technologies in a scale model Ultra-High Bypass turbofan at simulated takeoff and approach aircraft flight speeds. The two novel noise reduction technologies are called Over-the-Rotor acoustic treatment and Soft Vanes. Both technologies were aimed at modifying the local noise source mechanisms of the fan tip vortex/fan case interaction and the rotor wake-stator interaction. For the Over-the-Rotor acoustic treatment, two noise reduction configurations were investigated. The results showed that the two noise reduction technologies, Over-the-Rotor and Soft Vanes, were able to reduce the noise level of the fan model, but the Over-the-Rotor configurations had a significant negative impact on the fan aerodynamic performance; the loss in fan aerodynamic efficiency was between 2.75 to 8.75 percent, depending on configuration, compared to the conventional solid baseline fan case rubstrip also tested. Performance results with the Soft Vanes showed that there was no measurable change in the corrected fan thrust and a 1.8 percent loss in corrected stator vane thrust, which resulted in a total net thrust loss of approximately 0.5 percent compared with the baseline reference stator vane set.

  19. Online monitoring of the Osiris reactor with the Nucifer neutrino detector

    NASA Astrophysics Data System (ADS)

    Boireau, G.; Bouvet, L.; Collin, A. P.; Coulloux, G.; Cribier, M.; Deschamp, H.; Durand, V.; Fechner, M.; Fischer, V.; Gaffiot, J.; Gérard Castaing, N.; Granelli, R.; Kato, Y.; Lasserre, T.; Latron, L.; Legou, P.; Letourneau, A.; Lhuillier, D.; Mention, G.; Mueller, Th. A.; Nghiem, T.-A.; Pedrol, N.; Pelzer, J.; Pequignot, M.; Piret, Y.; Prono, G.; Scola, L.; Starzinski, P.; Vivier, M.; Dumonteil, E.; Mancusi, D.; Varignon, C.; Buck, C.; Lindner, M.; Bazoma, J.; Bouvier, S.; Bui, V. M.; Communeau, V.; Cucoanes, A.; Fallot, M.; Gautier, M.; Giot, L.; Guilloux, G.; Lenoir, M.; Martino, J.; Mercier, G.; Milleto, T.; Peuvrel, N.; Porta, A.; Le Quéré, N.; Renard, C.; Rigalleau, L. M.; Roy, D.; Vilajosana, T.; Yermia, F.; Nucifer Collaboration

    2016-06-01

    Originally designed as a new nuclear reactor monitoring device, the Nucifer detector has successfully detected its first neutrinos. We provide the second-shortest baseline measurement of the reactor neutrino flux. The detection of electron antineutrinos emitted in the decay chains of the fission products, combined with reactor core simulations, provides a new tool to assess both the thermal power and the fissile content of the whole nuclear core and could be used by the International Agency for Atomic Energy to enhance the safeguards of civil nuclear reactors. Deployed at only 7.2 m away from the compact Osiris research reactor core (70 MW) operating at the Saclay research center of the French Alternative Energies and Atomic Energy Commission, the experiment also exhibits a well-suited configuration to search for a new short baseline oscillation. We report the first results of the Nucifer experiment, describing the performances of the ˜0.85 m3 detector remotely operating at a shallow depth equivalent to ˜12 m of water and under intense background radiation conditions. Based on 145 (106) days of data with the reactor on (off), leading to the detection of an estimated 40760 ν¯ e , the mean number of detected antineutrinos is 281 ±7 (stat )±18 (syst )ν¯ e/day , in agreement with the prediction of 277 ±23 ν¯ e/day . Because of the large background, no conclusive results on the existence of light sterile neutrinos could be derived, however. As a first societal application we quantify how antineutrinos could be used for the Plutonium Management and Disposition Agreement.

  20. Simulation Evaluation of Synthetic Vision as an Enabling Technology for Equivalent Visual Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Williams, Steven P.; Bailey, Randall E.

    2008-01-01

    Enhanced Vision (EV) and synthetic vision (SV) systems may serve as enabling technologies to meet the challenges of the Next Generation Air Transportation System (NextGen) Equivalent Visual Operations (EVO) concept ? that is, the ability to achieve or even improve on the safety of Visual Flight Rules (VFR) operations, maintain the operational tempos of VFR, and even, perhaps, retain VFR procedures independent of actual weather and visibility conditions. One significant challenge lies in the definition of required equipage on the aircraft and on the airport to enable the EVO concept objective. A piloted simulation experiment was conducted to evaluate the effects of the presence or absence of Synthetic Vision, the location of this information during an instrument approach (i.e., on a Head-Up or Head-Down Primary Flight Display), and the type of airport lighting information on landing minima. The quantitative data from this experiment were analyzed to begin the definition of performance-based criteria for all-weather approach and landing operations. Objective results from the present study showed that better approach performance was attainable with the head-up display (HUD) compared to the head-down display (HDD). A slight performance improvement in HDD performance was shown when SV was added, as the pilots descended below 200 ft to a 100 ft decision altitude, but this performance was not tested for statistical significance (nor was it expected to be statistically significant). The touchdown data showed that regardless of the display concept flown (SV HUD, Baseline HUD, SV HDD, Baseline HDD) a majority of the runs were within the performance-based defined approach and landing criteria in all the visibility levels, approach lighting systems, and decision altitudes tested. For this visual flight maneuver, RVR appeared to be the most significant influence in touchdown performance. The approach lighting system clearly impacted the pilot's ability to descend to 100 ft height above touchdown based on existing Federal Aviation Regulation (FAR) 91.175 using a 200 ft decision height, but did not appear to influence touchdown performance or approach path maintenance

  1. Demonstration of obstacle avoidance system (OASYS) symbology in full mission simulation

    NASA Astrophysics Data System (ADS)

    Sharkey, Thomas J.

    1994-06-01

    The U. S. Army Aeroflightdynamics Directorate's (AFDD) Crew Station Research and Development Branch (CSRDB) conducted a multiphase effort to develop symbology displaying information from the Obstacle Avoidance System (OASYS) on the Aviator's Night Vision System (ANVIS) Head Up Display (HUD). The first phase of this program used static symbology displayed on a workstation to identify the types of information required from OASYS by the pilot. The second phase used a low-fidelity, pilot-in-the-loop simulation to evaluate fourteen different symbology-drive law combinations. Based on the results of phases 1 and 2 three candidate symbologies were selected, along with the baseline symbology developed by the OASYS contractor, for evaluation in full mission simulation. In addition, a full-daylight, full field-of-view condition and Night Vision Goggle (NVG) condition, both without OASYS symbology, were used as control conditions. The environmental conditions (e.g., ambient illumination, visual range) and task requirements (e.g., altitude and airspeed) used in the simulation were selected to severely tax the symbology. Reliable differences in performance between symbology conditions were found. Two of the symbologies developed during the earlier phases of this program resulted in reduced frequencies of ground strikes compared to OASYS baseline and NVG only conditions. The frequency of close approaches to wires was lower with the symbology developed in this program than with the baseline symbology. All OASYS symbologies improved performance relative to the NVG control condition. It is recommended that the OASYS symbology and drive laws developed during this program be used during OASYS flight tests.

  2. Airfoil Ice-Accretion Aerodynamics Simulation

    NASA Technical Reports Server (NTRS)

    Bragg, Michael B.; Broeren, Andy P.; Addy, Harold E.; Potapczuk, Mark G.; Guffond, Didier; Montreuil, E.

    2007-01-01

    NASA Glenn Research Center, ONERA, and the University of Illinois are conducting a major research program whose goal is to improve our understanding of the aerodynamic scaling of ice accretions on airfoils. The program when it is completed will result in validated scaled simulation methods that produce the essential aerodynamic features of the full-scale iced-airfoil. This research will provide some of the first, high-fidelity, full-scale, iced-airfoil aerodynamic data. An initial study classified ice accretions based on their aerodynamics into four types: roughness, streamwise ice, horn ice, and spanwise-ridge ice. Subscale testing using a NACA 23012 airfoil was performed in the NASA IRT and University of Illinois wind tunnel to better understand the aerodynamics of these ice types and to test various levels of ice simulation fidelity. These studies are briefly reviewed here and have been presented in more detail in other papers. Based on these results, full-scale testing at the ONERA F1 tunnel using cast ice shapes obtained from molds taken in the IRT will provide full-scale iced airfoil data from full-scale ice accretions. Using these data as a baseline, the final step is to validate the simulation methods in scale in the Illinois wind tunnel. Computational ice accretion methods including LEWICE and ONICE have been used to guide the experiments and are briefly described and results shown. When full-scale and simulation aerodynamic results are available, these data will be used to further develop computational tools. Thus the purpose of the paper is to present an overview of the program and key results to date.

  3. Closing the Knowledge Gap: Effects of Land Use Conversion on Belowground Carbon near the 100th Meridian

    NASA Astrophysics Data System (ADS)

    Waldron, S. E.; Phillips, R. L.; Dell, R.; Suddick, E. C.

    2012-12-01

    Native prairie of the northern Great Plains near the 100th meridian is currently under land use conversion pressure due to high commodity prices. From 2002 to 2007, approximately 303,515 hectares of prairie were converted to crop production in the Prairie Pothole Region (PPR) from Montana to the Dakotas. The spatiotemporal effects of land-use conversion on soil organic matter are still unclear for the PPR. Effects will vary with management, soil properties and time, making regional experiments and simulation modeling necessary. Grassland conservationists are interested in soil carbon data and soil carbon simulation models to inform potential voluntary carbon credit programs. These programs require quantification of changes in soil carbon associated with land-use conversion and management. We addressed this issue by 1) designing a regional-scale experiment, 2) collecting and analyzing soil data, and 3) interviewing producers about land management practices, as required for regional, process-based biogeochemical models. We selected farms at random within a 29,000 km2 area of interest and measured soil properties at multiple depths for native prairie and adjacent annual crop fields. The cores were processed at six different depths (between 0 and 100 cm) for bulk density, pH, texture, total carbon, inorganic carbon, and total nitrogen. We found that the largest difference in soil organic carbon occurred at the 0-10 cm depth, but the magnitude of the effect of land use varied with soil properties and land management. Results from this project, coupled with regional model simulations (Denitrification-Decomposition, DNDC) represent the baseline data needed for future voluntary carbon credit programs and long-term carbon monitoring networks. Enrollment in such programs could help ranchers and farmers realize a new income stream from maintaining their native prairie and the carbon stored beneath it.

  4. An Integrated Framework for Modeling Air Carrier Behavior, Policy, and Impacts in the U.S. Air Transportation System

    NASA Technical Reports Server (NTRS)

    Horio, Brant M.; Kumar, Vivek; DeCicco, Anthony H.; Hasan, Shahab; Stouffer, Virginia L.; Smith, Jeremy C.; Guerreiro, Nelson M.

    2015-01-01

    The implementation of the Next Generation Air Transportation System (NextGen) in the United States is an ongoing challenge for policymakers due to the complexity of the air transportation system (ATS) with its broad array of stakeholders and dynamic interdependencies between them. The successful implementation of NextGen has a hard dependency on the active participation of U.S. commercial airlines. To assist policymakers in identifying potential policy designs that facilitate the implementation of NextGen, the National Aeronautics and Space Administration (NASA) and LMI developed a research framework called the Air Transportation System Evolutionary Simulation (ATS-EVOS). This framework integrates large empirical data sets with multiple specialized models to simulate the evolution of the airline response to potential future policies and explore consequential impacts on ATS performance and market dynamics. In the ATS-EVOS configuration presented here, we leverage the Transportation Systems Analysis Model (TSAM), the Airline Evolutionary Simulation (AIRLINE-EVOS), the Airspace Concept Evaluation System (ACES), and the Aviation Environmental Design Tool (AEDT), all of which enable this research to comprehensively represent the complex facets of the ATS and its participants. We validated this baseline configuration of ATS-EVOS against Airline Origin and Destination Survey (DB1B) data and subject matter expert opinion, and we verified the ATS-EVOS framework and agent behavior logic through scenario-based experiments that explored potential implementations of a carbon tax, congestion pricing policy, and the dynamics for equipage of new technology by airlines. These experiments demonstrated ATS-EVOS's capabilities in responding to a wide range of potential NextGen-related policies and utility for decision makers to gain insights for effective policy design.

  5. Data Processing Algorithm for Diagnostics of Combustion Using Diode Laser Absorption Spectrometry.

    PubMed

    Mironenko, Vladimir R; Kuritsyn, Yuril A; Liger, Vladimir V; Bolshov, Mikhail A

    2018-02-01

    A new algorithm for the evaluation of the integral line intensity for inferring the correct value for the temperature of a hot zone in the diagnostic of combustion by absorption spectroscopy with diode lasers is proposed. The algorithm is based not on the fitting of the baseline (BL) but on the expansion of the experimental and simulated spectra in a series of orthogonal polynomials, subtracting of the first three components of the expansion from both the experimental and simulated spectra, and fitting the spectra thus modified. The algorithm is tested in the numerical experiment by the simulation of the absorption spectra using a spectroscopic database, the addition of white noise, and the parabolic BL. Such constructed absorption spectra are treated as experimental in further calculations. The theoretical absorption spectra were simulated with the parameters (temperature, total pressure, concentration of water vapor) close to the parameters used for simulation of the experimental data. Then, spectra were expanded in the series of orthogonal polynomials and first components were subtracted from both spectra. The value of the correct integral line intensities and hence the correct temperature evaluation were obtained by fitting of the thus modified experimental and simulated spectra. The dependence of the mean and standard deviation of the evaluation of the integral line intensity on the linewidth and the number of subtracted components (first two or three) were examined. The proposed algorithm provides a correct estimation of temperature with standard deviation better than 60 K (for T = 1000 K) for the line half-width up to 0.6 cm -1 . The proposed algorithm allows for obtaining the parameters of a hot zone without the fitting of usually unknown BL.

  6. A Four-Way Comparison of Cardiac Function with Normobaric Normoxia, Normobaric Hypoxia, Hypobaric Hypoxia and Genuine High Altitude

    PubMed Central

    Boos, Christopher John; O’Hara, John Paul; Mellor, Adrian; Hodkinson, Peter David; Tsakirides, Costas; Reeve, Nicola; Gallagher, Liam; Green, Nicholas Donald Charles; Woods, David Richard

    2016-01-01

    Background There has been considerable debate as to whether different modalities of simulated hypoxia induce similar cardiac responses. Materials and Methods This was a prospective observational study of 14 healthy subjects aged 22–35 years. Echocardiography was performed at rest and at 15 and 120 minutes following two hours exercise under normobaric normoxia (NN) and under similar PiO2 following genuine high altitude (GHA) at 3,375m, normobaric hypoxia (NH) and hypobaric hypoxia (HH) to simulate the equivalent hypoxic stimulus to GHA. Results All 14 subjects completed the experiment at GHA, 11 at NN, 12 under NH, and 6 under HH. The four groups were similar in age, sex and baseline demographics. At baseline rest right ventricular (RV) systolic pressure (RVSP, p = 0.0002), pulmonary vascular resistance (p = 0.0002) and acute mountain sickness (AMS) scores were higher and the SpO2 lower (p<0.0001) among all three hypoxic groups (GHA, NH and HH) compared with NN. At both 15 minutes and 120 minutes post exercise, AMS scores, Cardiac output, septal S’, lateral S’, tricuspid S’ and A’ velocities and RVSP were higher and SpO2 lower with all forms of hypoxia compared with NN. On post-test analysis, among the three hypoxia groups, SpO2 was lower at baseline and 15 minutes post exercise with GHA (89.3±3.4% and 89.3±2.2%) and HH (89.0±3.1 and (89.8±5.0) compared with NH (92.9±1.7 and 93.6±2.5%). The RV Myocardial Performance (Tei) Index and RVSP were significantly higher with HH than NH at 15 and 120 minutes post exercise respectively and tricuspid A’ was higher with GHA compared with NH at 15 minutes post exercise. Conclusions GHA, NH and HH produce similar cardiac adaptations over short duration rest despite lower SpO2 levels with GHA and HH compared with NH. Notable differences emerge following exercise in SpO2, RVSP and RV cardiac function. PMID:27100313

  7. Space shuttle visual simulation system design study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The current and near-future state-of-the-art in visual simulation equipment technology is related to the requirements of the space shuttle visual system. Image source, image sensing, and displays are analyzed on a subsystem basis, and the principal conclusions are used in the formulation of a recommended baseline visual system. Perceptibility and visibility are also analyzed.

  8. Investigation of airfoil leading edge separation control with nanosecond plasma actuator

    NASA Astrophysics Data System (ADS)

    Zheng, J. G.; Cui, Y. D.; Zhao, Z. J.; Li, J.; Khoo, B. C.

    2016-11-01

    A combined numerical and experimental investigation of airfoil leading edge flow separation control with a nanosecond dielectric barrier discharge (DBD) plasma actuator is presented. Our study concentrates on describing dynamics of detailed flow actuation process and elucidating the nanosecond DBD actuation mechanism. A loose coupling methodology is employed to perform simulation, which consists of a self-similar plasma model for the description of pulsed discharge and two-dimensional Reynolds averaged Navier-Stokes (RANS) equations for the calculation of external airflow. A series of simulations of poststall flows around a NACA0015 airfoil is conducted with a Reynolds number range covering both low and high Re at Re=(0.05 ,0.15 ,1.2 ) ×106 . Meanwhile, wind-tunnel experiment is performed for two low Re flows to measure aerodynamic force on airfoil model and transient flow field with time-resolved particle image velocimetry (PIV). The PIV measurement provides possibly the clearest view of flow reattachment process under the actuation of a nanosecond plasma actuator ever observed in experiments, which is highly comparable to that predicted by simulation. It is found from the detailed simulation that the discharge-induced residual heat rather than shock wave plays a dominant role in flow control. For any leading edge separations, the preliminary flow reattachment is realized by residual heat-induced spanwise vortices. After that, the nanosecond actuator functions by continuing exciting flow instability at poststall attack angles or acting as an active trip near stall angle. As a result, the controlled flow is characterized by a train of repetitive, downstream moving vortices over suction surface or an attached turbulent boundary layer, which depends on both angle of attack and Reynolds number. The advection of residual temperature with external flow offers a nanosecond plasma actuator a lot of flexibility to extend its influence region. Animations are provided for baseline flow and that subjected to plasma control at two typical Reynolds numbers.

  9. Study on the calibration and optimization of double theodolites baseline

    NASA Astrophysics Data System (ADS)

    Ma, Jing-yi; Ni, Jin-ping; Wu, Zhi-chao

    2018-01-01

    For the double theodolites measurement system baseline as the benchmark of the scale of the measurement system and affect the accuracy of the system, this paper puts forward a method for calibration and optimization of the double theodolites baseline. Using double theodolites to measure the known length of the reference ruler, and then reverse the baseline formula. Based on the error propagation law, the analyses show that the baseline error function is an important index to measure the accuracy of the system, and the reference ruler position, posture and so on have an impact on the baseline error. The optimization model is established and the baseline error function is used as the objective function, and optimizes the position and posture of the reference ruler. The simulation results show that the height of the reference ruler has no effect on the baseline error; the posture is not uniform; when the reference ruler is placed at x=500mm and y=1000mm in the measurement space, the baseline error is the smallest. The experimental results show that the experimental results are consistent with the theoretical analyses in the measurement space. In this paper, based on the study of the placement of the reference ruler, for improving the accuracy of the double theodolites measurement system has a reference value.

  10. Baseline tests for arc melter vitrification of INEL buried wastes. Volume 1: Facility description and summary data report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, L.L.; O`Connor, W.K.; Turner, P.C.

    1993-11-19

    This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc meltingmore » furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests.« less

  11. Long-Baseline Neutrino Facility (LBNF) and Deep Underground Neutrino Experiment (DUNE): Conceptual Design Report. Volume 3: Long-Baseline Neutrino Facility for DUNE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strait, James; McCluskey, Elaine; Lundin, Tracy

    2016-01-21

    This volume of the LBNF/DUNE Conceptual Design Report covers the Long-Baseline Neutrino Facility for DUNE and describes the LBNF Project, which includes design and construction of the beamline at Fermilab, the conventional facilities at both Fermilab and SURF, and the cryostat and cryogenics infrastructure required for the DUNE far detector.

  12. A multi-scale ensemble-based framework for forecasting compound coastal-riverine flooding: The Hackensack-Passaic watershed and Newark Bay

    NASA Astrophysics Data System (ADS)

    Saleh, F.; Ramaswamy, V.; Wang, Y.; Georgas, N.; Blumberg, A.; Pullen, J.

    2017-12-01

    Estuarine regions can experience compound impacts from coastal storm surge and riverine flooding. The challenges in forecasting flooding in such areas are multi-faceted due to uncertainties associated with meteorological drivers and interactions between hydrological and coastal processes. The objective of this work is to evaluate how uncertainties from meteorological predictions propagate through an ensemble-based flood prediction framework and translate into uncertainties in simulated inundation extents. A multi-scale framework, consisting of hydrologic, coastal and hydrodynamic models, was used to simulate two extreme flood events at the confluence of the Passaic and Hackensack rivers and Newark Bay. The events were Hurricane Irene (2011), a combination of inland flooding and coastal storm surge, and Hurricane Sandy (2012) where coastal storm surge was the dominant component. The hydrodynamic component of the framework was first forced with measured streamflow and ocean water level data to establish baseline inundation extents with the best available forcing data. The coastal and hydrologic models were then forced with meteorological predictions from 21 ensemble members of the Global Ensemble Forecast System (GEFS) to retrospectively represent potential future conditions up to 96 hours prior to the events. Inundation extents produced by the hydrodynamic model, forced with the 95th percentile of the ensemble-based coastal and hydrologic boundary conditions, were in good agreement with baseline conditions for both events. The USGS reanalysis of Hurricane Sandy inundation extents was encapsulated between the 50th and 95th percentile of the forecasted inundation extents, and that of Hurricane Irene was similar but with caveats associated with data availability and reliability. This work highlights the importance of accounting for meteorological uncertainty to represent a range of possible future inundation extents at high resolution (∼m).

  13. Multiple Intravenous Infusions Phase 2b: Laboratory Study

    PubMed Central

    Pinkney, Sonia; Fan, Mark; Chan, Katherine; Koczmara, Christine; Colvin, Christopher; Sasangohar, Farzan; Masino, Caterina; Easty, Anthony; Trbovich, Patricia

    2014-01-01

    Background Administering multiple intravenous (IV) infusions to a single patient via infusion pump occurs routinely in health care, but there has been little empirical research examining the risks associated with this practice or ways to mitigate those risks. Objectives To identify the risks associated with multiple IV infusions and assess the impact of interventions on nurses’ ability to safely administer them. Data Sources and Review Methods Forty nurses completed infusion-related tasks in a simulated adult intensive care unit, with and without interventions (i.e., repeated-measures design). Results Errors were observed in completing common tasks associated with the administration of multiple IV infusions, including the following (all values from baseline, which was current practice): setting up and programming multiple primary continuous IV infusions (e.g., 11.7% programming errors) identifying IV infusions (e.g., 7.7% line-tracing errors) managing dead volume (e.g., 96.0% flush rate errors following IV syringe dose administration) setting up a secondary intermittent IV infusion (e.g., 11.3% secondary clamp errors) administering an IV pump bolus (e.g., 11.5% programming errors) Of 10 interventions tested, 6 (1 practice, 3 technology, and 2 educational) significantly decreased or even eliminated errors compared to baseline. Limitations The simulation of an adult intensive care unit at 1 hospital limited the ability to generalize results. The study results were representative of nurses who received training in the interventions but had little experience using them. The longitudinal effects of the interventions were not studied. Conclusions Administering and managing multiple IV infusions is a complex and risk-prone activity. However, when a patient requires multiple IV infusions, targeted interventions can reduce identified risks. A combination of standardized practice, technology improvements, and targeted education is required. PMID:26316919

  14. Principles for Integrating Mars Analog Science, Operations, and Technology Research

    NASA Technical Reports Server (NTRS)

    Clancey, William J.

    2003-01-01

    During the Apollo program, the scientific community and NASA used terrestrial analog sites for understanding planetary features and for training astronauts to be scientists. Human factors studies (Harrison, Clearwater, & McKay 1991; Stuster 1996) have focused on the effects of isolation in extreme environments. More recently, with the advent of wireless computing, we have prototyped advanced EVA technologies for navigation, scheduling, and science data logging (Clancey 2002b; Clancey et al., in press). Combining these interests in a single expedition enables tremendous synergy and authenticity, as pioneered by Pascal Lee's Haughton-Mars Project (Lee 2001; Clancey 2000a) and the Mars Society s research stations on a crater rim on Devon Island in the High Canadian Arctic (Clancey 2000b; 2001b) and the Morrison Formation of southeast Utah (Clancey 2002a). Based on this experience, the following principles are proposed for conducting an integrated science, operations, and technology research program at analog sites: 1) Authentic work; 2) PI-based projects; 3) Unencumbered baseline studies; 4) Closed simulations; and 5) Observation and documentation. Following these principles, we have been integrating field science, operations research, and technology development at analog sites on Devon Island and in Utah over the past five years. Analytic methods include work practice simulation (Clancey 2002c; Sierhuis et a]., 2000a;b), by which the interaction of human behavior, facilities, geography, tools, and procedures are formalized in computer models. These models are then converted into the runtime EVA system we call mobile agents (Clancey 2002b; Clancey et al., in press). Furthermore, we have found that the Apollo Lunar Surface Journal (Jones, 1999) provides a vast repository or understanding astronaut and CapCom interactions, serving as a baseline for Mars operations and quickly highlighting opportunities for computer automation (Clancey, in press).

  15. Simulating large atmospheric phase screens using a woofer-tweeter algorithm.

    PubMed

    Buscher, David F

    2016-10-03

    We describe an algorithm for simulating atmospheric wavefront perturbations over ranges of spatial and temporal scales spanning more than 4 orders of magnitude. An open-source implementation of the algorithm written in Python can simulate the evolution of the perturbations more than an order-of-magnitude faster than real time. Testing of the implementation using metrics appropriate to adaptive optics systems and long-baseline interferometers show accuracies at the few percent level or better.

  16. Sea level static calibration of a compact multimission aircraft propulsion simulator with inlet flow distortion

    NASA Technical Reports Server (NTRS)

    Won, Mark J.

    1990-01-01

    Wind tunnel tests of propulsion-integrated aircraft models have identified inlet flow distortion as a major source of compressor airflow measurement error in turbine-powered propulsion simulators. Consequently, two Compact Multimission Aircraft Propulsion Simulator (CMAPS) units were statically tested at sea level ambient conditions to establish simulator operating performance characteristics and to calibrate the compressor airflow against an accurate bellmouth flowmeter in the presence of inlet flow distortions. The distortions were generated using various-shaped wire mesh screens placed upstream of the compressor. CMAPS operating maps and performance envelopes were obtained for inlet total pressure distortions (ratio of the difference between the maximum and minimum total pressures to the average total pressure) up to 35 percent, and were compared to baseline simulator operating characteristics for a uniform inlet. Deviations from CMAPS baseline performance were attributed to the coupled variation of both compressor inlet-flow distortion and Reynolds number index throughout the simulator operating envelope for each screen configuration. Four independent methods were used to determine CMAPS compressor airflow; direct compressor inlet and discharge measurements, an entering/exiting flow-balance relationships, and a correlation between the mixer pressure and the corrected compressor airflow. Of the four methods, the last yielded the least scatter in the compressor flow coefficient, approximately + or - 3 percent over the range of flow distortions.

  17. Uncertain Representations of Sub-Grid Pollutant Transport in Chemistry-Transport Models and Impacts on Long-Range Transport and Global Composition

    NASA Technical Reports Server (NTRS)

    Pawson, Steven; Zhu, Z.; Ott, L. E.; Molod, A.; Duncan, B. N.; Nielsen, J. E.

    2009-01-01

    Sub-grid transport, by convection and turbulence, is known to play an important role in lofting pollutants from their source regions. Consequently, the long-range transport and climatology of simulated atmospheric composition are impacted. This study uses the Goddard Earth Observing System, Version 5 (GEOS-5) atmospheric model to study pollutant transport. The baseline model uses a Relaxed Arakawa-Schubert (RAS) scheme that represents convection through a sequence of linearly entraining cloud plumes characterized by unique detrainment levels. Thermodynamics, moisture and trace gases are transported in the same manner. Various approximate forms of trace-gas transport are implemented, in which the box-averaged cloud mass fluxes from RAS are used with different numerical approaches. Substantial impacts on forward-model simulations of CO (using a linearized chemistry) are evident. In particular, some aspects of simulations using a diffusive form of sub-grid transport bear more resemblance to space-biased CO observations than do the baseline simulations with RAS transport. Implications for transport in the real atmosphere will be discussed. Another issue of importance is that many adjoint/inversion computations use simplified representations of sub-grid transport that may be inconsistent with the forward models: implications will be discussed. Finally, simulations using a complex chemistry model in GEOS-5 (in place of the linearized CO model) are underway: noteworthy results from this simulation will be mentioned.

  18. Relationship between jump landing kinematics and peak ACL force during a jump in downhill skiing: a simulation study.

    PubMed

    Heinrich, D; van den Bogert, A J; Nachbauer, W

    2014-06-01

    Recent data highlight that competitive skiers face a high risk of injuries especially during off-balance jump landing maneuvers in downhill skiing. The purpose of the present study was to develop a musculo-skeletal modeling and simulation approach to investigate the cause-and-effect relationship between a perturbed landing position, i.e., joint angles and trunk orientation, and the peak force in the anterior cruciate ligament (ACL) during jump landing. A two-dimensional musculo-skeletal model was developed and a baseline simulation was obtained reproducing measurement data of a reference landing movement. Based on the baseline simulation, a series of perturbed landing simulations (n = 1000) was generated. Multiple linear regression was performed to determine a relationship between peak ACL force and the perturbed landing posture. Increased backward lean, hip flexion, knee extension, and ankle dorsiflexion as well as an asymmetric position were related to higher peak ACL forces during jump landing. The orientation of the trunk of the skier was identified as the most important predictor accounting for 60% of the variance of the peak ACL force in the simulations. Teaching of tactical decisions and the inclusion of exercise regimens in ACL injury prevention programs to improve trunk control during landing motions in downhill skiing was concluded. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Orion Orbit Control Design and Analysis

    NASA Technical Reports Server (NTRS)

    Jackson, Mark; Gonzalez, Rodolfo; Sims, Christopher

    2007-01-01

    The analysis of candidate thruster configurations for the Crew Exploration Vehicle (CEV) is presented. Six candidate configurations were considered for the prime contractor baseline design. The analysis included analytical assessments of control authority, control precision, efficiency and robustness, as well as simulation assessments of control performance. The principles used in the analytic assessments of controllability, robustness and fuel performance are covered and results provided for the configurations assessed. Simulation analysis was conducted using a pulse width modulated, 6 DOF reaction system control law with a simplex-based thruster selection algorithm. Control laws were automatically derived from hardware configuration parameters including thruster locations, directions, magnitude and specific impulse, as well as vehicle mass properties. This parameterized controller allowed rapid assessment of multiple candidate layouts. Simulation results are presented for final phase rendezvous and docking, as well as low lunar orbit attitude hold. Finally, on-going analysis to consider alternate Service Module designs and to assess the pilot-ability of the baseline design are discussed to provide a status of orbit control design work to date.

  20. Quasi-optical analysis of a far-infrared spatio-spectral space interferometer concept

    NASA Astrophysics Data System (ADS)

    Bracken, C.; O'Sullivan, C.; Murphy, J. A.; Donohoe, A.; Savini, G.; Lightfoot, J.; Juanola-Parramon, R.; Fisica Consortium

    2016-07-01

    FISICA (Far-Infrared Space Interferometer Critical Assessment) was a three year study of a far-infrared spatio-spectral double-Fourier interferometer concept. One of the aims of the FISICA study was to set-out a baseline optical design for such a system, and to use a model of the system to simulate realistic telescope beams for use with an end-to-end instrument simulator. This paper describes a two-telescope (and hub) baseline optical design that fulfils the requirements of the FISICA science case, while minimising the optical mass of the system. A number of different modelling techniques were required for the analysis: fast approximate simulation tools such as ray tracing and Gaussian beam methods were employed for initial analysis, with GRASP physical optics used for higher accuracy in the final analysis. Results are shown for the predicted far-field patterns of the telescope primary mirrors under illumination by smooth walled rectangular feed horns. Far-field patterns for both on-axis and off-axis detectors are presented and discussed.

  1. Baseline Error Analysis and Experimental Validation for Height Measurement of Formation Insar Satellite

    NASA Astrophysics Data System (ADS)

    Gao, X.; Li, T.; Zhang, X.; Geng, X.

    2018-04-01

    In this paper, we proposed the stochastic model of InSAR height measurement by considering the interferometric geometry of InSAR height measurement. The model directly described the relationship between baseline error and height measurement error. Then the simulation analysis in combination with TanDEM-X parameters was implemented to quantitatively evaluate the influence of baseline error to height measurement. Furthermore, the whole emulation validation of InSAR stochastic model was performed on the basis of SRTM DEM and TanDEM-X parameters. The spatial distribution characteristics and error propagation rule of InSAR height measurement were fully evaluated.

  2. Baseline response rates affect resistance to change.

    PubMed

    Kuroda, Toshikazu; Cook, James E; Lattal, Kennon A

    2018-01-01

    The effect of response rates on resistance to change, measured as resistance to extinction, was examined in two experiments. In Experiment 1, responding in transition from a variable-ratio schedule and its yoked-interval counterpart to extinction was compared with pigeons. Following training on a multiple variable-ratio yoked-interval schedule of reinforcement, in which response rates were higher in the former component, reinforcement was removed from both components during a single extended extinction session. Resistance to extinction in the yoked-interval component was always either greater or equal to that in the variable-ratio component. In Experiment 2, resistance to extinction was compared for two groups of rats that exhibited either high or low response rates when maintained on identical variable-interval schedules. Resistance to extinction was greater for the lower-response-rate group. These results suggest that baseline response rate can contribute to resistance to change. Such effects, however, can only be revealed when baseline response rate and reinforcement rate are disentangled (Experiments 1 and 2) from the more usual circumstance where the two covary. Furthermore, they are more cleanly revealed when the programmed contingencies controlling high and low response rates are identical, as in Experiment 2. © 2017 Society for the Experimental Analysis of Behavior.

  3. Evaluation of Airframe Noise Reduction Concepts via Simulations Using a Lattice Boltzmann Approach

    NASA Technical Reports Server (NTRS)

    Fares, Ehab; Casalino, Damiano; Khorrami, Mehdi R.

    2015-01-01

    Unsteady computations are presented for a high-fidelity, 18% scale, semi-span Gulfstream aircraft model in landing configuration, i.e. flap deflected at 39 degree and main landing gear deployed. The simulations employ the lattice Boltzmann solver PowerFLOW® to simultaneously capture the flow physics and acoustics in the near field. Sound propagation to the far field is obtained using a Ffowcs Williams and Hawkings acoustic analogy approach. In addition to the baseline geometry, which was presented previously, various noise reduction concepts for the flap and main landing gear are simulated. In particular, care is taken to fully resolve the complex geometrical details associated with these concepts in order to capture the resulting intricate local flow field thus enabling accurate prediction of their acoustic behavior. To determine aeroacoustic performance, the farfield noise predicted with the concepts applied is compared to high-fidelity simulations of the untreated baseline configurations. To assess the accuracy of the computed results, the aerodynamic and aeroacoustic impact of the noise reduction concepts is evaluated numerically and compared to experimental results for the same model. The trends and effectiveness of the simulated noise reduction concepts compare well with measured values and demonstrate that the computational approach is capable of capturing the primary effects of the acoustic treatment on a full aircraft model.

  4. An analysis on 45° sweep tail angle for blended wing body aircraft to the aerodynamics coefficients by wind tunnel experiment

    NASA Astrophysics Data System (ADS)

    Latif, M. Z. A. Abd; Ahmad, M. A.; Nasir, R. E. Mohd; Wisnoe, W.; Saad, M. R.

    2017-12-01

    This paper presents the analysis of a model from UiTM Blended Wing Body (BWB) UAV, Baseline V that has been tested at UPNM high speed wind tunnel. Baseline V has a unique design due to different NACA sections used for its fuselage, body, wing root, midwing, wingtip, tail root, tail tip and the tail is swept 45° backward. The purpose of this experiment is to study the aerodynamic characteristics when the tail sweeps 45° backward. The experiments are conducted several times using 71.5% scaled down model at about 49.58 m/s airspeed or 25 Hz. The tail angle deflection is fixed and set at zero angle. All the data obtained is analyzed and presented in terms of coefficient of lift, coefficient of drag and also lift-to-drag ratio, and is plotted against various angles of attack. The angles of attack used for this experiments are between -10° to +30°. The blockage correction such as solid blockage, wake blockage and streamline curvature blockage are calculated in order to obtain true performance of the aircraft. From the observation, Baseline V shows that the aircraft tends to stall at around +15°. The maximum L/D ratio achieved for Baseline V is 20.8, however it decreases slightly to 20.7 after blockage corrections.

  5. A Statistical Simulation Approach to Safe Life Fatigue Analysis of Redundant Metallic Components

    NASA Technical Reports Server (NTRS)

    Matthews, William T.; Neal, Donald M.

    1997-01-01

    This paper introduces a dual active load path fail-safe fatigue design concept analyzed by Monte Carlo simulation. The concept utilizes the inherent fatigue life differences between selected pairs of components for an active dual path system, enhanced by a stress level bias in one component. The design is applied to a baseline design; a safe life fatigue problem studied in an American Helicopter Society (AHS) round robin. The dual active path design is compared with a two-element standby fail-safe system and the baseline design for life at specified reliability levels and weight. The sensitivity of life estimates for both the baseline and fail-safe designs was examined by considering normal and Weibull distribution laws and coefficient of variation levels. Results showed that the biased dual path system lifetimes, for both the first element failure and residual life, were much greater than for standby systems. The sensitivity of the residual life-weight relationship was not excessive at reliability levels up to R = 0.9999 and the weight penalty was small. The sensitivity of life estimates increases dramatically at higher reliability levels.

  6. Artificial Intelligence (AI) Based Tactical Guidance for Fighter Aircraft

    NASA Technical Reports Server (NTRS)

    McManus, John W.; Goodrich, Kenneth H.

    1990-01-01

    A research program investigating the use of Artificial Intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range (WVR) air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS), a second generation TDG, is presented. The Knowledge-Based Systems used by CLAWS to aid in the tactical decision-making process are outlined in detail, and the results of tests to evaluate the performance of CLAWS versus a baseline TDG developed in FORTRAN to run in real-time in the Langley Differential Maneuvering Simulator (DMS), are presented. To date, these test results have shown significant performance gains with respect to the TDG baseline in one-versus-one air combat engagements, and the AI-based TDG software has proven to be much easier to modify and maintain than the baseline FORTRAN TDG programs. Alternate computing environments and programming approaches, including the use of parallel algorithms and heterogeneous computer networks are discussed, and the design and performance of a prototype concurrent TDG system are presented.

  7. Terrain Portrayal for Head-Down Displays Experiment

    NASA Technical Reports Server (NTRS)

    Hughes, Monica F.; Takallu, M. A.

    2002-01-01

    The General Aviation Element of the Aviation Safety Program's Synthetic Vision Systems (SVS) Project is developing technology to eliminate low visibility induced General Aviation (GA) accidents. SVS displays present computer generated 3-dimensional imagery of the surrounding terrain on the Primary Flight Display (PFD) to greatly enhance pilot's situation awareness (SA), reducing or eliminating Controlled Flight into Terrain, as well as Low-Visibility Loss of Control accidents. SVS-conducted research is facilitating development of display concepts that provide the pilot with an unobstructed view of the outside terrain, regardless of weather conditions and time of day. A critical component of SVS displays is the appropriate presentation of terrain to the pilot. An experimental study has been conducted at NASA Langley Research Center (LaRC) to explore and quantify the relationship between the realism of the terrain presentation and resulting enhancements of pilot SA and pilot performance. Composed of complementary simulation and flight test efforts, Terrain Portrayal for Head-Down Displays (TP-HDD) experiments will help researchers evaluate critical terrain portrayal concepts. The experimental effort is to provide data to enable design trades that optimize SVS applications, as well as develop requirements and recommendations to facilitate the certification process. This paper focuses on the experimental set-up and preliminary qualitative results of the TP-HDD simulation experiment. In this experiment a fixed based flight simulator was equipped with various types of Head Down flight displays, ranging from conventional round dials (typical of most GA aircraft) to glass cockpit style PFD's. The variations of the PFD included an assortment of texturing and Digital Elevation Model (DEM) resolution combinations. A test matrix of 10 terrain display configurations (in addition to the baseline displays) were evaluated by 27 pilots of various backgrounds and experience levels. Qualitative (questionnaires) and quantitative (pilot performance and physiological) data were collected during the experimental runs. Preliminary results indicate that all of the evaluation pilots favored SVS displays over standard gauges, in terms of terrain awareness, SA, and perceived pilot performance. Among the terrain portrayal concepts tested, most pilots preferred the higher-resolution DEM. In addition, with minimal training, low-hour VFR evaluation pilots were able to negotiate a precision approach using SVS displays with a tunnel in the sky guidance concept.

  8. CVT/PCS phase 1 integrated testing

    NASA Technical Reports Server (NTRS)

    Mcbrayer, R. O.; Steadman, J. D.

    1973-01-01

    Five breadboard experiments representing three Sortie Lab experiment disciplines were installed in a payload carrier simulator. A description of the experiments and the payload carrier simulator was provided. An assessment of the experiment interface with the simulator and an assessment of the simulator experiment support systems were presented. The results indicate that a hardware integrator for each experiment is essential; a crew chief, or mission specialist, for systems management and experimenter liaison is a vital function; a payload specialist is a practical concept for experiment integration and operation; an integration fixture for a complex experiment is required to efficiently integrate the experiment and carrier; simultaneous experiment utilization of simulator systems caused unexpected problems in meeting individual experiment requirements; experimenter traffic inside the dual-floor simulator did not hamper experiment operations; and the requirement for zero-g operation will provide a significant design challenge for some experiments.

  9. Applied Time Domain Stability Margin Assessment for Nonlinear Time-Varying Systems

    NASA Technical Reports Server (NTRS)

    Kiefer, J. M.; Johnson, M. D.; Wall, J. H.; Dominguez, A.

    2016-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation. This technique was implemented by using the Stability Aerospace Vehicle Analysis Tool (SAVANT) computer simulation to evaluate the stability of the SLS system with the Adaptive Augmenting Control (AAC) active and inactive along its ascent trajectory. The gains for which the vehicle maintains apparent time-domain stability defines the gain margins, and the time delay similarly defines the phase margin. This method of extracting the control stability margins from the time-domain simulation is relatively straightforward and the resultant margins can be compared to the linearized system results. The sections herein describe the techniques employed to extract the time-domain margins, compare the results between these nonlinear and the linear methods, and provide explanations for observed discrepancies. The SLS ascent trajectory was simulated with SAVANT and the classical linear stability margins were evaluated at one second intervals. The linear analysis was performed with the AAC algorithm disabled to attain baseline stability margins. At each time point, the system was linearized about the current operating point using Simulink's built-in solver. Each linearized system in time was evaluated for its rigid-body gain margin (high frequency gain margin), rigid-body phase margin, and aero gain margin (low frequency gain margin) for each control axis. Using the stability margins derived from the baseline linearization approach, the time domain derived stability margins were determined by executing time domain simulations in which axis-specific incremental gain and phase adjustments were made to the nominal system about the expected neutral stability point at specific flight times. The baseline stability margin time histories were used to shift the system gain to various values around the zero margin point such that a precise amount of expected gain margin was maintained throughout flight. When assessing the gain margins, the gain was applied starting at the time point under consideration, thereafter following the variation in the margin found in the linear analysis. When assessing the rigid-body phase margin, a constant time delay was applied to the system starting at the time point under consideration. If the baseline stability margins were correctly determined via the linear analysis, the time domain simulation results should contain unstable behavior at certain gain and phase values. Examples will be shown from repeated simulations with variable added gain and phase lag. Faithfulness of margins calculated from the linear analysis to the nonlinear system will be demonstrated.

  10. A robust estimation of the effects of motorcycle autonomous emergency braking (MAEB) based on in-depth crashes in Australia.

    PubMed

    Savino, Giovanni; Mackenzie, Jamie; Allen, Trevor; Baldock, Matthew; Brown, Julie; Fitzharris, Michael

    2016-09-01

    Autonomous emergency braking (AEB) is a safety system that detects imminent forward collisions and reacts by slowing down the host vehicle without any action from the driver. AEB effectiveness in avoiding and mitigating real-world crashes has recently been demonstrated. Research suggests that a translation of AEB to powered 2-wheelers could also be beneficial. Previous studies have estimated the effects of a motorcycle AEB system (MAEB) via computer simulations. Though effects of MAEB were computed for motorcycle crashes derived from in-depth crash investigation, there may be some inaccuracies due to limitations of postcrash investigation (e.g., inaccuracies in preimpact velocity of the motorcycle). Furthermore, ideal MAEB technology was assumed, which may lead to overestimation of the benefits. This study sought to evaluate the sensitivity of the simulations to variations in reconstructed crash cases and the capacity of the MAEB system in order to provide a more robust estimation of MAEB effects. First, a comprehensive classification of accidents was used to identify scenarios in which MAEB was likely to apply, and representative crash cases from those available for this study were populated for each crash scenario. Second, 100 variant cases were generated by randomly varying a set of simulation parameters with given normal distributions around the baseline values. Variants reflected uncertainties in the original data. Third, the effects of MAEB were estimated in terms of the difference in the impact speed of the host motorcycle with and without the system via computer simulations of each variant case. Simulations were repeated assuming both an idealized and a realistic MAEB system. For each crash case, the results in the baseline case and in the variants were compared. A total of 36 crash cases representing 11 common crash scenarios were selected from 3 Australian in-depth data sets: 12 cases from New South Wales, 13 cases from Victoria, and 11 cases from South Australia. The reduction in impact speed elicited by MAEB in the baseline cases ranged from 2.8 to 10.0 km/h. The baseline cases over- or underestimated the mean impact speed reduction of the variant cases by up to 20%. Constraints imposed by simulating more realistic capabilities for an MAEB system produced a decrease in the estimated impact speed reduction of up to 14% (mean 5%) compared to an idealized system. The small difference between the baseline and variant case results demonstrates that the potential effects of MAEB computed from the cases described in in-depth crash reports are typically a good approximation, despite limitations of postcrash investigation. Furthermore, given that MAEB intervenes very close to the point of impact, limitations of the currently available technologies were not found to have a dramatic influence on the effects of the system.

  11. Interior renovation of a general practitioner office leads to a perceptual bias on patient experience for over one year.

    PubMed

    Gauthey, Jérôme; Tièche, Raphaël; Streit, Sven

    2018-01-01

    Measuring patient experience is key when assessing quality of care but can be biased: A perceptual bias occurs when renovations of the interior design of a general practitioner (GP) office improves how patients assessed quality of care. The aim was to assess the length of perceptual bias and if it could be reproduced after a second renovation. A GP office with 2 GPs in Switzerland was renovated twice within 3 years. We assessed patient experience at baseline, 2 months and 14 months after the first and 3 months after the second renovation. Each time, we invited a sample of 180 consecutive patients that anonymously graded patient experience in 4 domains: appearance of the office; qualities of medical assistants and GPs; and general satisfaction. We compared crude mean scores per domain from baseline until follow-up. In a multivariate model, we adjusted for patient's age, gender and for how long patients had been their GP. At baseline, patients aged 60.9 (17.7) years, 52% females. After the first renovation, we found a regression to the baseline level of patient experience after 14 months except for appearance of the office (p<0.001). After the second renovation, patient experience improved again in appearance of the office (p = 0.008), qualities of the GP (p = 0.008), and general satisfaction (p = 0.014). Qualities of the medical assistant showed a slight improvement (p = 0.068). Results were unchanged in the multivariate model. Interior renovation of a GP office probably causes a perceptual bias for >1 year that improves how patients rate quality of care. This bias could be reproduced after a second renovation strengthening a possible causal relationship. These findings imply to appropriately time measurement of patient experience to at least one year after interior renovation of GP practices to avoid environmental changes influences the estimates when measuring patient experience.

  12. Interior renovation of a general practitioner office leads to a perceptual bias on patient experience for over one year

    PubMed Central

    2018-01-01

    Introduction Measuring patient experience is key when assessing quality of care but can be biased: A perceptual bias occurs when renovations of the interior design of a general practitioner (GP) office improves how patients assessed quality of care. The aim was to assess the length of perceptual bias and if it could be reproduced after a second renovation. Methods A GP office with 2 GPs in Switzerland was renovated twice within 3 years. We assessed patient experience at baseline, 2 months and 14 months after the first and 3 months after the second renovation. Each time, we invited a sample of 180 consecutive patients that anonymously graded patient experience in 4 domains: appearance of the office; qualities of medical assistants and GPs; and general satisfaction. We compared crude mean scores per domain from baseline until follow-up. In a multivariate model, we adjusted for patient’s age, gender and for how long patients had been their GP. Results At baseline, patients aged 60.9 (17.7) years, 52% females. After the first renovation, we found a regression to the baseline level of patient experience after 14 months except for appearance of the office (p<0.001). After the second renovation, patient experience improved again in appearance of the office (p = 0.008), qualities of the GP (p = 0.008), and general satisfaction (p = 0.014). Qualities of the medical assistant showed a slight improvement (p = 0.068). Results were unchanged in the multivariate model. Conclusions Interior renovation of a GP office probably causes a perceptual bias for >1 year that improves how patients rate quality of care. This bias could be reproduced after a second renovation strengthening a possible causal relationship. These findings imply to appropriately time measurement of patient experience to at least one year after interior renovation of GP practices to avoid environmental changes influences the estimates when measuring patient experience. PMID:29462196

  13. Life Support Baseline Values and Assumptions Document

    NASA Technical Reports Server (NTRS)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  14. Life Support Baseline Values and Assumptions Document

    NASA Technical Reports Server (NTRS)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.; Wagner, Sandra A.

    2015-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. With the ability to accurately compare different technologies' performance for the same function, managers will be able to make better decisions regarding technology development.

  15. MCC level C formulation requirements. Shuttle TAEM guidance and flight control, STS-1 baseline

    NASA Technical Reports Server (NTRS)

    Carman, G. L.; Montez, M. N.

    1980-01-01

    The TAEM guidance and body rotational dynamics models required for the MCC simulation of the TAEM mission phase are defined. This simulation begins at the end of the entry phase and terminates at TAEM autoland interface. The logic presented is the required configuration for the first shuttle orbital flight (STS-1). The TAEM guidance is simulated in detail. The rotational dynamics simulation is a simplified model that assumes that the commanded rotational rates can be achieved in the integration interval. Thus, the rotational dynamics simulation is essentially a simulation of the autopilot commanded rates and integration of these rates to determine orbiter attitude. The rotational dynamics simulation also includes a simulation of the speedbrake deflection. The body flap and elevon deflections are computed in the orbiter aerodynamic simulation.

  16. Projected climate change impacts on winter recreation in the ...

    EPA Pesticide Factsheets

    A physically-based water and energy balance model is used to simulate natural snow accumulation at 247 winter recreation locations across the continental United States. We combine this model with projections of snowmaking conditions to determine downhill skiing, cross-country skiing, and snowmobiling season lengths under baseline and future climates, using data from five climate models and two emissions scenarios. The present-day simulations from the snow model without snowmaking are validated with observations of snow-water-equivalent from snow monitoring sites. Projected season lengths are combined with baseline estimates of winter recreation activity to monetize impacts to the selected winter recreation activity categories for the years 2050 and 2090. Estimate the physical and economic impact of climate change on winter recreation in the contiguous U.S.

  17. A simulator evaluation of an automatic terminal approach system

    NASA Technical Reports Server (NTRS)

    Hinton, D. A.

    1983-01-01

    The automatic terminal approach system (ATAS) is a concept for improving the pilot/machine interface with cockpit automation. The ATAS can automatically fly a published instrument approach by using stored instrument approach data to automatically tune airplane avionics, control the airplane's autopilot, and display status information to the pilot. A piloted simulation study was conducted to determine the feasibility of an ATAS, determine pilot acceptance, and examine pilot/ATAS interaction. Seven instrument-rated pilots each flew four instrument approaches with a base-line heading select autopilot mode. The ATAS runs resulted in lower flight technical error, lower pilot workload, and fewer blunders than with the baseline autopilot. The ATAS status display enabled the pilots to maintain situational awareness during the automatic approaches. The system was well accepted by the pilots.

  18. Dispersion analysis for baseline reference mission 1. [flight simulation and trajectory analysis for space shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Kuhn, A. E.

    1975-01-01

    A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for the baseline reference mission (BRM) 1 of the space shuttle orbiter. The dispersion analysis is based on the nominal trajectory for the BRM 1. State vector and performance dispersions (or variations) which result from the indicated 3 sigma uncertainties were studied. The dispersions were determined at major mission events and fixed times from lift-off (time slices) and the results will be used to evaluate the capability of the vehicle to perform the mission within a 3 sigma level of confidence and to determine flight performance reserves. A computer program is given that was used for dynamic flight simulations of the space shuttle orbiter.

  19. Nonlinear Dynamic Inversion Baseline Control Law: Architecture and Performance Predictions

    NASA Technical Reports Server (NTRS)

    Miller, Christopher J.

    2011-01-01

    A model reference dynamic inversion control law has been developed to provide a baseline control law for research into adaptive elements and other advanced flight control law components. This controller has been implemented and tested in a hardware-in-the-loop simulation; the simulation results show excellent handling qualities throughout the limited flight envelope. A simple angular momentum formulation was chosen because it can be included in the stability proofs for many basic adaptive theories, such as model reference adaptive control. Many design choices and implementation details reflect the requirements placed on the system by the nonlinear flight environment and the desire to keep the system as basic as possible to simplify the addition of the adaptive elements. Those design choices are explained, along with their predicted impact on the handling qualities.

  20. Skin hydration analysis by experiment and computer simulations and its implications for diapered skin.

    PubMed

    Saadatmand, M; Stone, K J; Vega, V N; Felter, S; Ventura, S; Kasting, G; Jaworska, J

    2017-11-01

    Experimental work on skin hydration is technologically challenging, and mostly limited to observations where environmental conditions are constant. In some cases, like diapered baby skin, such work is practically unfeasible, yet it is important to understand potential effects of diapering on skin condition. To overcome this challenge, in part, we developed a computer simulation model of reversible transient skin hydration effects. Skin hydration model by Li et al. (Chem Eng Sci, 138, 2015, 164) was further developed to simulate transient exposure conditions where relative humidity (RH), wind velocity, air, and skin temperature can be any function of time. Computer simulations of evaporative water loss (EWL) decay after different occlusion times were compared with experimental data to calibrate the model. Next, we used the model to investigate EWL and SC thickness in different diapering scenarios. Key results from the experimental work were: (1) For occlusions by RH=100% and free water longer than 30 minutes the absorbed amount of water is almost the same; (2) Longer occlusion times result in higher water absorption by the SC. The EWL decay and skin water content predictions were in agreement with experimental data. Simulations also revealed that skin under occlusion hydrates mainly because the outflux is blocked, not because it absorbs water from the environment. Further, simulations demonstrated that hydration level is sensitive to time, RH and/or free water on skin. In simulated diapering scenarios, skin maintained hydration content very close to the baseline conditions without a diaper for the entire duration of a 24 hours period. Different diapers/diaper technologies are known to have different profiles in terms of their ability to provide wetness protection, which can result in consumer-noticeable differences in wetness. Simulation results based on published literature using data from a number of different diapers suggest that diapered skin hydrates within ranges considered reversible. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Use of an Online Game to Evaluate Health Professions Students' Attitudes toward People in Poverty.

    PubMed

    Richey Smith, Carriann E; Ryder, Priscilla; Bilodeau, Ann; Schultz, Michele

    2016-10-25

    Objective. To determine baseline attitudes of pharmacy, physician assistant studies, and communication science and disorders students toward people in poverty and to examine the effectiveness of using the online poverty simulation game SPENT to affect these attitudes. Methods. Students completed pre/postassessments using the validated Undergraduate Perceptions of Poverty Tracking Survey (UPPTS). Students played the online, open access, SPENT game alone and/or in pairs in a 50-minute class. Results. Significant improvements in scale scores were seen in students after playing SPENT. Quartile results by prescore indicated that students with the lowest attitudes towards patients in poverty improved the most. Results suggested that most students found the experience worthwhile for themselves and/or for their classmates. Conclusions. The results of this study suggest SPENT may improve perspectives of undergraduate pharmacy and other health professions students.

  2. Flow and Noise from Septa Nozzles

    NASA Technical Reports Server (NTRS)

    Zaman, K. B. M. Q.; Bridges, J. E.

    2017-01-01

    Flow and noise fields are explored for the concept of distributed propulsion. A model-scale experiment is performed with an 8:1 aspect ratio rectangular nozzle that is divided into six passages by five septa. The septa geometries are created by placing plastic inserts within the nozzle. It is found that the noise radiation from the septa nozzle can be significantly lower than that from the baseline rectangular nozzle. The reduction of noise is inferred to be due to the introduction of streamwise vortices in the flow. The streamwise vortices are produced by secondary flow within each passage. Thus, the geometry of the internal passages of the septa nozzle can have a large influence. The flow evolution is profoundly affected by slight changes in the geometry. These conclusions are reached by mostly experimental results of the flowfield aided by brief numerical simulations.

  3. Very long baseline interferometry using a communication satellite

    NASA Technical Reports Server (NTRS)

    Swenson, G. W., Jr.

    1975-01-01

    A planned experiment is discussed in long-baseline interferometry, using the Communications Technology Satellite to transmit the base-band signal from one telescope to another for real-time correlation. A 20 megabit data rate is planned, calling for a delay-line of 10 MHz bandwidth and controllable delay up to 275 milliseconds. A number of sources will be studied on baselines from Ontario to West Virginia and California.

  4. SoLid Detector Technology

    NASA Astrophysics Data System (ADS)

    Labare, Mathieu

    2017-09-01

    SoLid is a reactor anti-neutrino experiment where a novel detector is deployed at a minimum distance of 5.5 m from a nuclear reactor core. The purpose of the experiment is three-fold: to search for neutrino oscillations at a very short baseline; to measure the pure 235U neutrino energy spectrum; and to demonstrate the feasibility of neutrino detectors for reactor monitoring. This report presents the unique features of the SoLid detector technology. The technology has been optimised for a high background environment resulting from low overburden and the vicinity of a nuclear reactor. The versatility of the detector technology is demonstrated with a 288 kg detector prototype which was deployed at the BR2 nuclear reactor in 2015. The data presented includes both reactor on, reactor off and calibration measurements. The measurement results are compared with Monte Carlo simulations. The 1.6t SoLid detector is currently under construction, with an optimised design and upgraded material technology to enhance the detector capabilities. Its deployement on site is planned for the begin of 2017 and offers the prospect to resolve the reactor anomaly within about two years.

  5. Chaotropic salts in liquid chromatographic method development for the determination of pramipexole and its impurities following quality-by-design principles.

    PubMed

    Vemić, Ana; Rakić, Tijana; Malenović, Anđelija; Medenica, Mirjana

    2015-01-01

    The aim of this paper is to present a development of liquid chromatographic method when chaotropic salts are used as mobile phase additives following the QbD principles. The effect of critical process parameters (column chemistry, salt nature and concentration, acetonitrile content and column temperature) on the critical quality attributes (retention of the first and last eluting peak and separation of the critical peak pairs) was studied applying the design of experiments-design space methodology (DoE-DS). D-optimal design is chosen in order to simultaneously examine both categorical and numerical factors in minimal number of experiments. Two ways for the achievement of quality assurance were performed and compared. Namely, the uncertainty originating from the models was assessed by Monte Carlo simulations propagating the error equal to the variance of the model residuals and propagating the error originating from the model coefficients' calculation. The baseline separation of pramipexole and its five impurities is achieved fulfilling all the required criteria while the method validation proved its reliability. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Laser plasma interaction in rugby-shaped hohlraums

    NASA Astrophysics Data System (ADS)

    Masson-Laborde, P.-E.; Philippe, F.; Tassin, V.; Monteil, M.-C.; Gauthier, P.; Casner, A.; Depierreux, S.; Seytor, P.; Teychenne, D.; Loiseau, P.; Freymerie, P.

    2014-10-01

    Rugby shaped-hohlraum has proven to give high performance compared to a classical similar-diameter cylinder hohlraum. Due to this performance, this hohlraum has been chosen as baseline ignition target for the Laser MegaJoule (LMJ). Many experiments have therefore been performed during the last years on the Omega laser facility in order to study in details the rugby hohlraum. In this talk, we will discuss the interpretation of these experiments from the point of view of the laser plasma instability problem. Experimental comparisons have been done between rugby, cylinder and elliptical shape rugby hohlraums and we will discuss how the geometry differences will affect the evolution of laser plasma instabilities (LPI). The efficiency of laser smoothing techniques on these instabilities will also be discussed as well as gas filling effect. The experimental results will be compared with FCI2 hydroradiative calculations and linear postprocessing with Piranah. Experimental Raman and Brillouin spectrum, from which we can infer the location of the parametric instabilities, will be compared to simulated ones, and will give the possibility to compare LPI between the different hohlraum geometries.

  7. Uranium (VI) solubility in carbonate-free ERDA-6 brine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucchini, Jean-francois; Khaing, Hnin; Reed, Donald T

    2010-01-01

    When present, uranium is usually an element of importance in a nuclear waste repository. In the Waste Isolation Pilot Plant (WIPP), uranium is the most prevalent actinide component by mass, with about 647 metric tons to be placed in the repository. Therefore, the chemistry of uranium, and especially its solubility in the WIPP conditions, needs to be well determined. Long-term experiments were performed to measure the solubility of uranium (VI) in carbonate-free ERDA-6 brine, a simulated WIPP brine, at pC{sub H+} values between 8 and 12.5. These data, obtained from the over-saturation approach, were the first repository-relevant data for themore » VI actinide oxidation state. The solubility trends observed pointed towards low uranium solubility in WIPP brines and a lack of amphotericity. At the expected pC{sub H+} in the WIPP ({approx} 9.5), measured uranium solubility approached 10{sup -7} M. The objective of these experiments was to establish a baseline solubility to further investigate the effects of carbonate complexation on uranium solubility in WIPP brines.« less

  8. Interferometry of chemically peculiar stars: theoretical predictions versus modern observing facilities

    NASA Astrophysics Data System (ADS)

    Shulyak, D.; Paladini, C.; Causi, G. Li; Perraut, K.; Kochukhov, O.

    2014-09-01

    By means of numerical experiments we explore the application of interferometry to the detection and characterization of abundance spots in chemically peculiar (CP) stars using the brightest star ε UMa as a case study. We find that the best spectral regions to search for spots and stellar rotation signatures are in the visual domain. The spots can clearly be detected already at a first visibility lobe and their signatures can be uniquely disentangled from that of rotation. The spots and rotation signatures can also be detected in near-infrared at low spectral resolution but baselines longer than 180 m are needed for all potential CP candidates. According to our simulations, an instrument like VEGA (or its successor e.g. Fibered and spectrally Resolved Interferometric Equipment New Design) should be able to detect, in the visual, the effect of spots and spots+rotation, provided that the instrument is able to measure V2 ≈ 10-3, and/or closure phase. In infrared, an instrument like AMBER but with longer baselines than the ones available so far would be able to measure rotation and spots. Our study provides necessary details about strategies of spot detections and the requirements for modern and planned interferometric facilities essential for CP star research.

  9. Tracking and Analyzing Individual Distress Following Terrorist Attacks Using Social Media Streams.

    PubMed

    Lin, Yu-Ru; Margolin, Drew; Wen, Xidao

    2017-08-01

    Risk research has theorized a number of mechanisms that might trigger, prolong, or potentially alleviate individuals' distress following terrorist attacks. These mechanisms are difficult to examine in a single study, however, because the social conditions of terrorist attacks are difficult to simulate in laboratory experiments and appropriate preattack baselines are difficult to establish with surveys. To address this challenge, we propose the use of computational focus groups and a novel analysis framework to analyze a social media stream that archives user history and location. The approach uses time-stamped behavior to quantify an individual's preattack behavior after an attack has occurred, enabling the assessment of time-specific changes in the intensity and duration of an individual's distress, as well as the assessment of individual and social-level covariates. To exemplify the methodology, we collected over 18 million tweets from 15,509 users located in Paris on November 13, 2015, and measured the degree to which they expressed anxiety, anger, and sadness after the attacks. The analysis resulted in findings that would be difficult to observe through other methods, such as that news media exposure had competing, time-dependent effects on anxiety, and that gender dynamics are complicated by baseline behavior. Opportunities for integrating computational focus group analysis with traditional methods are discussed. © 2017 Society for Risk Analysis.

  10. Proceedings of the Annual US Army Operations Research Symposium (AORS) (28th) Held in Fort Lee, Virginia on 10-12 October 1989. Volume 1

    DTIC Science & Technology

    1989-12-07

    Quantities in the Baseline vehicle fleet are shown as either active, reserve or rurOMue veicles . Production costs include vehicle rebuys. That is...advancing force encounters a minefield. The simulation is a hybrid time-step/event simulation: it uses a nominal thirty second time step through

  11. Aeroacoustic and Performance Simulations of a Test Scale Open Rotor

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.

    2013-01-01

    This paper explores a comparison between experimental data and numerical simulations of the historical baseline F31/A31 open rotor geometry. The experimental data were obtained at the NASA Glenn Research Center s Aeroacoustic facility and include performance and noise information for a variety of flow speeds (matching take-off and cruise). The numerical simulations provide both performance and aeroacoustic results using the NUMECA s Fine-Turbo analysis code. A non-linear harmonic method is used to capture the rotor/rotor interaction.

  12. Reliance on Simulation in Initial Entry Rifle Marksmanship Training and Future Directions for Simulation

    DTIC Science & Technology

    2016-11-01

    Engagement Simulation Training, and a day of dry -fire. The comparison was conducted during training with iron sights. On the two criterion measures, the...other five days of training consisted of two days of Engagement Skills Trainer (EST) 2000 training, one day of dry -fire, and two days of live-fire...0 / RM1 Preliminary Marksmanship Training Same as Baseline 1 / RM2 EST 2000 (grouping/zeroing) Test-D Drills 2 / RM3 Dry -Fire Training 25m Live-Fire

  13. Piecewise exponential models to assess the influence of job-specific experience on the hazard of acute injury for hourly factory workers

    PubMed Central

    2013-01-01

    Background An inverse relationship between experience and risk of injury has been observed in many occupations. Due to statistical challenges, however, it has been difficult to characterize the role of experience on the hazard of injury. In particular, because the time observed up to injury is equivalent to the amount of experience accumulated, the baseline hazard of injury becomes the main parameter of interest, excluding Cox proportional hazards models as applicable methods for consideration. Methods Using a data set of 81,301 hourly production workers of a global aluminum company at 207 US facilities, we compared competing parametric models for the baseline hazard to assess whether experience affected the hazard of injury at hire and after later job changes. Specific models considered included the exponential, Weibull, and two (a hypothesis-driven and a data-driven) two-piece exponential models to formally test the null hypothesis that experience does not impact the hazard of injury. Results We highlighted the advantages of our comparative approach and the interpretability of our selected model: a two-piece exponential model that allowed the baseline hazard of injury to change with experience. Our findings suggested a 30% increase in the hazard in the first year after job initiation and/or change. Conclusions Piecewise exponential models may be particularly useful in modeling risk of injury as a function of experience and have the additional benefit of interpretability over other similarly flexible models. PMID:23841648

  14. A Theoretical and Empirical Integrated Method to Select the Optimal Combined Signals for Geometry-Free and Geometry-Based Three-Carrier Ambiguity Resolution.

    PubMed

    Zhao, Dongsheng; Roberts, Gethin Wyn; Lau, Lawrence; Hancock, Craig M; Bai, Ruibin

    2016-11-16

    Twelve GPS Block IIF satellites, out of the current constellation, can transmit on three-frequency signals (L1, L2, L5). Taking advantages of these signals, Three-Carrier Ambiguity Resolution (TCAR) is expected to bring much benefit for ambiguity resolution. One of the research areas is to find the optimal combined signals for a better ambiguity resolution in geometry-free (GF) and geometry-based (GB) mode. However, the existing researches select the signals through either pure theoretical analysis or testing with simulated data, which might be biased as the real observation condition could be different from theoretical prediction or simulation. In this paper, we propose a theoretical and empirical integrated method, which first selects the possible optimal combined signals in theory and then refines these signals with real triple-frequency GPS data, observed at eleven baselines of different lengths. An interpolation technique is also adopted in order to show changes of the AR performance with the increase in baseline length. The results show that the AR success rate can be improved by 3% in GF mode and 8% in GB mode at certain intervals of the baseline length. Therefore, the TCAR can perform better by adopting the combined signals proposed in this paper when the baseline meets the length condition.

  15. A Theoretical and Empirical Integrated Method to Select the Optimal Combined Signals for Geometry-Free and Geometry-Based Three-Carrier Ambiguity Resolution

    PubMed Central

    Zhao, Dongsheng; Roberts, Gethin Wyn; Lau, Lawrence; Hancock, Craig M.; Bai, Ruibin

    2016-01-01

    Twelve GPS Block IIF satellites, out of the current constellation, can transmit on three-frequency signals (L1, L2, L5). Taking advantages of these signals, Three-Carrier Ambiguity Resolution (TCAR) is expected to bring much benefit for ambiguity resolution. One of the research areas is to find the optimal combined signals for a better ambiguity resolution in geometry-free (GF) and geometry-based (GB) mode. However, the existing researches select the signals through either pure theoretical analysis or testing with simulated data, which might be biased as the real observation condition could be different from theoretical prediction or simulation. In this paper, we propose a theoretical and empirical integrated method, which first selects the possible optimal combined signals in theory and then refines these signals with real triple-frequency GPS data, observed at eleven baselines of different lengths. An interpolation technique is also adopted in order to show changes of the AR performance with the increase in baseline length. The results show that the AR success rate can be improved by 3% in GF mode and 8% in GB mode at certain intervals of the baseline length. Therefore, the TCAR can perform better by adopting the combined signals proposed in this paper when the baseline meets the length condition. PMID:27854324

  16. Future prospects for measurements of mass hierarchy and CP violation

    NASA Astrophysics Data System (ADS)

    Lang, Karol

    2015-03-01

    We present a brief overview of current plans to pursue two challenging goals, resolution of the neutrino mass hierarchy and determination of the CP phase of the PMNS neutrino mixing matrix. Future prospects include large atmospheric experiments, PINGU, ORCA, and INO-ICAL, medium baseline reactor experiments, JUNO and RENO- 50, and long baseline accelerator experiments, LBNE, LBNO, and Hyper-Kamiokande. There are also new initiatives emerging, ESSνSB at the European Spallation Source, and CHIPS in the NuMI neutrino beam. This is a multifaceted, vigorous, and technically difficult world-wide program. It will likely take more than a decade to start reaping its benefits.

  17. Sterile Neutrinos in Cold Climates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Benjamin J.P.

    Measurements of neutrino oscillations at short baselines contain an intriguing set of experimental anomalies that may be suggestive of new physics such as the existence of sterile neutrinos. This three-part thesis presents research directed towards understanding these anomalies and searching for sterile neutrino oscillations. Part I contains a theoretical discussion of neutrino coherence properties. The open-quantum-system picture of neutrino beams, which allows a rigorous prediction of coherence distances for accelerator neutrinos, is presented. Validity of the standard treatment of active and sterile neutrino oscillations at short baselines is verified, and non-standard coherence loss effects at longer baselines are predicted. Partmore » II concerns liquid argon detector development for the MicroBooNE experiment, which will search for short-baseline oscillations in the Booster Neutrino Beam at Fermilab. Topics include characterization and installation of the MicroBooNE optical system; test-stand measurements of liquid argon optical properties with dissolved impurities; optimization of wavelength-shifting coatings for liquid argon scintillation light detection; testing and deployment of high-voltage surge arrestors to protect TPC field cages; and software development for optical and TPC simulation and reconstruction. Part III presents a search for sterile neutrinos using the IceCube neutrino telescope, which has collected a large sample of atmospheric-neutrino-induced events in the 1-10 TeV energy range. Sterile neutrinos would modify the detected neutrino flux shape via MSW-resonant oscillations. Following a careful treatment of systematic uncertainties in the sample, no evidence for MSW-resonant oscillations is observed, and exclusion limits on 3+1 model parameter space are derived. Under the mixing assumptions made, the 90% confidence level exclusion limit extends to sin 22θ 24 ≤ 0.02 at m 2 ~ 0.3 eV 2, and the LSND and MiniBooNE allowed regions are excluded at >99% confidence level.« less

  18. The Southwest Configuration for the Next Generation Very Large Array

    NASA Astrophysics Data System (ADS)

    Irwin Kellermann, Kenneth; Carilli, Chris; Condon, James; Cotton, William; Murphy, Eric Joseph; Nyland, Kristina

    2018-01-01

    We discuss the planned array configuration for the Next Generation Very Large Array (ngVLA). The configuration, termed the "Southwest Array," consists of 214 antennas each 18 m in diameter, distributed over the Southwest United States and Northern Mexico. The antenna locations have been set applying rough real-world constraints, such as road, fiber, and power access. The antenna locations will be fixed, with roughly 50% of the antennas in a "core" of 2 km diameter, located at the site of the JVLA. Another 30% of the antennas will be distributed over the Plains of San Augustin to a diameter of 30 km, possibly along, or near, the current JVLA arms. The remaining 20% of the antennas will be distributed in a rough two-arm spiral pattern to the South and East, out to a maximum distance of 500 km, into Texas, Arizona, and Chihuahua. Years of experience with the VLA up to 50 GHz, plus intensive antenna testing up to 250 GHz for the ALMA prototype antennas, verify the VLA site as having very good observing conditions (opacity, phase stability), up to 115 GHz (ngVLA Memo No. 1). Using a suite of tools implemented in CASA, we have made extensive imaging simulations with this configuration. We find that good imaging performance can be obtained through appropriate weighting of the visibilities, for resolutions ranging from that of the core of the array (1" at 30 GHz), out to the longest baselines (10 mas at 30 GHz), with a loss of roughly a factor of two in sensitivity relative to natural weighting (ngVLA Memo No. 16). The off-set core, located on the northern edge of the long baseline configuration, provides excellent sensitivity even on the longest baselines. We are considering, in addition, a compact configuration of 16 close-packed 6 m antennas to obtain uv-coverage down to baselines ~ 10 m for imaging large scale structure, as well as a configuration including 9 stations distributed to continental scales.

  19. Piloted simulation of an air-ground profile negotiation process in a time-based Air Traffic Control environment

    NASA Technical Reports Server (NTRS)

    Williams, David H.; Green, Steven M.

    1993-01-01

    Historically, development of airborne flight management systems (FMS) and ground-based air traffic control (ATC) systems has tended to focus on different objectives with little consideration for operational integration. A joint program, between NASA's Ames Research Center (Ames) and Langley Research Center (Langley), is underway to investigate the issues of, and develop systems for, the integration of ATC and airborne automation systems. A simulation study was conducted to evaluate a profile negotiation process (PNP) between the Center/TRACON Automation System (CTAS) and an aircraft equipped with a four-dimensional flight management system (4D FMS). Prototype procedures were developed to support the functional implementation of this process. The PNP was designed to provide an arrival trajectory solution which satisfies the separation requirements of ATC while remaining as close as possible to the aircraft's preferred trajectory. Results from the experiment indicate the potential for successful incorporation of aircraft-preferred arrival trajectories in the CTAS automation environment. Fuel savings on the order of 2 percent to 8 percent, compared to fuel required for the baseline CTAS arrival speed strategy, were achieved in the test scenarios. The data link procedures and clearances developed for this experiment, while providing the necessary functionality, were found to be operationally unacceptable to the pilots. In particular, additional pilot control and understanding of the proposed aircraft-preferred trajectory, and a simplified clearance procedure were cited as necessary for operational implementation of the concept.

  20. Relative astrometry of compact flaring structures in Sgr A* with polarimetric very long baseline interferometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Michael D.; Doeleman, Sheperd S.; Fish, Vincent L.

    2014-10-20

    We demonstrate that polarimetric interferometry can be used to extract precise spatial information about compact polarized flares of Sgr A*. We show that, for a faint dynamical component, a single interferometric baseline suffices to determine both its polarization and projected displacement from the quiescent intensity centroid. A second baseline enables two-dimensional reconstruction of the displacement, and additional baselines can self-calibrate using the flare, enhancing synthesis imaging of the quiescent emission. We apply this technique to simulated 1.3 mm wavelength observations of a 'hot spot' embedded in a radiatively inefficient accretion disk around Sgr A*. Our results indicate that, even withmore » current sensitivities, polarimetric interferometry with the Event Horizon Telescope can achieve ∼5 μas relative astrometry of compact flaring structures near Sgr A* on timescales of minutes.« less

Top