Sample records for progression simulation adds

  1. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  2. Identification of deficiencies in seasonal rainfall simulated by CMIP5 climate models

    NASA Astrophysics Data System (ADS)

    Dunning, Caroline M.; Allan, Richard P.; Black, Emily

    2017-11-01

    An objective technique for analysing seasonality, in terms of regime, progression and timing of the wet seasons, is applied in the evaluation of CMIP5 simulations across continental Africa. Atmosphere-only and coupled integrations capture the gross observed patterns of seasonal progression and give mean onset/cessation dates within 18 days of the observational dates for 11 of the 13 regions considered. Accurate representation of seasonality over central-southern Africa and West Africa (excluding the southern coastline) adds credence for future projected changes in seasonality here. However, coupled simulations exhibit timing biases over the Horn of Africa, with the long rains 20 days late on average. Although both sets of simulations detect biannual rainfall seasonal cycles for East and Central Africa, coupled simulations fail to capture the biannual regime over the southern West African coastline. This is linked with errors in the Gulf of Guinea sea surface temperature (SST) and deficient representation of the SST/rainfall relationship.

  3. Distribution and progression of add power among people in need of near correction.

    PubMed

    Han, Xiaotong; Lee, Pei Ying; Liu, Chi; He, Mingguang

    2018-04-16

    This study helps to better understand the need and trend in presbyopic add power in the aging society. Distribution and progression of presbyopic add power in East Asian population is largely unknown. Prospective cohort study. About 303 participants from a population-based study of residents aged 35 years and older in Guangzhou, China. Visual acuity (VA) test and non-cycloplegic automated refraction were performed at baseline in 2008 and the 6-year follow-up per standardized protocol. Participants with presenting near VA ≤ 20/40 underwent distance subjective refraction and add power measurement by increasing plus lens at a standard distance of 40 cm at each visit. Add power at baseline and follow-ups. Mean (standard deviation) age of the study participants was 57.6 (11.1) years and 50.2% were female. The mean add power at baseline was 1.43, 1.73, 2.03 and 2.20 diopters (D) for individuals in the age groups of 35-44, 45-54, 55-64 and 65+ years, respectively. Participants with older age and lower educational level had significantly higher add power requirements (P < 0.001). The overall 6-year increase in add power was 0.15D (95% CI: 0.06 to 0.25), and was smaller in myopic subjects (P = 0.03). Baseline age and add power, but not changes in biometric factors, were associated with longitudinal change in add power (P < 0.001). Distribution and progression of add power in Chinese was different from that previously suggested by Caucasian studies. More studies are needed to establish up-to-date age-related add power prescription norms for population of different ethnicities. © 2018 Royal Australian and New Zealand College of Ophthalmologists.

  4. SDAV Viz July Progress Update: LANL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sewell, Christopher Meyer

    2012-07-30

    SDAV Viz July Progress Update: (1) VPIC (Vector Particle in Cell) Kinetic Plasma Simulation Code - (a) Implemented first version of an in-situ adapter based on Paraview CoProcessing Library, (b) Three pipelines: vtkDataSetMapper, vtkContourFilter, vtkPistonContour, (c) Next, resolve issue at boundaries of processor domains; add more advanced viz/analysis pipelines; (2) Halo finding/merger trees - (a) Summer student Wathsala W. from University of Utah is working on data-parallel halo finder algorithm using PISTON, (b) Timo Bremer (LLNL), Valerio Pascucci (Utah), George Zagaris (Kitware), and LANL people are interested in using merger trees for tracking the evolution of halos in cosmo simulations;more » discussed possible overlap with work by Salman Habib and Katrin Heitmann (Argonne) during their visit to LANL 7/11; (3) PISTON integration in ParaView - Now available from ParaView github.« less

  5. Add+VantageMR® Assessments: A Case Study of Teacher and Student Gains

    ERIC Educational Resources Information Center

    Briand, Cathy

    2013-01-01

    This case study analyzes the effect of the Add+VantageMRRTM (AVMR) program on a teacher's pedagogy and on her students' progress in mathematics. AVMR, a professional development program in early mathematics, trains teachers to assess their students' progress and apply those insights to their teaching pedagogy. The AVMR assessment uses a…

  6. Adaptive Crack Modeling with Interface Solid Elements for Plain and Fiber Reinforced Concrete Structures.

    PubMed

    Zhan, Yijian; Meschke, Günther

    2017-07-08

    The effective analysis of the nonlinear behavior of cement-based engineering structures not only demands physically-reliable models, but also computationally-efficient algorithms. Based on a continuum interface element formulation that is suitable to capture complex cracking phenomena in concrete materials and structures, an adaptive mesh processing technique is proposed for computational simulations of plain and fiber-reinforced concrete structures to progressively disintegrate the initial finite element mesh and to add degenerated solid elements into the interfacial gaps. In comparison with the implementation where the entire mesh is processed prior to the computation, the proposed adaptive cracking model allows simulating the failure behavior of plain and fiber-reinforced concrete structures with remarkably reduced computational expense.

  7. Adaptive Crack Modeling with Interface Solid Elements for Plain and Fiber Reinforced Concrete Structures

    PubMed Central

    Zhan, Yijian

    2017-01-01

    The effective analysis of the nonlinear behavior of cement-based engineering structures not only demands physically-reliable models, but also computationally-efficient algorithms. Based on a continuum interface element formulation that is suitable to capture complex cracking phenomena in concrete materials and structures, an adaptive mesh processing technique is proposed for computational simulations of plain and fiber-reinforced concrete structures to progressively disintegrate the initial finite element mesh and to add degenerated solid elements into the interfacial gaps. In comparison with the implementation where the entire mesh is processed prior to the computation, the proposed adaptive cracking model allows simulating the failure behavior of plain and fiber-reinforced concrete structures with remarkably reduced computational expense. PMID:28773130

  8. Peer interactions of normal and attention-deficit-disordered boys during free-play, cooperative task, and simulated classroom situations.

    PubMed

    Cunningham, C E; Siegel, L S

    1987-06-01

    Groups of 30 ADD-H boys and 90 normal boys were divided into 30 mixed dyads composed of a normal and an ADD-H boy, and 30 normal dyads composed of 2 normal boys. Dyads were videotaped interacting in 15-minute free-play, 15-minute cooperative task, and 15-minute simulated classroom settings. Mixed dyads engaged in more controlling interaction than normal dyads in both free-play and simulated classroom settings. In the simulated classroom, mixed dyads completed fewer math problems and were less compliant with the commands of peers. ADD-H children spent less simulated classroom time on task and scored lower on drawing tasks than normal peers. Older dyads proved less controlling, more compliant with peer commands, more inclined to play and work independently, less active, and more likely to remain on task during the cooperative task and simulated classroom settings. Results suggest that the ADD-H child prompts a more controlling, less cooperative pattern of responses from normal peers.

  9. P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)

    NASA Astrophysics Data System (ADS)

    Kropp, Derek L.

    2009-05-01

    One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.

  10. Fatigue Damage of Collagenous Tissues: Experiment, Modeling and Simulation Studies

    PubMed Central

    Martin, Caitlin; Sun, Wei

    2017-01-01

    Mechanical fatigue damage is a critical issue for soft tissues and tissue-derived materials, particularly for musculoskeletal and cardiovascular applications; yet, our understanding of the fatigue damage process is incomplete. Soft tissue fatigue experiments are often difficult and time-consuming to perform, which has hindered progress in this area. However, the recent development of soft-tissue fatigue-damage constitutive models has enabled simulation-based fatigue analyses of tissues under various conditions. Computational simulations facilitate highly controlled and quantitative analyses to study the distinct effects of various loading conditions and design features on tissue durability; thus, they are advantageous over complex fatigue experiments. Although significant work to calibrate the constitutive models from fatigue experiments and to validate predictability remains, further development in these areas will add to our knowledge of soft-tissue fatigue damage and will facilitate the design of durable treatments and devices. In this review, the experimental, modeling, and simulation efforts to study collagenous tissue fatigue damage are summarized and critically assessed. PMID:25955007

  11. A cognitive robotics system: the symbolic and sub-symbolic robotic intelligence control system (SS-RICS)

    NASA Astrophysics Data System (ADS)

    Kelley, Troy D.; Avery, Eric

    2010-04-01

    This paper will detail the progress on the development of the Symbolic and Subsymbolic Robotics Intelligence Control System (SS-RICS). The system is a goal oriented production system, based loosely on the cognitive architecture, the Adaptive Control of Thought-Rational (ACT-R) some additions and changes. We have found that in order to simulate complex cognition on a robot, many aspects of cognition (long term memory (LTM), perception) needed to be in place before any generalized intelligent behavior can be produced. In working with ACT-R, we found that it was a good instantiation of working memory, but that we needed to add other aspects of cognition including LTM and perception to have a complete cognitive system. Our progress to date will be noted and the challenges that remain will be addressed.

  12. The effect of bifocal add on accommodative lag in myopic children with high accommodative lag.

    PubMed

    Berntsen, David A; Mutti, Donald O; Zadnik, Karla

    2010-12-01

    To determine the effect of a bifocal add and manifest correction on accommodative lag in myopic children with high accommodative lag, who have been reported to have the greatest reduction in myopia progression with progressive addition lenses (PALs). Monocular accommodative lag to a 4-D Badal stimulus was measured on two occasions 6 months apart in 83 children (mean ± SD age, 9.9 ± 1.3 years) with high lag randomized to wearing single-vision lenses (SVLs) or PALs. Accommodative lag was measured with the following corrections: habitual, manifest, manifest with +2.00-D add, and habitual with +2.00-D add (6-month visit only). At baseline, accommodative lag was higher (1.72 ± 0.37 D; mean ± SD) when measured with manifest correction than with habitual correction (1.51 ± 0.50; P < 0.05). This higher lag with manifest correction correlated with a larger amount of habitual undercorrection at baseline (r = -0.29, P = 0.009). A +2.00-D add over the manifest correction reduced lag by 0.45 ± 0.34 D at baseline and 0.33 ± 0.38 D at the 6-month visit. Lag results at 6 months were not different between PAL and SVL wearers (P = 0.92). A +2.00-D bifocal add did not eliminate accommodative lag and reduced lag by less than 25% of the bifocal power, indicating that children mainly responded to a bifocal by decreasing accommodation. If myopic progression is substantial, measuring lag with full correction can overestimate the hyperopic retinal blur that a child most recently experienced. (ClinicalTrials.gov number, NCT00335049.).

  13. The Effect of Bifocal Add on Accommodative Lag in Myopic Children with High Accommodative Lag

    PubMed Central

    Mutti, Donald O.; Zadnik, Karla

    2010-01-01

    Purpose. To determine the effect of a bifocal add and manifest correction on accommodative lag in myopic children with high accommodative lag, who have been reported to have the greatest reduction in myopia progression with progressive addition lenses (PALs). Methods. Monocular accommodative lag to a 4-D Badal stimulus was measured on two occasions 6 months apart in 83 children (mean ± SD age, 9.9 ± 1.3 years) with high lag randomized to wearing single-vision lenses (SVLs) or PALs. Accommodative lag was measured with the following corrections: habitual, manifest, manifest with +2.00-D add, and habitual with +2.00-D add (6-month visit only). Results. At baseline, accommodative lag was higher (1.72 ± 0.37 D; mean ± SD) when measured with manifest correction than with habitual correction (1.51 ± 0.50; P < 0.05). This higher lag with manifest correction correlated with a larger amount of habitual undercorrection at baseline (r = −0.29, P = 0.009). A +2.00-D add over the manifest correction reduced lag by 0.45 ± 0.34 D at baseline and 0.33 ± 0.38 D at the 6-month visit. Lag results at 6 months were not different between PAL and SVL wearers (P = 0.92). Conclusions. A +2.00-D bifocal add did not eliminate accommodative lag and reduced lag by less than 25% of the bifocal power, indicating that children mainly responded to a bifocal by decreasing accommodation. If myopic progression is substantial, measuring lag with full correction can overestimate the hyperopic retinal blur that a child most recently experienced. (ClinicalTrials.gov number, NCT00335049.) PMID:20688729

  14. 18F PET with florbetapir for the early diagnosis of Alzheimer's disease dementia and other dementias in people with mild cognitive impairment (MCI).

    PubMed

    Martínez, Gabriel; Vernooij, Robin Wm; Fuentes Padilla, Paulina; Zamora, Javier; Bonfill Cosp, Xavier; Flicker, Leon

    2017-11-22

    18 F-florbetapir uptake by brain tissue measured by positron emission tomography (PET) is accepted by regulatory agencies like the Food and Drug Administration (FDA) and the European Medicine Agencies (EMA) for assessing amyloid load in people with dementia. Its added value is mainly demonstrated by excluding Alzheimer's pathology in an established dementia diagnosis. However, the National Institute on Aging and Alzheimer's Association (NIA-AA) revised the diagnostic criteria for Alzheimer's disease and confidence in the diagnosis of mild cognitive impairment (MCI) due to Alzheimer's disease may be increased when using amyloid biomarkers tests like 18 F-florbetapir. These tests, added to the MCI core clinical criteria, might increase the diagnostic test accuracy (DTA) of a testing strategy. However, the DTA of 18 F-florbetapir to predict the progression from MCI to Alzheimer's disease dementia (ADD) or other dementias has not yet been systematically evaluated. To determine the DTA of the 18 F-florbetapir PET scan for detecting people with MCI at time of performing the test who will clinically progress to ADD, other forms of dementia (non-ADD), or any form of dementia at follow-up. This review is current to May 2017. We searched MEDLINE (OvidSP), Embase (OvidSP), PsycINFO (OvidSP), BIOSIS Citation Index (Thomson Reuters Web of Science), Web of Science Core Collection, including the Science Citation Index (Thomson Reuters Web of Science) and the Conference Proceedings Citation Index (Thomson Reuters Web of Science), LILACS (BIREME), CINAHL (EBSCOhost), ClinicalTrials.gov (https://clinicaltrials.gov), and the World Health Organization International Clinical Trials Registry Platform (WHO ICTRP) (http://www.who.int/ictrp/search/en/). We also searched ALOIS, the Cochrane Dementia & Cognitive Improvement Group's specialised register of dementia studies (http://www.medicine.ox.ac.uk/alois/). We checked the reference lists of any relevant studies and systematic reviews, and performed citation tracking using the Science Citation Index to identify any additional relevant studies. No language or date restrictions were applied to the electronic searches. We included studies that had prospectively defined cohorts with any accepted definition of MCI at time of performing the test and the use of 18 F-florbetapir scan to evaluate the DTA of the progression from MCI to ADD or other forms of dementia. In addition, we only selected studies that applied a reference standard for Alzheimer's dementia diagnosis, for example, National Institute of Neurological and Communicative Disorders and Stroke and the Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA) or Diagnostic and Statistical Manual of Mental Disorders-IV (DSM-IV) criteria. We screened all titles and abstracts identified in electronic-database searches. Two review authors independently selected studies for inclusion and extracted data to create two-by-two tables, showing the binary test results cross-classified with the binary reference standard. We used these data to calculate sensitivities, specificities, and their 95% confidence intervals. Two independent assessors performed quality assessment using the QUADAS-2 tool plus some additional items to assess the methodological quality of the included studies. We included three studies, two of which evaluated the progression from MCI to ADD, and one evaluated the progression from MCI to any form of dementia.Progression from MCI to ADD was evaluated in 448 participants. The studies reported data on 401 participants with 1.6 years of follow-up and in 47 participants with three years of follow-up. Sixty-one (15.2%) participants converted at 1.6 years follow-up; nine (19.1%) participants converted at three years of follow-up.Progression from MCI to any form of dementia was evaluated in five participants with 1.5 years of follow-up, with three (60%) participants converting to any form of dementia.There were concerns regarding applicability in the reference standard in all three studies. Regarding the domain of flow and timing, two studies were considered at high risk of bias. MCI to ADD;Progression from MCI to ADD in those with a follow-up between two to less than four years had a sensitivity of 67% (95% CI 30 to 93) and a specificity of 71% (95% CI 54 to 85) by visual assessment (n = 47, 1 study).Progression from MCI to ADD in those with a follow-up between one to less than two years had a sensitivity of 89% (95% CI 78 to 95) and a specificity of 58% (95% CI 53 to 64) by visual assessment, and a sensitivity of 87% (95% CI 76 to 94) and a specificity of 51% (95% CI 45 to 56) by quantitative assessment by the standardised uptake value ratio (SUVR)(n = 401, 1 study). MCI to any form of dementia;Progression from MCI to any form of dementia in those with a follow-up between one to less than two years had a sensitivity of 67% (95% CI 9 to 99) and a specificity of 50% (95% CI 1 to 99) by visual assessment (n = 5, 1 study). MCI to any other forms of dementia (non-ADD);There was no information regarding the progression from MCI to any other form of dementia (non-ADD). Although sensitivity was good in one included study, considering the poor specificity and the limited data available in the literature, we cannot recommend routine use of 18 F-florbetapir PET in clinical practice to predict the progression from MCI to ADD.Because of the poor sensitivity and specificity, limited number of included participants, and the limited data available in the literature, we cannot recommend its routine use in clinical practice to predict the progression from MCI to any form of dementia.Because of the high financial costs of 18 F-florbetapir, clearly demonstrating the DTA and standardising the process of this modality are important prior to its wider use.

  15. Add Control: plant virtualization for control solutions in WWTP.

    PubMed

    Maiza, M; Bengoechea, A; Grau, P; De Keyser, W; Nopens, I; Brockmann, D; Steyer, J P; Claeys, F; Urchegui, G; Fernández, O; Ayesa, E

    2013-01-01

    This paper summarizes part of the research work carried out in the Add Control project, which proposes an extension of the wastewater treatment plant (WWTP) models and modelling architectures used in traditional WWTP simulation tools, addressing, in addition to the classical mass transformations (transport, physico-chemical phenomena, biological reactions), all the instrumentation, actuation and automation & control components (sensors, actuators, controllers), considering their real behaviour (signal delays, noise, failures and power consumption of actuators). Its ultimate objective is to allow a rapid transition from the simulation of the control strategy to its implementation at full-scale plants. Thus, this paper presents the application of the Add Control simulation platform for the design and implementation of new control strategies at the WWTP of Mekolalde.

  16. Nintedanib with Add-on Pirfenidone in Idiopathic Pulmonary Fibrosis. Results of the INJOURNEY Trial.

    PubMed

    Vancheri, Carlo; Kreuter, Michael; Richeldi, Luca; Ryerson, Christopher J; Valeyre, Dominique; Grutters, Jan C; Wiebe, Sabrina; Stansen, Wibke; Quaresma, Manuel; Stowasser, Susanne; Wuyts, Wim A

    2018-02-01

    Nintedanib and pirfenidone slow the progression of idiopathic pulmonary fibrosis (IPF), but the disease continues to progress. More data are needed on the safety and efficacy of combination therapy with nintedanib and add-on pirfenidone. To investigate safety, tolerability, and pharmacokinetic and exploratory efficacy endpoints in patients treated with nintedanib and add-on pirfenidone versus nintedanib alone. Patients with IPF and FVC greater than or equal to 50% predicted at screening who completed a 4- to 5-week run-in with nintedanib 150 mg twice daily without dose reduction or treatment interruption were randomized to receive nintedanib 150 mg twice daily with add-on pirfenidone (titrated to 801 mg three times daily) or nintedanib 150 mg twice daily alone in an open-label manner for 12 weeks. The primary endpoint was the percentage of patients with on-treatment gastrointestinal adverse events from baseline to Week 12. Analyses were descriptive and exploratory. On-treatment gastrointestinal adverse events were reported in 37 of 53 patients (69.8%) treated with nintedanib with add-on pirfenidone and 27 of 51 patients (52.9%) treated with nintedanib alone. Predose plasma trough concentrations of nintedanib were similar when it was administered alone or with add-on pirfenidone. Mean (SE) changes from baseline in FVC at Week 12 were -13.3 (17.4) ml and -40.9 (31.4) ml in patients treated with nintedanib with add-on pirfenidone (n = 48) and nintedanib alone (n = 44), respectively. Nintedanib with add-on pirfenidone had a manageable safety and tolerability profile in patients with IPF, in line with the adverse event profiles of each drug. These data support further research into combination regimens in the treatment of IPF. Clinical trial registered with www.clinicaltrials.gov (NCT02579603).

  17. Dissipative preparation of antiferromagnetic order in the Fermi-Hubbard model

    NASA Astrophysics Data System (ADS)

    Kaczmarczyk, J.; Weimer, H.; Lemeshko, M.

    2016-09-01

    The Fermi-Hubbard model is one of the key models of condensed matter physics, which holds a potential for explaining the mystery of high-temperature superconductivity. Recent progress in ultracold atoms in optical lattices has paved the way to studying the model’s phase diagram using the tools of quantum simulation, which emerged as a promising alternative to the numerical calculations plagued by the infamous sign problem. However, the temperatures achieved using elaborate laser cooling protocols so far have been too high to show the appearance of antiferromagnetic (AF) and superconducting quantum phases directly. In this work, we demonstrate that using the machinery of dissipative quantum state engineering, one can observe the emergence of the AF order in the Fermi-Hubbard model with fermions in optical lattices. The core of the approach is to add incoherent laser scattering in such a way that the AF state emerges as the dark state of the driven-dissipative dynamics. The proposed controlled dissipation channels described in this work are straightforward to add to already existing experimental setups.

  18. Transport link scanner: simulating geographic transport network expansion through individual investments

    NASA Astrophysics Data System (ADS)

    Jacobs-Crisioni, C.; Koopmans, C. C.

    2016-07-01

    This paper introduces a GIS-based model that simulates the geographic expansion of transport networks by several decision-makers with varying objectives. The model progressively adds extensions to a growing network by choosing the most attractive investments from a limited choice set. Attractiveness is defined as a function of variables in which revenue and broader societal benefits may play a role and can be based on empirically underpinned parameters that may differ according to private or public interests. The choice set is selected from an exhaustive set of links and presumably contains those investment options that best meet private operator's objectives by balancing the revenues of additional fare against construction costs. The investment options consist of geographically plausible routes with potential detours. These routes are generated using a fine-meshed regularly latticed network and shortest path finding methods. Additionally, two indicators of the geographic accuracy of the simulated networks are introduced. A historical case study is presented to demonstrate the model's first results. These results show that the modelled networks reproduce relevant results of the historically built network with reasonable accuracy.

  19. Investigation of cloud/water vapor motion winds from geostationary satellite

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This report summarizes the research work accomplished on the NASA grant contract NAG8-892 during 1992. Research goals of this contract are the following: to complete upgrades to the Cooperative Institute for Meteorological Satellite Studies (CIMSS) wind system procedures for assigning heights and incorporating first guess information; to evaluate these modifications using simulated tracer fields; to add an automated quality control system to minimize the need for manual editing, while maintaining product quality; and to benchmark the upgraded algorithm in tests with NMC and/or MSFC. Work progressed on all these tasks and is detailed. This work was done in collaboration with CIMSS NOAA/NESDIS scientists working on the operational winds software, so that NASA funded research can benefit NESDIS operational algorithms.

  20. Comparison of the clinical outcomes between antiviral-naïve patients treated with entecavir and lamivudine-resistant patients receiving adefovir add-on lamivudine combination treatment

    PubMed Central

    Kim, Hong Joo; Park, Soo Kyung; Yang, Hyo Joon; Jung, Yoon Suk; Park, Jung Ho; Park, Dong Il; Cho, Yong Kyun; Sohn, Chong Il; Jeon, Woo Kyu; Kim, Byung Ik; Choi, Kyu Yong

    2016-01-01

    Background/Aims To analyze the effects of preexisting lamivudine (LAM) resistance and applying antiviral treatment (adefovir [ADV] add-on LAM combination treatment) on long-term treatment outcomes, and comparing the clinical outcomes of antiviral-naïve chronic hepatitis B patients receiving entecavir (ETV) monotherapy. Methods This study enrolled 73 antiviral-naïve patients who received 0.5-mg ETV as an initial therapy and 54 patients who received ADV add-on LAM combination treatment as a rescue therapy from July 2006 to July 2010. Results During 24-month treatments, the decreases in serum log10HBV-DNA values (copies/mL) were significantly greater in the antiviral-naïve patients treated with ETV than the patients receiving ADV add-on LAM combination treatment. The biochemical response rates for alanine aminotransferase normalization at 6 months (ETV) and 12 months (ADV add-on LAM) were 90.4% (66/73) and 77.8% (42/54), respectively (P=0.048). A Kaplan-Meier analysis indicated that the rates of serologic response, viral breakthrough, and emergence of genotypic resistance did not differ significantly between the two patient groups. There were also no significant intergroup differences in the rates of disease progression (PD) and new development of hepatocellular carcinoma (HCC). Conclusion The long-term clinical outcomes of antiviral-naïve patients treated with ETV and LAM-resistant patients receiving ADV add-on LAM combination treatment were comparable in terms of the emergence of HCC and disease progression. PMID:27729626

  1. 77 FR 12596 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-01

    ... OMB Review; Comment Request Title: Objective Work Plan (OWP), Objective Progress Report (OPR) and... grantees and better gauge grantee progress. The standardized format allows ANA to report results across all...) to report the results for activities within each Project Objective. The grantee may continue to add...

  2. 18F PET with flutemetamol for the early diagnosis of Alzheimer's disease dementia and other dementias in people with mild cognitive impairment (MCI).

    PubMed

    Martínez, Gabriel; Vernooij, Robin Wm; Fuentes Padilla, Paulina; Zamora, Javier; Flicker, Leon; Bonfill Cosp, Xavier

    2017-11-22

    18 F-flutemetamol uptake by brain tissue, measured by positron emission tomography (PET), is accepted by regulatory agencies like the Food and Drug Administration (FDA) and the European Medicine Agencies (EMA) for assessing amyloid load in people with dementia. Its added value is mainly demonstrated by excluding Alzheimer's pathology in an established dementia diagnosis. However, the National Institute on Aging and Alzheimer's Association (NIA-AA) revised the diagnostic criteria for Alzheimer's disease and the confidence in the diagnosis of mild cognitive impairment (MCI) due to Alzheimer's disease may be increased when using some amyloid biomarkers tests like 18 F-flutemetamol. These tests, added to the MCI core clinical criteria, might increase the diagnostic test accuracy (DTA) of a testing strategy. However, the DTA of 18 F-flutemetamol to predict the progression from MCI to Alzheimer's disease dementia (ADD) or other dementias has not yet been systematically evaluated. To determine the DTA of the 18 F-flutemetamol PET scan for detecting people with MCI at time of performing the test who will clinically progress to ADD, other forms of dementia (non-ADD) or any form of dementia at follow-up. The most recent search for this review was performed in May 2017. We searched MEDLINE (OvidSP), Embase (OvidSP), PsycINFO (OvidSP), BIOSIS Citation Index (Thomson Reuters Web of Science), Web of Science Core Collection, including the Science Citation Index (Thomson Reuters Web of Science) and the Conference Proceedings Citation Index (Thomson Reuters Web of Science), LILACS (BIREME), CINAHL (EBSCOhost), ClinicalTrials.gov (https://clinicaltrials.gov), and the World Health Organization International Clinical Trials Registry Platform (WHO ICTRP) (http://www.who.int/ictrp/search/en/). We also searched ALOIS, the Cochrane Dementia & Cognitive Improvement Group's specialised register of dementia studies (http://www.medicine.ox.ac.uk/alois/). We checked the reference lists of any relevant studies and systematic reviews, and performed citation tracking using the Science Citation Index to identify any additional relevant studies. No language or date restrictions were applied to the electronic searches. We included studies that had prospectively defined cohorts with any accepted definition of MCI at time of performing the test and the use of 18 F-flutemetamol scan to evaluate the DTA of the progression from MCI to ADD or other forms of dementia. In addition, we only selected studies that applied a reference standard for Alzheimer's dementia diagnosis, for example, National Institute of Neurological and Communicative Disorders and Stroke and the Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA) or Diagnostic and Statistical Manual of Mental Disorders-IV (DSM-IV) criteria. We screened all titles and abstracts identified in electronic-database searches. Two review authors independently selected studies for inclusion and extracted data to create two-by-two tables, showing the binary test results cross-classified with the binary reference standard. We used these data to calculate sensitivities, specificities, and their 95% confidence intervals. Two independent assessors performed quality assessment using the QUADAS-2 tool plus some additional items to assess the methodological quality of the included studies. Progression from MCI to ADD was evaluated in 243 participants from two studies. The studies reported data on 19 participants with two years of follow-up and on 224 participants with three years of follow-up. Nine (47.4%) participants converted at two years follow-up and 81 (36.2%) converted at three years of follow-up.There were concerns about participant selection and sampling in both studies. The index test domain in one study was considered unclear and in the second study it was considered at low risk of bias. For the reference standard domain, one study was considered at low risk and the second study was considered to have an unclear risk of bias. Regarding the domains of flow and timing, both studies were considered at high risk of bias. MCI to ADD;Progression from MCI to ADD at two years of follow-up had a sensitivity of 89% (95% CI 52 to 100) and a specificity of 80% (95% CI 44 to 97) by quantitative assessment by SUVR (n = 19, 1 study).Progression from MCI to ADD at three years of follow-up had a sensitivity of 64% (95% CI 53 to 75) and a specificity of 69% (95% CI 60 to 76) by visual assessment (n = 224, 1 study).There was no information regarding the other two objectives in this systematic review (SR): progression from MCI to other forms of dementia and progression to any form of dementia at follow-up. Due to the varying sensitivity and specificity for predicting the progression from MCI to ADD and the limited data available, we cannot recommend routine use of 18 F-flutemetamol in clinical practice. 18 F-flutemetamol has high financial costs; therefore, clearly demonstrating its DTA and standardising the process of the 18 F-flutemetamol modality is important prior to its wider use.

  3. Progression of Patterns (POP): A Machine Classifier Algorithm to Identify Glaucoma Progression in Visual Fields

    PubMed Central

    Goldbaum, Michael H.; Lee, Intae; Jang, Giljin; Balasubramanian, Madhusudhanan; Sample, Pamela A.; Weinreb, Robert N.; Liebmann, Jeffrey M.; Girkin, Christopher A.; Anderson, Douglas R.; Zangwill, Linda M.; Fredette, Marie-Josee; Jung, Tzyy-Ping; Medeiros, Felipe A.; Bowd, Christopher

    2012-01-01

    Purpose. We evaluated Progression of Patterns (POP) for its ability to identify progression of glaucomatous visual field (VF) defects. Methods. POP uses variational Bayesian independent component mixture model (VIM), a machine learning classifier (MLC) developed previously. VIM separated Swedish Interactive Thresholding Algorithm (SITA) VFs from a set of 2,085 normal and glaucomatous eyes into nine axes (VF patterns): seven glaucomatous. Stable glaucoma was simulated in a second set of 55 patient eyes with five VFs each, collected within four weeks. A third set of 628 eyes with 4,186 VFs (mean ± SD of 6.7 ± 1.7 VFs over 4.0 ± 1.4 years) was tested for progression. Tested eyes were placed into suspect and glaucoma categories at baseline, based on VFs and disk stereoscopic photographs; a subset of eyes had stereophotographic evidence of progressive glaucomatous optic neuropathy (PGON). Each sequence of fields was projected along seven VIM glaucoma axes. Linear regression (LR) slopes generated from projections onto each axis yielded a degree of confidence (DOC) that there was progression. At 95% specificity, progression cutoffs were established for POP, visual field index (VFI), and mean deviation (MD). Guided progression analysis (GPA) was also compared. Results. POP identified a statistically similar number of eyes (P > 0.05) as progressing compared with VFI, MD, and GPA in suspects (3.8%, 2.7%, 5.6%, and 2.9%, respectively), and more eyes than GPA (P = 0.01) in glaucoma (16.0%, 15.3%, 12.0%, and 7.3%, respectively), and more eyes than GPA (P = 0.05) in PGON eyes (26.3%, 23.7%, 27.6%, and 14.5%, respectively). Conclusions. POP, with its display of DOC of progression and its identification of progressing VF defect pattern, adds to the information available to the clinician for detecting VF progression. PMID:22786913

  4. Spatial and spectral simulation of LANDSAT images of agricultural areas

    NASA Technical Reports Server (NTRS)

    Pont, W. F., Jr. (Principal Investigator)

    1982-01-01

    A LANDSAT scene simulation capability was developed to study the effects of small fields and misregistration on LANDSAT-based crop proportion estimation procedures. The simulation employs a pattern of ground polygons each with a crop ID, planting date, and scale factor. Historical greenness/brightness crop development profiles generate the mean signal values for each polygon. Historical within-field covariances add texture to pixels in each polygon. The planting dates and scale factors create between-field/within-crop variation. Between field and crop variation is achieved by the above and crop profile differences. The LANDSAT point spread function is used to add correlation between nearby pixels. The next effect of the point spread function is to blur the image. Mixed pixels and misregistration are also simulated.

  5. Straight A's: Public Education Policy and Progress. Volume 6, Number 6

    ERIC Educational Resources Information Center

    Amos, Jason, Ed.

    2006-01-01

    "Straight A's: Public Education Policy and Progress" is a biweekly newsletter that focuses on education news and events both in Washington, DC and around the country. The following articles are included in this issue: (1) Senate Approves Budget Plan: Amendment Adds Funds for Education Programs; (2) The Hits Keep Coming: House…

  6. Modeling absolute plate and plume motions

    NASA Astrophysics Data System (ADS)

    Bodinier, G. P.; Wessel, P.; Conrad, C. P.

    2016-12-01

    Paleomagnetic evidence for plume drift has made modeling of absolute plate motions challenging, especially since direct observations of plume drift are lacking. Predictions of plume drift arising from mantle convection models and broadly satisfying observed paleolatitudes have so far provided the only framework for deriving absolute plate motions over moving hotspots. However, uncertainties in mantle rheology, temperature, and initial conditions make such models nonunique. Using simulated and real data, we will show that age progressions along Pacific hotspot trails provide strong constraints on plume motions for all major trails, and furthermore that it is possible to derive models for relative plume drift from these data alone. Relative plume drift depends on the inter-hotspot distances derived from age progressions but lacks a fixed reference point and orientation. By incorporating paleolatitude histories for the Hawaii and Louisville chains we add further constraints on allowable plume motions, yet one unknown parameter remains: a longitude shift that applies equally to all plumes. To obtain a solution we could restrict either the Hawaii or Louisville plume to have latitudinal motion only, thus satisfying paleolatitude constraints. Yet, restricting one plume to latitudinal motion while all others move freely is not realistic. Consequently, it is only possible to resolve the motion of hotspots relative to an overall and unknown longitudinal shift as a function of time. Our plate motions are therefore dependent on the same shift via an unknown rotation about the north pole. Yet, as plume drifts are consequences of mantle convection, our results place strong constraints on the pattern of convection. Other considerations, such as imposed limits on plate speed, plume speed, proximity to LLSVP edges, model smoothness, or relative plate motions via ridge-spotting may add further constraints that allow a unique model of Pacific absolute plate and plume motions to be inferred. Our modeling suggests that the acquisition of new age and paleomagnetic data from hotspot trails where data are lacking would add valuable constraints on both plume and plate motions. At present, the limiting factor is inconsistencies between paleomagnetic, geometric, and chronologic data, leading to large uncertainties in the results.

  7. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  8. Cost-effectiveness of exenatide twice daily vs insulin glargine as add-on therapy to oral antidiabetic agents in patients with type 2 diabetes in China.

    PubMed

    Gu, Shuyan; Wang, Xiaoyong; Qiao, Qing; Gao, Weiguo; Wang, Jian; Dong, Hengjin

    2017-12-01

    To estimate the long-term cost-effectiveness of exenatide twice daily vs insulin glargine once daily as add-on therapy to oral antidiabetic agents (OADs) for Chinese patients with type 2 diabetes (T2DM). The Cardiff Diabetes Model was used to simulate disease progression and estimate the long-term effects of exenatide twice daily vs insulin glargine once daily. Patient profiles and treatment effects required for the model were obtained from literature reviews (English and Chinese databases) and from a meta-analysis of 8 randomized controlled trials comparing exenatide twice daily with insulin glargine once daily add-on to OADs for T2DM in China. Medical expenditure data were collected from 639 patients with T2DM (aged ≥18 years) with and without complications incurred between January 1, 2014 and December 31, 2015 from claims databases in Shandong, China. Costs (2014 Chinese Yuan [¥]) and benefits were estimated, from the payers' perspective, over 40 years at a discount rate of 3%. A series of sensitivity analyses were performed. Patients on exenatide twice daily + OAD had a lower predicted incidence of most cardiovascular and hypoglycaemic events and lower total costs compared with those on insulin glargine once daily + OAD. A greater number of quality-adjusted life years (QALYs; 1.94) at a cost saving of ¥117 706 gained was associated with exenatide twice daily vs insulin glargine once daily. (i.e. cost saving of ¥60 764/QALY) per patient. In Chinese patients with T2DM inadequately controlled by OADs, exenatide twice daily is a cost-effective add-on therapy alternative to insulin glargine once daily, and may address the problem of an excess of medical needs resulting from weight gain and hypoglycaemia in T2DM treatment. © 2017 John Wiley & Sons Ltd.

  9. An Extension to the Multilevel Logic Simulator for Microcomputers.

    DTIC Science & Technology

    1987-06-01

    gates .............................. 61 3. Add or delete inputs ............................... 61 4. Add or delete outputs...the gates affected by the deletion. 61 4. Add or delete outputs The only modification that will be done in the circuit is the insertion (deletion) of...recompilation of the circuit. P 105 -Z -- % % 9,m "N , " " " " " ’ ’-"" " " " " " " " - -" , " " -- -" - .’ TABLE 25 THE DELINP CASE FOR THE ALU CIRCUIT JI

  10. Pirfenidone in patients with rapidly progressive interstitial lung disease associated with clinically amyopathic dermatomyositis

    NASA Astrophysics Data System (ADS)

    Li, Ting; Guo, Li; Chen, Zhiwei; Gu, Liyang; Sun, Fangfang; Tan, Xiaoming; Chen, Sheng; Wang, Xiaodong; Ye, Shuang

    2016-09-01

    To evaluate the efficacy of pirfenidone in patients with rapidly progressive interstitial lung disease (RPILD) related to clinically amyopathic dermatomyositis (CADM), we conducted an open-label, prospective study with matched retrospective controls. Thirty patients diagnosed with CADM-RPILD with a disease duration <6 months at Renji Hospital South Campus from June 2014 to November 2015 were prospectively enrolled and treated with pirfenidone at a target dose of 1800 mg/d in addition to conventional treatment, such as a glucocorticoid and/or other immunosuppressants. Matched patients without pirfenidone treatment (n = 27) were retrospectively selected as controls between October 2012 and September 2015. We found that the pirfenidone add-on group displayed a trend of lower mortality compared with the control group (36.7% vs 51.9%, p = 0.2226). Furthermore, the subgroup analysis indicated that the pirfenidone add-on had no impact on the survival of acute ILD patients (disease duration <3 months) (50% vs 50%, p = 0.3862) while for subacute ILD patients (disease duration 3-6 months), the pirfenidone add-on (n = 10) had a significantly higher survival rate compared with the control subgroup (n = 9) (90% vs 44.4%, p = 0.0450). Our data indicated that the pirfenidone add-on may improve the prognosis of patients with subacute ILD related to CADM.

  11. Pirfenidone in patients with rapidly progressive interstitial lung disease associated with clinically amyopathic dermatomyositis.

    PubMed

    Li, Ting; Guo, Li; Chen, Zhiwei; Gu, Liyang; Sun, Fangfang; Tan, Xiaoming; Chen, Sheng; Wang, Xiaodong; Ye, Shuang

    2016-09-12

    To evaluate the efficacy of pirfenidone in patients with rapidly progressive interstitial lung disease (RPILD) related to clinically amyopathic dermatomyositis (CADM), we conducted an open-label, prospective study with matched retrospective controls. Thirty patients diagnosed with CADM-RPILD with a disease duration <6 months at Renji Hospital South Campus from June 2014 to November 2015 were prospectively enrolled and treated with pirfenidone at a target dose of 1800 mg/d in addition to conventional treatment, such as a glucocorticoid and/or other immunosuppressants. Matched patients without pirfenidone treatment (n = 27) were retrospectively selected as controls between October 2012 and September 2015. We found that the pirfenidone add-on group displayed a trend of lower mortality compared with the control group (36.7% vs 51.9%, p = 0.2226). Furthermore, the subgroup analysis indicated that the pirfenidone add-on had no impact on the survival of acute ILD patients (disease duration <3 months) (50% vs 50%, p = 0.3862); while for subacute ILD patients (disease duration 3-6 months), the pirfenidone add-on (n = 10) had a significantly higher survival rate compared with the control subgroup (n = 9) (90% vs 44.4%, p = 0.0450). Our data indicated that the pirfenidone add-on may improve the prognosis of patients with subacute ILD related to CADM.

  12. A developmental dose-response analysis of the effects of methylphenidate on the peer interactions of attention deficit disordered boys.

    PubMed

    Cunningham, C E; Siegel, L S; Offord, D R

    1985-11-01

    Mixed dyads of 42 normal and 42 ADD boys were videotaped in free play, co-operative task, and simulated classrooms. ADD boys received placebo, 0.15 mg/kg, and 0.50 mg/kg of methylphenidate. ADD boys were more active and off task, watched peers less, and scored lower on mathematics and visual-motor tasks. Older boys interacted less, ignored peer interactions and play more frequently, were less controlling, and more compliant. In class, methylphenidate improved visual motor scores, and reduced the controlling behaviour, activity level, and off task behaviour of ADD boys. Normal peers displayed reciprocal reductions in controlling behaviour, activity level, and off task behaviour.

  13. Approach to the unfolding and folding dynamics of add A-riboswitch upon adenine dissociation using a coarse-grained elastic network model

    NASA Astrophysics Data System (ADS)

    Li, Chunhua; Lv, Dashuai; Zhang, Lei; Yang, Feng; Wang, Cunxin; Su, Jiguo; Zhang, Yang

    2016-07-01

    Riboswitches are noncoding mRNA segments that can regulate the gene expression via altering their structures in response to specific metabolite binding. We proposed a coarse-grained Gaussian network model (GNM) to examine the unfolding and folding dynamics of adenosine deaminase (add) A-riboswitch upon the adenine dissociation, in which the RNA is modeled by a nucleotide chain with interaction networks formed by connecting adjoining atomic contacts. It was shown that the adenine binding is critical to the folding of the add A-riboswitch while the removal of the ligand can result in drastic increase of the thermodynamic fluctuations especially in the junction regions between helix domains. Under the assumption that the native contacts with the highest thermodynamic fluctuations break first, the iterative GNM simulations showed that the unfolding process of the adenine-free add A-riboswitch starts with the denature of the terminal helix stem, followed by the loops and junctions involving ligand binding pocket, and then the central helix domains. Despite the simplified coarse-grained modeling, the unfolding dynamics and pathways are shown in close agreement with the results from atomic-level MD simulations and the NMR and single-molecule force spectroscopy experiments. Overall, the study demonstrates a new avenue to investigate the binding and folding dynamics of add A-riboswitch molecule which can be readily extended for other RNA molecules.

  14. Multidisciplinary model-based-engineering for laser weapon systems: recent progress

    NASA Astrophysics Data System (ADS)

    Coy, Steve; Panthaki, Malcolm

    2013-09-01

    We are working to develop a comprehensive, integrated software framework and toolset to support model-based engineering (MBE) of laser weapons systems. MBE has been identified by the Office of the Director, Defense Science and Engineering as one of four potentially "game-changing" technologies that could bring about revolutionary advances across the entire DoD research and development and procurement cycle. To be effective, however, MBE requires robust underlying modeling and simulation technologies capable of modeling all the pertinent systems, subsystems, components, effects, and interactions at any level of fidelity that may be required in order to support crucial design decisions at any point in the system development lifecycle. Very often the greatest technical challenges are posed by systems involving interactions that cut across two or more distinct scientific or engineering domains; even in cases where there are excellent tools available for modeling each individual domain, generally none of these domain-specific tools can be used to model the cross-domain interactions. In the case of laser weapons systems R&D these tools need to be able to support modeling of systems involving combined interactions among structures, thermal, and optical effects, including both ray optics and wave optics, controls, atmospheric effects, target interaction, computational fluid dynamics, and spatiotemporal interactions between lasing light and the laser gain medium. To address this problem we are working to extend Comet™, to add the addition modeling and simulation capabilities required for this particular application area. In this paper we will describe our progress to date.

  15. Program For Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.

    1991-01-01

    User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.

  16. Adding Badging to a Marketing Simulation to Increase Motivation to Learn

    ERIC Educational Resources Information Center

    Saxton, M. Kim

    2015-01-01

    Badging has become a popular tool for obtaining social recognition for personal accomplishments. This innovation describes a way to add badging to a marketing simulation to increase student motivation to achieve the simulation's goals. Assessments indicate that badging both motivates students to perform better and helps explain students' perceived…

  17. Age-dependent effects of topiramate on the acquisition and the retention of rapid kindling.

    PubMed

    Mazarati, Andréy; Shin, Don; Auvin, Stéphane; Sankar, Raman

    2007-04-01

    To examine antiepileptogenic, disease modifying, and anticonvulsant effects of topiramate under conditions of rapid kindling at different stages of development. Afterdischarge threshold (ADT) and duration (ADD) were examined in 2-, 3-, and 5-week-old Wistar rats before and after administration of topiramate (200 mg/kg). Animals underwent a rapid kindling protocol (sixty 10-s trains, bipolar 20 Hz square wave pulses delivered every 5 min). The progression of behavioral and electrographic seizures, and responses to test stimulations 24 h after the protocol were compared between topiramate and vehicle-treated control rats. In addition, rats that were previously given vehicle only prior to kindling, were then given topiramate to examine the effect on established kindled seizures. In 2-week-old animals, topiramate affected neither the baseline afterdischarge, nor the progression of kindled seizures. In 3-week-old rats, topiramate did not modify the baseline afterdischarge, but significantly delayed the occurrence of full motor seizures in response to repeated stimulations. Topiramate treatment of 5-week-old rats increased baseline ADT, shortened ADD, and delayed the progression of kindled seizures. Twenty-four h after the last kindling stimulation, animals of all ages exhibited a decreased ADT, an increase ADD, and developed behavioral seizures in response to threshold stimulation. Vehicle-treated kindled rats that were then given topiramate displayed significantly attenuated behavioral seizures induced by the threshold stimulation. Topiramate exhibited age-dependent disease-modifying effects under conditions of rapid kindling, but failed to block epileptogenesis. Topiramate also inhibited kindled seizures with equal efficacy across the three ages.

  18. Age-Dependent Effects of Topiramate on the Acquisition and the Retention of Rapid Kindling

    PubMed Central

    Mazarati, Andréy; Shin, Don; Auvin, Stéphane; Sankar, Raman

    2008-01-01

    Summary Purpose To examine antiepileptogenic, disease-modifying, and anticonvulsant effects of topiramate under conditions of rapid kindling at different stages of development. Methods Afterdischarge threshold (ADT) and duration (ADD) were examined in two-, three-, and five-week old Wistar rats before and after administration of topiramate (200 mg/kg). Animals underwent a rapid kindling protocol (sixty 10 second trains, bipolar 20 Hz square wave pulses delivered every five minutes). The progression of behavioral and electrographic seizures, and responses to test stimulations 24 hours after the protocol were compared between topiramate and vehicle treated control rats. In addition, rats that were previously given vehicle only prior to kindling, were then given topiramate to examine the effect on established kindled seizures. Results In two-week old animals, topiramate affected neither the baseline afterdischarge, nor the progression of kindled seizures. In three-week old rats, topiramate did not modify the baseline afterdischarge, but significantly delayed the occurrence of full motor seizures in response to repeated stimulations. Topiramate treatment of five-week old rats increased baseline ADT, shortened ADD, and delayed the progression of kindled seizures. Twenty four hours after the last kindling stimulation, animals of all ages exhibited a decreased ADT, an increase ADD, and developed behavioral seizures in response to threshold stimulation. Vehicle treated kindled rats that were then given topiramate displayed significantly attenuated behavioral seizures induced by the threshold stimulation. Conclusions Topiramate exhibited age-dependent disease-modifying effects under conditions of rapid kindling, but failed to block epileptogenesis. Topiramate also inhibited kindled seizures with equal efficacy across the three ages. PMID:17319916

  19. Coupling of Peridynamics and Finite Element Formulation for Multiscale Simulations

    DTIC Science & Technology

    2012-10-16

    unidirectional fiber - reinforced composites, Computer Methods in Applied Mechanics and Engineering 217 (2012) 247-261. [44] S. A. Silling, M. Epton...numerical testing for different grid widths to horizon ratios , (4) development of an approach to add another material variable in the given approach...partition of unity principle, (3) numerical testing for different grid widths to horizon ratios , (4) development of an approach to add another

  20. Euclid Cosmological Simulations Requirements and Implementation Plan

    NASA Technical Reports Server (NTRS)

    Kiessling, Alina

    2012-01-01

    Simulations are essential for the successful undertaking of the Euclid mission. The simulations requirements for the Euclid mission are vast ! It is an enormous undertaking that includes development of software and acquisition of hardware facilities. The simulations requirements are currently being finalised - please contact myself or Elisabetta Semboloni if you would like to add/modify any r equi r ements (or if you would like to be involved in the development of the simulations).

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chunhua; Department of Computational Medicine and Bioinformatics, University of Michigan, Ann Arbor, Michigan 45108; Lv, Dashuai

    Riboswitches are noncoding mRNA segments that can regulate the gene expression via altering their structures in response to specific metabolite binding. We proposed a coarse-grained Gaussian network model (GNM) to examine the unfolding and folding dynamics of adenosine deaminase (add) A-riboswitch upon the adenine dissociation, in which the RNA is modeled by a nucleotide chain with interaction networks formed by connecting adjoining atomic contacts. It was shown that the adenine binding is critical to the folding of the add A-riboswitch while the removal of the ligand can result in drastic increase of the thermodynamic fluctuations especially in the junction regionsmore » between helix domains. Under the assumption that the native contacts with the highest thermodynamic fluctuations break first, the iterative GNM simulations showed that the unfolding process of the adenine-free add A-riboswitch starts with the denature of the terminal helix stem, followed by the loops and junctions involving ligand binding pocket, and then the central helix domains. Despite the simplified coarse-grained modeling, the unfolding dynamics and pathways are shown in close agreement with the results from atomic-level MD simulations and the NMR and single-molecule force spectroscopy experiments. Overall, the study demonstrates a new avenue to investigate the binding and folding dynamics of add A-riboswitch molecule which can be readily extended for other RNA molecules.« less

  2. The effect of different control point sampling sequences on convergence of VMAT inverse planning

    NASA Astrophysics Data System (ADS)

    Pardo Montero, Juan; Fenwick, John D.

    2011-04-01

    A key component of some volumetric-modulated arc therapy (VMAT) optimization algorithms is the progressive addition of control points to the optimization. This idea was introduced in Otto's seminal VMAT paper, in which a coarse sampling of control points was used at the beginning of the optimization and new control points were progressively added one at a time. A different form of the methodology is also present in the RapidArc optimizer, which adds new control points in groups called 'multiresolution levels', each doubling the number of control points in the optimization. This progressive sampling accelerates convergence, improving the results obtained, and has similarities with the ordered subset algorithm used to accelerate iterative image reconstruction. In this work we have used a VMAT optimizer developed in-house to study the performance of optimization algorithms which use different control point sampling sequences, most of which fall into three different classes: doubling sequences, which add new control points in groups such that the number of control points in the optimization is (roughly) doubled; Otto-like progressive sampling which adds one control point at a time, and equi-length sequences which contain several multiresolution levels each with the same number of control points. Results are presented in this study for two clinical geometries, prostate and head-and-neck treatments. A dependence of the quality of the final solution on the number of starting control points has been observed, in agreement with previous works. We have found that some sequences, especially E20 and E30 (equi-length sequences with 20 and 30 multiresolution levels, respectively), generate better results than a 5 multiresolution level RapidArc-like sequence. The final value of the cost function is reduced up to 20%, such reductions leading to small improvements in dosimetric parameters characterizing the treatments—slightly more homogeneous target doses and better sparing of the organs at risk.

  3. Classroom as Reality: Demonstrating Campaign Effects through Live Simulation

    ERIC Educational Resources Information Center

    Coffey, Daniel J.; Miller, William J.; Feuerstein, Derek

    2011-01-01

    Scholastic research has demonstrated that when conducted properly, active learning exercises are successful at increasing student awareness, student interest, and knowledge retention. Face-to-face simulations, in particular, have been demonstrated to add positively to classrooms focusing on comparative politics, international relations, public…

  4. High-Throughput Toxicity Testing: New Strategies for Assessing Chemical Safety

    EPA Science Inventory

    In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct add...

  5. An Unembalmed Cadaveric Preparation for Simulating Pleural Effusion: A Pilot Study of Chest Percussion Involving Medical Students

    ERIC Educational Resources Information Center

    Cook, Mark S.; Kernahan, Peter J.

    2017-01-01

    Cadaveric simulations are an effective way to add clinical context to an anatomy course. In this study, unembalmed (fresh) cadavers were uniquely prepared to simulate pleural effusion to teach chest percussion and review thoracic anatomy. Thirty first-year medical students were assigned to either an intervention (Group A) or control group (Group…

  6. Diagnostic and Prognostic Value of the Combination of Two Measures of Verbal Memory in Mild Cognitive Impairment due to Alzheimer's Disease.

    PubMed

    Sala, Isabel; Illán-Gala, Ignacio; Alcolea, Daniel; Sánchez-Saudinós, Ma Belén; Salgado, Sergio Andrés; Morenas-Rodríguez, Estrella; Subirana, Andrea; Videla, Laura; Clarimón, Jordi; Carmona-Iragui, María; Ribosa-Nogué, Roser; Blesa, Rafael; Fortea, Juan; Lleó, Alberto

    2017-01-01

    Episodic memory impairment is the core feature of typical Alzheimer's disease. To evaluate the performance of two commonly used verbal memory tests to detect mild cognitive impairment due to Alzheimer's disease (MCI-AD) and to predict progression to Alzheimer's disease dementia (AD-d). Prospective study of MCI patients in a tertiary memory disorder unit. Patients underwent an extensive neuropsychological battery including two tests of declarative verbal memory: The Free and Cued Selective Reminding Test (FCSRT) and the word list learning task from the Consortium to Establish a Registry for Alzheimer's disease (CERAD-WL). Cerebrospinal fluid (CSF) was obtained from all patients and MCI-AD was defined by means of the t-Tau/Aβ1-42 ratio. Logistic regression analyses tested whether the combination of FCSRT and CERAD-WL measures significantly improved the prediction of MCI-AD. Progression to AD-d was analyzed in a Cox regression model. A total of 202 MCI patients with a mean follow-up of 34.2±24.2 months were included and 98 (48.5%) met the criteria for MCI-AD. The combination of FCSRT and CERAD-WL measures improved MCI-AD classification accuracy based on CSF biomarkers. Both tests yielded similar global predictive values (59.9-65.3% and 59.4-62.8% for FCSRT and CERAD-WL, respectively). MCI-AD patients with deficits in both FCSRT and CERAD-WL had a faster progression to AD-d than patients with deficits in only one test. The combination of FCSRT and CERAD-WL improves the classification of MCI-AD and defines different prognostic profiles. These findings have important implications for clinical practice and the design of clinical trials.

  7. Preliminary efficacy, safety, pharmacokinetics, pharmacodynamics and quality of life study of pegylated recombinant human arginase 1 in patients with advanced hepatocellular carcinoma.

    PubMed

    Yau, Thomas; Cheng, Paul N; Chan, Pierre; Chen, Li; Yuen, Jimmy; Pang, Roberta; Fan, Sheung Tat; Wheatley, Denys N; Poon, Ronnie T

    2015-04-01

    This study was designed to evaluate the efficacy, safety profile, pharmacokinetics, pharmacodynamics and quality of life of pegylated recombinant human arginase 1 (Peg-rhAgr1) in patients with advanced hepatocellular carcinoma (HCC). Patients were given weekly doses of Peg-rhAgr1 (1600 U/kg). Tumour response was assessed every 8 weeks using RECIST 1.1 and modified RECIST criteria. A total of 20 patients were recruited, of whom 15 were deemed evaluable for treatment efficacy. Eighteen patients (90%) were hepatitis B carriers. Median age was 61.5 (range 30-75). Overall disease control rate was 13%, with 2 of the 15 patients achieving stable disease for >8 weeks. The median progression-free survival (PFS) was 1.7 (95% CI: 1.67-1.73) months, with median overall survival (OS) of all 20 enrolled patients being 5.2 (95% CI: 3.3-12.0) months. PFS was significantly prolonged in patients with adequate arginine depletion (ADD) >2 months versus those who had ≤2 months of ADD (6.4 versus 1.7 months; p = 0.01). The majority of adverse events (AEs) were grade 1/2 non-hematological toxicities. Transient liver dysfunctions (25%) were the most commonly reported serious AEs and likely due to disease progression. Pharmacokinetic and pharmacodynamic data showed that Peg-rhAgr1 induced rapid and sustained arginine depletion. The overall quality of life of the enrolled patients was well preserved. Peg-rhAgr1 is well tolerated with a good toxicity profile in patients with advanced HCC. A weekly dose of 1600 U/kg is sufficient to induce ADD. Significantly longer PFS times were recorded for patients who had ADD for >2 months.

  8. Kissing loop interaction in adenine riboswitch: insights from umbrella sampling simulations.

    PubMed

    Di Palma, Francesco; Bottaro, Sandro; Bussi, Giovanni

    2015-01-01

    Riboswitches are cis-acting regulatory RNA elements prevalently located in the leader sequences of bacterial mRNA. An adenine sensing riboswitch cis-regulates adeninosine deaminase gene (add) in Vibrio vulnificus. The structural mechanism regulating its conformational changes upon ligand binding mostly remains to be elucidated. In this open framework it has been suggested that the ligand stabilizes the interaction of the distal "kissing loop" complex. Using accurate full-atom molecular dynamics with explicit solvent in combination with enhanced sampling techniques and advanced analysis methods it could be possible to provide a more detailed perspective on the formation of these tertiary contacts. In this work, we used umbrella sampling simulations to study the thermodynamics of the kissing loop complex in the presence and in the absence of the cognate ligand. We enforced the breaking/formation of the loop-loop interaction restraining the distance between the two loops. We also assessed the convergence of the results by using two alternative initialization protocols. A structural analysis was performed using a novel approach to analyze base contacts. Contacts between the two loops were progressively lost when larger inter-loop distances were enforced. Inter-loop Watson-Crick contacts survived at larger separation when compared with non-canonical pairing and stacking interactions. Intra-loop stacking contacts remained formed upon loop undocking. Our simulations qualitatively indicated that the ligand could stabilize the kissing loop complex. We also compared with previously published simulation studies. Kissing complex stabilization given by the ligand was compatible with available experimental data. However, the dependence of its value on the initialization protocol of the umbrella sampling simulations posed some questions on the quantitative interpretation of the results and called for better converged enhanced sampling simulations.

  9. Adventures in Assessment

    ERIC Educational Resources Information Center

    Hawkey, Kate; Thorne, Sally; Arkinstall, Philip; Bryant, Matthew; Rawlings, David; Kennett, Richard; Fletcher, Adele

    2015-01-01

    In "Teaching History 157, Assessment Edition," a number of different teachers shared the ways in which their departments were approaching the assessment and reporting of students' progress in a "post-levels" world. This article adds to those examples, first by illustrating how teachers from different schools in the Bristol area…

  10. Recent developments in computer modeling add ecological realism to landscape genetics

    EPA Science Inventory

    Background / Question / Methods A factor limiting the rate of progress in landscape genetics has been the shortage of spatial models capable of linking life history attributes such as dispersal behavior to complex dynamic landscape features. The recent development of new models...

  11. Performance of Hippocampus Volumetry with FSL-FIRST for Prediction of Alzheimer's Disease Dementia in at Risk Subjects with Amnestic Mild Cognitive Impairment.

    PubMed

    Suppa, Per; Hampel, Harald; Kepp, Timo; Lange, Catharina; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph

    2016-01-01

    MRI-based hippocampus volume, a core feasible biomarker of Alzheimer's disease (AD), is not yet widely used in clinical patient care, partly due to lack of validation of software tools for hippocampal volumetry that are compatible with routine workflow. Here, we evaluate fully-automated and computationally efficient hippocampal volumetry with FSL-FIRST for prediction of AD dementia (ADD) in subjects with amnestic mild cognitive impairment (aMCI) from phase 1 of the Alzheimer's Disease Neuroimaging Initiative. Receiver operating characteristic analysis of FSL-FIRST hippocampal volume (corrected for head size and age) revealed an area under the curve of 0.79, 0.70, and 0.70 for prediction of aMCI-to-ADD conversion within 12, 24, or 36 months, respectively. Thus, FSL-FIRST provides about the same power for prediction of progression to ADD in aMCI as other volumetry methods.

  12. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  13. Bribes or Rewards.

    ERIC Educational Resources Information Center

    Megyeri, Kathy A.

    Small tangible rewards for student progress, such as candy bars, pens, or ribbons, add potency to the verbal and written praise offered by the teacher, thus increasing student motivation. Giving students small prizes enhances the cooperative atmosphere of learning, especially for those who do not normally do well. Research indicates that low…

  14. RESEARCH PROJECT A: MAPPING DISPARITIES IN BIRTH OUTCOMES

    EPA Science Inventory

    We plan to continue working on each of the areas described in the progress report's summary of accomplishments section. Achieving a better understanding of exposure to air toxins, as well as particulate ma...

  15. In-depth characterization of the microRNA transcriptome in a leukemia progression model

    PubMed Central

    Kuchenbauer, Florian; Morin, Ryan D.; Argiropoulos, Bob; Petriv, Oleh I.; Griffith, Malachi; Heuser, Michael; Yung, Eric; Piper, Jessica; Delaney, Allen; Prabhu, Anna-Liisa; Zhao, Yongjun; McDonald, Helen; Zeng, Thomas; Hirst, Martin; Hansen, Carl L.; Marra, Marco A.; Humphries, R. Keith

    2008-01-01

    MicroRNAs (miRNAs) have been shown to play important roles in physiological as well as multiple malignant processes, including acute myeloid leukemia (AML). In an effort to gain further insight into the role of miRNAs in AML, we have applied the Illumina massively parallel sequencing platform to carry out an in-depth analysis of the miRNA transcriptome in a murine leukemia progression model. This model simulates the stepwise conversion of a myeloid progenitor cell by an engineered overexpression of the nucleoporin 98 (NUP98)–homeobox HOXD13 fusion gene (ND13), to aggressive AML inducing cells upon transduction with the oncogenic collaborator Meis1. From this data set, we identified 307 miRNA/miRNA* species in the ND13 cells and 306 miRNA/miRNA* species in ND13+Meis1 cells, corresponding to 223 and 219 miRNA genes. Sequence counts varied between two and 136,558, indicating a remarkable expression range between the detected miRNA species. The large number of miRNAs expressed and the nature of differential expression suggest that leukemic progression as modeled here is dictated by the repertoire of shared, but differentially expressed miRNAs. Our finding of extensive sequence variations (isomiRs) for almost all miRNA and miRNA* species adds additional complexity to the miRNA transcriptome. A stringent target prediction analysis coupled with in vitro target validation revealed the potential for miRNA-mediated release of oncogenes that facilitates leukemic progression from the preleukemic to leukemia inducing state. Finally, 55 novel miRNAs species were identified in our data set, adding further complexity to the emerging world of small RNAs. PMID:18849523

  16. Management of Wood Products Manufacturing Using Simulation/Animation

    Treesearch

    D. Earl Kline; J.K. Wiedenbeck; Philip A. Araman

    1992-01-01

    Managers of hardwood processing facilities need timely information on which to base important decisions such as when to add costly equipment or how to improve profitability subject to time-varying demands. The overall purpose of this paper is to introduce a method that can effectively provide such timely information. A simulation/animation modeling procedure is...

  17. Slow and Steady: Ocean Circulation. The Influence of Sea Surface Height on Ocean Currents

    NASA Technical Reports Server (NTRS)

    Haekkinen, Sirpa

    2000-01-01

    The study of ocean circulation is vital to understanding how our climate works. The movement of the ocean is closely linked to the progression of atmospheric motion. Winds close to sea level add momentum to ocean surface currents. At the same time, heat that is stored and transported by the ocean warms the atmosphere above and alters air pressure distribution. Therefore, any attempt to model climate variation accurately must include reliable calculations of ocean circulation. Unlike movement of the atmosphere, movement of the ocean's waters takes place mostly near the surface. The major patterns of surface circulation form gigantic circular cells known as gyres. They are categorized according to their general location-equatorial, subtropical, subpolar, and polar-and may run across an entire ocean. The smaller-scale cell of ocean circulation is known' as an eddy. Eddies are much more common than gyres and much more difficult to track in computer simulations of ocean currents.

  18. Stray light rejection in giant externally-occulted solar coronagraphs: experimental developments

    NASA Astrophysics Data System (ADS)

    Venet, M.; Bazin, C.; Koutchmy, S.; Lamy, P.

    2017-11-01

    The advent of giant, formation-flight, externally-occulted solar coronagraphs such as ASPIICS (Association de Satellites Pour l'Imagerie et l'Interférométrie de la Couronne Solaire [1,2,3,4]) selected by the European Space Agency (ESA) for its third PROBA (Project for On-Board Autonomy) mission of formation flying demonstration (presently in phase B) and Hi-RISE proposed in the framework of ESA Cosmic Vision program, presents formidable challenges for the study and calibration of instrumental stray light. With distances between the external occulter (EO) and the optical pupil (OP) exceeding hundred meters and occulter sizes larger than a meter, it becomes impossible to perform tests at the real scale. The requirement to limit the over-occultation to less than 1.05 Rsun, orders of magnitude to what has been achieved so far in past coronagraphs, further adds to the challenge. We are approaching the problem experimentally using reduced scale simulators and present below a progress report of our work.

  19. AN/SLQ-32 EW System Model: and Expandable, Object-Oriented, Process- Based Simulation

    DTIC Science & Technology

    1992-09-01

    CONST threshold = 0.1; timetol = 0.01; orientol = 5.8; VAR rec, recLast :BufferBeamRecType; time,power : REAL; powerl,orientation : REAL; BEGIN NEW...PulseGroup); rec:-ASK BufferBeam Removed; time: =rec. time; orientation: =rec. orientation; OUTPUT ( "ORIENREFI, orientation); recLast :=ASK BufferBeam Last...TO Add(rec); IF (rec= recLast ) EXIT; END IF; rec :=ASK BufferBeam Remove o; ELSE ASK BufferBeam TO Add(rec); IF (rec = recLast ) EXIT; END IF; rec

  20. Applications of aerospace technology

    NASA Technical Reports Server (NTRS)

    Rouse, D. J.; Brown, J. N., Jr.; Cleland, John; Lehrman, Stephen; Trachtman, Lawrence; Wallace, Robert; Winfield, Daniel; Court, Nancy; Maggin, Bernard; Barnett, Reed

    1987-01-01

    Highlights are presented for the Research Triangle Institute (RTI) Applications Team activities over the past quarter. Progress in fulfilling the requirements of the contract is summarized, along with the status of the eight add-on tasks. New problem statements are presented. Transfer activities for ongoing projects with the NASA Centers are included.

  1. Measurement Science and Training.

    ERIC Educational Resources Information Center

    Bunderson, C. Victor

    The need for training and retraining is a central element in current discussions about the economy of the United States. This paper is designed to introduce training practitioners to some new concepts about how measurement science can provide a new framework for assessing progress and can add new discipline to the development, implementation, and…

  2. It's No Secret: Progress Prized in Brownsville

    ERIC Educational Resources Information Center

    Zehr, Mary Ann

    2008-01-01

    This article features Brownsville Independent School District which was awarded the prestigious 2008 Broad Prize for Urban Education for being the nation's most improved urban school district. The Texas border district sees teacher training and data-based instruction as paths to learning gains--and the $1 million Broad award adds validation. In…

  3. Scientific Arguments as Learning Artifacts: Designing for Learning from the Web with KIE.

    ERIC Educational Resources Information Center

    Bell, Philip; Linn, Marcia C.

    2000-01-01

    Examines how students use evidence, determines when they add further ideas and claims, and measures progress in understanding light propagation. Uses the Scaffolded Knowledge Integration (SKI) instructional framework for design decisions. Discusses design studies that test and elaborate on the instructional framework. (Contains 33 references.)…

  4. Methodological Concerns about the Education Value-Added Assessment System

    ERIC Educational Resources Information Center

    Amrein-Beardsley, Audrey

    2008-01-01

    Value-added models help to evaluate the knowledge that school districts, schools, and teachers add to student learning as students progress through school. In this article, the well-known Education Value-Added Assessment System (EVAAS) is examined. The author presents a practical investigation of the methodological issues associated with the…

  5. Alabama's Education Report Card, 2011-2012

    ERIC Educational Resources Information Center

    Alabama State Department of Education, 2013

    2013-01-01

    Educational progress has been moving in the right direction for several years in Alabama. Now, with the implementation of Alabama's own Plan 2020, an even higher level of accountability for students, teachers, administrators, support systems, and schools/school systems, Alabama is poised to experience unprecedented growth. Add to that, the Alabama…

  6. Predicting reading outcomes with progress monitoring slopes among middle grade students

    PubMed Central

    Tolar, Tammy D.; Barth, Amy E.; Fletcher, Jack M.; Francis, David J.; Vaughn, Sharon

    2013-01-01

    Effective implementation of response-to-intervention (RTI) frameworks depends on efficient tools for monitoring progress. Evaluations of growth (i.e., slope) may be less efficient than evaluations of status at a single time point, especially if slopes do not add to predictions of outcomes over status. We examined progress monitoring slope validity for predicting reading outcomes among middle school students by evaluating latent growth models for different progress monitoring measure-outcome combinations. We used multi-group modeling to evaluate the effects of reading ability, reading intervention, and progress monitoring administration condition on slope validity. Slope validity was greatest when progress monitoring was aligned with the outcome (i.e., word reading fluency slope was used to predict fluency outcomes in contrast to comprehension outcomes), but effects varied across administration conditions (viz., repeated reading of familiar vs. novel passages). Unless the progress monitoring measure is highly aligned with outcome, slope may be an inefficient method for evaluating progress in an RTI context. PMID:24659899

  7. A molecular simulation protocol to avoid sampling redundancy and discover new states.

    PubMed

    Bacci, Marco; Vitalis, Andreas; Caflisch, Amedeo

    2015-05-01

    For biomacromolecules or their assemblies, experimental knowledge is often restricted to specific states. Ambiguity pervades simulations of these complex systems because there is no prior knowledge of relevant phase space domains, and sampling recurrence is difficult to achieve. In molecular dynamics methods, ruggedness of the free energy surface exacerbates this problem by slowing down the unbiased exploration of phase space. Sampling is inefficient if dwell times in metastable states are large. We suggest a heuristic algorithm to terminate and reseed trajectories run in multiple copies in parallel. It uses a recent method to order snapshots, which provides notions of "interesting" and "unique" for individual simulations. We define criteria to guide the reseeding of runs from more "interesting" points if they sample overlapping regions of phase space. Using a pedagogical example and an α-helical peptide, the approach is demonstrated to amplify the rate of exploration of phase space and to discover metastable states not found by conventional sampling schemes. Evidence is provided that accurate kinetics and pathways can be extracted from the simulations. The method, termed PIGS for Progress Index Guided Sampling, proceeds in unsupervised fashion, is scalable, and benefits synergistically from larger numbers of replicas. Results confirm that the underlying ideas are appropriate and sufficient to enhance sampling. In molecular simulations, errors caused by not exploring relevant domains in phase space are always unquantifiable and can be arbitrarily large. Our protocol adds to the toolkit available to researchers in reducing these types of errors. This article is part of a Special Issue entitled "Recent developments of molecular dynamics". Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Additional confirmation of the validity of laboratory simulation of cloud radiances

    NASA Technical Reports Server (NTRS)

    Davis, J. M.; Cox, S. K.

    1986-01-01

    The results of a laboratory experiment are presented that provide additional verification of the methodology adopted for simulation of the radiances reflected from fields of optically thick clouds using the Cloud Field Optical Simulator (CFOS) at Colorado State University. The comparison of these data with their theoretically derived counterparts indicates that the crucial mechanism of cloud-to-cloud radiance field interaction is accurately simulated in the CFOS experiments and adds confidence to the manner in which the optical depth is scaled.

  9. 18F PET with florbetaben for the early diagnosis of Alzheimer's disease dementia and other dementias in people with mild cognitive impairment (MCI).

    PubMed

    Martínez, Gabriel; Vernooij, Robin Wm; Fuentes Padilla, Paulina; Zamora, Javier; Flicker, Leon; Bonfill Cosp, Xavier

    2017-11-22

    18 F-florbetaben uptake by brain tissue, measured by positron emission tomography (PET), is accepted by regulatory agencies like the Food and Drug Administration (FDA) and the European Medicine Agencies (EMA) for assessing amyloid load in people with dementia. Its added value is mainly demonstrated by excluding Alzheimer's pathology in an established dementia diagnosis. However, the National Institute on Aging and Alzheimer's Association (NIA-AA) revised the diagnostic criteria for Alzheimer's disease and confidence in the diagnosis of mild cognitive impairment (MCI) due to Alzheimer's disease may be increased when using some amyloid biomarkers tests like 18 F-florbetaben. These tests, added to the MCI core clinical criteria, might increase the diagnostic test accuracy (DTA) of a testing strategy. However, the DTA of 18 F-florbetaben to predict the progression from MCI to Alzheimer's disease dementia (ADD) or other dementias has not yet been systematically evaluated. To determine the DTA of the 18 F-florbetaben PET scan for detecting people with MCI at time of performing the test who will clinically progress to ADD, other forms of dementia (non-ADD), or any form of dementia at follow-up. The most recent search for this review was performed in May 2017. We searched MEDLINE (OvidSP), Embase (OvidSP), PsycINFO (OvidSP), BIOSIS Citation Index (Thomson Reuters Web of Science), Web of Science Core Collection, including the Science Citation Index (Thomson Reuters Web of Science) and the Conference Proceedings Citation Index (Thomson Reuters Web of Science), LILACS (BIREME), CINAHL (EBSCOhost), ClinicalTrials.gov (https://clinicaltrials.gov), and the World Health Organization International Clinical Trials Registry Platform (WHO ICTRP) (http://www.who.int/ictrp/search/en/). We also searched ALOIS, the Cochrane Dementia & Cognitive Improvement Group's specialised register of dementia studies (http://www.medicine.ox.ac.uk/alois/). We checked the reference lists of any relevant studies and systematic reviews, and performed citation tracking using the Science Citation Index to identify any additional relevant studies. No language or date restrictions were applied to electronic searches. We included studies that had prospectively defined cohorts with any accepted definition of MCI at time of performing the test and the use of 18 F-florbetaben scan to evaluate the DTA of the progression from MCI to ADD or other forms of dementia. In addition, we only selected studies that applied a reference standard for Alzheimer's dementia diagnosis, for example, the National Institute of Neurological and Communicative Disorders and Stroke and the Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA) or Diagnostic and Statistical Manual of Mental Disorders-IV (DSM-IV) criteria. We screened all titles and abstracts identified in electronic-database searches. Two review authors independently selected studies for inclusion and extracted data to create two-by-two tables, showing the binary test results cross-classified with the binary reference standard. We used these data to calculate sensitivities, specificities, and their 95% confidence intervals. Two independent assessors performed quality assessment using the QUADAS-2 tool plus some additional items to assess the methodological quality of the included studies. Progression from MCI to ADD, any other form of dementia, and any form of dementia was evaluated in one study (Ong 2015). It reported data on 45 participants at four years of follow-up; 21 participants met NINCDS-ADRDA criteria for Alzheimer's disease dementia at four years of follow-up, the proportion converting to ADD was 47% of the 45 participants, and 11% of the 45 participants met criteria for other types of dementias (three cases of FrontoTemporal Dementia (FTD), one of Dementia with Lewy body (DLB), and one of Progressive Supranuclear Palsy (PSP)). We considered the study to be at high risk of bias in the domains of the reference standard, flow, and timing (QUADAS-2). MCI to ADD; 18 F-florbetaben PET scan analysed visually: the sensitivity was 100% (95% confidence interval (CI) 84% to 100%) and the specificity was 83% (95% CI 63% to 98%) (n = 45, 1 study). Analysed quantitatively: the sensitivity was 100% (95% CI 84% to 100%) and the specificity was 88% (95% CI 68% to 97%) for the diagnosis of ADD at follow-up (n = 45, 1 study). MCI to any other form of dementia (non-ADD); 18 F-florbetaben PET scan analysed visually: the sensitivity was 0% (95% CI 0% to 52%) and the specificity was 38% (95% CI 23% to 54%) (n = 45, 1 study). Analysed quantitatively: the sensitivity was 0% (95% CI 0% to 52%) and the specificity was 40% (95% CI 25% to 57%) for the diagnosis of any other form of dementia at follow-up (n = 45, 1 study). MCI to any form of dementia; 18 F-florbetaben PET scan analysed visually: the sensitivity was 81% (95% CI 61% to 93%) and the specificity was 79% (95% CI 54% to 94%) (n = 45, 1 study). Analysed quantitatively: the sensitivity was 81% (95% CI 61% to 93%) and the specificity was 84% (95% CI 60% to 97%) for the diagnosis of any form of dementia at follow-up (n = 45, 1 study). Although we were able to calculate one estimation of DTA in, especially, the prediction of progression from MCI to ADD at four years follow-up, the small number of participants implies imprecision of sensitivity and specificity estimates. We cannot make any recommendation regarding the routine use of 18 F-florbetaben in clinical practice based on one single study with 45 participants. 18 F-florbetaben has high financial costs, therefore, clearly demonstrating its DTA and standardising the process of the 18 F-florbetaben modality are important prior to its wider use.

  10. Design and Evaluation of Wood Processing Facilities Using Object-Oriented Simulation

    Treesearch

    D. Earl Kline; Philip A. Araman

    1992-01-01

    Managers of hardwood processing facilities need timely information on which to base important decisions such as when to add costly equipment or how to improve profitability subject to time-varying demands. The overall purpose of this paper is to introduce a tool that can effectively provide such timely information. A simulation/animation modeling procedure is described...

  11. Burn severity mapping using simulation modeling and satellite imagery

    Treesearch

    Eva C. Karau; Robert E. Keane

    2010-01-01

    Although burn severity maps derived from satellite imagery provide a landscape view of fire impacts, fire effects simulation models can provide spatial fire severity estimates and add a biotic context in which to interpret severity. In this project, we evaluated two methods of mapping burn severity in the context of rapid post-fire assessment for four wildfires in...

  12. The Tides--A Neglected Topic.

    ERIC Educational Resources Information Center

    Hartel, Hermann

    2000-01-01

    Finds that computer simulations can be used to visualize the processes involved with lunar tides. Technology adds value, thus opening new paths for a more distinct analysis and increased learning results. (Author/CCM)

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanov, Gennady; /Fermilab

    CST Particle Studio combines electromagnetic field simulation, multi-particle tracking, adequate post-processing and advanced probabilistic emission model, which is the most important new capability in multipactor simulation. The emission model includes in simulation the stochastic properties of emission and adds primary electron elastic and inelastic reflection from the surfaces. The simulation of multipactor in coaxial waveguides have been performed to study the effects of the innovations on the multipactor threshold and the range over which multipactor can occur. The results compared with available previous experiments and simulations as well as the technique of MP simulation with CST PS are presented andmore » discussed.« less

  14. Teaching and Learning Physics: Performance Art Evoking Insight

    ERIC Educational Resources Information Center

    Sommer, Wilfried

    2015-01-01

    Doing experiments in physics lessons can create a magical moment if students become really intrigued with the experimental progression. They add a new quality to what the experiment shows. Their attention and nature's revelations flow together: a performance is taking place. It's similar to a moment during a theatrical performance, when the…

  15. Computational simulation of progressive fracture in fiber composites

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.

  16. Investigation of parabolic computational techniques for internal high-speed viscous flows

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Power, G. D.

    1985-01-01

    A feasibility study was conducted to assess the applicability of an existing parabolic analysis (ADD-Axisymmetric Diffuser Duct), developed previously for subsonic viscous internal flows, to mixed supersonic/subsonic flows with heat addition simulating a SCRAMJET combustor. A study was conducted with the ADD code modified to include additional convection effects in the normal momentum equation when supersonic expansion and compression waves were present. It is concluded from the present study that for the class of problems where strong viscous/inviscid interactions are present a global iteration procedure is required.

  17. A Hartree-Fock Application Using UPC++ and the New DArray Library

    DOE PAGES

    Ozog, David; Kamil, Amir; Zheng, Yili; ...

    2016-07-21

    The Hartree-Fock (HF) method is the fundamental first step for incorporating quantum mechanics into many-electron simulations of atoms and molecules, and it is an important component of computational chemistry toolkits like NWChem. The GTFock code is an HF implementation that, while it does not have all the features in NWChem, represents crucial algorithmic advances that reduce communication and improve load balance by doing an up-front static partitioning of tasks, followed by work stealing whenever necessary. To enable innovations in algorithms and exploit next generation exascale systems, it is crucial to support quantum chemistry codes using expressive and convenient programming modelsmore » and runtime systems that are also efficient and scalable. Here, this paper presents an HF implementation similar to GTFock using UPC++, a partitioned global address space model that includes flexible communication, asynchronous remote computation, and a powerful multidimensional array library. UPC++ offers runtime features that are useful for HF such as active messages, a rich calculus for array operations, hardware-supported fetch-and-add, and functions for ensuring asynchronous runtime progress. We present a new distributed array abstraction, DArray, that is convenient for the kinds of random-access array updates and linear algebra operations on block-distributed arrays with irregular data ownership. Finally, we analyze the performance of atomic fetch-and-add operations (relevant for load balancing) and runtime attentiveness, then compare various techniques and optimizations for each. Our optimized implementation of HF using UPC++ and the DArrays library shows up to 20% improvement over GTFock with Global Arrays at scales up to 24,000 cores.« less

  18. A Hartree-Fock Application Using UPC++ and the New DArray Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozog, David; Kamil, Amir; Zheng, Yili

    The Hartree-Fock (HF) method is the fundamental first step for incorporating quantum mechanics into many-electron simulations of atoms and molecules, and it is an important component of computational chemistry toolkits like NWChem. The GTFock code is an HF implementation that, while it does not have all the features in NWChem, represents crucial algorithmic advances that reduce communication and improve load balance by doing an up-front static partitioning of tasks, followed by work stealing whenever necessary. To enable innovations in algorithms and exploit next generation exascale systems, it is crucial to support quantum chemistry codes using expressive and convenient programming modelsmore » and runtime systems that are also efficient and scalable. Here, this paper presents an HF implementation similar to GTFock using UPC++, a partitioned global address space model that includes flexible communication, asynchronous remote computation, and a powerful multidimensional array library. UPC++ offers runtime features that are useful for HF such as active messages, a rich calculus for array operations, hardware-supported fetch-and-add, and functions for ensuring asynchronous runtime progress. We present a new distributed array abstraction, DArray, that is convenient for the kinds of random-access array updates and linear algebra operations on block-distributed arrays with irregular data ownership. Finally, we analyze the performance of atomic fetch-and-add operations (relevant for load balancing) and runtime attentiveness, then compare various techniques and optimizations for each. Our optimized implementation of HF using UPC++ and the DArrays library shows up to 20% improvement over GTFock with Global Arrays at scales up to 24,000 cores.« less

  19. Review of Flight Training Technology

    DTIC Science & Technology

    1976-07-01

    the cockpit. They might be used to train pilots in procedures to cope with NOE-altitude emergencies; howeve-r, a combination of cinematic simulation...airplanes. Although cockpit motion adds realism , thereby i-nproving pilot performanc, in the simulater Fedderqon, Vil; Guercio and Wall, i7?. Ince...operations. Light aircraft, part-task trainers, motion pictures and video tares, cinematic simulators, and digital teaching machines are among the

  20. Seeking Global Minima

    NASA Astrophysics Data System (ADS)

    Tajuddin, Wan Ahmad

    1994-02-01

    Ease in finding the configuration at the global energy minimum in a symmetric neural network is important for combinatorial optimization problems. We carry out a comprehensive survey of available strategies for seeking global minima by comparing their performances in the binary representation problem. We recall our previous comparison of steepest descent with analog dynamics, genetic hill-climbing, simulated diffusion, simulated annealing, threshold accepting and simulated tunneling. To this, we add comparisons to other strategies including taboo search and one with field-ordered updating.

  1. ARES Modeling of High-foot Implosions (NNSA Milestone #5466)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurricane, O. A.

    ARES “capsule only” simulations demonstrated results of applying an ASC code to a suite of high-foot ICF implosion experiments. While a capability to apply an asymmetric FDS drive to the capsule-only model using add-on Python routines exists, it was not exercised here. The ARES simulation results resemble the results from HYDRA simulations documented in A. Kritcher, et al., Phys. Plasmas, 23, 052709 (2016); namely, 1D simulation and data are in reasonable agreement for the lowest velocity experiments, but diverge from each other at higher velocities.

  2. Preliminary results for the design, fabrication, and performance of a backside-illuminated avalanche drift detector

    NASA Astrophysics Data System (ADS)

    Qiao, Yun; Liang, Kun; Chen, Wen-Fei; Han, De-Jun

    2013-10-01

    The detection of low-level light is a key technology in various experimental scientific studies. As a photon detector, the silicon photomultiplier (SiPM) has gradually become an alternative to the photomultiplier tube (PMT) in many applications in high-energy physics, astroparticle physics, and medical imaging because of its high photon detection efficiency (PDE), good resolution for single-photon detection, insensitivity to magnetic field, low operating voltage, compactness, and low cost. However, primarily because of the geometric fill factor, the PDE of most SiPMs is not very high; in particular, for those SiPMs with a high density of micro cells, the effective area is small, and the bandwidth of the light response is narrow. As a building block of the SiPM, the concept of the backside-illuminated avalanche drift detector (ADD) was first proposed by the Max Planck Institute of Germany eight years ago; the ADD is promising to have high PDE over the full energy range of optical photons, even ultraviolet light and X-ray light, and because the avalanche multiplication region is very small, the ADD is beneficial for the fabrication of large-area SiPMs. However, because of difficulties in design and fabrication, no significant progress had been made, and the concept had not yet been verified. In this paper, preliminary results in the design, fabrication, and performance of a backside-illuminated ADD are reported; the difficulties in and limitations to the backside-illuminated ADD are analyzed.

  3. Emotional Intelligence and Perceived Social Support among Italian High School Students

    ERIC Educational Resources Information Center

    Di Fabio, Annamaria; Kenny, Maureen E.

    2012-01-01

    Emotional intelligence (EI) has emerged in recent research as a teachable skill that is distinct from personality and is relevant to scholastic and work success and progress in career development. This study adds to that research by examining the relationship of performance and self-report measures of EI and personality traits with perceived…

  4. An Analysis of Multiple Configurations of Next-Generation Cathodes in a Low Power Hall Thruster

    DTIC Science & Technology

    2009-03-01

    compressor, the roughing pump , and the cryo-head temperature indicators. Figure 6. SPASS lab vacuum chamber and associated components. To measure...in progress to add additional cryo- pumps to the existing vacuum chamber that may allow higher propellant flow rates without exceeding ~1x10-5 torr... Vacuum Facility .........................................................................................................45 Test Assembly

  5. Read-Aloud and the English Language Learner

    ERIC Educational Resources Information Center

    Watson, Tanya Elaine

    2013-01-01

    Reading aloud (read-aloud) is quickly progressing as a useful strategy on the middle school level, yet research has not adequately caught up with is use with special populations such as middle school students and English language learners (ELLs). The purpose of this study was to add to the limited research on the read-aloud instructional strategy…

  6. Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry

    ERIC Educational Resources Information Center

    Gerard, Libby F.; Linn, Marcia C.

    2016-01-01

    Computer scoring of student written essays about an inquiry topic can be used to diagnose student progress both to alert teachers to struggling students and to generate automated guidance. We identify promising ways for teachers to add value to automated guidance to improve student learning. Three teachers from two schools and their 386 students…

  7. Sharing "Cat Games" and Cookies: Special Education Students Investigate Division

    ERIC Educational Resources Information Center

    Taber, Susan B.; Canonica, Michele

    2008-01-01

    Learning mathematics has traditionally been thought of as a sequential progression. Children learn to count to 10, then to 20, and then to 100. They learn to add without regrouping and then with regrouping. The authors teach addition before multiplication and the two-times table before the six-times table. They usually teach division as a separate…

  8. Multiple Counseling in Open and Closed Time-Extended Groups.

    ERIC Educational Resources Information Center

    Chambers, W. M.

    The open time-extended group, run by multiple counselors, adds a facilitating dimension to the counseling function--a dimension that exemplifies the concepts of self-growth and self-actualization by first providing the atmosphere for the client and then by allowing him to progress at his own rate and to a depth which he determines. An open group…

  9. draco: Analysis and simulation of drift scan radio data

    NASA Astrophysics Data System (ADS)

    Shaw, J. Richard

    2017-12-01

    draco analyzes transit radio data with the m-mode formalism. It is telescope agnostic, and is used as part of the analysis and simulation pipeline for the CHIME (Canadian Hydrogen Intensity Mapping Experiment) telescope. It can simulate time stream data from maps of the sky (using the m-mode formalism) and add gain fluctuations and correctly correlated instrumental noise (i.e. Wishart distributed). Further, it can perform various cuts on the data and make maps of the sky from data using the m-mode formalism.

  10. In vitro simulation of the equine hindgut as a tool to study the influence of phytosterol consumption on the excretion of anabolic-androgenic steroids in horses.

    PubMed

    Decloedt, A I; Bailly-Chouriberry, L; Vanden Bussche, J; Garcia, P; Popot, M-A; Bonnaire, Y; Vanhaecke, L

    2015-08-01

    Traditionally, steroids other than testosterone are considered to be synthetic, anabolic steroids. Nevertheless, in stallions, it has been shown that β-Bol can originate from naturally present testosterone. Other precursors, including phytosterols from feed, have been put forward to explain the prevalence of low levels of steroids (including β-Bol and ADD) in urine of mares and geldings. However, the possible biotransformation and identification of the precursors has thus far not been investigated in horses. To study the possible endogenous digestive transformation, in vitro simulations of the horse hindgut were set up, using fecal inocula obtained from eight different horses. The functionality of the in vitro model was confirmed by monitoring the formation of short-chain fatty acids and the consumption of amino acids and carbohydrates throughout the digestion process. In vitro digestion samples were analyzed with a validated UHPLC-MS/MS method. The addition of β-Bol gave rise to the formation of ADD (androsta-1,4-diene-3,17-dione) or αT. Upon addition of ADD to the in vitro digestions, the transformation of ADD to β-Bol was observed and this for all eight horses' inocula, in line with previously obtained in vivo results, again confirming the functionality of the in vitro model. The transformation ratio proved to be inoculum and thus horse dependent. The addition of pure phytosterols (50% β-sitosterol) or phytosterol-rich herbal supplements on the other hand, did not induce the detection of β-Bol, only low concentrations of AED, a testosterone precursor, could be found (0.1 ng/mL). As such, the digestive transformation of ADD could be linked to the detection of β-Bol, and the consumption of phytosterols to low concentrations of AED, but there is no direct link between phytosterols and β-Bol. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Equilibrium expert: an add-in to Microsoft Excel for multiple binding equilibrium simulations and parameter estimations.

    PubMed

    Raguin, Olivier; Gruaz-Guyon, Anne; Barbet, Jacques

    2002-11-01

    An add-in to Microsoft Excel was developed to simulate multiple binding equilibriums. A partition function, readily written even when the equilibrium is complex, describes the experimental system. It involves the concentrations of the different free molecular species and of the different complexes present in the experiment. As a result, the software is not restricted to a series of predefined experimental setups but can handle a large variety of problems involving up to nine independent molecular species. Binding parameters are estimated by nonlinear least-square fitting of experimental measurements as supplied by the user. The fitting process allows user-defined weighting of the experimental data. The flexibility of the software and the way it may be used to describe common experimental situations and to deal with usual problems such as tracer reactivity or nonspecific binding is demonstrated by a few examples. The software is available free of charge upon request.

  12. A design multifunctional plasmonic optical device by micro ring system

    NASA Astrophysics Data System (ADS)

    Pornsuwancharoen, N.; Youplao, P.; Amiri, I. S.; Ali, J.; Yupapin, P.

    2018-03-01

    A multi-function electronic device based on the plasmonic circuit is designed and simulated by using the micro-ring system. From which a nonlinear micro-ring resonator is employed and the selected electronic devices such as rectifier, amplifier, regulator and filter are investigated. A system consists of a nonlinear micro-ring resonator, which is known as a modified add-drop filter and made of an InGaAsP/InP material. The stacked waveguide of an InGaAsP/InP - graphene -gold/silver is formed as a part of the device, the required output signals are formed by the specific control of input signals via the input and add ports. The material and device aspects are reviewed. The simulation results are obtained using the Opti-wave and MATLAB software programs, all device parameters are based on the fabrication technology capability.

  13. USACDEC Experimentation Manual

    DTIC Science & Technology

    1981-10-01

    Commander, Instrumentation Command (Prov) who is responsible for the cinematic form of the films. The writing requirements for discrete sections of the...level of simulated realism required. Higher levels of simulated realism will require higher degrees of control to insure the test events occur as...experimentation, the "enemy" created to add realism . Aggressor forces may be represented by live troops In the field or by mechanical targets with or

  14. Simulated electronic heterodyne recording and processing of pulsed-laser holograms

    NASA Technical Reports Server (NTRS)

    Decker, A. J.

    1979-01-01

    The electronic recording of pulsed-laser holograms is proposed. The polarization sensitivity of each resolution element of the detector is controlled independently to add an arbitrary phase to the image waves. This method which can be used to simulate heterodyne recording and to process three-dimensional optical images, is based on a similar method for heterodyne recording and processing of continuous-wave holograms.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falk, H.; Herbert, J.T.; Edmonds, L.

    Four cases of childhood hepatic angiosarcoma (HAS), representing the malignant form of infantile hemangioendothelioma, are described. The morphologic appearance of childhood HAS differs from the adult form in the following features: the associated presence of benign infantile hemagioendothelioma; the presence of dysontogenetic features; and an altered appearance of the angiosarcoma itself. It is postulated for these cases that the benigh infantile hemangioendothelioma progressed to the malignant angiosarcoma. One of the four cases had exposure to elevated levels of arsenic in the environment that may have contributed to this progression. This latter case adds to published reports associating arsenic exposure withmore » increased risk for hepatic angiosarcoma.« less

  16. Numerical investigation of internal high-speed viscous flows using a parabolic technique

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Power, G. D.

    1985-01-01

    A feasibility study has been conducted to assess the applicability of an existing parabolic analysis (ADD-Axisymmetric Diffuser Duct), developed previously for subsonic viscous internal flows, to mixed supersonic/subsonic flows with heat addition simulating a SCRAMJET combustor. A study was conducted with the ADD code modified to include additional convection effects in the normal momentum equation when supersonic expansion and compression waves are present. A set of test problems with weak shock and expansion waves have been analyzed with this modified ADD method and stable and accurate solutions were demonstrated provided the streamwise step size was maintained at levels larger than the boundary layer displacement thickness. Calculations made with further reductions in step size encountered departure solutions consistent with strong interaction theory. Calculations were also performed for a flow field with a flame front in which a specific heat release was imposed to simulate a SCRAMJET combustor. In this case the flame front generated relatively thick shear layers which aggravated the departure solution problem. Qualitatively correct results were obtained for these cases using a marching technique with the convective terms in the normal momentum equation suppressed. It is concluded from the present study that for the class of problems where strong viscous/inviscid interactions are present a global iteration procedure is required.

  17. Filling the Simulated Sandtrap

    NASA Image and Video Library

    2009-06-30

    Rover team members Mike Seibert left and Paolo Bellutta add a barrowful of soil mixture to the sloped box where a test rover will be used for assessing possible maneuvers for NASA rover Spirit to use in escaping from a sandtrap on Mars.

  18. Depression prevention, labour force participation and income of older working aged Australians: A microsimulation economic analysis.

    PubMed

    Veerman, J Lennert; Shrestha, Rupendra N; Mihalopoulos, Cathrine; Passey, Megan E; Kelly, Simon J; Tanton, Robert; Callander, Emily J; Schofield, Deborah J

    2015-05-01

    Depression has economic consequences not only for the health system, but also for individuals and society. This study aims to quantify the potential economic impact of five-yearly screening for sub-syndromal depression in general practice among Australians aged 45-64 years, followed by a group-based psychological intervention to prevent progression to depression. We used an epidemiological simulation model to estimate reductions in prevalence of depression, and a microsimulation model, Health&WealthMOD2030, to estimate the impact on labour force participation, personal income, savings, taxation revenue and welfare expenditure. Group therapy is estimated to prevent around 5,200 prevalent cases of depression (2.2%) and add about 520 people to the labour force. Private incomes are projected to increase by $19 million per year, tax revenues by $2.4 million, and transfer payments are reduced by $2.6 million. Group-based psychological intervention to prevent depression could result in considerable economic benefits in addition to its clinical effects. © The Royal Australian and New Zealand College of Psychiatrists 2014.

  19. Guidelines to electrode positioning for human and animal electrical impedance myography research

    NASA Astrophysics Data System (ADS)

    Sanchez, Benjamin; Pacheck, Adam; Rutkove, Seward B.

    2016-09-01

    The positioning of electrodes in electrical impedance myography (EIM) is critical for accurately assessing disease progression and effectiveness of treatment. In human and animal trials for neuromuscular disorders, inconsistent electrode positioning adds errors to the muscle impedance. Despite its importance, how the reproducibility of resistance and reactance, the two parameters that define EIM, are affected by changes in electrode positioning remains unknown. In this paper, we present a novel approach founded on biophysical principles to study the reproducibility of resistance and reactance to electrode misplacements. The analytical framework presented allows the user to quantify a priori the effect on the muscle resistance and reactance using only one parameter: the uncertainty placing the electrodes. We also provide quantitative data on the precision needed to position the electrodes and the minimum muscle length needed to achieve a pre-specified EIM reproducibility. The results reported here are confirmed with finite element model simulations and measurements on five healthy subjects. Ultimately, our data can serve as normative values to enhance the reliability of EIM as a biomarker and facilitate comparability of future human and animal studies.

  20. Comparing self-guided learning and educator-guided learning formats for simulation-based clinical training.

    PubMed

    Brydges, Ryan; Carnahan, Heather; Rose, Don; Dubrowski, Adam

    2010-08-01

    In this paper, we tested the over-arching hypothesis that progressive self-guided learning offers equivalent learning benefit vs. proficiency-based training while limiting the need to set proficiency standards. We have shown that self-guided learning is enhanced when students learn on simulators that progressively increase in fidelity during practice. Proficiency-based training, a current gold-standard training approach, requires achievement of a criterion score before students advance to the next learning level. Baccalaureate nursing students (n = 15/group) practised intravenous catheterization using simulators that differed in fidelity (i.e. students' perceived realism). Data were collected in 2008. Proficiency-based students advanced from low- to mid- to high-fidelity after achieving a proficiency criterion at each level. Progressive students self-guided their progression from low- to mid- to high-fidelity. Yoked control students followed an experimenter-defined progressive practice schedule. Open-ended students moved freely between the simulators. One week after practice, blinded experts evaluated students' skill transfer on a standardized patient simulation. Group differences were examined using analyses of variance. Proficiency-based students scored highest on the high-fidelity post-test (effect size = 1.22). An interaction effect showed that the Progressive and Open-ended groups maintained their performance from post-test to transfer test, whereas the Proficiency-based and Yoked control groups experienced a significant decrease (P < 0.05). Surprisingly, most Open-ended students (73%) chose the progressive practice schedule. Progressive training and proficiency-based training resulted in equivalent transfer test performance, suggesting that progressive students effectively self-guided when to transition between simulators. Students' preference for the progressive practice schedule indicates that educators should consider this sequence for simulation-based training.

  1. Commentary: The Development of Hippocampal-Dependent Memory Functions: Theoretical Comments on Jabès and Nelson Review (2015)

    ERIC Educational Resources Information Center

    Bachevalier, Jocelyne

    2015-01-01

    Studies investigating the development of memory processes and their neural substrates have flourished over the past two decades. The review by Jabès and Nelson (2015) adds an important piece to our understanding of the maturation of different elements and circuits within the hippocampal system and their association with the progressive development…

  2. Certificate Structure Study: Do Stackable Certificates Really "Add" up to a Degree? Research Report 17-2

    ERIC Educational Resources Information Center

    Washington State Board for Community and Technical Colleges, 2017

    2017-01-01

    The purpose of this study is to answer key questions about the structure of certificates and their function in employability and degree attainment in the Washington State Community and Technical College (CTC) System. Specifically, this study addresses the following: (1) Do certificates play a role in helping students progress along career pathways…

  3. Background and Recent Progress in Anomalous Transport Simulation

    DTIC Science & Technology

    2017-07-19

    NUMBER (Include area code) 19 July 2017 Briefing Charts 14 June 2017 - 19 July 2017 Background and Recent Progress in Anomalous Transport Simulation ...and Recent Progress in Anomalous Transport Simulation 19 Jul 2017 Justin Koo AFRL/RQRS Edwards AFB, CA 2DISTRIBUTION A: Approved for public release...Baalrud, S.D. and Chabert, P., “Theory for the anomalous electron transport in Hall effect thrusters. I. Insights from particle-in-cell simulations

  4. WestPro: a computer program for simulating uneven-aged Douglas-fir stand growth and yield in the Pacific Northwest.

    Treesearch

    Rebecca Ralston; Joseph Buongiorno; Benedict Schulte; Jeremy Fried

    2003-01-01

    WestPro is an add-in program designed to work with Microsoft Excel to simulate the growth of uneven-aged Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco) stands in the Pacific Northwest region of the United States. Given the initial stand state, defined as the number of softwood and hardwood trees per acre by diameter class, WestPro predicts the...

  5. Extending the Use and Effectiveness of the Monopoly® Board Game as an In-Class Economic Simulation in the Introductory Financial Accounting Course

    ERIC Educational Resources Information Center

    Shanklin, Stephen B.; Ehlen, Craig R.

    2017-01-01

    This paper extends the use of the Monopoly® board game as an economic simulation exercise designed to reinforce an understanding of how the accounting cycle impacts the financial statements used to evaluate management performance. This extension adds elements of debt not previously utilized to allow for an introduction of the fundamentals of ratio…

  6. Downscaling with a nested regional climate model in near-surface fields over the contiguous United States

    NASA Astrophysics Data System (ADS)

    Wang, Jiali; Kotamarthi, Veerabhadra R.

    2014-07-01

    The Weather Research and Forecasting (WRF) model is used for dynamic downscaling of 2.5-degree National Centers for Environmental Prediction-U.S. Department of Energy Reanalysis II (NCEP-R2) data for 1980-2010 at 12 km resolution over most of North America. The model's performance for surface air temperature and precipitation is evaluated by comparison with high-resolution observational data sets. The model's ability to add value is investigated by comparison with NCEP-R2 data and a 50 km regional climate simulation. The causes for major model bias are studied through additional sensitivity experiments with various model setup/integration approaches and physics representations. The WRF captures the main features of the spatial patterns and annual cycles of air temperature and precipitation over most of the contiguous United States. However, simulated air temperatures over the south central region and precipitation over the Great Plains and the Southwest have significant biases. Allowing longer spin-up time, reducing the nudging strength, or replacing the WRF Single-Moment six-class microphysics with Morrison microphysics reduces the bias over some subregions. However, replacing the Grell-Devenyi cumulus parameterization with Kain-Fritsch shows no improvement. The 12 km simulation does add value above the NCEP-R2 data and the 50 km simulation over mountainous and coastal zones.

  7. Stabilization of a progressive hemangioblastoma under treatment with thalidomide.

    PubMed

    Piribauer, Maria; Czech, Thomas; Dieckmann, Karin; Birner, Peter; Hainfellner, Johannes A; Prayer, Daniela; Fazeny-Dörner, Barbara; Weinländer, Georg; Marosi, Christine

    2004-02-01

    After the second recurrence of spinal seeding in hemangioblastoma not associated to von-Hippel-Lindau disease, we treated an adult female patient with thalidomide 200 mg orally/day at night for longer than 1 year. The patient reported subjective relief of symptoms after 1 month. Magnetic resonance imaging (MRI) controls 1,6 and 11 months after begin of thalidomide treatment did not show further tumor progression. She remained wheelchair-bound, but mobility of her arms continuously improved. There was no thalidomide associated side-effect in this patient until her death from pneumonia due to legionnaire's disease. Antiangiogenic treatment with interferon (IFN) alpha-2a and IFN alpha-2b and with SU 5416 has been reported to be effective and well tolerated in several patients with previously progressive angioblastomas and hemangioblastomas. This case adds further evidence of the efficacy of an antiangiogenic treatment concept in a progressive hemangioblastoma.

  8. Effect of prescribed prism on monocular interpupillary distances and fitting heights for progressive add lenses.

    PubMed

    Brooks, C W; Riley, H D

    1994-06-01

    Success in fitting progressive addition lenses is dependent upon the accurate placement of the progressive zone. Both eyes must track simultaneously within the boundary of the progressive corridor. Vertical prism will displace the wearer's lines of sight and consequently eye position. Because fitting heights are measured using an empty frame, subjects with vertical phorias usually will fuse, and not show the vertical differences in pupil heights during the measuring process. Therefore, when prescriptions contain vertical prism one must consider the changes in measured fitting heights that will occur once the lenses are placed in the frame. Fitting heights must be altered approximately 0.3 mm for each vertical prism diopter prescribed. The fitting height adjustment is opposite from the base direction of the prescribed prism. An explanation of the effect of prescribed horizontal prism on monocular interpupillary distance (PD) measurements is also included.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinilla, Maria Isabel

    This report seeks to study and benchmark code predictions against experimental data; determine parameters to match MCNP-simulated detector response functions to experimental stilbene measurements; add stilbene processing capabilities to DRiFT; and improve NEUANCE detector array modeling and analysis using new MCNP6 and DRiFT features.

  10. The optics of occupational progressive lenses.

    PubMed

    Sheedy, James E; Hardy, Raymond F

    2005-08-01

    Occupational progressive lenses (OPLs) utilize progressive power optics and are designed primarily to meet near and intermediate viewing needs such as working at a computer workstation for presbyopic patients. OPLs are fabricated to have the prescribed near power in the lower part of the lens and the power in the upper portion of the lens is determined by the amount of power "degression" (decrease in plus power) relative to the near power. Independent measurements of the optical characteristics of these lenses have not been reported previously. Manufacturers of 7 different OPL designs provided sample lenses for a patient with +2.50 D add that were measured with a Rotlex Class Plus lens analyzer (Rotlex Inc., Israel). Power measurements were normalized to the location specified by the manufacturer, and the vertical location of each lens was normalized to pupil center based on manufacturer fitting guidelines. Large optical differences exist among the OPL designs. The results show clear differences between the designs in terms of the add powers, their vertical location, and zone width. The size and location of the near, near-intermediate, far-intermediate, and far viewing zones were determined. The literature and clinical experience support that OPLs are successful at meeting the computer, general office, and other intermediate viewing distance needs of many patients. However, because of the large differences in the several OPL designs, patient success can likely be enhanced by selecting the design that best suits his or her viewing needs.

  11. Support for Simulation-Based Learning; The Effects of Model Progression and Assignments on Learning about Oscillatory Motion.

    ERIC Educational Resources Information Center

    Swaak, Janine; And Others

    In this study, learners worked with a simulation of harmonic oscillation. Two supportive measures were introduced: model progression and assignments. In model progression, the model underlying the simulation is not offered in its full complexity from the start, but variables are gradually introduced. Assignments are small exercises that help the…

  12. Sequence-dependent folding landscapes of adenine riboswitch aptamers.

    PubMed

    Lin, Jong-Chin; Hyeon, Changbong; Thirumalai, D

    2014-04-14

    Expression of a large fraction of genes in bacteria is controlled by riboswitches, which are found in the untranslated region of mRNA. Structurally riboswitches have a conserved aptamer domain to which a metabolite binds, resulting in a conformational change in the downstream expression platform. Prediction of the functions of riboswitches requires a quantitative description of the folding landscape so that the barriers and time scales for the conformational change in the switching region in the aptamer can be estimated. Using a combination of all atom molecular dynamics (MD) and coarse-grained model simulations we studied the response of adenine (A) binding add and pbuE A-riboswitches to mechanical force. The two riboswitches contain a structurally similar three-way junction formed by three paired helices, P1, P2, and P3, but carry out different functions. Using pulling simulations, with structures generated in MD simulations, we show that after P1 rips the dominant unfolding pathway in the add A-riboswitch is the rupture of P2 followed by unraveling of P3. In the pbuE A-riboswitch, after P1 unfolds P3 ruptures ahead of P2. The order of unfolding of the helices, which is in accord with single molecule pulling experiments, is determined by the relative stabilities of the individual helices. Our results show that the stability of isolated helices determines the order of assembly and response to force in these non-coding regions. We use the simulated free energy profile for the pbuE A-riboswitch to estimate the time scale for allosteric switching, which shows that this riboswitch is under kinetic control lending additional support to the conclusion based on single molecule pulling experiments. A consequence of the stability hypothesis is that a single point mutation (U28C) in the P2 helix of the add A-riboswitch, which increases the stability of P2, would make the folding landscapes of the two riboswitches similar. This prediction can be tested in single molecule pulling experiments.

  13. Investigation of the Vehicle Mobility in Fording

    DTIC Science & Technology

    2016-05-29

    Conference on Multibody System Dynamics May 29 – June 2, 2016, Montréal, Canada Investigation of the Vehicle Mobility in Fording Arman Pazouki1...strategy outlined has been implemented in Chrono as a dedicated add-on called Chrono::FSI [3]. Figure 1 shows a vehicle model used in a fording simulation...rigid objects. Chrono::FSI has been used for vehicle mobility in fording operations as shown in Figure 2. The computational time per simulation time

  14. PySM: Python Sky Model

    NASA Astrophysics Data System (ADS)

    Thorne, Ben; Alonso, David; Naess, Sigurd; Dunkley, Jo

    2017-04-01

    PySM generates full-sky simulations of Galactic foregrounds in intensity and polarization relevant for CMB experiments. The components simulated are thermal dust, synchrotron, AME, free-free, and CMB at a given Nside, with an option to integrate over a top hat bandpass, to add white instrument noise, and to smooth with a given beam. PySM is based on the large-scale Galactic part of Planck Sky Model code and uses some of its inputs

  15. Advanced Technology for Portable Personal Visualization.

    DTIC Science & Technology

    1992-06-01

    interactive radiosity . 6 Advanced Technology for Portable Personal Visualization Progress Report January-June 1992 9 2.5 Virtual-Environment Ultrasound...the system, with support for textures, model partitioning, more complex radiosity emitters, and the replacement of model parts with objects from our...model libraries. "* Add real-time, interactive radiosity to the display program on Pixel-Planes 5. "* Move the real-time model mesh-generation to the

  16. Will Moores law be sufficient?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeBenedictis, Erik P.

    2004-07-01

    It seems well understood that supercomputer simulation is an enabler for scientific discoveries, weapons, and other activities of value to society. It also seems widely believed that Moore's Law will make progressively more powerful supercomputers over time and thus enable more of these contributions. This paper seeks to add detail to these arguments, revealing them to be generally correct but not a smooth and effortless progression. This paper will review some key problems that can be solved with supercomputer simulation, showing that more powerful supercomputers will be useful up to a very high yet finite limit of around 1021 FLOPSmore » (1 Zettaflops) . The review will also show the basic nature of these extreme problems. This paper will review work by others showing that the theoretical maximum supercomputer power is very high indeed, but will explain how a straightforward extrapolation of Moore's Law will lead to technological maturity in a few decades. The power of a supercomputer at the maturity of Moore's Law will be very high by today's standards at 1016-1019 FLOPS (100 Petaflops to 10 Exaflops), depending on architecture, but distinctly below the level required for the most ambitious applications. Having established that Moore's Law will not be that last word in supercomputing, this paper will explore the nearer term issue of what a supercomputer will look like at maturity of Moore's Law. Our approach will quantify the maximum performance as permitted by the laws of physics for extension of current technology and then find a design that approaches this limit closely. We study a 'multi-architecture' for supercomputers that combines a microprocessor with other 'advanced' concepts and find it can reach the limits as well. This approach should be quite viable in the future because the microprocessor would provide compatibility with existing codes and programming styles while the 'advanced' features would provide a boost to the limits of performance.« less

  17. Simulating 3D Spacecraft Constellations for Low Frequency Radio Imaging

    NASA Astrophysics Data System (ADS)

    Hegedus, A. M.; Amiri, N.; Lazio, J.; Belov, K.; Kasper, J. C.

    2016-12-01

    Constellations of small spacecraft could be used to realize a low-frequency phased array for either heliophysics or astrophysics observations. However, there are issues that arise with an orbiting array that do not occur on the ground, thus rendering much of the existing radio astronomy software inadequate for data analysis and simulation. In this work we address these issues and consider the performance of two constellation concepts. The first is a 32-spacecraft constellation for astrophysical observations, and the second is a 5-element concept for pointing to the location of radio emission from coronal mass ejections (CMEs). For the first, we fill the software gap by extending the APSYNSIM software to simulate the aperture synthesis for a radio interferometer in orbit. This involves using the dynamic baselines from the relative motion of the individual spacecraft as well as the capability to add galactic noise. The ability to simulate phase errors corresponding to positional uncertainty of the antennas was also added. The upgraded software was then used to model the imaging of a 32 spacecraft constellation that would orbit the moon to image radio galaxies like Cygnus A at .3-30 MHz. Animated images showing the improvement of the dirty image as the orbits progressed were made. RMSE plots that show how well the dirty image matches the input image as a function of integration time were made. For the second concept we performed radio interferometric simulations of the Sun Radio Interferometer Space Experiment (SunRISE) using the Common Astronomy Software Applications (CASA) package. SunRISE is a five spacecraft phased array that would orbit Earth to localize the low frequency radio emission from CMEs. This involved simulating the array in CASA, creating truth images for the CMEs over the entire frequency band of SunRISE, and observing them with the simulated array to see how well it could localize the true position of the CME. The results of our analysis show that we can localize the radio emission originating from the head or flanks of the CMEs in spite of the phase errors introduced by uncertainties in orbit and clock estimation.

  18. 10 CFR 611.101 - Application.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., including vehicle simulations using industry standard model (need to add name and location of this open.... All such information and data must include assumptions made in their preparation and the range of... any product (vehicle or component) to be produced by or through the project, including relevant data...

  19. Integrated Spatio-Temporal Ecological Modeling System

    DTIC Science & Technology

    1998-07-01

    models that we hold in our conscious (and subconscious ) minds. Chapter 3 explores how this approach is being augmented with the more formal capture...This approach makes it possible to add new simulation model components to I- STEMS without having to reprogram existing components. The steps required

  20. A third-order silicon racetrack add-drop filter with a moderate feature size

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Zhou, Xin; Chen, Qian; Shao, Yue; Chen, Xiangning; Huang, Qingzhong; Jiang, Wei

    2018-01-01

    In this work, we design and fabricate a highly compact third-order racetrack add-drop filter consisting of silicon waveguides with modified widths on a silicon-on-insulator (SOI) wafer. Compared to the previous approach that requires an exceedingly narrow coupling gap less than 100nm, we propose a new approach that enlarges the minimum feature size of the whole device to be 300 nm to reduce the process requirement. The three-dimensional finite-difference time-domain (3D-FDTD) method is used for simulation. Experiment results show good agreement with simulation results in property. In the experiment, the filter shows a nearly box-like channel dropping response, which has a large flat 3-dB bandwidth ({3 nm), relatively large FSR ({13.3 nm) and out-of-band rejection larger than 14 dB at the drop port with a footprint of 0.0006 mm2 . The device is small and simple enough to have a wide range of applications in large scale on-chip photonic integration circuits.

  1. Progressive Fracture of Fiber Composite Build-Up Structures

    NASA Technical Reports Server (NTRS)

    Gotsis, Pascal K.; Chamis, C. C.; Minnetyan, Levon

    1997-01-01

    Damage progression and fracture of built-up composite structures is evaluated by using computational simulation. The objective is to examine the behavior and response of a stiffened composite (0/ +/- 45/90)(sub s6) laminate panel by simulating the damage initiation, growth, accumulation, progression and propagation to structural collapse. An integrated computer code, CODSTRAN, was augmented for the simulation of the progressive damage and fracture of built-up composite structures under mechanical loading. Results show that damage initiation and progression have significant effect on the structural response. Influence of the type of loading is investigated on the damage initiation, propagation and final fracture of the build-up composite panel.

  2. Progressive Fracture of Fiber Composite Build-Up Structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Gotsis, Pascal K.; Chamis, C. C.

    1997-01-01

    Damage progression and fracture of built-up composite structures is evaluated by using computational simulation. The objective is to examine the behavior and response of a stiffened composite (0 +/-45/90)(sub s6) laminate panel by simulating the damage initiation, growth, accumulation, progression and propagation to structural collapse. An integrated computer code CODSTRAN was augmented for the simulation of the progressive damage and fracture of built-up composite structures under mechanical loading. Results show that damage initiation and progression to have significant effect on the structural response. Influence of the type of loading is investigated on the damage initiation, propagation and final fracture of the build-up composite panel.

  3. Downscaling with a nested regional climate model in near-surface fields over the contiguous United States: WRF dynamical downscaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiali; Kotamarthi, Veerabhadra R.

    The Weather Research and Forecasting (WRF) model is used for dynamic downscaling of 2.5 degree National Centers for Environmental Prediction-U.S. Department of Energy Reanalysis II (NCEP-R2) data for 1980-2010 at 12 km resolution over most of North America. The model's performance for surface air temperature and precipitation is evaluated by comparison with high-resolution observational data sets. The model's ability to add value is investigated by comparison with NCEP-R2 data and a 50 km regional climate simulation. The causes for major model bias are studied through additional sensitivity experiments with various model setup/integration approaches and physics representations. The WRF captures themore » main features of the spatial patterns and annual cycles of air temperature and precipitation over most of the contiguous United States. However, simulated air temperatures over the south central region and precipitation over the Great Plains and the Southwest have significant biases. Allowing longer spin-up time, reducing the nudging strength, or replacing the WRF Single-Moment 6-class microphysics with Morrison microphysics reduces the bias over some subregions. However, replacing the Grell-Devenyi cumulus parameterization with Kain-Fritsch shows no improvement. The 12 km simulation does add value above the NCEP-R2 data and the 50 km simulation over mountainous and coastal zones.« less

  4. Progressive compressive imager

    NASA Astrophysics Data System (ADS)

    Evladov, Sergei; Levi, Ofer; Stern, Adrian

    2012-06-01

    We have designed and built a working automatic progressive sampling imaging system based on the vector sensor concept, which utilizes a unique sampling scheme of Radon projections. This sampling scheme makes it possible to progressively add information resulting in tradeoff between compression and the quality of reconstruction. The uniqueness of our sampling is that in any moment of the acquisition process the reconstruction can produce a reasonable version of the image. The advantage of the gradual addition of the samples is seen when the sparsity rate of the object is unknown, and thus the number of needed measurements. We have developed the iterative algorithm OSO (Ordered Sets Optimization) which employs our sampling scheme for creation of nearly uniform distributed sets of samples, which allows the reconstruction of Mega-Pixel images. We present the good quality reconstruction from compressed data ratios of 1:20.

  5. 10 CFR 611.101 - Application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., including vehicle simulations using industry standard model (need to add name and location of this open source model) to show projected fuel economy; (d) A detailed estimate of the total project costs together..., equity, and debt, and the liability of parties associated with the project; (f) Applicant's business plan...

  6. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Appendix B: ROBSIM programmer's guide

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.

    1986-01-01

    The purpose of the Robotic Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAM 77 and implemented on a VAX 11/750 computer using the VMS operating system. The programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With the manual and the in-code documentation, an experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.

  7. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, appendix B

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.

    1984-01-01

    The purpose of the Robotics Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAN 77 and implemented on a VAX 11/750 computer using the VMS operating system. This programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With this manual and the in-code documentation, and experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.

  8. Real-time failure control (SAFD)

    NASA Technical Reports Server (NTRS)

    Panossian, Hagop V.; Kemp, Victoria R.; Eckerling, Sherry J.

    1990-01-01

    The Real Time Failure Control program involves development of a failure detection algorithm, referred as System for Failure and Anomaly Detection (SAFD), for the Space Shuttle Main Engine (SSME). This failure detection approach is signal-based and it entails monitoring SSME measurement signals based on predetermined and computed mean values and standard deviations. Twenty four engine measurements are included in the algorithm and provisions are made to add more parameters if needed. Six major sections of research are presented: (1) SAFD algorithm development; (2) SAFD simulations; (3) Digital Transient Model failure simulation; (4) closed-loop simulation; (5) SAFD current limitations; and (6) enhancements planned for.

  9. Phytoremediation of Atmospheric Methane

    DTIC Science & Technology

    2013-04-15

    REPORT Phytoremediation of Atmospheric Methane 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: We have transformed a plant, Arabidopsis thaliana, with the...298 (Rev 8/98) Prescribed by ANSI Std. Z39.18 - 31-Mar-2012 Phytoremediation of Atmospheric Methane Report Title ABSTRACT We have transformed a...DD882) Scientific Progress See attachment Technology Transfer 1    Final Report for DARPA project W911NF1010027  Phytoremediation  of Atmospheric

  10. Marine Mammals and Low-Frequency Sound: Progress Since 1994

    DTIC Science & Technology

    2000-03-03

    Acoustical Society of America 100:2611. Abstract. Stafford , K.M., CG. Fox, and D.S. Clark. 1998 . Long - range acoustic detection and localization ...adds further incentive for documenting the occurrence and types of vocalizations of blue whales in the Pacific Ocean ( Stafford et al., 1998 ). The...K.M. Stafford , CG. Fox, H.W. Braham, M.A. McDonald, and J. Thomason. 1999. Acoustic and visual detection of large

  11. Math Scores Add Up for Hispanic Students: States and School Districts Notable for Recent Gains by Hispanic Students in Mathematics. Publication#2014-59

    ERIC Educational Resources Information Center

    Pane, Natalia E.

    2014-01-01

    This report shows significant gains in math achievement by Hispanic fourth- and eighth-graders across the nation--the equivalent of one grade level in the last ten years (2003-3013). Using data from the National Assessment of Educational Progress, Child Trends reviewed and compared fourth and eighth grade math scores in the nation, states, large…

  12. Dextromethorphan attenuated inflammation and combined opioid use in humans undergoing methadone maintenance treatment.

    PubMed

    Chen, Shiou-Lan; Lee, Sheng-Yu; Tao, Pao-Luh; Chang, Yun-Hsuan; Chen, Shih-Heng; Chu, Chun-Hsien; Chen, Po See; Lee, I Hui; Yeh, Tzung Lieh; Yang, Yen Kuang; Hong, Jau-Shyong; Lu, Ru-Band

    2012-12-01

    Recent studies show that proinflammatory cytokines might be related to the development of opioid dependence (physiological, psychological, or both). In a double-blind, randomly stratified clinical trial investigating whether add-on dextromethorphan (60-120 mg/day) attenuated inflammation and the combined use of opioids in heroin-dependent patients undergoing methadone maintenance treatment, we evaluated whether inflammation is related to the progression of opioid dependence. All participants (107 heroin-dependent patients and 84 nondependent healthy controls) were recruited from National Cheng Kung University Hospital. Their plasma cytokine levels were measured to evaluate the effect of add-on dextromethorphan. Plasma TNF-α and IL-8 levels were significantly higher in long-term heroin-dependent patients than in healthy controls (p < 0.001). Chronic heroin-use-induced TNF-α and IL-8 levels were significantly (p < 0.05) attenuated in patients treated for 12 weeks with add-on dextromethorphan. Moreover, both tolerance to methadone and the combined use of opioids were significantly (p < 0.05) attenuated in patients taking dextromethorphan. We conclude that dextromethorphan might be a feasible adjuvant therapeutic for attenuating inflammation and inhibiting methadone tolerance and combined opioid use in heroin-dependent patients.

  13. Safinamide: an add-on treatment for managing Parkinson’s disease

    PubMed Central

    Müller, Thomas

    2018-01-01

    Heterogeneous expression of neurotransmitter deficits results from onset and progression of Parkinson’s disease. Intervals, characterized by reappearance of motor and associated certain nonmotor symptoms, determine the end of good tolerability and efficacy of oral levodopa therapy. These “OFF” states result from levodopa pharmacokinetics and disease progression-related deterioration of the central buffering capacity for fluctuations of dopamine levels. This review discusses safinamide as an add-on therapeutic agent in orally levodopa-treated patients with “OFF” phenomena. Safinamide provided beneficial effects on “OFF” symptoms in pivotal trials with doses of 50 or 100 mg once daily. Safinamide reversibly inhibits mono-amine oxidase B and declines abnormal glutamate release by modulation of potassium- and sodium ion channels. An ideal candidate for combination with safinamide is opicapone. This inhibitor of peripheral catechol-O-methyltransferase supports continuous brain delivery of levodopa and, thus, the continuous dopaminergic stimulation concept. Both compounds with their once-daily application and good tolerability may complement each other by reduction of necessary oral levodopa intakes and “OFF” times. Thus, a promising, future option will be combination of safinamide and opicapone in one formulation. It will reduce adherence issues and may complement levodopa treatment. It will probably cause less nausea and edema than a dopamine agonist/levodopa regimen. PMID:29670409

  14. Safinamide: an add-on treatment for managing Parkinson's disease.

    PubMed

    Müller, Thomas

    2018-01-01

    Heterogeneous expression of neurotransmitter deficits results from onset and progression of Parkinson's disease. Intervals, characterized by reappearance of motor and associated certain nonmotor symptoms, determine the end of good tolerability and efficacy of oral levodopa therapy. These "OFF" states result from levodopa pharmacokinetics and disease progression-related deterioration of the central buffering capacity for fluctuations of dopamine levels. This review discusses safinamide as an add-on therapeutic agent in orally levodopa-treated patients with "OFF" phenomena. Safinamide provided beneficial effects on "OFF" symptoms in pivotal trials with doses of 50 or 100 mg once daily. Safinamide reversibly inhibits mono-amine oxidase B and declines abnormal glutamate release by modulation of potassium- and sodium ion channels. An ideal candidate for combination with safinamide is opicapone. This inhibitor of peripheral catechol-O-methyltransferase supports continuous brain delivery of levodopa and, thus, the continuous dopaminergic stimulation concept. Both compounds with their once-daily application and good tolerability may complement each other by reduction of necessary oral levodopa intakes and "OFF" times. Thus, a promising, future option will be combination of safinamide and opicapone in one formulation. It will reduce adherence issues and may complement levodopa treatment. It will probably cause less nausea and edema than a dopamine agonist/levodopa regimen.

  15. Progressive Fracture of Fiber Composite Builtup Structures

    NASA Technical Reports Server (NTRS)

    Gotsis, Pascal K.; Chamis, Christos C.; Minnetyan, Levon

    1996-01-01

    The damage progression and fracture of builtup composite structures was evaluated by using computational simulation to examine the behavior and response of a stiffened composite (0 +/- 45/90)(sub s6) laminate panel subjected to a bending load. The damage initiation, growth, accumulation, progression, and propagation to structural collapse were simulated. An integrated computer code (CODSTRAN) was augmented for the simulation of the progressive damage and fracture of builtup composite structures under mechanical loading. Results showed that damage initiation and progression have a significant effect on the structural response. Also investigated was the influence of different types of bending load on the damage initiation, propagation, and final fracture of the builtup composite panel.

  16. Inference of Transmission Network Structure from HIV Phylogenetic Trees

    DOE PAGES

    Giardina, Federica; Romero-Severson, Ethan Obie; Albert, Jan; ...

    2017-01-13

    Phylogenetic inference is an attractive means to reconstruct transmission histories and epidemics. However, there is not a perfect correspondence between transmission history and virus phylogeny. Both node height and topological differences may occur, depending on the interaction between within-host evolutionary dynamics and between-host transmission patterns. To investigate these interactions, we added a within-host evolutionary model in epidemiological simulations and examined if the resulting phylogeny could recover different types of contact networks. To further improve realism, we also introduced patient-specific differences in infectivity across disease stages, and on the epidemic level we considered incomplete sampling and the age of the epidemic.more » Second, we implemented an inference method based on approximate Bayesian computation (ABC) to discriminate among three well-studied network models and jointly estimate both network parameters and key epidemiological quantities such as the infection rate. Our ABC framework used both topological and distance-based tree statistics for comparison between simulated and observed trees. Overall, our simulations showed that a virus time-scaled phylogeny (genealogy) may be substantially different from the between-host transmission tree. This has important implications for the interpretation of what a phylogeny reveals about the underlying epidemic contact network. In particular, we found that while the within-host evolutionary process obscures the transmission tree, the diversification process and infectivity dynamics also add discriminatory power to differentiate between different types of contact networks. We also found that the possibility to differentiate contact networks depends on how far an epidemic has progressed, where distance-based tree statistics have more power early in an epidemic. Finally, we applied our ABC inference on two different outbreaks from the Swedish HIV-1 epidemic.« less

  17. Inference of Transmission Network Structure from HIV Phylogenetic Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giardina, Federica; Romero-Severson, Ethan Obie; Albert, Jan

    Phylogenetic inference is an attractive means to reconstruct transmission histories and epidemics. However, there is not a perfect correspondence between transmission history and virus phylogeny. Both node height and topological differences may occur, depending on the interaction between within-host evolutionary dynamics and between-host transmission patterns. To investigate these interactions, we added a within-host evolutionary model in epidemiological simulations and examined if the resulting phylogeny could recover different types of contact networks. To further improve realism, we also introduced patient-specific differences in infectivity across disease stages, and on the epidemic level we considered incomplete sampling and the age of the epidemic.more » Second, we implemented an inference method based on approximate Bayesian computation (ABC) to discriminate among three well-studied network models and jointly estimate both network parameters and key epidemiological quantities such as the infection rate. Our ABC framework used both topological and distance-based tree statistics for comparison between simulated and observed trees. Overall, our simulations showed that a virus time-scaled phylogeny (genealogy) may be substantially different from the between-host transmission tree. This has important implications for the interpretation of what a phylogeny reveals about the underlying epidemic contact network. In particular, we found that while the within-host evolutionary process obscures the transmission tree, the diversification process and infectivity dynamics also add discriminatory power to differentiate between different types of contact networks. We also found that the possibility to differentiate contact networks depends on how far an epidemic has progressed, where distance-based tree statistics have more power early in an epidemic. Finally, we applied our ABC inference on two different outbreaks from the Swedish HIV-1 epidemic.« less

  18. No evidence for an effect on brain atrophy rate of atorvastatin add-on to interferon β1b therapy in relapsing-remitting multiple sclerosis (the ARIANNA study).

    PubMed

    Lanzillo, Roberta; Quarantelli, Mario; Pozzilli, Carlo; Trojano, Maria; Amato, Maria Pia; Marrosu, Maria G; Francia, Ada; Florio, Ciro; Orefice, Giuseppe; Tedeschi, Gioacchino; Bellantonio, Paolo; Annunziata, Pasquale; Grimaldi, Luigi M; Comerci, Marco; Brunetti, Arturo; Bonavita, Vincenzo; Alfano, Bruno; Marini, Stefano; Brescia Morra, Vincenzo

    2016-08-01

    A previous phase 2 trial has suggested that statins might delay brain atrophy in secondary progressive multiple sclerosis. The objective of this study was to evaluate the effect of atorvastatin add-on therapy on cerebral atrophy in relapsing-remitting multiple sclerosis. This randomised, placebo-controlled study compared atorvastatin 40 mg or placebo add-on therapy to interferon β1b for 24 months. Brain magnetic resonance imaging, multiple sclerosis functional composite score, Rao neuropsychological battery and expanded disability status scale were evaluated over 24 months. A total of 154 patients were randomly assigned, 75 in the atorvastatin and 79 in the placebo arms, with a comparable drop-out rate (overall 23.4%). Brain atrophy over 2 years was not different in the two arms (-0.38% and -0.32% for the atorvastatin and placebo groups, respectively). Relapse rate, expanded disability status scale, multiple sclerosis functional composite score or cognitive changes were not different in the two arms. Patients withdrawing from the study had a higher number of relapses in the previous 2 years (P=0.04) and a greater probability of relapsing within 12 months. Our results suggest that the combination of atorvastatin and interferon β1b is not justified in early relapsing-remitting multiple sclerosis and adds to the body of evidence indicating an absence of significant radiological and clinical benefit of statins in relapsing-remitting multiple sclerosis. © The Author(s), 2015.

  19. Damage progression in Composite Structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    1996-01-01

    A computational simulation tool is used to evaluate the various stages of damage progression in composite materials during Iosipescu sheat testing. Unidirectional composite specimens with either the major or minor material axis in the load direction are considered. Damage progression characteristics are described for each specimen using two types of boundary conditions. A procedure is outlined regarding the use of computational simulation in composites testing. Iosipescu shear testing using the V-notched beam specimen is a convenient method to measure both shear strength and shear stiffness simultaneously. The evaluation of composite test response can be made more productive and informative via computational simulation of progressive damage and fracture. Computational simulation performs a complete evaluation of laminated composite fracture via assessment of ply and subply level damage/fracture processes.

  20. Systems Epidemiology: What’s in a Name?

    PubMed Central

    Dammann, O.; Gray, P.; Gressens, P.; Wolkenhauer, O.; Leviton, A.

    2014-01-01

    Systems biology is an interdisciplinary effort to integrate molecular, cellular, tissue, organ, and organism levels of function into computational models that facilitate the identification of general principles. Systems medicine adds a disease focus. Systems epidemiology adds yet another level consisting of antecedents that might contribute to the disease process in populations. In etiologic and prevention research, systems-type thinking about multiple levels of causation will allow epidemiologists to identify contributors to disease at multiple levels as well as their interactions. In public health, systems epidemiology will contribute to the improvement of syndromic surveillance methods. We encourage the creation of computational simulation models that integrate information about disease etiology, pathogenetic data, and the expertise of investigators from different disciplines. PMID:25598870

  1. Dissemination and support of ARGUS for accelerator applications. Technical progress report, April 24, 1991--January 20, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User`s Guide that documents the use of the code for all users. To release the code and the User`s Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less

  2. Advanced secondary recovery demonstration for the Sooner Unit. Progress report, July 1--September 30, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sippel, M.A.; Cammon, T.J.

    1995-09-30

    The objective of this project is to increase production from the Cretaceous ``D`` Sand in the Denver-Julesburg (D-J) Basin through geologically targeted infill drilling and improved reservoir management of waterflood operations. This project involves multi-disciplinary reservoir characterization using high-density 3-D seismic, detailed stratigraphy and reservoir simulation studies. Infill drilling, water-injection conversion and recompleting some wells to add short-radius laterals will be based on the results of the reservoir characterization studies. Production response will be evaluated using reservoir simulation and production tests. Technology transfer will utilize workshops, presentations and technical papers which will emphasize the economic advantages of implementing the demonstratedmore » technologies. The success of this project and effective technology transfer should prompt-re-appraisal of older waterflood projects and implementation of new projects in oil provinces such as the D-J Basin. Three wells have been drilled by the project based on 3-D seismic and integrated reservoir characterization study. Oil production has increased in September to 54.0 m{sup 3}/D (340 bopd) after the completion of the SU 21-16-9. Combination-attribute maps from 3-D seismic data closely predicted the net-pay thickness of the new well. Inter-well tracer tests with sodium bromide indicate a high-permeability channel between two wells. An oral presentation was made at the Rocky Mountain AAPG meeting in Reno, NV.« less

  3. Trade-off between competition and facilitation defines gap colonization in mountains

    PubMed Central

    Lembrechts, Jonas J.; Milbau, Ann; Nijs, Ivan

    2015-01-01

    Recent experimental observations show that gap colonization in small-stature (e.g. grassland and dwarf shrubs) vegetation strongly depends on the abiotic conditions within them. At the same time, within-gap variation in biotic interactions such as competition and facilitation, caused by distance to the gap edge, would affect colonizer performance, but a theoretical framework to explore such patterns is missing. Here, we model how competition, facilitation and environmental conditions together determine the small-scale patterns of gap colonization along a cold gradient in mountains, by simulating colonizer survival in gaps of various sizes. Our model adds another dimension to the known effects of biotic interactions along a stress gradient by focussing on the trade-off between competition and facilitation in the within-gap environment. We show that this trade-off defines a peak in colonizer survival at a specific distance from the gap edge, which progressively shifts closer to the edge as the environment gets colder, ultimately leaving a large fraction of gaps unsuitable for colonization in facilitation-dominated systems. This is reinforced when vegetation size and temperature amelioration are manipulated simultaneously with temperature in order to simulate an elevational gradient more realistically. Interestingly, all other conditions being equal, the magnitude of the realized survival peak was always lower in large than in small gaps, making large gaps harder to colonize. The model is relevant to predict effects of non-native plant invasions and climate warming on colonization processes in mountains. PMID:26558706

  4. Distribution Feeder Modeling for Time-Series Simulation of Voltage Management Strategies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giraldez Miner, Julieta I; Gotseff, Peter; Nagarajan, Adarsh

    This paper presents techniques to create baseline distribution models using a utility feeder from Hawai'ian Electric Company. It describes the software-to-software conversion, steady-state, and time-series validations of a utility feeder model. It also presents a methodology to add secondary low-voltage circuit models to accurately capture the voltage at the customer meter level. This enables preparing models to perform studies that simulate how customer-sited resources integrate into legacy utility distribution system operations.

  5. Progress in the Utilization of High-Fidelity Simulation in Basic Science Education

    ERIC Educational Resources Information Center

    Helyer, Richard; Dickens, Peter

    2016-01-01

    High-fidelity patient simulators are mainly used to teach clinical skills and remain underutilized in teaching basic sciences. This article summarizes our current views on the use of simulation in basic science education and identifies pitfalls and opportunities for progress.

  6. Towards physical implementation of an optical add-drop multiplexer (OADM) based upon properties of 12-fold photonic quasicrystals

    NASA Astrophysics Data System (ADS)

    Gauthier, Robert C.; Mnaymneh, Khaled

    2005-09-01

    The key feature that gives photonic crystals (PhCs) their ability to form photonic band gaps (PBGs) analogous to electronic band gaps of semiconductors is their translation symmetries. In recent years, however, it has been found that structures that possess only rotational symmetries can also have PBGs. In addition, these structures, known as Photonic Quasicrystals (PhQs), have other interesting qualities that set them apart of their translational cousins. One interesting feature is how defect states can be created in PhQs. If the rotational symmetry is disturbed, defect states analogous to defects states that are created in PhCs can be obtained. Simulation results of these defect states and other propagation properties of planar 12-fold photonic quasicrystal patterns, and its physical implementations in Silicon-On-Insulator (SOI) are presented. The main mechanisms required to make any optical multiplexing system is propagation; stop bands and add/drop ports. With the rotationally symmetry of the PhQ causing the stop bands, line defects facilitating propagation and now these specially design defect states acting as add/drop ports, a physical implementation of an OADM can be presented. Theoretical, practical and manufacturing benefits of PhQs are discussed. Simulated transmission plots are shown for various fill factors, dielectric contrast and propagation direction. It is shown that low index waveguides can be produced using the quasi-crystal photonic crystal pattern. Fabrication steps and results are shown.

  7. Cultural Norms of Clinical Simulation in Undergraduate Nursing Education

    PubMed Central

    2015-01-01

    Simulated practice of clinical skills has occurred in skills laboratories for generations, and there is strong evidence to support high-fidelity clinical simulation as an effective tool for learning performance-based skills. What are less known are the processes within clinical simulation environments that facilitate the learning of socially bound and integrated components of nursing practice. Our purpose in this study was to ethnographically describe the situated learning within a simulation laboratory for baccalaureate nursing students within the western United States. We gathered and analyzed data from observations of simulation sessions as well as interviews with students and faculty to produce a rich contextualization of the relationships, beliefs, practices, environmental factors, and theoretical underpinnings encoded in cultural norms of the students’ situated practice within simulation. Our findings add to the evidence linking learning in simulation to the development of broad practice-based skills and clinical reasoning for undergraduate nursing students. PMID:28462300

  8. Coal desulfurization by a microwave process. Technical progress report, February 1981-May 1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavitsanos, P.D.; Golden, J.A.; Bleiler, K.W.

    1981-01-01

    Desulfurization experiments were carried out using the 6KW, 2450 MHz Flow Reactor System. The program has been directed toward the combination of physical separation and microwave exposure with NaOH to increase sulfur removal. The following treatment sequence has been used with good results: (1) expose 1/4 to 1 in. raw coal to microwaves; (2) crush the treated coal and separate the sample into float/sink fractions; (3) add NaOH to the float fraction and re-expose the sample to microwaves; and (4) wash, add NaOH and expose to microwaves. This procedure has produced up to 89% sulfur removal and as low asmore » 0.31 numberS/10/sup 6/ Btu. Ash analyses on these samples showed as high as 40% reduction. The calorific value was increased in almost all samples. Data on sulfur, ash and calorific values are summarized.« less

  9. Synchronization Of Parallel Discrete Event Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  10. Vietnam plunges ahead

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, M.T.

    1995-07-01

    Vietnam is moving fast. Facing the need to double its installed power generation capacity by the year 2000, Vietnam is pursuing a range of development alternatives to add an estimated 3,000 MW of new power plants. As part of the country`s progress toward a market economy, Vietnam has relaxed its rules regarding investment in power plants. The country enacted a new electricity law early in 1995, paving the way for private participation in the power sector.

  11. Indian Ocean Surface Circulations and Their Connection to Indian Ocean Dipole, Identified From Ocean Surface Currents Analysis Real Time (OSCAR) Data

    DTIC Science & Technology

    2008-06-01

    31 1. Seasonal Development .......................................................................32 2. Winter Monsoon...summary of the monsoon system in the Indian Ocean. The top part indicates the wind cycle; the lower part shows the major currents that develop in...energy interests in the Indian Ocean’s waters. The rapid economic progress in developing nations, such as India and South Africa, also adds up their

  12. The Effects of Check-In, Check-Up, Check-Out for Students with Moderate Intellectual Disability during On- and Off-Site Vocational Training

    ERIC Educational Resources Information Center

    Boden, Lauren J.; Jolivette, Kristine; Alberto, Paul A.

    2018-01-01

    Check-in/check-out is a secondary-tier intervention within the positive behavior interventions and supports framework. Check-in/check-out pairs the use of an adult mentor with a daily progress report to help students meet individualized behavioral goals. This study adds to the research base by examining the effects of check-in, check-up, check-out…

  13. Hadley circulation extent and strength in a wide range of simulated climates

    NASA Astrophysics Data System (ADS)

    D'Agostino, Roberta; Adam, Ori; Lionello, Piero; Schneider, Tapio

    2017-04-01

    Understanding the Hadley circulation (HC) dynamics is crucial because its changes affect the seasonal migration of the ITCZ, the extent of subtropical arid regions and the strength of the monsoons. Despite decades of study, the factors controlling its strength and extent have remained unclear. Here we analyse how HC strength and extent change over a wide range of climate conditions from the Last Glacial Maximum to future projections. The large climate change between paleoclimate simulations and future scenarios offers the chance to analyse robust HC changes and their link to large-scale factors. The HC shrinks and strengthens in the coldest simulation relative to the warmest. A progressive poleward shift of its edges is evident as the climate warms (at a rate of 0.35°lat./K in each hemisphere). The HC extent and strength both depend on the isentropic slope, which in turn is related to the meridional temperature gradient, subtropical static stability and tropopause height. In multiple robust regression analysis using these as predictors, we find that the tropical tropopause height does not add relevant information to the model beyond surface temperature. Therefore, primarily the static stability and secondarily the meridional temperature contrast together account for the bulk of the almost the total HC variance. However, the regressions leave some of the northern HC edge and southern HC strength variance unexplained. The effectiveness of this analysis is limited by the correlation among the predictors and their relationship with mean temperature. In fact, for all simulations, the tropical temperature explains well the variations of HC except its southern hemisphere intensity. Hence, it can be used as the sole predictor to diagnose the HC response to greenhouse-induced global warming. How to account for the evolution of the southern HC strength remains unclear, because of the large inter-model spread in this quantity.

  14. Compensating additional optical power in the central zone of a multifocal contact lens forminimization of the shrinkage error of the shell mold in the injection molding process.

    PubMed

    Vu, Lien T; Chen, Chao-Chang A; Lee, Chia-Cheng; Yu, Chia-Wei

    2018-04-20

    This study aims to develop a compensating method to minimize the shrinkage error of the shell mold (SM) in the injection molding (IM) process to obtain uniform optical power in the central optical zone of soft axial symmetric multifocal contact lenses (CL). The Z-shrinkage error along the Z axis or axial axis of the anterior SM corresponding to the anterior surface of a dry contact lens in the IM process can be minimized by optimizing IM process parameters and then by compensating for additional (Add) powers in the central zone of the original lens design. First, the shrinkage error is minimized by optimizing three levels of four IM parameters, including mold temperature, injection velocity, packing pressure, and cooling time in 18 IM simulations based on an orthogonal array L 18 (2 1 ×3 4 ). Then, based on the Z-shrinkage error from IM simulation, three new contact lens designs are obtained by increasing the Add power in the central zone of the original multifocal CL design to compensate for the optical power errors. Results obtained from IM process simulations and the optical simulations show that the new CL design with 0.1 D increasing in Add power has the closest shrinkage profile to the original anterior SM profile with percentage of reduction in absolute Z-shrinkage error of 55% and more uniform power in the central zone than in the other two cases. Moreover, actual experiments of IM of SM for casting soft multifocal CLs have been performed. The final product of wet CLs has been completed for the original design and the new design. Results of the optical performance have verified the improvement of the compensated design of CLs. The feasibility of this compensating method has been proven based on the measurement results of the produced soft multifocal CLs of the new design. Results of this study can be further applied to predict or compensate for the total optical power errors of the soft multifocal CLs.

  15. Teaching childbirth with high-fidelity simulation. Is it better observing the scenario during the briefing session?

    PubMed

    Cuerva, Marcos J; Piñel, Carlos S; Martin, Lourdes; Espinosa, Jose A; Corral, Octavio J; Mendoza, Nicolás

    2018-02-12

    The design of optimal courses for obstetric undergraduate teaching is a relevant question. This study evaluates two different designs of simulator-based learning activity on childbirth with regard to respect to the patient, obstetric manoeuvres, interpretation of cardiotocography tracings (CTG) and infection prevention. This randomised experimental study which differs in the content of their briefing sessions consisted of two groups of undergraduate students, who performed two simulator-based learning activities on childbirth. The first briefing session included the observations of a properly performed scenario according to Spanish clinical practice guidelines on care in normal childbirth by the teachers whereas the second group did not include the observations of a properly performed scenario, and the students observed it only after the simulation process. The group that observed a properly performed scenario after the simulation obtained worse grades during the simulation, but better grades during the debriefing and evaluation. Simulator use in childbirth may be more fruitful when the medical students observe correct performance at the completion of the scenario compared to that at the start of the scenario. Impact statement What is already known on this subject? There is a scarcity of literature about the design of optimal high-fidelity simulation training in childbirth. It is known that preparing simulator-based learning activities is a complex process. Simulator-based learning includes the following steps: briefing, simulation, debriefing and evaluation. The most important part of high-fidelity simulations is the debriefing. A good briefing and simulation are of high relevance in order to have a fruitful debriefing session. What do the results of this study add? Our study describes a full simulator-based learning activity on childbirth that can be reproduced in similar facilities. The findings of this study add that high-fidelity simulation training in childbirth is favoured by a short briefing session and an abrupt start to the scenario, rather than a long briefing session that includes direct instruction in the scenario. What are the implications of these findings for clinical practice and/or further research? The findings of this study reveal what to include in the briefing of simulator-based learning activities on childbirth. These findings have implications in medical teaching and in medical practice.

  16. Motion Simulator

    NASA Technical Reports Server (NTRS)

    1993-01-01

    MOOG, Inc. supplies hydraulic actuators for the Space Shuttle. When MOOG learned NASA was interested in electric actuators for possible future use, the company designed them with assistance from Marshall Space Flight Center. They also decided to pursue the system's commercial potential. This led to partnership with InterActive Simulation, Inc. for production of cabin flight simulators for museums, expositions, etc. The resulting products, the Magic Motion Simulator 30 Series, are the first electric powered simulators. Movements are computer-guided, including free fall to heighten the sense of moving through space. A projection system provides visual effects, and the 11 speakers of a digital laser based sound system add to the realism. The electric actuators are easier to install, have lower operating costs, noise, heat and staff requirements. The U.S. Space & Rocket Center and several other organizations have purchased the simulators.

  17. Extended frequency turbofan model

    NASA Technical Reports Server (NTRS)

    Mason, J. R.; Park, J. W.; Jaekel, R. F.

    1980-01-01

    The fan model was developed using two dimensional modeling techniques to add dynamic radial coupling between the core stream and the bypass stream of the fan. When incorporated into a complete TF-30 engine simulation, the fan model greatly improved compression system frequency response to planar inlet pressure disturbances up to 100 Hz. The improved simulation also matched engine stability limits at 15 Hz, whereas the one dimensional fan model required twice the inlet pressure amplitude to stall the simulation. With verification of the two dimensional fan model, this program formulated a high frequency F-100(3) engine simulation using row by row compression system characteristics. In addition to the F-100(3) remote splitter fan, the program modified the model fan characteristics to simulate a proximate splitter version of the F-100(3) engine.

  18. Simulated characteristics of the DEGAS γ-detector array

    NASA Astrophysics Data System (ADS)

    Li, G. S.; Lizarazo, C.; Gerl, J.; Kojouharov, I.; Schaffner, H.; Górska, M.; Pietralla, N.; Saha, S.; Liu, M. L.; Wang, J. G.

    2018-05-01

    The performance of the novel HPGe-Cluster array DEGAS to be used at FAIR has been studied through GEANT4 simulations using accurate geometries of most of the detector components. The simulation framework has been tested by comparing experimental data of various detector setups. The study showed that the DEGAS system could provide a clear improvement of the photo-peak efficiency compared to the previous RISING array. In addition, the active BGO Back-catcher could greatly enhance the background suppression capability. The add-back analysis revealed that even at a γ multiplicity of six the sensitivity is improved by adding back the energy depositions of the neighboring Ge crystals.

  19. Cattle Uterus: A Novel Animal Laboratory Model for Advanced Hysteroscopic Surgery Training

    PubMed Central

    Ewies, Ayman A. A.; Khan, Zahid R.

    2015-01-01

    In recent years, due to reduced training opportunities, the major shift in surgical training is towards the use of simulation and animal laboratories. Despite the merits of Virtual Reality Simulators, they are far from representing the real challenges encountered in theatres. We introduce the “Cattle Uterus Model” in the hope that it will be adopted in training courses as a low cost and easy-to-set-up tool. It adds new dimensions to the advanced hysteroscopic surgery training experience by providing tactile sensation and simulating intraoperative difficulties. It complements conventional surgical training, aiming to maximise clinical exposure and minimise patients' harm. PMID:26265918

  20. Implementation of the U.S. Environmental Protection Agency's Waste Reduction (WAR) Algorithm in Cape-Open Based Process Simulators

    EPA Science Inventory

    The Sustainable Technology Division has recently completed an implementation of the U.S. EPA's Waste Reduction (WAR) Algorithm that can be directly accessed from a Cape-Open compliant process modeling environment. The WAR Algorithm add-in can be used in AmsterChem's COFE (Cape-Op...

  1. Enterprise Requirements and Acquisition Model (ERAM) Analysis and Extension

    DTIC Science & Technology

    2014-02-20

    add them to the ERAM simulation. References . Arena, M. V., Obaid, Y., Galway L. A., Fox, B., Graser, J. C., Sollinger, J. M., Wu, F., & Wong, C... Galway L. A., Fox, B., Graser, J. C., Sollinger, J. M., Wu, F., & Wong, C. (2006). Impossible certainty: Cost risk analysis for air force systems (MG-415

  2. Exploring the optimum step size for defocus curves.

    PubMed

    Wolffsohn, James S; Jinabhai, Amit N; Kingsnorth, Alec; Sheppard, Amy L; Naroo, Shehzad A; Shah, Sunil; Buckhurst, Phillip; Hall, Lee A; Young, Graeme

    2013-06-01

    To evaluate the effect of reducing the number of visual acuity measurements made in a defocus curve on the quality of data quantified. Midland Eye, Solihull, United Kingdom. Evaluation of a technique. Defocus curves were constructed by measuring visual acuity on a distance logMAR letter chart, randomizing the test letters between lens presentations. The lens powers evaluated ranged between +1.50 diopters (D) and -5.00 D in 0.50 D steps, which were also presented in a randomized order. Defocus curves were measured binocularly with the Tecnis diffractive, Rezoom refractive, Lentis rotationally asymmetric segmented (+3.00 D addition [add]), and Finevision trifocal multifocal intraocular lenses (IOLs) implanted bilaterally, and also for the diffractive IOL and refractive or rotationally asymmetric segmented (+3.00 D and +1.50 D adds) multifocal IOLs implanted contralaterally. Relative and absolute range of clear-focus metrics and area metrics were calculated for curves fitted using 0.50 D, 1.00 D, and 1.50 D steps and a near add-specific profile (ie, distance, half the near add, and the full near-add powers). A significant difference in simulated results was found in at least 1 of the relative or absolute range of clear-focus or area metrics for each of the multifocal designs examined when the defocus-curve step size was increased (P<.05). Faster methods of capturing defocus curves from multifocal IOL designs appear to distort the metric results and are therefore not valid. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  3. Real-time visual simulation of APT system based on RTW and Vega

    NASA Astrophysics Data System (ADS)

    Xiong, Shuai; Fu, Chengyu; Tang, Tao

    2012-10-01

    The Matlab/Simulink simulation model of APT (acquisition, pointing and tracking) system is analyzed and established. Then the model's C code which can be used for real-time simulation is generated by RTW (Real-Time Workshop). Practical experiments show, the simulation result of running the C code is the same as running the Simulink model directly in the Matlab environment. MultiGen-Vega is a real-time 3D scene simulation software system. With it and OpenGL, the APT scene simulation platform is developed and used to render and display the virtual scenes of the APT system. To add some necessary graphics effects to the virtual scenes real-time, GLSL (OpenGL Shading Language) shaders are used based on programmable GPU. By calling the C code, the scene simulation platform can adjust the system parameters on-line and get APT system's real-time simulation data to drive the scenes. Practical application shows that this visual simulation platform has high efficiency, low charge and good simulation effect.

  4. Viewing Our Aged Selves: Age Progression Simulations Increase Young Adults' Aging Anxiety and Negative Stereotypes of Older Adults.

    PubMed

    Rittenour, Christine E; Cohen, Elizabeth L

    2016-04-01

    This experiment tests the effect of an old-age progression simulation on young adults' (N = 139) reported aging anxiety and perceptions about older adults as a social group. College students were randomly assigned to one of three conditions: self-aged simulation, stranger-aged simulation, or a control group. Compared with the control group, groups exposed to an age progression experienced more negative affect, and individuals in the self-aged condition reported greater aging anxiety. In accordance with stereotype activation theorizing, the self-age simulation group also perceived older adults as less competent and expressed more pity and less envy for older adults. Compared to the stranger-aged group, participants who observed their own age progression were also the more likely to deny the authenticity of their transformed image.These findings highlight potential negative social and psychological consequences of using age simulations to affect positive health outcomes, and they shed light on how virtual experiences can affect stereotyping of older adults. © The Author(s) 2016.

  5. Comprehensive study of unexpected microscope condensers formed in sample arrangements commonly used in optical microscopy.

    PubMed

    Desai, Darshan B; Aldawsari, Mabkhoot Mudith S; Alharbi, Bandar Mohammed H; Sen, Sanchari; Grave de Peralta, Luis

    2015-09-01

    We show that various setups for optical microscopy which are commonly used in biomedical laboratories behave like efficient microscope condensers that are responsible for observed subwavelength resolution. We present a series of experiments and simulations that reveal how inclined illumination from such unexpected condensers occurs when the sample is perpendicularly illuminated by a microscope's built-in white-light source. In addition, we demonstrate an inexpensive add-on optical module that serves as an efficient and lightweight microscope condenser. Using such add-on optical module in combination with a low-numerical-aperture objective lens and Fourier plane imaging microscopy technique, we demonstrate detection of photonic crystals with a period nearly eight times smaller than the Rayleigh resolution limit.

  6. Ageing, health status and coverage rate effects on community prescription costs in Ireland.

    PubMed

    Kenneally, Martin; Lynch, Brenda

    2018-06-01

    This paper aims to explore how GMS drug costs depend on age, gender, income, health status, community drug scheme coverage rates and whether they display significant differences across regions of Ireland. We also aim to find out whether the GMS drug costs of high and low income cohorts respond similarly to changes in their health status. The paper projects GMS drug costs in 2026 and examines the separate cost of population ageing and population growth over the period. We also aim to simulate the estimated model to show how much giving free prescription drugs to all persons aged 'under 5' would add to 2026 GMS drug costs, and also how much giving universal GMS coverage to all persons in 2026 would add to 2026 GMS drug costs. We construct a multivariate logistic regression model of GMS community drug costs in Ireland. We progress the methodology used in earlier studies by explicitly modelling how regional incomes and regional health status interact in determining GMS drug costs in Ireland. An age cohort and region breakdown of the simulated GMS drug costs, of both projected demographic trends and public policy measures that have been adopted or are under consideration, are also investigated. We find that GMS drug costs depend on age-but not gender-on income, health status, community drug scheme coverage rates, and they are significantly lower for all age cohorts in Donegal and the North West region. The GMS drug costs of high income cohorts tend to increase as their health status improves, whereas they tend to decrease as the health status of low income cohorts improves. A uniform 1% gain in health status has little impact on total GMS prescribing costs. Similarly, if the health status of all Irish regions improved to match that of the East region in 2010 it would only have reduced public prescription costs by around 32 € million of the 1.8 € billion GMS drugs bill. We find that giving free prescription drugs to all persons aged 'under 5' in 2010 would have only a minor impact on 2010 GMS drug costs, whereas giving universal GMS coverage to all persons would have doubled public prescription costs from 1.8 € billion to circa 3.6 € billion.

  7. Intensity-modulated radiotherapy (IMRT) in pediatric low-grade glioma.

    PubMed

    Paulino, Arnold C; Mazloom, Ali; Terashima, Keita; Su, Jack; Adesina, Adekunle M; Okcu, M Faith; Teh, Bin S; Chintagumpala, Murali

    2013-07-15

    The objective of this study was to evaluate local control and patterns of failure in pediatric patients with low-grade glioma (LGG) who received treatment with intensity-modulated radiation therapy (IMRT). In total, 39 children received IMRT after incomplete resection or disease progression. Three methods of target delineation were used. The first was to delineate the gross tumor volume (GTV) and add a 1-cm margin to create the clinical target volume (CTV) (Method 1; n = 19). The second was to add a 0.5-cm margin around the GTV to create the CTV (Method 2; n = 6). The prescribed dose to the GTV was the same as dose to the CTV for both Methods 1 and 2 (median, 50.4 grays [Gy]). The final method was dose painting, in which a GTV was delineated with a second target volume (2TV) created by adding 1 cm to the GTV (Method 3; n = 14). Different doses were prescribed to the GTV (median, 50.4 Gy) and the 2TV (median, 41.4 Gy). The 8-year progression-free and overall survival rates were 78.2% and 93.7%, respectively. Seven failures occurred, all of which were local in the high-dose (≥95%) region of the IMRT field. On multivariate analysis, age ≤5 years at time of IMRT had a detrimental impact on progression-free survival. IMRT provided local control rates comparable to those provided by 2-dimensional and 3-dimensional radiotherapy. Margins ≥1 cm added to the GTV may not be necessary, because excellent local control was achieved by adding a 0.5-cm margin (Method 2) and by dose painting (Method 3). © 2013 American Cancer Society.

  8. Electroencephalogram and Alzheimer's Disease: Clinical and Research Approaches

    PubMed Central

    Tsolaki, Anthoula; Kazis, Dimitrios; Kompatsiaris, Ioannis; Kosmidou, Vasiliki; Tsolaki, Magda

    2014-01-01

    Alzheimer's disease (AD) is a neurodegenerative disorder that is characterized by cognitive deficits, problems in activities of daily living, and behavioral disturbances. Electroencephalogram (EEG) has been demonstrated as a reliable tool in dementia research and diagnosis. The application of EEG in AD has a wide range of interest. EEG contributes to the differential diagnosis and the prognosis of the disease progression. Additionally such recordings can add important information related to the drug effectiveness. This review is prepared to form a knowledge platform for the project entitled “Cognitive Signal Processing Lab,” which is in progress in Information Technology Institute in Thessaloniki. The team tried to focus on the main research fields of AD via EEG and recent published studies. PMID:24868482

  9. Upper crustal structure of Alabama from regional magnetic and gravity data: Using geology to interpret geophysics, and vice versa

    USGS Publications Warehouse

    Steltenpohl, Mark G.; Horton, J. Wright; Hatcher, Robert D.; Zietz, Isidore; Daniels, David L.; Higgins, Michael W.

    2013-01-01

    Aeromagnetic and gravity data sets obtained for Alabama (United States) have been digitally merged and filtered to enhance upper-crustal anomalies. Beneath the Appalachian Basin in northwestern Alabama, broad deep-crustal anomalies of the continental interior include the Grenville front and New York–Alabama lineament (dextral fault). Toward the east and south, high-angle discordance between the northeast-trending Appalachians and the east-west–trending wedge of overlapping Mesozoic and Cenozoic Gulf Coastal Plain sediments reveals how bedrock geophysical signatures progressively change with deeper burial. High-frequency magnetic anomalies in the Appalachian deformed domain (ADD) correspond to amphibolites and mylonites outlining terranes, while broader, lower-amplitude domains include Paleozoic intrusive bodies and Grenville basement gneiss. Fundamental ADD structures (e.g., the Alexander City, Towaliga, and Goat Rock–Bartletts Ferry faults) can be traced southward beneath the Gulf Coastal Plain to the suture with Gondwanan crust of the Suwannee terrane. Within the ADD, there is clear magnetic distinction between Laurentian crust and the strongly linear, high-frequency magnetic highs of peri-Gondwanan (Carolina-Uchee) arc terranes. The contact (Central Piedmont suture) corresponds to surface exposures of the Bartletts Ferry fault. ADD magnetic and gravity signatures are truncated by the east-west–trending Altamaha magnetic low associated with the Suwannee suture. Arcuate northeast-trending magnetic linears of the Suwannee terrane reflect internal structure and Mesozoic failed-rift trends. Geophysical data can be used to make inferences on surface and subsurface geology and vice versa, which has applicability anywhere that bedrock is exposed or concealed beneath essentially non-magnetic sedimentary cover.

  10. [Additional qualification in health economics--a pre-condition for ENT leadership positions?].

    PubMed

    Lehnerdt, G; Schöffski, O; Mattheis, S; Hoffmann, T K; Lang, S

    2013-11-01

    The increasing medical-technical progress as well as the dramatic demographic changes cause problems with regard to rapid enlargement of medical service offers, allocation of resources and a financing shortfall in the German public health system. The economization in the German Health System can also be perceived in ENT departments. After performing an internet search about the rapidly growing market for qualifications measures in health economics, we hence conducted an anonymous survey for ENT senior doctors and directors of the 34 German University Departments to evaluate their attitude towards, as well as their expectation of such an add-on qualification. Since the German government finalized the health care reform in the year 2000 such qualification measures rapidly developed: amongst others, 26 postgraduate, extra-occupational master programs have been inaugurated. The anonymous survey was answered by 105 ENT doctors (63 senior doctors, 27 vice professors and 15 directors). 63% out of these 105 colleagues considered such an add-on qualification to be mandatory. 41% of the colleagues were already "add-on qualified" in that field, only 10 of them by means of a study program. 71 of 105 colleagues (68%) considered the add-on qualification to be advantageous for their future personal career. With regard to the designated contents of the study program, "Staff Management" was even prioritized to "Hospital Financing" and "Cost Accounting". Aspects of management and a (health-) economical basic knowledge became an integral part of the daily routine for "first-line management doctors" also in (University) ENT-departments. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Developing a Theory-Based Simulation Educator Resource.

    PubMed

    Thomas, Christine M; Sievers, Lisa D; Kellgren, Molly; Manning, Sara J; Rojas, Deborah E; Gamblian, Vivian C

    2015-01-01

    The NLN Leadership Development Program for Simulation Educators 2014 faculty development group identified a lack of a common language/terminology to outline the progression of expertise of simulation educators. The group analyzed Benner's novice-to-expert model and applied its levels of experience to simulation educator growth. It established common operational categories of faculty development and used them to organize resources that support progression toward expertise. The resulting theory-based Simulator Educator Toolkit outlines levels of ability and provides quality resources to meet the diverse needs of simulation educators and team members.

  12. Spironolactone Add-on for Preventing or Slowing the Progression of Diabetic Nephropathy: A Meta-analysis.

    PubMed

    Hou, Jing; Xiong, Weiquan; Cao, Ling; Wen, Xiangqiong; Li, Ailing

    2015-09-01

    The aim of this meta-analysis was to evaluate the benefits and potential adverse effects of adding spironolactone to standard antidiabetic/renoprotective/antihypertensive (AD/RP/AHT) treatment in patients with diabetic nephropathy (DN). PubMed/MEDLINE and Web of Knowledge were searched for relevant randomized, controlled studies (RCTs) or quasi-RCTs of the effects of adding spironolactone to standard AD/RP/AHT treatment in patients with DN. Results were summarized with a random-effects model or a fixed-effects model. According to the outcomes measured (benefits and risks of adding spironolactone to standard AD/RP/AHT treatment), compared with controls, the addition of spironolactone significantly decreased end-of-treatment (EOT) 24-hour urinary albumin/protein excretion and significantly increased percentage reduction from baseline in urinary albumin/creatinine ratio (UACR), although it did not significantly affect EOT UACR. The addition of spironolactone further led to a significantly greater reduction from baseline in glomerular filtration rate (GFR)/estimated (e) GFR, although it did not significantly affect EOT GFR/eGFR. Further, the addition of spironolactone significantly reduced EOT in-office, 24-hour, and daytime systolic and diastolic blood pressure (SBP and DBP, respectively) and led to significantly greater reductions from baseline in in-office SBP and DBP, although it did not significantly affect nighttime SBP or DBP. Finally, the addition of spironolactone significantly increased mean serum/plasma potassium levels and the risk for hyperkalemia. Spironolactone could be added to preexisting AD/RP/AHT therapy in patients with DN to prevent or slow DN progression by reducing proteinuria. The addition of spironolactone would likely provide even more beneficial effect in patients with DN and hypertension due to the BP reduction associated with spironolactone use. However, the beneficial effects of spironolactone add-on should be weighed against its potential risks, especially hyperkalemia. The long-term effects of spironolactone add-on on renal outcomes and mortality need to be studied. Copyright © 2015. Published by Elsevier Inc.

  13. The environmental variables that impact human decomposition in terrestrially exposed contexts within Canada.

    PubMed

    Cockle, Diane Lyn; Bell, Lynne S

    2017-03-01

    Little is known about the nature and trajectory of human decomposition in Canada. This study involved the examination of 96 retrospective police death investigation cases selected using the Canadian ViCLAS (Violent Crime Linkage Analysis System) and sudden death police databases. A classification system was designed and applied based on the latest visible stages of autolysis (stages 1-2), putrefaction (3-5) and skeletonisation (6-8) observed. The analysis of the progression of decomposition using time (Post Mortem Interval (PMI) in days) and temperature accumulated-degree-days (ADD) score found considerable variability during the putrefaction and skeletonisation phases, with poor predictability noted after stage 5 (post bloat). The visible progression of decomposition outdoors was characterized by a brown to black discolouration at stage 5 and remnant desiccated black tissue at stage 7. No bodies were totally skeletonised in under one year. Mummification of tissue was rare with earlier onset in winter as opposed to summer, considered likely due to lower seasonal humidity. It was found that neither ADD nor the PMI were significant dependent variables for the decomposition score with correlations of 53% for temperature and 41% for time. It took almost twice as much time and 1.5 times more temperature (ADD) for the set of cases exposed to cold and freezing temperatures (4°C or less) to reach putrefaction compared to the warm group. The amount of precipitation and/or clothing had a negligible impact on the advancement of decomposition, whereas the lack of sun exposure (full shade) had a small positive effect. This study found that the poor predictability of onset and the duration of late stage decomposition, combined with our limited understanding of the full range of variables which influence the speed of decomposition, makes PMI estimations for exposed terrestrial cases in Canada unreliable, but also calls in question PMI estimations elsewhere. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  14. MRI signal and texture features for the prediction of MCI to Alzheimer's disease progression

    NASA Astrophysics Data System (ADS)

    Martínez-Torteya, Antonio; Rodríguez-Rojas, Juan; Celaya-Padilla, José M.; Galván-Tejada, Jorge I.; Treviño, Victor; Tamez-Peña, José G.

    2014-03-01

    An early diagnosis of Alzheimer's disease (AD) confers many benefits. Several biomarkers from different information modalities have been proposed for the prediction of MCI to AD progression, where features extracted from MRI have played an important role. However, studies have focused almost exclusively in the morphological characteristics of the images. This study aims to determine whether features relating to the signal and texture of the image could add predictive power. Baseline clinical, biological and PET information, and MP-RAGE images for 62 subjects from the Alzheimer's Disease Neuroimaging Initiative were used in this study. Images were divided into 83 regions and 50 features were extracted from each one of these. A multimodal database was constructed, and a feature selection algorithm was used to obtain an accurate and small logistic regression model, which achieved a cross-validation accuracy of 0.96. These model included six features, five of them obtained from the MP-RAGE image, and one obtained from genotyping. A risk analysis divided the subjects into low-risk and high-risk groups according to a prognostic index, showing that both groups are statistically different (p-value of 2.04e-11). The results demonstrate that MRI features related to both signal and texture, add MCI to AD predictive power, and support the idea that multimodal biomarkers outperform single-modality biomarkers.

  15. Certification Study of a Derivative Model of a Small Jet Transport Airplane Using a Piloted Research Simulator

    DTIC Science & Technology

    1977-06-01

    RESEARCH SIMULATOR • RAYMOND 0. FORREST SYSTEMS RESEARCH AND DEVELOPMENT SERVICE FEDERAL AVIATION ADMINISTRATION AMES RESEARCH CENTER MOFFE1T FIELD ...25 M o f f e t t Field , CA 94035 13. T ype of Repor t and P.r.od Co o er ed 12 . Sponsorrng Ar en cy Na me and Add eis ___________ U . S...dynamic stability derivatives of a complete airplane . The method utilizes potential flow theory to compute the surface flow fields and pressures on any

  16. Making the diagnosis of Sjögren's syndrome in patients with dry eye.

    PubMed

    Beckman, Kenneth A; Luchs, Jodi; Milner, Mark S

    2016-01-01

    Sjögren's syndrome (SS) is a chronic and progressive systemic autoimmune disease that often presents initially with symptoms of dry eye and dry mouth. Symptoms are often nonspecific and develop gradually, making diagnosis difficult. Patients with dry eye complaints warrant a step-wise evaluation for possible SS. Initial evaluation requires establishment of a dry eye diagnosis using a combination of patient questionnaires and objective ocular tests, including inflammatory biomarker testing. Additional work-up using the Schirmer test and tear film break-up time can differentiate between aqueous-deficient dry eye (ADDE) and evaporative dry eye. The presence of ADDE should trigger further work-up to differentiate between SS-ADDE and non-SS-ADDE. There are numerous non-ocular manifestations of SS, and monitoring for SS-related comorbid findings can aid in diagnosis, ideally in collaboration with a rheumatologist. The clinical work-up of SS can involve a variety of tests, including tear function tests, serological tests for autoantibody biomarkers, minor salivary gland and lacrimal gland biopsies. Examination of classic SS biomarkers (SS-A/Ro, SS-B/La, antinuclear antibody, and rheumatoid factor) is a convenient and non-invasive way of evaluating patients for the presence of SS, even years prior to confirmed diagnosis, although not all SS patients will test positive, particularly those with early disease. Recently, newer biomarkers have been identified, including autoantibodies to salivary gland protein-1, parotid secretory protein, and carbonic anhydrase VI, and may allow for earlier diagnosis of SS. A diagnostic test kit is commercially available (Sjö(®)), incorporating these new biomarkers along with the classic autoantibodies. This advanced test has been shown to identify SS patients who previously tested negative against traditional biomarkers only. All patients with clinically significant ADDE should be considered for serological assessment for SS, given the availability of new serological diagnostic tests and the potentially serious consequences of missing the diagnosis.

  17. Making the diagnosis of Sjögren’s syndrome in patients with dry eye

    PubMed Central

    Beckman, Kenneth A; Luchs, Jodi; Milner, Mark S

    2016-01-01

    Sjögren’s syndrome (SS) is a chronic and progressive systemic autoimmune disease that often presents initially with symptoms of dry eye and dry mouth. Symptoms are often nonspecific and develop gradually, making diagnosis difficult. Patients with dry eye complaints warrant a step-wise evaluation for possible SS. Initial evaluation requires establishment of a dry eye diagnosis using a combination of patient questionnaires and objective ocular tests, including inflammatory biomarker testing. Additional work-up using the Schirmer test and tear film break-up time can differentiate between aqueous-deficient dry eye (ADDE) and evaporative dry eye. The presence of ADDE should trigger further work-up to differentiate between SS-ADDE and non-SS-ADDE. There are numerous non-ocular manifestations of SS, and monitoring for SS-related comorbid findings can aid in diagnosis, ideally in collaboration with a rheumatologist. The clinical work-up of SS can involve a variety of tests, including tear function tests, serological tests for autoantibody biomarkers, minor salivary gland and lacrimal gland biopsies. Examination of classic SS biomarkers (SS-A/Ro, SS-B/La, antinuclear antibody, and rheumatoid factor) is a convenient and non-invasive way of evaluating patients for the presence of SS, even years prior to confirmed diagnosis, although not all SS patients will test positive, particularly those with early disease. Recently, newer biomarkers have been identified, including autoantibodies to salivary gland protein-1, parotid secretory protein, and carbonic anhydrase VI, and may allow for earlier diagnosis of SS. A diagnostic test kit is commercially available (Sjö®), incorporating these new biomarkers along with the classic autoantibodies. This advanced test has been shown to identify SS patients who previously tested negative against traditional biomarkers only. All patients with clinically significant ADDE should be considered for serological assessment for SS, given the availability of new serological diagnostic tests and the potentially serious consequences of missing the diagnosis. PMID:26766898

  18. Computational wear simulation of patellofemoral articular cartilage during in vitro testing.

    PubMed

    Li, Lingmin; Patil, Shantanu; Steklov, Nick; Bae, Won; Temple-Wong, Michele; D'Lima, Darryl D; Sah, Robert L; Fregly, Benjamin J

    2011-05-17

    Though changes in normal joint motions and loads (e.g., following anterior cruciate ligament injury) contribute to the development of knee osteoarthritis, the precise mechanism by which these changes induce osteoarthritis remains unknown. As a first step toward identifying this mechanism, this study evaluates computational wear simulations of a patellofemoral joint specimen wear tested on a knee simulator machine. A multibody dynamic model of the specimen mounted in the simulator machine was constructed in commercial computer-aided engineering software. A custom elastic foundation contact model was used to calculate contact pressures and wear on the femoral and patellar articular surfaces using geometry created from laser scan and MR data. Two different wear simulation approaches were investigated--one that wore the surface geometries gradually over a sequence of 10 one-cycle dynamic simulations (termed the "progressive" approach), and one that wore the surface geometries abruptly using results from a single one-cycle dynamic simulation (termed the "non-progressive" approach). The progressive approach with laser scan geometry reproduced the experimentally measured wear depths and areas for both the femur and patella. The less costly non-progressive approach predicted deeper wear depths, especially on the patella, but had little influence on predicted wear areas. Use of MR data for creating the articular and subchondral bone geometry altered wear depth and area predictions by at most 13%. These results suggest that MR-derived geometry may be sufficient for simulating articular cartilage wear in vivo and that a progressive simulation approach may be needed for the patella and tibia since both remain in continuous contact with the femur. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Optimizing smoke and plume rise modeling approaches at local scales

    Treesearch

    Derek V. Mallia; Adam K. Kochanski; Shawn P. Urbanski; John C. Lin

    2018-01-01

    Heating from wildfires adds buoyancy to the overlying air, often producing plumes that vertically distribute fire emissions throughout the atmospheric column over the fire. The height of the rising wildfire plume is a complex function of the size of the wildfire, fire heat flux, plume geometry, and atmospheric conditions, which can make simulating plume rises difficult...

  20. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  1. Mild Normobaric Hypoxia Exposure for Human-Autonomy System Testing

    NASA Technical Reports Server (NTRS)

    Stephens, Chad L.; Kennedy, Kellie D.; Crook, Brenda L.; Williams, Ralph A.; Schutte, Paul

    2017-01-01

    An experiment investigated the impact of normobaric hypoxia induction on aircraft pilot performance to specifically evaluate the use of hypoxia as a method to induce mild cognitive impairment to explore human-autonomous systems integration opportunities. Results of this exploratory study show that the effect of 15,000 feet simulated altitude did not induce cognitive deficits as indicated by performance on written, computer-based, or simulated flight tasks. However, the subjective data demonstrated increased effort by the human test subject pilots to maintain equivalent performance in a flight simulation task. This study represents current research intended to add to the current knowledge of performance decrement and pilot workload assessment to improve automation support and increase aviation safety.

  2. Description of the GMAO OSSE for Weather Analysis Software Package: Version 3

    NASA Technical Reports Server (NTRS)

    Koster, Randal D. (Editor); Errico, Ronald M.; Prive, Nikki C.; Carvalho, David; Sienkiewicz, Meta; El Akkraoui, Amal; Guo, Jing; Todling, Ricardo; McCarty, Will; Putman, William M.; hide

    2017-01-01

    The Global Modeling and Assimilation Office (GMAO) at the NASA Goddard Space Flight Center has developed software and products for conducting observing system simulation experiments (OSSEs) for weather analysis applications. Such applications include estimations of potential effects of new observing instruments or data assimilation techniques on improving weather analysis and forecasts. The GMAO software creates simulated observations from nature run (NR) data sets and adds simulated errors to those observations. The algorithms employed are much more sophisticated, adding a much greater degree of realism, compared with OSSE systems currently available elsewhere. The algorithms employed, software designs, and validation procedures are described in this document. Instructions for using the software are also provided.

  3. Trade-off between competition and facilitation defines gap colonization in mountains.

    PubMed

    Lembrechts, Jonas J; Milbau, Ann; Nijs, Ivan

    2015-11-10

    Recent experimental observations show that gap colonization in small-stature (e.g. grassland and dwarf shrubs) vegetation strongly depends on the abiotic conditions within them. At the same time, within-gap variation in biotic interactions such as competition and facilitation, caused by distance to the gap edge, would affect colonizer performance, but a theoretical framework to explore such patterns is missing. Here, we model how competition, facilitation and environmental conditions together determine the small-scale patterns of gap colonization along a cold gradient in mountains, by simulating colonizer survival in gaps of various sizes. Our model adds another dimension to the known effects of biotic interactions along a stress gradient by focussing on the trade-off between competition and facilitation in the within-gap environment. We show that this trade-off defines a peak in colonizer survival at a specific distance from the gap edge, which progressively shifts closer to the edge as the environment gets colder, ultimately leaving a large fraction of gaps unsuitable for colonization in facilitation-dominated systems. This is reinforced when vegetation size and temperature amelioration are manipulated simultaneously with temperature in order to simulate an elevational gradient more realistically. Interestingly, all other conditions being equal, the magnitude of the realized survival peak was always lower in large than in small gaps, making large gaps harder to colonize. The model is relevant to predict effects of non-native plant invasions and climate warming on colonization processes in mountains. Published by Oxford University Press on behalf of the Annals of Botany Company.

  4. Predicting Failure Progression and Failure Loads in Composite Open-Hole Tension Coupons

    NASA Technical Reports Server (NTRS)

    Arunkumar, Satyanarayana; Przekop, Adam

    2010-01-01

    Failure types and failure loads in carbon-epoxy [45n/90n/-45n/0n]ms laminate coupons with central circular holes subjected to tensile load are simulated using progressive failure analysis (PFA) methodology. The progressive failure methodology is implemented using VUMAT subroutine within the ABAQUS(TradeMark)/Explicit nonlinear finite element code. The degradation model adopted in the present PFA methodology uses an instantaneous complete stress reduction (COSTR) approach to simulate damage at a material point when failure occurs. In-plane modeling parameters such as element size and shape are held constant in the finite element models, irrespective of laminate thickness and hole size, to predict failure loads and failure progression. Comparison to published test data indicates that this methodology accurately simulates brittle, pull-out and delamination failure types. The sensitivity of the failure progression and the failure load to analytical loading rates and solvers precision is demonstrated.

  5. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  6. The Need for a Comprehensive Mental Health Information System. 1. Data Requirements of Local Clinicians and Administrators in Navy Psychiatry.

    DTIC Science & Technology

    1980-07-01

    management, e.g., treatment plans and goals, current objectives of treatment, patient progress, and the results of any medical consultations. A planned... medical care utilization fol-low mental health interventions. Add to this fact the estimate that 50%o of patients entering the Navy health care delivery...Captain Sears is Chief, Psychiatric Services, Naval Regional Medical Center, San Diego, CA 92134. Report No. 80-19, supported by Naval Medical Research

  7. Formation of Carbon Nanotube Based Gears: Quantum Chemistry and Molecular Mechanics Study of the Electrophilic Addition of o-Benzyne to Fullerenes, Graphene, and Nanotubes

    NASA Technical Reports Server (NTRS)

    Jaffe, Richard; Han, Jie; Globus, Al; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Considerable progress has been made in recent years in chemical functionalization of fullerene molecules. In some cases, the predominant reaction products are different from those obtained (using the same reactants) from polycyclic aromatic hydrocarbons (PAHs). One such example is the cycloaddition of o-benzyne to C60. It is well established that benzyne adds across one of the rings in naphthalene, anthracene and other PAHs forming the [2+4] cycloaddition product (benzobicyclo[2.2.2.]-octatriene with naphthalene and triptycene with anthracene). However, Hoke et al demonstrated that the only reaction path for o-benzyne with C60 leads to the [2+2] cycloaddition product in which benzyne adds across one of the interpentagonal bonds (forming a cyclobutene ring in the process). Either reaction product results in a loss of aromaticity and distortion of the PAH or fullerene substrate, and in a loss of strain in the benzyne. It is not clear, however, why different products are preferred in these cases. In the current paper, we consider the stability of benzyne-nanotube adducts and the ability of Brenner's potential energy model to describe the structure and stability of these adducts. The Brenner potential has been widely used for describing diamondoid and graphitic carbon. Recently it has also been used for molecular mechanics and molecular dynamics simulations of fullerenes and nanotubes. However, it has not been tested for the case of functionalized fullerenes (especially with highly strained geometries). We use the Brenner potential for our companion nanogear simulations and believe that it should be calibrated to insure that those simulations are physically reasonable. In the present work, Density Functional theory (DFT) calculations are used to determine the preferred geometric structures and energetics for this calibration. The DFT method is a kind of ab initio quantum chemistry method for determining the electronic structure of molecules. For a given basis set expansion, it is comparable in accuracy to the MP2 method (better than Hartree Fock, but less accurate than more extensive electron correlation methods such as MP4 or CCSD). However, for systems with large numbers of basis functions it more efficient than any other methods that include electron correlation effects. In this presentation we show the results of DFT calculations for the reaction of benzyne with naphthalene, C60, and nanotube models. We compare energies for [2+2] and [2+4] cycloaddition products. The preferred products for the naphthalene and C60 reactions have been determined by experiment and, thus, these cases serve as a validation of our quantum chemical approach. We also compare the DFT and Brenner potential results. Finally we can predict the likelihood of reaction between benzyne and nanotubes.

  8. Progressive Damage and Fracture in Composites Under Dynamic Loading

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    1994-01-01

    A computational simulation tool is used to evaluate the various stages of damage progression in composite materials during losipescu shear testing. Unidirectional composite specimens with either the major or minor material axis in the load direction are considered. Damage progression characteristics are described for each specimen using two types of boundary conditions. A procedure is outlined regarding the use of computational simulation in the testing of composite materials.

  9. Exome sequencing results in successful riboflavin treatment of a rapidly progressive neurological condition

    PubMed Central

    Petrovski, Slavé; Shashi, Vandana; Petrou, Steven; Schoch, Kelly; McSweeney, Keisha Melodi; Dhindsa, Ryan S.; Krueger, Brian; Crimian, Rebecca; Case, Laura E.; Khalid, Roha; El-Dairi, Maysantoine A.; Jiang, Yong-Hui; Mikati, Mohamad A.; Goldstein, David B.

    2015-01-01

    Genetically targeted therapies for rare Mendelian conditions are improving patient outcomes. Here, we present the case of a 20-mo-old female suffering from a rapidly progressing neurological disorder. Although diagnosed initially with a possible autoimmune condition, analysis of the child's exome resulted in a diagnosis of Brown–Vialetto–Van Laere syndrome 2 (BVVLS2). This new diagnosis led to a change in the therapy plan from steroids and precautionary chemotherapy to high-dose riboflavin. Improvements were reported quickly, including in motor strength after 1 mo. In this case, the correct diagnosis and appropriate treatment would have been unlikely in the absence of exome sequencing and careful interpretation. This experience adds to a growing list of examples that emphasize the importance of early genome-wide diagnostics. PMID:27148561

  10. Studying the Formation and Development of Molecular Clouds: With the CCAT Heterodyne Array Instrument (CHAI)

    NASA Technical Reports Server (NTRS)

    Goldsmith, Paul F.

    2012-01-01

    Surveys of all different types provide basic data using different tracers. Molecular clouds have structure over a very wide range of scales. Thus, "high resolution" surveys and studies of selected nearby clouds add critical information. The combination of large-area and high resolution allows Increased spatial dynamic range, which in turn enables detection of new and perhaps critical morphology (e.g. filaments). Theoretical modeling has made major progress, and suggests that multiple forces are at work. Galactic-scale modeling also progressing - indicates that stellar feedback is required. Models must strive to reproduce observed cloud structure at all scales. Astrochemical observations are not unrelated to questions of cloud evolution and star formation but we are still learning how to use this capability.

  11. THE ORGANIZATION OF A COURSE FOR INDIVIDUAL PROGRESS AT THEODORE HIGH SCHOOL--SYSTEM ANALYSIS AND SIMULATION.

    ERIC Educational Resources Information Center

    BRATTEN, JACK E.

    THE BIOLOGY COURSE OF THEODORE HIGH SCHOOL AT THEODORE, ALABAMA, WAS STUDIED AS A SYSTEM FOR "PROCESSING" STUDENTS AND WAS SIMULATED ON A COMPUTER. AN EXPERIMENTAL VERSION OF THE COURSE WAS SIMULATED AND COMPARED WITH THE ACTUAL COURSE. THE PURPOSES OF THIS STUDY WERE (1) TO EXAMINE THE CONCEPT OF INDIVIDUAL PROGRESS AS IT RELATED TO THE…

  12. Renal function preservation with pioglitazone or with basal insulin as an add-on therapy for patients with type 2 diabetes mellitus.

    PubMed

    Chang, Yu-Hung; Hwu, Der-Wei; Chang, Dao-Ming; An, Ling-Wang; Hsieh, Chang-Hsun; Lee, Yau-Jiunn

    2017-06-01

    Clinical outcome may differ owing to the distinct pharmacological characteristics of insulin sensitizers and insulin. This study was performed to compare the metabolic and renal function changes with add-on pioglitazone treatment versus basal insulin in patients with type 2 diabetes mellitus (DM) in whom sulfonylurea and metformin regimens failed. Patients who were consecutively managed in the diabetes comprehensive program with add-on pioglitazone or detemir/glargine treatment for at least 2 years following sulfonylurea and metformin treatment failure were included. A total of 1002 patients were enrolled (pioglitazone: 559, detemir: 264, glargine: 179). After propensity score matching, there were 105 patients with matchable baseline characteristics in each group. After a mean of 3.5 years of follow-up, the pioglitazone group showed a greater HbA1c reduction than the detemir group and the glargine group. Despite patients in all three groups exhibiting significant body weight gain, those in the pioglitazone group and the glargine group showed greater body weight increases than the patients in the detemir group (2.1, 1.6 and 0.8 kg, respectively, p < 0.05). Interestingly, Cox regression analysis indicated that patients under detemir or glargine treatment had a higher probability of CKD progression as compared with the pioglitazone group, with hazard ratios of 2.63 (95% CI 1.79-3.88) and 3.13 (95% CI 2.01-4.87), respectively. Our study first showed that treatment with both pioglitazone and basal insulin improved glycemic control, while only pioglitazone treatment was observed to be advantageous in terms of preserving renal function when used as an add-on therapy for patients with type 2 DM in whom sulfonylurea and metformin regimens failed.

  13. The DIAN-TU Next Generation Alzheimer’s prevention trial: adaptive design and disease progression model

    PubMed Central

    Bateman, Randall J.; Benzinger, Tammie L.; Berry, Scott; Clifford, David B.; Duggan, Cynthia; Fagan, Anne M.; Fanning, Kathleen; Farlow, Martin R.; Hassenstab, Jason; McDade, Eric M.; Mills, Susan; Paumier, Katrina; Quintana, Melanie; Salloway, Stephen P.; Santacruz, Anna; Schneider, Lon S.; Wang, Guoqiao; Xiong, Chengjie

    2016-01-01

    INTRODUCTION The Dominantly Inherited Alzheimer Network Trials Unit (DIAN-TU) trial is an adaptive platform trial testing multiple drugs to slow or prevent the progression of Alzheimer’s disease in autosomal dominant Alzheimer’s disease (ADAD) families. With completion of enrollment of the first two drug arms, the DIAN-TU now plans to add new drugs to the platform, designated as the Next Generation Prevention Trial (NexGen). METHODS In collaboration with ADAD families, philanthropic organizations, academic leaders, the DIAN-TU Pharma Consortium, the NIH, and regulatory colleagues, the DIAN-TU developed innovative clinical study designs for the DIAN-TU NexGen trial. RESULTS Our expanded trials toolbox consists of a Disease Progression Model for ADAD, primary endpoint DIAN-TU cognitive performance composite, biomarker development, self-administered cognitive assessments, adaptive dose adjustments, and blinded data collection through the last participant completion. CONCLUSION These steps represent elements to improve efficacy of the adaptive platform trial and a continued effort to optimize prevention and treatment trials in ADAD. PMID:27583651

  14. Does information available at admission for delivery improve prediction of vaginal birth after cesarean?

    PubMed Central

    Grobman, William A.; Lai, Yinglei; Landon, Mark B.; Spong, Catherine Y.; Leveno, Kenneth J.; Rouse, Dwight J.; Varner, Michael W.; Moawad, Atef H.; Simhan, Hyagriv N.; Harper, Margaret; Wapner, Ronald J.; Sorokin, Yoram; Miodovnik, Menachem; Carpenter, Marshall; O'sullivan, Mary J.; Sibai, Baha M.; Langer, Oded; Thorp, John M.; Ramin, Susan M.; Mercer, Brian M.

    2010-01-01

    Objective To construct a predictive model for vaginal birth after cesarean (VBAC) that combines factors that can be ascertained only as the pregnancy progresses with those known at initiation of prenatal care. Study design Using multivariable modeling, we constructed a predictive model for VBAC that included patient factors known at the initial prenatal visit as well as those that only became evident as the pregancy progressed to the admission for delivery. Results 9616 women were analyzed. The regression equation for VBAC success included multiple factors that could not be known at the first prenatal visit. The area under the curve for this model was significantly greater (P < .001) than that of a model that included only factors available at the first prenatal visit. Conclusion A prediction model for VBAC success that incorporates factors that can be ascertained only as the pregnancy progresses adds to the predictive accuracy of a model that uses only factors available at a first prenatal visit. PMID:19813165

  15. Cognitive simulators for medical education and training.

    PubMed

    Kahol, Kanav; Vankipuram, Mithra; Smith, Marshall L

    2009-08-01

    Simulators for honing procedural skills (such as surgical skills and central venous catheter placement) have proven to be valuable tools for medical educators and students. While such simulations represent an effective paradigm in surgical education, there is an opportunity to add a layer of cognitive exercises to these basic simulations that can facilitate robust skill learning in residents. This paper describes a controlled methodology, inspired by neuropsychological assessment tasks and embodied cognition, to develop cognitive simulators for laparoscopic surgery. These simulators provide psychomotor skill training and offer the additional challenge of accomplishing cognitive tasks in realistic environments. A generic framework for design, development and evaluation of such simulators is described. The presented framework is generalizable and can be applied to different task domains. It is independent of the types of sensors, simulation environment and feedback mechanisms that the simulators use. A proof of concept of the framework is provided through developing a simulator that includes cognitive variations to a basic psychomotor task. The results of two pilot studies are presented that show the validity of the methodology in providing an effective evaluation and learning environments for surgeons.

  16. Evolving MEMS Resonator Designs for Fabrication

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Kraus, William F.; Lohn, Jason D.

    2008-01-01

    Because of their small size and high reliability, microelectromechanical (MEMS) devices have the potential to revolution many areas of engineering. As with conventionally-sized engineering design, there is likely to be a demand for the automated design of MEMS devices. This paper describes our current status as we progress toward our ultimate goal of using an evolutionary algorithm and a generative representation to produce designs of a MEMS device and successfully demonstrate its transfer to an actual chip. To produce designs that are likely to transfer to reality, we present two ways to modify evaluation of designs. The first is to add location noise, differences between the actual dimensions of the design and the design blueprint, which is a technique we have used for our work in evolving antennas and robots. The second method is to add prestress to model the warping that occurs during the extreme heat of fabrication. In future we expect to fabricate and test some MEMS resonators that are evolved in this way.

  17. Adding Value to the Health Care System: Identifying Value-Added Systems Roles for Medical Students.

    PubMed

    Gonzalo, Jed D; Graaf, Deanna; Johannes, Bobbie; Blatt, Barbara; Wolpaw, Daniel R

    To catalyze learning in Health Systems Science and add value to health systems, education programs are seeking to incorporate students into systems roles, which are not well described. The authors sought to identify authentic roles for students within a range of clinical sites and explore site leaders' perceptions of the value of students performing these roles. From 2013 to 2015, site visits and interviews with leadership from an array of clinical sites (n = 30) were conducted. Thematic analysis was used to identify tasks and benefits of integrating students into interprofessional care teams. Types of systems roles included direct patient benefit activities, including monitoring patient progress with care plans and facilitating access to resources, and clinic benefit activities, including facilitating coordination and improving clinical processes. Perceived benefits included improved value of the clinical mission and enhanced student education. These results elucidate a framework for student roles that enhance learning and add value to health systems.

  18. The Umbra Simulation and Integration Framework Applied to Emergency Response Training

    NASA Technical Reports Server (NTRS)

    Hamilton, Paul Lawrence; Britain, Robert

    2010-01-01

    The Mine Emergency Response Interactive Training Simulation (MERITS) is intended to prepare personnel to manage an emergency in an underground coal mine. The creation of an effective training environment required realistic emergent behavior in response to simulation events and trainee interventions, exploratory modification of miner behavior rules, realistic physics, and incorporation of legacy code. It also required the ability to add rich media to the simulation without conflicting with normal desktop security settings. Our Umbra Simulation and Integration Framework facilitated agent-based modeling of miners and rescuers and made it possible to work with subject matter experts to quickly adjust behavior through script editing, rather than through lengthy programming and recompilation. Integration of Umbra code with the WebKit browser engine allowed the use of JavaScript-enabled local web pages for media support. This project greatly extended the capabilities of Umbra in support of training simulations and has implications for simulations that combine human behavior, physics, and rich media.

  19. Linkage analysis of systolic blood pressure: a score statistic and computer implementation

    PubMed Central

    Wang, Kai; Peng, Yingwei

    2003-01-01

    A genome-wide linkage analysis was conducted on systolic blood pressure using a score statistic. The randomly selected Replicate 34 of the simulated data was used. The score statistic was applied to the sibships derived from the general pedigrees. An add-on R program to GENEHUNTER was developed for this analysis and is freely available. PMID:14975145

  20. Genetic Adaptive Control for PZT Actuators

    NASA Technical Reports Server (NTRS)

    Kim, Jeongwook; Stover, Shelley K.; Madisetti, Vijay K.

    1995-01-01

    A piezoelectric transducer (PZT) is capable of providing linear motion if controlled correctly and could provide a replacement for traditional heavy and large servo systems using motors. This paper focuses on a genetic model reference adaptive control technique (GMRAC) for a PZT which is moving a mirror where the goal is to keep the mirror velocity constant. Genetic Algorithms (GAs) are an integral part of the GMRAC technique acting as the search engine for an optimal PID controller. Two methods are suggested to control the actuator in this research. The first one is to change the PID parameters and the other is to add an additional reference input in the system. The simulation results of these two methods are compared. Simulated Annealing (SA) is also used to solve the problem. Simulation results of GAs and SA are compared after simulation. GAs show the best result according to the simulation results. The entire model is designed using the Mathworks' Simulink tool.

  1. System Engineering Infrastructure Evolution Galileo IOV and the Steps Beyond

    NASA Astrophysics Data System (ADS)

    Eickhoff, J.; Herpel, H.-J.; Steinle, T.; Birn, R.; Steiner, W.-D.; Eisenmann, H.; Ludwig, T.

    2009-05-01

    The trends to more and more constrained financial budgets in satellite engineering require a permanent optimization of the S/C system engineering processes and infrastructure. Astrium in the recent years already has built up a system simulation infrastructure - the "Model-based Development & Verification Environment" - which meanwhile is well known all over Europe and is established as Astrium's standard approach for ESA, DLR projects and now even the EU/ESA-Project Galileo IOV. The key feature of the MDVE / FVE approach is to provide entire S/C simulation (with full featured OBC simulation) already in early phases to start OBSW code tests on a simulated S/C and to later add hardware in the loop step by step up to an entire "Engineering Functional Model (EFM)" or "FlatSat". The subsequent enhancements to this simulator infrastructure w.r.t. spacecraft design data handling are reported in the following sections.

  2. Simulated Surgery-an exam for our time? Summary of the current status and development of the MRCGP Simulated Surgery module.

    PubMed

    Hawthorne, Kamila; Denney, Mei Ling; Bewick, Mike; Wakeford, Richard

    2006-01-01

    WHAT IS ALREADY KNOWN IN THIS AREA • The Simulated Surgery module of the MRCGP examination has been shown to be a valid and reliable assessment of clinical consulting skills. WHAT THIS WORK ADDS • This paper describes the further development of the methodology of the Simulated Surgery; showing the type of data analysis currently used to assure its quality and reliability. The measures taken to tighten up case quality are discussed. SUGGESTIONS FOR FUTURE RESEARCH The future development of clinical skills assessments in general practice is discussed. More work is needed on the effectiveness and reliability of lay assessors in complex integrated clinical cases. New methods to test areas that are difficult to reproduces in a simulated environment (such as acute emergencies and cases with the very young or very old) are also needed.

  3. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  4. The future of human cerebral cartography: a novel approach

    PubMed Central

    Frackowiak, Richard; Markram, Henry

    2015-01-01

    Cerebral cartography can be understood in a limited, static, neuroanatomical sense. Temporal information from electrical recordings contributes information on regional interactions adding a functional dimension. Selective tagging and imaging of molecules adds biochemical contributions. Cartographic detail can also be correlated with normal or abnormal psychological or behavioural data. Modern cerebral cartography is assimilating all these elements. Cartographers continue to collect ever more precise data in the hope that general principles of organization will emerge. However, even detailed cartographic data cannot generate knowledge without a multi-scale framework making it possible to relate individual observations and discoveries. We propose that, in the next quarter century, advances in cartography will result in progressively more accurate drafts of a data-led, multi-scale model of human brain structure and function. These blueprints will result from analysis of large volumes of neuroscientific and clinical data, by a process of reconstruction, modelling and simulation. This strategy will capitalize on remarkable recent developments in informatics and computer science and on the existence of much existing, addressable data and prior, though fragmented, knowledge. The models will instantiate principles that govern how the brain is organized at different levels and how different spatio-temporal scales relate to each other in an organ-centred context. PMID:25823868

  5. An Enriched Shell Finite Element for Progressive Damage Simulation in Composite Laminates

    NASA Technical Reports Server (NTRS)

    McElroy, Mark W.

    2016-01-01

    A formulation is presented for an enriched shell nite element capable of progressive damage simulation in composite laminates. The element uses a discrete adaptive splitting approach for damage representation that allows for a straightforward model creation procedure based on an initially low delity mesh. The enriched element is veri ed for Mode I, Mode II, and mixed Mode I/II delamination simulation using numerical benchmark data. Experimental validation is performed using test data from a delamination-migration experiment. Good correlation was found between the enriched shell element model results and the numerical and experimental data sets. The work presented in this paper is meant to serve as a rst milestone in the enriched element's development with an ultimate goal of simulating three-dimensional progressive damage processes in multidirectional laminates.

  6. Pore blocking: An innovative formulation strategy for the design of alcohol resistant multi-particulate dosage forms.

    PubMed

    Schrank, Simone; Jedinger, Nicole; Wu, Shengqian; Piller, Michael; Roblegg, Eva

    2016-07-25

    In this work calcium stearate (CaSt) multi-particulates loaded with codeine phosphate (COP) were developed in an attempt to provide extended release (ER) combined with alcohol dose dumping (ADD) resistance. The pellets were prepared via wet/extrusion spheronization and ER characteristics were obtained after fluid bed drying at 30°C. Pore blockers (i.e., xanthan, guar gum and TiO2) were integrated to control the uptake of ethanolic media, the CaSt swelling and consequently, the COP release. While all three pore blockers are insoluble in ethanol, xanthan dissolves, guar gum swells and TiO2 does not interact with water. The incorporation of 10 and 15% TiO2 still provided ER characteristics and yielded ADD resistance in up to 40v% ethanol. The in-vitro data were subjected to PK simulations, which revealed similar codeine plasma levels when the medication is used concomitantly with alcoholic beverages. Taken together the in-vitro and in-silico results demonstrate that the incorporation of appropriate pore blockers presents a promising strategy to provide ADD resistance of multi-particulate systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Modeling and simulation in biomedicine.

    PubMed Central

    Aarts, J.; Möller, D.; van Wijk van Brievingh, R.

    1991-01-01

    A group of researchers and educators in The Netherlands, Germany and Czechoslovakia have developed and adapted mathematical computer models of phenomena in the field of physiology and biomedicine for use in higher education. The models are graphical and highly interactive, and are all written in TurboPascal or the mathematical simulation language PSI. An educational shell has been developed to launch the models. The shell allows students to interact with the models and teachers to edit the models, to add new models and to monitor the achievements of the students. The models and the shell have been implemented on a MS-DOS personal computer. This paper describes the features of the modeling package and presents the modeling and simulation of the heart muscle as an example. PMID:1807745

  8. OpenMM 7: Rapid development of high performance algorithms for molecular dynamics

    PubMed Central

    Swails, Jason; Zhao, Yutong; Beauchamp, Kyle A.; Wang, Lee-Ping; Stern, Chaya D.; Brooks, Bernard R.; Pande, Vijay S.

    2017-01-01

    OpenMM is a molecular dynamics simulation toolkit with a unique focus on extensibility. It allows users to easily add new features, including forces with novel functional forms, new integration algorithms, and new simulation protocols. Those features automatically work on all supported hardware types (including both CPUs and GPUs) and perform well on all of them. In many cases they require minimal coding, just a mathematical description of the desired function. They also require no modification to OpenMM itself and can be distributed independently of OpenMM. This makes it an ideal tool for researchers developing new simulation methods, and also allows those new methods to be immediately available to the larger community. PMID:28746339

  9. Comparing multiple turbulence restoration algorithms performance on noisy anisoplanatic imagery

    NASA Astrophysics Data System (ADS)

    Rucci, Michael A.; Hardie, Russell C.; Dapore, Alexander J.

    2017-05-01

    In this paper, we compare the performance of multiple turbulence mitigation algorithms to restore imagery degraded by atmospheric turbulence and camera noise. In order to quantify and compare algorithm performance, imaging scenes were simulated by applying noise and varying levels of turbulence. For the simulation, a Monte-Carlo wave optics approach is used to simulate the spatially and temporally varying turbulence in an image sequence. A Poisson-Gaussian noise mixture model is then used to add noise to the observed turbulence image set. These degraded image sets are processed with three separate restoration algorithms: Lucky Look imaging, bispectral speckle imaging, and a block matching method with restoration filter. These algorithms were chosen because they incorporate different approaches and processing techniques. The results quantitatively show how well the algorithms are able to restore the simulated degraded imagery.

  10. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  11. Variability of Valuation of Non-Monetary Incentives: Motivating and Implementing the Combinatorial Retention Auction Mechanism

    DTIC Science & Technology

    2009-03-01

    homeport, geographic stability for two tours and compressed work week; homeport, lump sum SRB, and telecommuting ). The Monte Carlo simulation...Geographic stability 2 tours, and compressed work week). The Add 2 combination includes home port choice, lump sum SRB, and telecommuting ...VALUATION OF NON-MONETARY INCENTIVES: MOTIVATING AND IMPLEMENTING THE COMBINATORIAL RETENTION AUCTION MECHANISM by Jason Blake Ellis March 2009

  12. Lectin Enzyme Assay Detection of Viruses, Tissue Culture, and a Mycotoxin Simulant

    DTIC Science & Technology

    1988-09-01

    micromix vibrator at 37 OC for 10-30 min. 9. Read color development at 10, 20, and 30 min. Table 5. LEAD Test (Procedure II). 1. Add 0.1 mL of virus...or TC concentrations to 0.1 mL of WGA- peroxidase in microtiter tray. 2. Mix on yankee rotator or micromix vibrator at room temperature for 10 min. 3

  13. SouthPro : a computer program for managing uneven-aged loblolly pine stands

    Treesearch

    Benedict Schulte; Joseph Buongiorno; Ching-Rong Lin; Kenneth E. Skog

    1998-01-01

    SouthPro is a Microsoft Excel add-in program that simulates the management, growth, and yield of uneven-aged loblolly pine stands in the Southern United States. The built-in growth model of this program was calibrated from 991 uneven-aged plots in seven states, covering most growing conditions and sites. Stands are described by the number of trees in 13 size classes...

  14. Memory Subsystem Performance of Programs with Intensive Heap Allocation

    DTIC Science & Technology

    1993-12-13

    implroves all organiizatiotns. However Ilie imliprove- nient in going from one- way to t)two- wav set associa i vi; v is, tit1c ucl uialler t hal iil( tleil...simulating multi-cyclh instnlctiomis. we ca-nnot (let ,rmine their exa(’ct pen al il PIA. 26 Program ] Total I Div [I Midi F Add F Sub F Div F NIni CW 0.00

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serne, R. Jeffrey; Lanigan, David C.; Westsik, Joseph H.

    This revision to the original report adds two longer term leach sets of data to the report and provides more discussion and graphics on how to interpret the results from long-term laboratory leach tests. The leach tests were performed at Pacific Northwest National Laboratory (PNNL) for Washington River Protection Solutions (WRPS) to evaluate the release of key constituents from monoliths of Cast Stone prepared with four simulated low-activity waste (LAW) liquid waste streams.

  16. Energy, Transportation, Air Quality, Climate Change, Health Nexus: Sustainable Energy is Good for Our Health.

    PubMed

    Erickson, Larry E; Jennings, Merrisa

    2017-01-01

    The Paris Agreement on Climate Change has the potential to improve air quality and human health by encouraging the electrification of transportation and a transition from coal to sustainable energy. There will be human health benefits from reducing combustion emissions in all parts of the world. Solar powered charging infrastructure for electric vehicles adds renewable energy to generate electricity, shaded parking, and a needed charging infrastructure for electric vehicles that will reduce range anxiety. The costs of wind power, solar panels, and batteries are falling because of technological progress, magnitude of commercial activity, production experience, and competition associated with new trillion dollar markets. These energy and transportation transitions can have a very positive impact on health. The energy, transportation, air quality, climate change, health nexus may benefit from additional progress in developing solar powered charging infrastructure.

  17. Wa-Chair: A concept for development of economical stair-climbing wheelchair

    NASA Astrophysics Data System (ADS)

    Jyoti Baishya, Nayan; Ogai, Harutoshi

    2018-02-01

    In this paper, a concept for development of cost effective and reliable stair climbing wheelchair is being proposed. Slider-crank mechanism is being used to compensate for any variation in inclination angle of the wheelchair during ascent or descent on stairs. Controlling wheelchair’s inclination angle can reduce risk for the rider as it prevents the wheelchair from toppling. A prototype is being developed to validate proposed mechanism. Proposed mechanism allows rider to view in the direction of progress which adds additional safety to the rider.

  18. From Progressive to Flat: How Tax Reform would Affect the Military

    DTIC Science & Technology

    2012-06-01

    l,e:s dedJe1ic:n. Al:taeh Fam eoco 36 36 Add lines23 Uro ~.gh 3:5 36 37 Subtract line36 1rom line :22. ThS S vour adjusted gross income .. 37 156...name I Fore1111 provtnce/county I For«11’P05tilco:le tJI"Uf, ’NiUr. $SID9) lO ~IS ~n:1 Ct’l:Ctang I IIC. DD’ •• ld cti !’Qt )’OUf l&x or ""’ .... 0 Yov

  19. [News in neurology 2013].

    PubMed

    Spatola, Marianna; Rossetti, Andrea O; Michel, Patrick; Kuntzer, Thierry; Benninger, David; Nater, Bernard; Démonet, Jean-François; Schluep, Myriam; Du Pasquier, Renaud A; Vingerhoets, François

    2014-01-15

    In 2013, perampanel is approved as an add-on treatment for generalised and focal seizures in pharmaco-resistant epilepsy. New anticoagulants are superior to antivitamin K in stroke secondary prevention in case of atrial fibrillation. DBS remains a valid therapeutic option for advanced Parkinson's disease. Intranasal ketamine seems to reduce the intensity of severe migraine aura. High concentrations of topic capsaicin improve post-herpetic neuralgia. In Alzheimer's disease, statins might deteriorate cognitive functions. Oral immuno-modifing treatments for relapsing remitting multiple sclerosis have shown to slow cerebral atrophy progression at two years.

  20. Catalysts for CO2/epoxide ring-opening copolymerization

    PubMed Central

    Trott, G.; Saini, P. K.; Williams, C. K.

    2016-01-01

    This article summarizes and reviews recent progress in the development of catalysts for the ring-opening copolymerization of carbon dioxide and epoxides. The copolymerization is an interesting method to add value to carbon dioxide, including from waste sources, and to reduce pollution associated with commodity polymer manufacture. The selection of the catalyst is of critical importance to control the composition, properties and applications of the resultant polymers. This review highlights and exemplifies some key recent findings and hypotheses, in particular using examples drawn from our own research. PMID:26755758

  1. Remodeling a tissue: subtraction adds insight.

    PubMed

    Axelrod, Jeffrey D

    2012-11-27

    Sculpting a body plan requires both patterning of gene expression and translating that pattern into morphogenesis. Developmental biologists have made remarkable strides in understanding gene expression patterning, but despite a long history of fascination with the mechanics of morphogenesis, knowledge of how patterned gene expression drives the emergence of even simple shapes and forms has grown at a slower pace. The successful merging of approaches from cell biology, developmental biology, imaging, engineering, and mathematical and computational sciences is now accelerating progress toward a fuller and better integrated understanding of the forces shaping morphogenesis.

  2. Rational growth of Bi2S3 nanotubes from quasi-two-dimensional precursors.

    PubMed

    Ye, Changhui; Meng, Guowen; Jiang, Zhi; Wang, Yinhai; Wang, Guozhong; Zhang, Lide

    2002-12-25

    Synthesis of Bi2S3 nanotubes from rolling of the quasi-two-dimensional (2-D) layered precursor represents new progress in the synthetic approach and adds new members to the present inorganic fullerene family. These nanotubes display multiwalled structures that resemble that of a multiwalled carbon nanotube. The successful synthesis of Bi2S3 nanotubes highlights the feasibility of inorganic fullerene-like structures from other chemicals that possess layered crystalline structures, not only the well-known 2-D family, but possibly also those quasi-2-D members.

  3. Effect of Worked Examples on Mental Model Progression in a Computer-Based Simulation Learning Environment

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Nelson, David W.; Meeker, Richard; Liang, Xinya; Boulware, Wilma

    2010-01-01

    In a diagnostic problem solving operation of a computer-simulated chemical plant, chemical engineering students were randomly assigned to two groups: one studying product-oriented worked examples, the other practicing conventional problem solving. Effects of these instructional strategies on the progression of learners' mental models were examined…

  4. Logs Analysis of Adapted Pedagogical Scenarios Generated by a Simulation Serious Game Architecture

    ERIC Educational Resources Information Center

    Callies, Sophie; Gravel, Mathieu; Beaudry, Eric; Basque, Josianne

    2017-01-01

    This paper presents an architecture designed for simulation serious games, which automatically generates game-based scenarios adapted to learner's learning progression. We present three central modules of the architecture: (1) the learner model, (2) the adaptation module and (3) the logs module. The learner model estimates the progression of the…

  5. Comparing the cost-effectiveness of simulation modalities: a case study of peripheral intravenous catheterization training.

    PubMed

    Isaranuwatchai, Wanrudee; Brydges, Ryan; Carnahan, Heather; Backstein, David; Dubrowski, Adam

    2014-05-01

    While the ultimate goal of simulation training is to enhance learning, cost-effectiveness is a critical factor. Research that compares simulation training in terms of educational- and cost-effectiveness will lead to better-informed curricular decisions. Using previously published data we conducted a cost-effectiveness analysis of three simulation-based programs. Medical students (n = 15 per group) practiced in one of three 2-h intravenous catheterization skills training programs: low-fidelity (virtual reality), high-fidelity (mannequin), or progressive (consisting of virtual reality, task trainer, and mannequin simulator). One week later, all performed a transfer test on a hybrid simulation (standardized patient with a task trainer). We used a net benefit regression model to identify the most cost-effective training program via paired comparisons. We also created a cost-effectiveness acceptability curve to visually represent the probability that one program is more cost-effective when compared to its comparator at various 'willingness-to-pay' values. We conducted separate analyses for implementation and total costs. The results showed that the progressive program had the highest total cost (p < 0.001) whereas the high-fidelity program had the highest implementation cost (p < 0.001). While the most cost-effective program depended on the decision makers' willingness-to-pay value, the progressive training program was generally most educationally- and cost-effective. Our analyses suggest that a progressive program that strategically combines simulation modalities provides a cost-effective solution. More generally, we have introduced how a cost-effectiveness analysis may be applied to simulation training; a method that medical educators may use to investment decisions (e.g., purchasing cost-effective and educationally sound simulators).

  6. Refinement of Objective Motion Cueing Criteria Investigation Based on Three Flight Tasks

    NASA Technical Reports Server (NTRS)

    Zaal, Petrus M. T.; Schroeder, Jeffery A.; Chung, William W.

    2017-01-01

    The objective of this paper is to refine objective motion cueing criteria for commercial transport simulators based on pilots' performance in three flying tasks. Actuator hardware and software algorithms determine motion cues. Today, during a simulator qualification, engineers objectively evaluate only the hardware. Pilot inspectors subjectively assess the overall motion cueing system (i.e., hardware plus software); however, it is acknowledged that pinpointing any deficiencies that might arise to either hardware or software is challenging. ICAO 9625 has an Objective Motion Cueing Test (OMCT), which is now a required test in the FAA's part 60 regulations for new devices, evaluating the software and hardware together; however, it lacks accompanying fidelity criteria. Hosman has documented OMCT results for a statistical sample of eight simulators which is useful, but having validated criteria would be an improvement. In a previous experiment, we developed initial objective motion cueing criteria that this paper is trying to refine. Sinacori suggested simple criteria which are in reasonable agreement with much of the literature. These criteria often necessitate motion displacements greater than most training simulators can provide. While some of the previous work has used transport aircraft in their studies, the majority used fighter aircraft or helicopters. Those that used transport aircraft considered degraded flight characteristics. As a result, earlier criteria lean more towards being sufficient, rather than necessary, criteria for typical transport aircraft training applications. Considering the prevalence of 60-inch, six-legged hexapod training simulators, a relevant question is "what are the necessary criteria that can be used with the ICAO 9625 diagnostic?" This study adds to the literature as follows. First, it examines well-behaved transport aircraft characteristics, but in three challenging tasks. The tasks are equivalent to the ones used in our previous experiment, allowing us to directly compare the results and add to the previous data. Second, it uses the Vertical Motion Simulator (VMS), the world's largest vertical displacement simulator. This allows inclusion of relatively large motion conditions, much larger than a typical training simulator can provide. Six new motion configurations were used that explore the motion responses between the initial objective motion cueing boundaries found in a previous experiment and what current hexapod simulators typically provide. Finally, a sufficiently large pilot pool added statistical reliability to the results.

  7. GENOA-PFA: Progressive Fracture in Composites Simulated Computationally

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.

    2000-01-01

    GENOA-PFA is a commercial version of the Composite Durability Structural Analysis (CODSTRAN) computer program that simulates the progression of damage ultimately leading to fracture in polymer-matrix-composite (PMC) material structures under various loading and environmental conditions. GENOA-PFA offers several capabilities not available in other programs developed for this purpose, making it preferable for use in analyzing the durability and damage tolerance of complex PMC structures in which the fiber reinforcements occur in two- and three-dimensional weaves and braids. GENOA-PFA implements a progressive-fracture methodology based on the idea that a structure fails when flaws that may initially be small (even microscopic) grow and/or coalesce to a critical dimension where the structure no longer has an adequate safety margin to avoid catastrophic global fracture. Damage is considered to progress through five stages: (1) initiation, (2) growth, (3) accumulation (coalescence of propagating flaws), (4) stable propagation (up to the critical dimension), and (5) unstable or very rapid propagation (beyond the critical dimension) to catastrophic failure. The computational simulation of progressive failure involves formal procedures for identifying the five different stages of damage and for relating the amount of damage at each stage to the overall behavior of the deteriorating structure. In GENOA-PFA, mathematical modeling of the composite physical behavior involves an integration of simulations at multiple, hierarchical scales ranging from the macroscopic (lamina, laminate, and structure) to the microscopic (fiber, matrix, and fiber/matrix interface), as shown in the figure. The code includes algorithms to simulate the progression of damage from various source defects, including (1) through-the-thickness cracks and (2) voids with edge, pocket, internal, or mixed-mode delaminations.

  8. Regional climate model sensitivity to domain size

    NASA Astrophysics Data System (ADS)

    Leduc, Martin; Laprise, René

    2009-05-01

    Regional climate models are increasingly used to add small-scale features that are not present in their lateral boundary conditions (LBC). It is well known that the limited area over which a model is integrated must be large enough to allow the full development of small-scale features. On the other hand, integrations on very large domains have shown important departures from the driving data, unless large scale nudging is applied. The issue of domain size is studied here by using the “perfect model” approach. This method consists first of generating a high-resolution climatic simulation, nicknamed big brother (BB), over a large domain of integration. The next step is to degrade this dataset with a low-pass filter emulating the usual coarse-resolution LBC. The filtered nesting data (FBB) are hence used to drive a set of four simulations (LBs for Little Brothers), with the same model, but on progressively smaller domain sizes. The LB statistics for a climate sample of four winter months are compared with BB over a common region. The time average (stationary) and transient-eddy standard deviation patterns of the LB atmospheric fields generally improve in terms of spatial correlation with the reference (BB) when domain gets smaller. The extraction of the small-scale features by using a spectral filter allows detecting important underestimations of the transient-eddy variability in the vicinity of the inflow boundary, which can penalize the use of small domains (less than 100 × 100 grid points). The permanent “spatial spin-up” corresponds to the characteristic distance that the large-scale flow needs to travel before developing small-scale features. The spin-up distance tends to grow in size at higher levels in the atmosphere.

  9. Monte Carlo study of the effects of system geometry and antiscatter grids on cone-beam CT scatter distributions

    PubMed Central

    Sisniega, A.; Zbijewski, W.; Badal, A.; Kyprianou, I. S.; Stayman, J. W.; Vaquero, J. J.; Siewerdsen, J. H.

    2013-01-01

    Purpose: The proliferation of cone-beam CT (CBCT) has created interest in performance optimization, with x-ray scatter identified among the main limitations to image quality. CBCT often contends with elevated scatter, but the wide variety of imaging geometry in different CBCT configurations suggests that not all configurations are affected to the same extent. Graphics processing unit (GPU) accelerated Monte Carlo (MC) simulations are employed over a range of imaging geometries to elucidate the factors governing scatter characteristics, efficacy of antiscatter grids, guide system design, and augment development of scatter correction. Methods: A MC x-ray simulator implemented on GPU was accelerated by inclusion of variance reduction techniques (interaction splitting, forced scattering, and forced detection) and extended to include x-ray spectra and analytical models of antiscatter grids and flat-panel detectors. The simulator was applied to small animal (SA), musculoskeletal (MSK) extremity, otolaryngology (Head), breast, interventional C-arm, and on-board (kilovoltage) linear accelerator (Linac) imaging, with an axis-to-detector distance (ADD) of 5, 12, 22, 32, 60, and 50 cm, respectively. Each configuration was modeled with and without an antiscatter grid and with (i) an elliptical cylinder varying 70–280 mm in major axis; and (ii) digital murine and anthropomorphic models. The effects of scatter were evaluated in terms of the angular distribution of scatter incident upon the detector, scatter-to-primary ratio (SPR), artifact magnitude, contrast, contrast-to-noise ratio (CNR), and visual assessment. Results: Variance reduction yielded improvements in MC simulation efficiency ranging from ∼17-fold (for SA CBCT) to ∼35-fold (for Head and C-arm), with the most significant acceleration due to interaction splitting (∼6 to ∼10-fold increase in efficiency). The benefit of a more extended geometry was evident by virtue of a larger air gap—e.g., for a 16 cm diameter object, the SPR reduced from 1.5 for ADD = 12 cm (MSK geometry) to 1.1 for ADD = 22 cm (Head) and to 0.5 for ADD = 60 cm (C-arm). Grid efficiency was higher for configurations with shorter air gap due to a broader angular distribution of scattered photons—e.g., scatter rejection factor ∼0.8 for MSK geometry versus ∼0.65 for C-arm. Grids reduced cupping for all configurations but had limited improvement on scatter-induced streaks and resulted in a loss of CNR for the SA, Breast, and C-arm. Relative contribution of forward-directed scatter increased with a grid (e.g., Rayleigh scatter fraction increasing from ∼0.15 without a grid to ∼0.25 with a grid for the MSK configuration), resulting in scatter distributions with greater spatial variation (the form of which depended on grid orientation). Conclusions: A fast MC simulator combining GPU acceleration with variance reduction provided a systematic examination of a range of CBCT configurations in relation to scatter, highlighting the magnitude and spatial uniformity of individual scatter components, illustrating tradeoffs in CNR and artifacts and identifying the system geometries for which grids are more beneficial (e.g., MSK) from those in which an extended geometry is the better defense (e.g., C-arm head imaging). Compact geometries with an antiscatter grid challenge assumptions of slowly varying scatter distributions due to increased contribution of Rayleigh scatter. PMID:23635285

  10. Home care for the disabled elderly: predictors and expected costs.

    PubMed Central

    Coughlin, T A; McBride, T D; Perozek, M; Liu, K

    1992-01-01

    While interest in publicly funded home care for the disabled elderly is keen, basic policy issues need to be addressed before an appropriate program can be adopted and financed. This article presents findings from a study in which the cost implications of anticipated behavioral responses (for example, caregiver substitution) are estimated. Using simulation techniques, the results demonstrate that anticipated behavioral responses would likely add between $1.8 and $2.7 billion (1990 dollars) to the costs of a public home care program. Results from a variety of cost simulations are presented. The data base for the study was the 1982 National Long-Term Care Survey. PMID:1399652

  11. Study on the characteristics of multi-infeed HVDC

    NASA Astrophysics Data System (ADS)

    Li, Ming; Song, Xinli; Liu, Wenzhuo; Xiang, Yinxing; Zhao, Shutao; Su, Zhida; Meng, Hang

    2017-09-01

    China has built more than ten HVDC transmission projects in recent years [1]. Now, east China has formed a multi-HVDC feed pattern grid. It is imminent to study the interaction of the multi-HVDC and the characteristics of it. In this paper, an electromechanical-electromagnetic hybrid model is built with electromechanical data of a certain power network. We use electromagnetic models to simulate the HVDC section and electromechanical models simulate the AC power network [2]. In order to study the characteristics of the grid, this paper adds some faults to the line and analysed the fault characteristics. At last give analysis of the fault characteristics.

  12. Time reversal and charge conjugation in an embedding quantum simulator.

    PubMed

    Zhang, Xiang; Shen, Yangchao; Zhang, Junhua; Casanova, Jorge; Lamata, Lucas; Solano, Enrique; Yung, Man-Hong; Zhang, Jing-Ning; Kim, Kihwan

    2015-08-04

    A quantum simulator is an important device that may soon outperform current classical computations. A basic arithmetic operation, the complex conjugate, however, is considered to be impossible to be implemented in such a quantum system due to the linear character of quantum mechanics. Here, we present the experimental quantum simulation of such an unphysical operation beyond the regime of unitary and dissipative evolutions through the embedding of a quantum dynamics in the electronic multilevels of a (171)Yb(+) ion. We perform time reversal and charge conjugation, which are paradigmatic examples of antiunitary symmetry operators, in the evolution of a Majorana equation without the tomographic knowledge of the evolving state. Thus, these operations can be applied regardless of the system size. Our approach offers the possibility to add unphysical operations to the toolbox of quantum simulation, and provides a route to efficiently compute otherwise intractable quantities, such as entanglement monotones.

  13. Time reversal and charge conjugation in an embedding quantum simulator

    PubMed Central

    Zhang, Xiang; Shen, Yangchao; Zhang, Junhua; Casanova, Jorge; Lamata, Lucas; Solano, Enrique; Yung, Man-Hong; Zhang, Jing-Ning; Kim, Kihwan

    2015-01-01

    A quantum simulator is an important device that may soon outperform current classical computations. A basic arithmetic operation, the complex conjugate, however, is considered to be impossible to be implemented in such a quantum system due to the linear character of quantum mechanics. Here, we present the experimental quantum simulation of such an unphysical operation beyond the regime of unitary and dissipative evolutions through the embedding of a quantum dynamics in the electronic multilevels of a 171Yb+ ion. We perform time reversal and charge conjugation, which are paradigmatic examples of antiunitary symmetry operators, in the evolution of a Majorana equation without the tomographic knowledge of the evolving state. Thus, these operations can be applied regardless of the system size. Our approach offers the possibility to add unphysical operations to the toolbox of quantum simulation, and provides a route to efficiently compute otherwise intractable quantities, such as entanglement monotones. PMID:26239028

  14. Progressive learning in endoscopy simulation training improves clinical performance: a blinded randomized trial.

    PubMed

    Grover, Samir C; Scaffidi, Michael A; Khan, Rishad; Garg, Ankit; Al-Mazroui, Ahmed; Alomani, Tareq; Yu, Jeffrey J; Plener, Ian S; Al-Awamy, Mohamed; Yong, Elaine L; Cino, Maria; Ravindran, Nikila C; Zasowski, Mark; Grantcharov, Teodor P; Walsh, Catharine M

    2017-11-01

    A structured comprehensive curriculum (SCC) that uses simulation-based training (SBT) can improve clinical colonoscopy performance. This curriculum may be enhanced through the application of progressive learning, a training strategy centered on incrementally challenging learners. We aimed to determine whether a progressive learning-based curriculum (PLC) would lead to superior clinical performance compared with an SCC. This was a single-blinded randomized controlled trial conducted at a single academic center. Thirty-seven novice endoscopists were recruited and randomized to either a PLC (n = 18) or to an SCC (n = 19). The PLC comprised 6 hours of SBT, which progressed in complexity and difficulty. The SCC included 6 hours of SBT, with cases of random order of difficulty. Both groups received expert feedback and 4 hours of didactic teaching. Participants were assessed at baseline, immediately after training, and 4 to 6 weeks after training. The primary outcome was participants' performance during their first 2 clinical colonoscopies, as assessed by using the Joint Advisory Group Direct Observation of Procedural Skills assessment tool (JAG DOPS). Secondary outcomes were differences in endoscopic knowledge, technical and communication skills, and global performance in the simulated setting. The PLC group outperformed the SCC group during first and second clinical colonoscopies, measured by JAG DOPS (P < .001). Additionally, the PLC group had superior technical and communication skills and global performance in the simulated setting (P < .05). There were no differences between groups in endoscopic knowledge (P > .05). Our findings demonstrate the superiority of a PLC for endoscopic simulation, compared with an SCC. Challenging trainees progressively is a simple, theory-based approach to simulation whereby the performance of clinical colonoscopies can be improved. (Clinical trial registration number: NCT02000180.). Copyright © 2017 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  15. Joint coverage probability in a simulation study on Continuous-Time Markov Chain parameter estimation.

    PubMed

    Benoit, Julia S; Chan, Wenyaw; Doody, Rachelle S

    2015-01-01

    Parameter dependency within data sets in simulation studies is common, especially in models such as Continuous-Time Markov Chains (CTMC). Additionally, the literature lacks a comprehensive examination of estimation performance for the likelihood-based general multi-state CTMC. Among studies attempting to assess the estimation, none have accounted for dependency among parameter estimates. The purpose of this research is twofold: 1) to develop a multivariate approach for assessing accuracy and precision for simulation studies 2) to add to the literature a comprehensive examination of the estimation of a general 3-state CTMC model. Simulation studies are conducted to analyze longitudinal data with a trinomial outcome using a CTMC with and without covariates. Measures of performance including bias, component-wise coverage probabilities, and joint coverage probabilities are calculated. An application is presented using Alzheimer's disease caregiver stress levels. Comparisons of joint and component-wise parameter estimates yield conflicting inferential results in simulations from models with and without covariates. In conclusion, caution should be taken when conducting simulation studies aiming to assess performance and choice of inference should properly reflect the purpose of the simulation.

  16. Flight investigation of a four-dimensional terminal area guidance system for STOL aircraft

    NASA Technical Reports Server (NTRS)

    Neuman, F.; Hardy, G. H.

    1981-01-01

    A series of flight tests and fast-time simulations were conducted, using the augmentor wing jet STOL research aircraft and the STOLAND 4D-RNAV system to add to the growing data base of 4D-RNAV system performance capabilities. To obtain statistically meaningful data a limited amount of flight data were supplemented by a statistically significant amount of data obtained from fast-time simulation. The results of these tests are reported. Included are comparisons of the 4D-RNAV estimated winds with actual winds encountered in flight, as well as data on along-track navigation and guidance errors, and time-of-arrival errors at the final approach waypoint. In addition, a slight improvement of the STOLAND 4D-RNAV system is proposed and demonstrated, using the fast-time simulation.

  17. Solving Problems With SINDA/FLUINT

    NASA Technical Reports Server (NTRS)

    2002-01-01

    SINDA/FLUINT, the NASA standard software system for thermohydraulic analysis, provides computational simulation of interacting thermal and fluid effects in designs modeled as heat transfer and fluid flow networks. The product saves time and money by making the user's design process faster and easier, and allowing the user to gain a better understanding of complex systems. The code is completely extensible, allowing the user to choose the features, accuracy and approximation levels, and outputs. Users can also add their own customizations as needed to handle unique design tasks or to automate repetitive tasks. Applications for SINDA/FLUINT include the pharmaceutical, petrochemical, biomedical, electronics, and energy industries. The system has been used to simulate nuclear reactors, windshield wipers, and human windpipes. In the automotive industry, it simulates the transient liquid/vapor flows within air conditioning systems.

  18. Transport properties and equation of state for HCNO mixtures in and beyond the warm dense matter regime

    DOE PAGES

    Ticknor, Christopher; Collins, Lee A.; Kress, Joel D.

    2015-08-04

    We present simulations of a four component mixture of HCNO with orbital free molecular dynamics (OFMD). These simulations were conducted for 5–200 eV with densities ranging between 0.184 and 36.8 g/cm 3. We extract the equation of state from the simulations and compare to average atom models. We found that we only need to add a cold curve model to find excellent agreement. In addition, we studied mass transport properties. We present fits to the self-diffusion and shear viscosity that are able to reproduce the transport properties over the parameter range studied. We compare these OFMD results to models basedmore » on the Coulomb coupling parameter and one-component plasmas.« less

  19. A Layered Solution for Supercomputing Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grider, Gary

    To solve the supercomputing challenge of memory keeping up with processing speed, a team at Los Alamos National Laboratory developed two innovative memory management and storage technologies. Burst buffers peel off data onto flash memory to support the checkpoint/restart paradigm of large simulations. MarFS adds a thin software layer enabling a new tier for campaign storage—based on inexpensive, failure-prone disk drives—between disk drives and tape archives.

  20. CalPro: a spreadsheet program for the management of California mixed-conifer stands.

    Treesearch

    Jingjing Liang; Joseph Buongiorno; Robert A. Monserud

    2004-01-01

    CalPro is an add-in program developed to work with Microsoft Excel to simulate the growth and management of uneven-aged mixed-conifer stands in California. Its built-in growth model was calibrated from 177 uneven-aged plots on industry and other private lands. Stands are described by the number of trees per acre in each of nineteen 2-inch diameter classes in...

  1. Enhancing robustness of interdependent network by adding connectivity and dependence links

    NASA Astrophysics Data System (ADS)

    Cui, Pengshuai; Zhu, Peidong; Wang, Ke; Xun, Peng; Xia, Zhuoqun

    2018-05-01

    Enhancing robustness of interdependent networks by adding connectivity links has been researched extensively, however, few of them are focusing on adding both connectivity and dependence links to enhance robustness. In this paper, we aim to study how to allocate the limited costs reasonably to add both connectivity and dependence links. Firstly, we divide the attackers into stubborn attackers and smart attackers according to whether would they change their attack modes with the changing of network structure; Then by simulations, link addition strategies are given separately according to different attackers, with which we can allocate the limited costs to add connectivity links and dependence links reasonably and achieve more robustness than only adding connectivity links or dependence links. The results show that compared to only adding connectivity links or dependence links, allocating the limited resources reasonably and adding both connectivity links and dependence links could bring more robustness to the interdependent networks.

  2. Broadband Transmission Loss Due to Reverberant Excitation

    NASA Technical Reports Server (NTRS)

    Barisciano, Lawrence P. Jr.

    1999-01-01

    The noise transmission characteristics of candidate curved aircraft sidewall panel constructions is examined analytically using finite element models of the selected panel geometries. The models are validated by experimental modal analyses and transmission loss testing. The structural and acoustic response of the models are then examined when subjected to random or reverberant excitation, the simulation of which is also discussed. For a candidate curved honeycomb panel, the effect of add-on trim panel treatments is examined. Specifically, two different mounting configurations are discussed and their effect on the transmission loss of the panel is presented. This study finds that the add-on acoustical treatments do improve on the primary structures transmission loss characteristics, however, much more research is necessary to draw any valid conclusions about the optimal configuration for the maximum noise transmission loss. This paper describes several directions for the extension of this work.

  3. Interference Canceller Based on Cycle-and-Add Property for Single User Detection in DS-CDMA

    NASA Astrophysics Data System (ADS)

    Hettiarachchi, Ranga; Yokoyama, Mitsuo; Uehara, Hideyuki; Ohira, Takashi

    In this paper, performance of a novel interference cancellation technique for the single user detection in a direct-sequence code-division multiple access (DS-CDMA) system has been investigated. This new algorithm is based on the Cycle-and-Add property of PN (Pseudorandom Noise) sequences and can be applied for both synchronous and asynchronous systems. The proposed strategy provides a simple method that can delete interference signals one by one in spite of the power levels of interferences. Therefore, it is possible to overcome the near-far problem (NFP) in a successive manner without using transmit power control (TPC) techniques. The validity of the proposed procedure is corroborated by computer simulations in additive white Gaussian noise (AWGN) and frequency-nonselective fading channels. Performance results indicate that the proposed receiver outperforms the conventional receiver and, in many cases, it does so with a considerable gain.

  4. 3-D Image Encryption Based on Rubik's Cube and RC6 Algorithm

    NASA Astrophysics Data System (ADS)

    Helmy, Mai; El-Rabaie, El-Sayed M.; Eldokany, Ibrahim M.; El-Samie, Fathi E. Abd

    2017-12-01

    A novel encryption algorithm based on the 3-D Rubik's cube is proposed in this paper to achieve 3D encryption of a group of images. This proposed encryption algorithm begins with RC6 as a first step for encrypting multiple images, separately. After that, the obtained encrypted images are further encrypted with the 3-D Rubik's cube. The RC6 encrypted images are used as the faces of the Rubik's cube. From the concepts of image encryption, the RC6 algorithm adds a degree of diffusion, while the Rubik's cube algorithm adds a degree of permutation. The simulation results demonstrate that the proposed encryption algorithm is efficient, and it exhibits strong robustness and security. The encrypted images are further transmitted over wireless Orthogonal Frequency Division Multiplexing (OFDM) system and decrypted at the receiver side. Evaluation of the quality of the decrypted images at the receiver side reveals good results.

  5. A Generalized Method for Automatic Downhand and Wirefeed Control of a Welding Robot and Positioner

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken; Cook, George E.

    1988-01-01

    A generalized method for controlling a six degree-of-freedom (DOF) robot and a two DOF positioner used for arc welding operations is described. The welding path is defined in the part reference frame, and robot/positioner joint angles of the equivalent eight DOF serial linkage are determined via an iterative solution. Three algorithms are presented: the first solution controls motion of the eight DOF mechanism such that proper torch motion is achieved while minimizing the sum-of-squares of joint displacements; the second algorithm adds two constraint equations to achieve torch control while maintaining part orientation so that welding occurs in the downhand position; and the third algorithm adds the ability to control the proper orientation of a wire feed mechanism used in gas tungsten arc (GTA) welding operations. A verification of these algorithms is given using ROBOSIM, a NASA developed computer graphic simulation software package design for robot systems development.

  6. Cyber Security Research Frameworks For Coevolutionary Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rush, George D.; Tauritz, Daniel Remy

    Several architectures have been created for developing and testing systems used in network security, but most are meant to provide a platform for running cyber security experiments as opposed to automating experiment processes. In the first paper, we propose a framework termed Distributed Cyber Security Automation Framework for Experiments (DCAFE) that enables experiment automation and control in a distributed environment. Predictive analysis of adversaries is another thorny issue in cyber security. Game theory can be used to mathematically analyze adversary models, but its scalability limitations restrict its use. Computational game theory allows us to scale classical game theory to larger,more » more complex systems. In the second paper, we propose a framework termed Coevolutionary Agent-based Network Defense Lightweight Event System (CANDLES) that can coevolve attacker and defender agent strategies and capabilities and evaluate potential solutions with a custom network defense simulation. The third paper is a continuation of the CANDLES project in which we rewrote key parts of the framework. Attackers and defenders have been redesigned to evolve pure strategy, and a new network security simulation is devised which specifies network architecture and adds a temporal aspect. We also add a hill climber algorithm to evaluate the search space and justify the use of a coevolutionary algorithm.« less

  7. 3D Laser Imprint Using a Smoother Ray-Traced Power Deposition Method

    NASA Astrophysics Data System (ADS)

    Schmitt, Andrew J.

    2017-10-01

    Imprinting of laser nonuniformities in directly-driven icf targets is a challenging problem to accurately simulate with large radiation-hydro codes. One of the most challenging aspects is the proper construction of the complex and rapidly changing laser interference structure driving the imprint using the reduced laser propagation models (usually ray-tracing) found in these codes. We have upgraded the modelling capability in our massively-parallel fastrad3d code by adding a more realistic EM-wave interference structure. This interference model adds an axial laser speckle to the previous transverse-only laser structure, and can be impressed on our improved smoothed 3D raytrace package. This latter package, which connects rays to form bundles and performs power deposition calculations on the bundles, is intended to decrease ray-trace noise (which can mask or add to imprint) while using fewer rays. We apply this improved model to 3D simulations of recent imprint experiments performed on the Omega-EP laser and the Nike laser that examined the reduction of imprinting due to very thin high-Z target coatings. We report on the conditions in which this new model makes a significant impact on the development of laser imprint. Supported by US DoE/NNSA.

  8. Validating and Optimizing the Effects of Model Progression in Simulation-Based Inquiry Learning

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Lazonder, Ard W.; de Jong, Ton; Anjewierden, Anjo; Bollen, Lars

    2012-01-01

    Model progression denotes the organization of the inquiry learning process in successive phases of increasing complexity. This study investigated the effectiveness of model progression in general, and explored the added value of either broadening or narrowing students' possibilities to change model progression phases. Results showed that…

  9. MAISIE: a multipurpose astronomical instrument simulator environment

    NASA Astrophysics Data System (ADS)

    O'Brien, Alan; Beard, Steven; Geers, Vincent; Klaassen, Pamela

    2016-07-01

    Astronomical instruments often need simulators to preview their data products and test their data reduction pipelines. Instrument simulators have tended to be purpose-built with a single instrument in mind, and at- tempting to reuse one of these simulators for a different purpose is often a slow and difficult task. MAISIE is a simulator framework designed for reuse on different instruments. An object-oriented design encourages reuse of functionality and structure, while offering the flexibility to create new classes with new functionality. MAISIE is a set of Python classes, interfaces and tools to help build instrument simulators. MAISIE can just as easily build simulators for single and multi-channel instruments, imagers and spectrometers, ground and space based instruments. To remain easy to use and to facilitate the sharing of simulators across teams, MAISIE is written in Python, a freely available and open-source language. New functionality can be created for MAISIE by creating new classes that represent optical elements. This approach allows new and novel instruments to add functionality and take advantage of the existing MAISIE classes. MAISIE has recently been used successfully to develop the simulator for the JWST/MIRI- Medium Resolution Spectrometer.

  10. Progressive but Previously Untreated CLL Patients with Greater Array CGH Complexity Exhibit a Less Durable Response to Chemoimmunotherapy

    PubMed Central

    Kay, Neil E.; Eckel-Passow, Jeanette E.; Braggio, Esteban; VanWier, Scott; Shanafelt, Tait D.; Van Dyke, Daniel L.; Jelinek, Diane F.; Tschumper, Renee C.; Kipps, Thomas; Byrd, John C.; Fonseca, Rafael

    2010-01-01

    To better understand the implications of genomic instability and outcome in B-cell CLL, we sought to address genomic complexity as a predictor of chemosensitivity and ultimately clinical outcome in this disease. We employed array-based comparative genomic hybridization (aCGH), using a one-million probe array and identified gains and losses of genetic material in 48 patients treated on a chemoimmunotherapy (CIT) clinical trial. We identified chromosomal gain or loss in ≥6% of the patients on chromosomes 3, 8, 9, 10, 11, 12, 13, 14 and 17. Higher genomic complexity, as a mechanism favoring clonal selection, was associated with shorter progression-free survival and predicted a poor response to treatment. Of interest, CLL cases with loss of p53 surveillance showed more complex genomic features and were found both in patients with a 17p13.1 deletion and in the more favorable genetic subtype characterized by the presence of 13q14.1 deletion. This aCGH study adds information on the association between inferior trial response and increasing genetic complexity as CLL progresses. PMID:21156228

  11. The DIAN-TU Next Generation Alzheimer's prevention trial: Adaptive design and disease progression model.

    PubMed

    Bateman, Randall J; Benzinger, Tammie L; Berry, Scott; Clifford, David B; Duggan, Cynthia; Fagan, Anne M; Fanning, Kathleen; Farlow, Martin R; Hassenstab, Jason; McDade, Eric M; Mills, Susan; Paumier, Katrina; Quintana, Melanie; Salloway, Stephen P; Santacruz, Anna; Schneider, Lon S; Wang, Guoqiao; Xiong, Chengjie

    2017-01-01

    The Dominantly Inherited Alzheimer Network Trials Unit (DIAN-TU) trial is an adaptive platform trial testing multiple drugs to slow or prevent the progression of Alzheimer's disease in autosomal dominant Alzheimer's disease (ADAD) families. With completion of enrollment of the first two drug arms, the DIAN-TU now plans to add new drugs to the platform, designated as the Next Generation (NexGen) prevention trial. In collaboration with ADAD families, philanthropic organizations, academic leaders, the DIAN-TU Pharma Consortium, the National Institutes of Health, and regulatory colleagues, the DIAN-TU developed innovative clinical study designs for the DIAN-TU NexGen prevention trial. Our expanded trial toolbox consists of a disease progression model for ADAD, primary end point DIAN-TU cognitive performance composite, biomarker development, self-administered cognitive assessments, adaptive dose adjustments, and blinded data collection through the last participant completion. These steps represent elements to improve efficacy of the adaptive platform trial and a continued effort to optimize prevention and treatment trials in ADAD. Copyright © 2016 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  12. Next Processor Module: A Hardware Accelerator of UT699 LEON3-FT System for On-Board Computer Software Simulation

    NASA Astrophysics Data System (ADS)

    Langlois, Serge; Fouquet, Olivier; Gouy, Yann; Riant, David

    2014-08-01

    On-Board Computers (OBC) are more and more using integrated systems on-chip (SOC) that embed processors running from 50MHz up to several hundreds of MHz, and around which are plugged some dedicated communication controllers together with other Input/Output channels.For ground testing and On-Board SoftWare (OBSW) validation purpose, a representative simulation of these systems, faster than real-time and with cycle-true timing of execution, is not achieved with current purely software simulators.Since a few years some hybrid solutions where put in place ([1], [2]), including hardware in the loop so as to add accuracy and performance in the computer software simulation.This paper presents the results of the works engaged by Thales Alenia Space (TAS-F) at the end of 2010, that led to a validated HW simulator of the UT699 by mid- 2012 and that is now qualified and fully used in operational contexts.

  13. Benchmark tests for a Formula SAE Student car prototyping

    NASA Astrophysics Data System (ADS)

    Mariasiu, Florin

    2011-12-01

    Aerodynamic characteristics of a vehicle are important elements in its design and construction. A low drag coefficient brings significant fuel savings and increased engine power efficiency. In designing and developing vehicles trough computer simulation process to determine the vehicles aerodynamic characteristics are using dedicated CFD (Computer Fluid Dynamics) software packages. However, the results obtained by this faster and cheaper method, are validated by experiments in wind tunnels tests, which are expensive and were complex testing equipment are used in relatively high costs. Therefore, the emergence and development of new low-cost testing methods to validate CFD simulation results would bring great economic benefits for auto vehicles prototyping process. This paper presents the initial development process of a Formula SAE Student race-car prototype using CFD simulation and also present a measurement system based on low-cost sensors through which CFD simulation results were experimentally validated. CFD software package used for simulation was Solid Works with the FloXpress add-on and experimental measurement system was built using four piezoresistive force sensors FlexiForce type.

  14. Improvements in flight table dynamic transparency for hardware-in-the-loop facilities

    NASA Astrophysics Data System (ADS)

    DeMore, Louis A.; Mackin, Rob; Swamp, Michael; Rusterholtz, Roger

    2000-07-01

    Flight tables are a 'necessary evil' in the Hardware-In-The- Loop (HWIL) simulation. Adding the actual or prototypic flight hardware to the loop, in order to increase the realism of the simulation, forces us to add motion simulation to the process. Flight table motion bases bring unwanted dynamics, non- linearities, transport delays, etc to an already difficult problem sometimes requiring the simulation engineer to compromise the results. We desire that the flight tables be 'dynamically transparent' to the simulation scenario. This paper presents a State Variable Feedback (SVF) control system architecture with feed-forward techniques that improves the flight table's dynamic transparency by significantly reducing the table's low frequency phase lag. We offer some actual results with existing flight tables that demonstrate the improved transparency. These results come from a demonstration conducted on a flight table in the KHILS laboratory at Eglin AFB and during a refurbishment of a flight table for the Boeing Company of St. Charles, Missouri.

  15. Recent progress in simulating galaxy formation from the largest to the smallest scales

    NASA Astrophysics Data System (ADS)

    Faucher-Giguère, Claude-André

    2018-05-01

    Galaxy formation simulations are an essential part of the modern toolkit of astrophysicists and cosmologists alike. Astrophysicists use the simulations to study the emergence of galaxy populations from the Big Bang, as well as the formation of stars and supermassive black holes. For cosmologists, galaxy formation simulations are needed to understand how baryonic processes affect measurements of dark matter and dark energy. Owing to the extreme dynamic range of galaxy formation, advances are driven by novel approaches using simulations with different tradeoffs between volume and resolution. Large-volume but low-resolution simulations provide the best statistics, while higher-resolution simulations of smaller cosmic volumes can be evolved with self-consistent physics and reveal important emergent phenomena. I summarize recent progress in galaxy formation simulations, including major developments in the past five years, and highlight some key areas likely to drive further advances over the next decade.

  16. Big five personality factors and cigarette smoking: a 10-year study among US adults.

    PubMed

    Zvolensky, Michael J; Taha, Farah; Bono, Amanda; Goodwin, Renee D

    2015-04-01

    The present study examined the relation between the big five personality traits and any lifetime cigarette use, progression to daily smoking, and smoking persistence among adults in the United States (US) over a ten-year period. Data were drawn from the Midlife Development in the US (MIDUS) I and II (N = 2101). Logistic regression was used to examine the relationship between continuously measured personality factors and any lifetime cigarette use, smoking progression, and smoking persistence at baseline (1995-1996) and at follow-up (2004-2006). The results revealed that higher levels of openness to experience and neuroticism were each significantly associated with increased risk of any lifetime cigarette use. Neuroticism also was associated with increased risk of progression from ever smoking to daily smoking and persistent daily smoking over a ten-year period. In contrast, conscientiousness was associated with decreased risk of lifetime cigarette use, progression to daily smoking, and smoking persistence. Most, but not all, associations between smoking and personality persisted after adjusting for demographic characteristics, depression, anxiety disorders, and substance use problems. The findings suggest that openness to experience and neuroticism may be involved in any lifetime cigarette use and smoking progression, and that conscientiousness appears to protect against smoking progression and persistence. These data add to a growing literature suggesting that certain personality factors--most consistently neuroticism--are important to assess and perhaps target during intervention programs for smoking behavior. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Big five personality factors and cigarette smoking: A 10-year study among US adults

    PubMed Central

    Zvolensky, Michael J.; Taha, Farah; Bono, Amanda; Goodwin, Renee D.

    2015-01-01

    The present study examined the relation between the big five personality traits and any lifetime cigarette use, progression to daily smoking, and smoking persistence among adults in the United States (US) over a ten-year period. Data were drawn from the Midlife Development in the US (MIDUS) I and II (N=2,101). Logistic regression was used to examine the relationship between continuously measured personality factors and any lifetime cigarette use, smoking progression, and smoking persistence at baseline (1995–1996) and at follow-up (2004–2006). The results revealed that higher levels of openness to experience and neuroticism were each significantly associated with increased risk of any lifetime cigarette use. Neuroticism also was associated with increased risk of progression from ever smoking to daily smoking and persistent daily smoking over a ten-year period. In contrast, conscientiousness was associated with decreased risk of lifetime cigarette use, progression to daily smoking, and smoking persistence. Most, but not all, associations between smoking and personality persisted after adjusting for demographic characteristics, depression, anxiety disorders, and substance use problems. The findings suggest that openness to experience and neuroticism may be involved in any lifetime cigarette use and smoking progression, and that conscientiousness appears to protect against smoking progression and persistence. These data add to a growing literature suggesting that certain personality factors—most consistently neuroticism—are important to assess and perhaps target during intervention programs for smoking behavior. PMID:25799395

  18. Nondestructive Intervention to Multi-Agent Systems through an Intelligent Agent

    PubMed Central

    Han, Jing; Wang, Lin

    2013-01-01

    For a given multi-agent system where the local interaction rule of the existing agents can not be re-designed, one way to intervene the collective behavior of the system is to add one or a few special agents into the group which are still treated as normal agents by the existing ones. We study how to lead a Vicsek-like flocking model to reach synchronization by adding special agents. A popular method is to add some simple leaders (fixed-headings agents). However, we add one intelligent agent, called ‘shill’, which uses online feedback information of the group to decide the shill's moving direction at each step. A novel strategy for the shill to coordinate the group is proposed. It is strictly proved that a shill with this strategy and a limited speed can synchronize every agent in the group. The computer simulations show the effectiveness of this strategy in different scenarios, including different group sizes, shill speed, and with or without noise. Compared to the method of adding some fixed-heading leaders, our method can guarantee synchronization for any initial configuration in the deterministic scenario and improve the synchronization level significantly in low density groups, or model with noise. This suggests the advantage and power of feedback information in intervention of collective behavior. PMID:23658695

  19. Development of Macroscale Models of UO 2 Fuel Sintering and Densification using Multiscale Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenquist, Ian; Tonks, Michael

    2016-10-01

    Light water reactor fuel pellets are fabricated using sintering to final densities of 95% or greater. During reactor operation, the porosity remaining in the fuel after fabrication decreases further due to irradiation-assisted densification. While empirical models have been developed to describe this densification process, a mechanistic model is needed as part of the ongoing work by the NEAMS program to develop a more predictive fuel performance code. In this work we will develop a phase field model of sintering of UO 2 in the MARMOT code, and validate it by comparing to published sintering data. We will then add themore » capability to capture irradiation effects into the model, and use it to develop a mechanistic model of densification that will go into the BISON code and add another essential piece to the microstructure-based materials models. The final step will be to add the effects of applied fields, to model field-assisted sintering of UO 2. The results of the phase field model will be validated by comparing to data from field-assisted sintering. Tasks over three years: 1. Develop a sintering model for UO 2 in MARMOT 2. Expand model to account for irradiation effects 3. Develop a mechanistic macroscale model of densification for BISON« less

  20. Use of Electronic Health Record Simulation to Understand the Accuracy of Intern Progress Notes

    PubMed Central

    March, Christopher A.; Scholl, Gretchen; Dversdal, Renee K.; Richards, Matthew; Wilson, Leah M.; Mohan, Vishnu; Gold, Jeffrey A.

    2016-01-01

    Background With the widespread adoption of electronic health records (EHRs), there is a growing awareness of problems in EHR training for new users and subsequent problems with the quality of information present in EHR-generated progress notes. By standardizing the case, simulation allows for the discovery of EHR patterns of use as well as a modality to aid in EHR training. Objective To develop a high-fidelity EHR training exercise for internal medicine interns to understand patterns of EHR utilization in the generation of daily progress notes. Methods Three months after beginning their internship, 32 interns participated in an EHR simulation designed to assess patterns in note writing and generation. Each intern was given a simulated chart and instructed to create a daily progress note. Notes were graded for use of copy-paste, macros, and accuracy of presented data. Results A total of 31 out of 32 interns (97%) completed the exercise. There was wide variance in use of macros to populate data, with multiple macro types used for the same data category. Three-quarters of notes contained either copy-paste elements or the elimination of active medical problems from the prior days' notes. This was associated with a significant number of quality issues, including failure to recognize a lack of deep vein thrombosis prophylaxis, medications stopped on admission, and issues in prior discharge summary. Conclusions Interns displayed wide variation in the process of creating progress notes. Additional studies are being conducted to determine the impact EHR-based simulation has on standardization of note content. PMID:27168894

  1. Use of Electronic Health Record Simulation to Understand the Accuracy of Intern Progress Notes.

    PubMed

    March, Christopher A; Scholl, Gretchen; Dversdal, Renee K; Richards, Matthew; Wilson, Leah M; Mohan, Vishnu; Gold, Jeffrey A

    2016-05-01

    Background With the widespread adoption of electronic health records (EHRs), there is a growing awareness of problems in EHR training for new users and subsequent problems with the quality of information present in EHR-generated progress notes. By standardizing the case, simulation allows for the discovery of EHR patterns of use as well as a modality to aid in EHR training. Objective To develop a high-fidelity EHR training exercise for internal medicine interns to understand patterns of EHR utilization in the generation of daily progress notes. Methods Three months after beginning their internship, 32 interns participated in an EHR simulation designed to assess patterns in note writing and generation. Each intern was given a simulated chart and instructed to create a daily progress note. Notes were graded for use of copy-paste, macros, and accuracy of presented data. Results A total of 31 out of 32 interns (97%) completed the exercise. There was wide variance in use of macros to populate data, with multiple macro types used for the same data category. Three-quarters of notes contained either copy-paste elements or the elimination of active medical problems from the prior days' notes. This was associated with a significant number of quality issues, including failure to recognize a lack of deep vein thrombosis prophylaxis, medications stopped on admission, and issues in prior discharge summary. Conclusions Interns displayed wide variation in the process of creating progress notes. Additional studies are being conducted to determine the impact EHR-based simulation has on standardization of note content.

  2. Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry

    NASA Astrophysics Data System (ADS)

    Gerard, Libby F.; Linn, Marcia C.

    2016-02-01

    Computer scoring of student written essays about an inquiry topic can be used to diagnose student progress both to alert teachers to struggling students and to generate automated guidance. We identify promising ways for teachers to add value to automated guidance to improve student learning. Three teachers from two schools and their 386 students participated. We draw on evidence from student progress, observations of how teachers interact with students, and reactions of teachers. The findings suggest that alerts for teachers prompted rich teacher-student conversations about energy in photosynthesis. In one school, the combination of the automated guidance plus teacher guidance was more effective for student science learning than two rounds of personalized, automated guidance. In the other school, both approaches resulted in equal learning gains. These findings suggest optimal combinations of automated guidance and teacher guidance to support students to revise explanations during inquiry and build integrated understanding of science.

  3. Energy, Transportation, Air Quality, Climate Change, Health Nexus: Sustainable Energy is Good for Our Health

    PubMed Central

    Erickson, Larry E.; Jennings, Merrisa

    2017-01-01

    The Paris Agreement on Climate Change has the potential to improve air quality and human health by encouraging the electrification of transportation and a transition from coal to sustainable energy. There will be human health benefits from reducing combustion emissions in all parts of the world. Solar powered charging infrastructure for electric vehicles adds renewable energy to generate electricity, shaded parking, and a needed charging infrastructure for electric vehicles that will reduce range anxiety. The costs of wind power, solar panels, and batteries are falling because of technological progress, magnitude of commercial activity, production experience, and competition associated with new trillion dollar markets. These energy and transportation transitions can have a very positive impact on health. The energy, transportation, air quality, climate change, health nexus may benefit from additional progress in developing solar powered charging infrastructure. PMID:29922702

  4. The principles and practices of educational neuroscience: Comment on Bowers (2016).

    PubMed

    Howard-Jones, Paul A; Varma, Sashank; Ansari, Daniel; Butterworth, Brian; De Smedt, Bert; Goswami, Usha; Laurillard, Diana; Thomas, Michael S C

    2016-10-01

    In his recent critique of Educational Neuroscience, Bowers argues that neuroscience has no role to play in informing education, which he equates with classroom teaching. Neuroscience, he suggests, adds nothing to what we can learn from psychology. In this commentary, we argue that Bowers' assertions misrepresent the nature and aims of the work in this new field. We suggest that, by contrast, psychological and neural levels of explanation complement rather than compete with each other. Bowers' analysis also fails to include a role for educational expertise-a guiding principle of our new field. On this basis, we conclude that his critique is potentially misleading. We set out the well-documented goals of research in Educational Neuroscience, and show how, in collaboration with educators, significant progress has already been achieved, with the prospect of even greater progress in the future. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. Traumatic brain injury history and progression from mild cognitive impairment to Alzheimer disease.

    PubMed

    LoBue, Christian; Woon, Fu L; Rossetti, Heidi C; Hynan, Linda S; Hart, John; Cullum, C Munro

    2018-05-01

    To examine whether history of traumatic brain injury (TBI) is associated with more rapid progression from mild cognitive impairment (MCI) to Alzheimer's disease (AD). Data from 2,719 subjects with MCI were obtained from the National Alzheimer's Coordinating Center. TBI was categorized based on presence (TBI+) or absence (TBI-) of reported TBI with loss of consciousness (LOC) without chronic deficit occurring >1 year prior to diagnosis of MCI. Survival analyses were used to determine if a history of TBI predicted progression from MCI to AD up to 8 years. Random regression models were used to examine whether TBI history also predicted rate of decline on the Clinical Dementia Rating scale Sum of Boxes score (CDR-SB) among subjects who progress to AD. Across 8 years, TBI history was not significantly associated with progression from MCI to a diagnosis of AD in unadjusted (HR = 0.80; 95% CI [0.63, 1.01]; p = .06) and adjusted (p = .15) models. Similarly, a history of TBI was a nonsignificant predictor for rate of decline on CDR-SB among subjects who progressed to AD (b = 0.15, p = .38). MCI was, however, diagnosed a mean of 2.6 years earlier (p < .001) in TBI+ subjects compared with the TBI- group. A history of TBI with LOC was not associated with progression from MCI to AD, but was linked to an earlier age of MCI diagnosis. These findings add to a growing literature suggesting that TBI might reduce the threshold for onset of MCI and certain neurodegenerative conditions, but appears unrelated to progression from MCI to AD. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  6. Signal detection in global mean temperatures after "Paris": an uncertainty and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Visser, Hans; Dangendorf, Sönke; van Vuuren, Detlef P.; Bregman, Bram; Petersen, Arthur C.

    2018-02-01

    In December 2015, 195 countries agreed in Paris to hold the increase in global mean surface temperature (GMST) well below 2.0 °C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5 °C. Since large financial flows will be needed to keep GMSTs below these targets, it is important to know how GMST has progressed since pre-industrial times. However, the Paris Agreement is not conclusive as regards methods to calculate it. Should trend progression be deduced from GCM simulations or from instrumental records by (statistical) trend methods? Which simulations or GMST datasets should be chosen, and which trend models? What is pre-industrial and, finally, are the Paris targets formulated for total warming, originating from both natural and anthropogenic forcing, or do they refer to anthropogenic warming only? To find answers to these questions we performed an uncertainty and sensitivity analysis where datasets and model choices have been varied. For all cases we evaluated trend progression along with uncertainty information. To do so, we analysed four trend approaches and applied these to the five leading observational GMST products. We find GMST progression to be largely independent of various trend model approaches. However, GMST progression is significantly influenced by the choice of GMST datasets. Uncertainties due to natural variability are largest in size. As a parallel path, we calculated GMST progression from an ensemble of 42 GCM simulations. Mean progression derived from GCM-based GMSTs appears to lie in the range of trend-dataset combinations. A difference between both approaches appears to be the width of uncertainty bands: GCM simulations show a much wider spread. Finally, we discuss various choices for pre-industrial baselines and the role of warming definitions. Based on these findings we propose an estimate for signal progression in GMSTs since pre-industrial.

  7. A Randomized Trial of Soft Multifocal Contact Lenses for Myopia Control: Baseline Data and Methods.

    PubMed

    Walline, Jeffrey J; Gaume Giannoni, Amber; Sinnott, Loraine T; Chandler, Moriah A; Huang, Juan; Mutti, Donald O; Jones-Jordan, Lisa A; Berntsen, David A

    2017-09-01

    The Bifocal Lenses In Nearsighted Kids (BLINK) study is the first soft multifocal contact lens myopia control study to compare add powers and measure peripheral refractive error in the vertical meridian, so it will provide important information about the potential mechanism of myopia control. The BLINK study is a National Eye Institute-sponsored, double-masked, randomized clinical trial to investigate the effects of soft multifocal contact lenses on myopia progression. This article describes the subjects' baseline characteristics and study methods. Subjects were 7 to 11 years old, had -0.75 to -5.00 spherical component and less than 1.00 diopter (D) astigmatism, and had 20/25 or better logMAR distance visual acuity with manifest refraction in each eye and with +2.50-D add soft bifocal contact lenses on both eyes. Children were randomly assigned to wear Biofinity single-vision, Biofinity Multifocal "D" with a +1.50-D add power, or Biofinity Multifocal "D" with a +2.50-D add power contact lenses. We examined 443 subjects at the baseline visits, and 294 (66.4%) subjects were enrolled. Of the enrolled subjects, 177 (60.2%) were female, and 200 (68%) were white. The mean (± SD) age was 10.3 ± 1.2 years, and 117 (39.8%) of the eligible subjects were younger than 10 years. The mean spherical equivalent refractive error, measured by cycloplegic autorefraction was -2.39 ± 1.00 D. The best-corrected binocular logMAR visual acuity with glasses was +0.01 ± 0.06 (20/21) at distance and -0.03 ± 0.08 (20/18) at near. The BLINK study subjects are similar to patients who would routinely be eligible for myopia control in practice, so the results will provide clinical information about soft bifocal contact lens myopia control as well as information about the mechanism of the treatment effect, if one occurs.

  8. Rapid response to methylphenidate as an add-on therapy to mirtazapine in the treatment of major depressive disorder in terminally ill cancer patients: a four-week, randomized, double-blinded, placebo-controlled study.

    PubMed

    Ng, Chong Guan; Boks, Marco P M; Roes, Kit C B; Zainal, Nor Zuraida; Sulaiman, Ahmad Hatim; Tan, Seng Beng; de Wit, Niek J

    2014-04-01

    This is a 4 week, randomized, double-blind, placebo-controlled study to examine the effects of methylphenidate as add-on therapy to mirtazapine compared to placebo for treatment of depression in terminally ill cancer patients. It involved 88 terminally ill cancer patients from University of Malaya Medical Centre, Kuala Lumpur, Malaysia. They were randomized and treated with either methylphenidate or placebo as add on to mirtazapine. The change in Montgomery-Åsberg Depression Rating Scale (MADRS) score from baseline to day 3 was analyzed by linear regression. Changes of MADRS and Clinical Global Impression-Severity Scale (CGI-S) over 28 days were analyzed using mixed model repeated measures (MMRM). Secondary analysis of MADRS response rates, defined as 50% or more reduction from baseline score. A significantly larger reduction of Montgomery-Åsberg Depression Rating Scale (MADRS) score in the methylphenidate group was observed from day 3 (B=4.14; 95% CI=1.83-6.45). Response rate (defined as 50% or more reduction from baseline MADRS score) in the methylphenidate treated group was superior from day 14. Improvement in Clinical Global Impression-Severity Scale (CGI-S) was greater in the methylphenidate treated group from day 3 until day 28. The drop-out rates were 52.3% in the methylphenidate group and 59.1% in the placebo group (relative risk=0.86, 95%CI=0.54-1.37) due to cancer progression. Nervous system adverse events were more common in methylphenidate treated subjects (20.5% vs 9.1%, p=0.13). In conclusions, methylphenidate as add on therapy to mirtazapine demonstrated an earlier antidepressant response in terminally ill cancer patients, although at an increased risk of the nervous system side effects. Copyright © 2014 Elsevier B.V. and ECNP. All rights reserved.

  9. Bridging the Gap Between Science and Clinical Efficacy: Physiology, Imaging, and Modeling of Aerosols in the Lung.

    PubMed

    Darquenne, Chantal; Fleming, John S; Katz, Ira; Martin, Andrew R; Schroeter, Jeffry; Usmani, Omar S; Venegas, Jose; Schmid, Otmar

    2016-04-01

    Development of a new drug for the treatment of lung disease is a complex and time consuming process involving numerous disciplines of basic and applied sciences. During the 2015 Congress of the International Society for Aerosols in Medicine, a group of experts including aerosol scientists, physiologists, modelers, imagers, and clinicians participated in a workshop aiming at bridging the gap between basic research and clinical efficacy of inhaled drugs. This publication summarizes the current consensus on the topic. It begins with a short description of basic concepts of aerosol transport and a discussion on targeting strategies of inhaled aerosols to the lungs. It is followed by a description of both computational and biological lung models, and the use of imaging techniques to determine aerosol deposition distribution (ADD) in the lung. Finally, the importance of ADD to clinical efficacy is discussed. Several gaps were identified between basic science and clinical efficacy. One gap between scientific research aimed at predicting, controlling, and measuring ADD and the clinical use of inhaled aerosols is the considerable challenge of obtaining, in a single study, accurate information describing the optimal lung regions to be targeted, the effectiveness of targeting determined from ADD, and some measure of the drug's effectiveness. Other identified gaps were the language and methodology barriers that exist among disciplines, along with the significant regulatory hurdles that need to be overcome for novel drugs and/or therapies to reach the marketplace and benefit the patient. Despite these gaps, much progress has been made in recent years to improve clinical efficacy of inhaled drugs. Also, the recent efforts by many funding agencies and industry to support multidisciplinary networks including basic science researchers, R&D scientists, and clinicians will go a long way to further reduce the gap between science and clinical efficacy.

  10. Bridging the Gap Between Science and Clinical Efficacy: Physiology, Imaging, and Modeling of Aerosols in the Lung

    PubMed Central

    Fleming, John S.; Katz, Ira; Martin, Andrew R.; Schroeter, Jeffry; Usmani, Omar S.; Venegas, Jose

    2016-01-01

    Abstract Development of a new drug for the treatment of lung disease is a complex and time consuming process involving numerous disciplines of basic and applied sciences. During the 2015 Congress of the International Society for Aerosols in Medicine, a group of experts including aerosol scientists, physiologists, modelers, imagers, and clinicians participated in a workshop aiming at bridging the gap between basic research and clinical efficacy of inhaled drugs. This publication summarizes the current consensus on the topic. It begins with a short description of basic concepts of aerosol transport and a discussion on targeting strategies of inhaled aerosols to the lungs. It is followed by a description of both computational and biological lung models, and the use of imaging techniques to determine aerosol deposition distribution (ADD) in the lung. Finally, the importance of ADD to clinical efficacy is discussed. Several gaps were identified between basic science and clinical efficacy. One gap between scientific research aimed at predicting, controlling, and measuring ADD and the clinical use of inhaled aerosols is the considerable challenge of obtaining, in a single study, accurate information describing the optimal lung regions to be targeted, the effectiveness of targeting determined from ADD, and some measure of the drug's effectiveness. Other identified gaps were the language and methodology barriers that exist among disciplines, along with the significant regulatory hurdles that need to be overcome for novel drugs and/or therapies to reach the marketplace and benefit the patient. Despite these gaps, much progress has been made in recent years to improve clinical efficacy of inhaled drugs. Also, the recent efforts by many funding agencies and industry to support multidisciplinary networks including basic science researchers, R&D scientists, and clinicians will go a long way to further reduce the gap between science and clinical efficacy. PMID:26829187

  11. Damage Progression in Bolted Composites

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C.; Gotsis, Pascal K.

    1998-01-01

    Structural durability, damage tolerance, and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.

  12. Damage Progression in Bolted Composites

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos; Gotsis, Pascal K.

    1998-01-01

    Structural durability,damage tolerance,and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.

  13. Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.

    2010-01-01

    In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.

  14. Phase imaging using shifted wavefront sensor images.

    PubMed

    Zhang, Zhengyun; Chen, Zhi; Rehman, Shakil; Barbastathis, George

    2014-11-01

    We propose a new approach to the complete retrieval of a coherent field (amplitude and phase) using the same hardware configuration as a Shack-Hartmann sensor but with two modifications: first, we add a transversally shifted measurement to resolve ambiguities in the measured phase; and second, we employ factored form descent (FFD), an inverse algorithm for coherence retrieval, with a hard rank constraint. We verified the proposed approach using both numerical simulations and experiments.

  15. A Layered Solution for Supercomputing Storage

    ScienceCinema

    Grider, Gary

    2018-06-13

    To solve the supercomputing challenge of memory keeping up with processing speed, a team at Los Alamos National Laboratory developed two innovative memory management and storage technologies. Burst buffers peel off data onto flash memory to support the checkpoint/restart paradigm of large simulations. MarFS adds a thin software layer enabling a new tier for campaign storage—based on inexpensive, failure-prone disk drives—between disk drives and tape archives.

  16. SimWorx: An ADA Distributed Simulation Application Framework Supporting HLA and DIS

    DTIC Science & Technology

    1996-12-01

    The authors emphasize that most real systems have elements of several architectural styles; these are called heterogeneous architectures. Typically...In order for frameworks to be used, understood, and maintained, Adair emphasizes they must be clearly documented. 37 2.5.2.2 Framework Use Issues...0a) cuE U)) 00 Z64 Support Category Classes I Component-Type, Max Size _ Item-Type, Max-Size Bounded Buffer ProtectedContainer +Get() +Add() +Put

  17. WestProPlus: a stochastic spreadsheet program for the management of all-aged Douglas-fir–hemlock forests in the Pacific Northwest.

    Treesearch

    Jingjing Liang; Joseph Buongiorno; Robert A. Monserud

    2006-01-01

    WestProPlus is an add-in program developed to work with Microsoft Excel to simulate the growth and management of all-aged Douglas-fir–western hemlock (Pseudotsuga menziesii (Mirb.) Franco–Tsuga heterophylla (Raf.) Sarg.) stands in Oregon and Washington. Its built-in growth model was calibrated from 2,706 permanent plots in the...

  18. A Two-Player Game of Life

    NASA Astrophysics Data System (ADS)

    Levene, Mark; Roussos, George

    We present a new extension of Conway's game of life for two players, which we call ``p2life''. P2life allows one of two types of token, black or white, to inhabit a cell and adds competitive elements into the birth and survival rules of the original game. We solve the mean-field equation for p2life and determine, using simulation, that the asymptotic density of p2life approaches 0.0362.

  19. Environmental triggers in IBD: a review of progress and evidence.

    PubMed

    Ananthakrishnan, Ashwin N; Bernstein, Charles N; Iliopoulos, Dimitrios; Macpherson, Andrew; Neurath, Markus F; Ali, Raja A Raja; Vavricka, Stephan R; Fiocchi, Claudio

    2018-01-01

    A number of environmental factors have been associated with the development of IBD. Alteration of the gut microbiota, or dysbiosis, is closely linked to initiation or progression of IBD, but whether dysbiosis is a primary or secondary event is unclear. Nevertheless, early-life events such as birth, breastfeeding and exposure to antibiotics, as well as later childhood events, are considered potential risk factors for IBD. Air pollution, a consequence of the progressive contamination of the environment by countless compounds, is another factor associated with IBD, as particulate matter or other components can alter the host's mucosal defences and trigger immune responses. Hypoxia associated with high altitude is also a factor under investigation as a potential new trigger of IBD flares. A key issue is how to translate environmental factors into mechanisms of IBD, and systems biology is increasingly recognized as a strategic tool to unravel the molecular alterations leading to IBD. Environmental factors add a substantial level of complexity to the understanding of IBD pathogenesis but also promote the fundamental notion that complex diseases such as IBD require complex therapies that go well beyond the current single-agent treatment approach. This Review describes the current conceptualization, evidence, progress and direction surrounding the association of environmental factors with IBD.

  20. Fifty years of IMOG (International Meetings on Organic Geochemistry)

    USGS Publications Warehouse

    Kvenvolden, Keith A.

    2012-01-01

    IMOG2011 is the 25th of a series of international meetings on organic geochemistry that began in 1962. Thus, this 25th meeting marks the 50th anniversary year of IMOG, which has (a) had a rich history with meetings taking place in 11 different countries, (b) published Proceedings, titled “Advances in Organic Geochemistry,” from each meeting that now number 24 volumes totaling almost 18,000 pages, and (c) documented the content and development of the science of organic geochemistry. IMOG2011 adds a new milestone to the progress of organic geochemistry through time.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saccomanno, G.

    This work, supported by the United States Department of Energy, continues to add data on the health effects of cigarette smoking and radon exposure. Since the beginning of this contract, 473 sputum samples have been collected from 286 uranium workers who are routinely screened in an effort to identify cell changes that could signal possible progression to lung cancer; seven new lung cancer cases have been identified during this period. At this time, there are 426 lung cancer cases in the uranium miner tumor registry with diagnostic slides from surgery and/or autopsy; an additional 40 cases have been diagnosed withmore » sputum cytology only.« less

  2. Systemic Problems: A perspective on stem cell aging and rejuvenation.

    PubMed

    Conboy, Irina M; Conboy, Michael J; Rebo, Justin

    2015-10-01

    This review provides balanced analysis of the advances in systemic regulation of young and old tissue stem cells and suggests strategies for accelerating development of therapies to broadly combat age-related tissue degenerative pathologies. Many highlighted recent reports on systemic tissue rejuvenation combine parabiosis with a "silver bullet" putatively responsible for the positive effects. Attempts to unify these papers reflect the excitement about this experimental approach and add value in reproducing previous work. At the same time, defined molecular approaches, which are "beyond parabiosis" for the rejuvenation of multiple old organs represent progress toward attenuating or even reversing human tissue aging.

  3. Restoration of Function With Acupuncture Following Severe Traumatic Brain Injury: A Case Report.

    PubMed

    Wolf, Jacob; Sparks, Linda; Deng, Yong; Langland, Jeffrey

    2015-11-01

    This case report illustrates the improvement of an acupuncture-treated patient who incurred a severe traumatic brain injury (TBI) from a snowboarding accident. Over 4 years, the patient progressed from initially not being able to walk, having difficulty with speech, and suffering from poor eyesight to where he has now regained significant motor function, speech, and vision and has returned to snowboarding. A core acupuncture protocol plus specific points added to address the patient's ongoing concerns was used. This case adds to the medical literature by demonstrating the potential role of acupuncture in TBI treatment.

  4. Longhorn Business Jets

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Developed in NASA's Aircraft Energy Efficiency program and manufactured by Gates Learjet Corporation, the winglet is an aerodynamic innovation designed to reduce fuel consumption and improve airplane performance. Winglets are lifting surfaces designed to operate in the "vortex" or air whirlpool which occurs at an airplane's wingtip. Complex flow of air around wingtip creates drag which retards the plane's progress. Winglet reduces strength of vortex and thereby reduces strength of drag. Additionally, winglet generates its own lift, producing forward thrust in the manner of a boat's sail. Combination of reduced drag and additional thrust adds up to significant improvement in fuel efficiency.

  5. Nanocomposite Hydrogels: 3D Polymer-Nanoparticle Synergies for On-Demand Drug Delivery.

    PubMed

    Merino, Sonia; Martín, Cristina; Kostarelos, Kostas; Prato, Maurizio; Vázquez, Ester

    2015-05-26

    Considerable progress in the synthesis and technology of hydrogels makes these materials attractive structures for designing controlled-release drug delivery systems. In particular, this review highlights the latest advances in nanocomposite hydrogels as drug delivery vehicles. The inclusion/incorporation of nanoparticles in three-dimensional polymeric structures is an innovative means for obtaining multicomponent systems with diverse functionality within a hybrid hydrogel network. Nanoparticle-hydrogel combinations add synergistic benefits to the new 3D structures. Nanogels as carriers for cancer therapy and injectable gels with improved self-healing properties have also been described as new nanocomposite systems.

  6. Tumorigenesis: it takes a village.

    PubMed

    Tabassum, Doris P; Polyak, Kornelia

    2015-08-01

    Although it is widely accepted that most cancers exhibit some degree of intratumour heterogeneity, we are far from understanding the dynamics that operate among subpopulations within tumours. There is growing evidence that cancer cells behave as communities, and increasing attention is now being directed towards the cooperative behaviour of subclones that can influence disease progression. As expected, these interactions can add a greater layer of complexity to therapeutic interventions in heterogeneous tumours, often leading to a poor prognosis. In this Review, we highlight studies that demonstrate such interactions in cancer and postulate ways to overcome them with better-designed therapeutic strategies.

  7. Advanced Platform for Development and Evaluation of Grid Interconnection Systems Using Hardware-in-the-Loop (Poster)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, B.; Shirazi, M.; Coddington, M.

    2013-02-01

    This poster describes a Grid Interconnection System Evaluator (GISE) that leverages hardware-in-the-loop (HIL) simulation techniques to rapidly evaluate the grid interconnection standard conformance of an ICS according to the procedures in IEEE Std 1547.1TM. The architecture and test sequencing of this evaluation tool, along with a set of representative ICS test results from three different photovoltaic (PV) inverters, are presented. The GISE adds to the National Renewable Energy Laboratory's (NREL) evaluation platform that now allows for rapid development of ICS control algorithms using controller HIL (CHIL) techniques, the ability to test the dc input characteristics of PV-based ICSs through themore » use of a PV simulator capable of simulating real-world dynamics using power HIL (PHIL), and evaluation of ICS grid interconnection conformance.« less

  8. Simulating Society Transitions: Standstill, Collapse and Growth in an Evolving Network Model

    PubMed Central

    Xu, Guanghua; Yang, Junjie; Li, Guoqing

    2013-01-01

    We developed a model society composed of various occupations that interact with each other and the environment, with the capability of simulating three widely recognized societal transition patterns: standstill, collapse and growth, which are important compositions of society evolving dynamics. Each occupation is equipped with a number of inhabitants that may randomly flow to other occupations, during which process new occupations may be created and then interact with existing ones. Total population of society is associated with productivity, which is determined by the structure and volume of the society. We ran the model under scenarios such as parasitism, environment fluctuation and invasion, which correspond to different driving forces of societal transition, and obtained reasonable simulation results. This work adds to our understanding of societal evolving dynamics as well as provides theoretical clues to sustainable development. PMID:24086530

  9. Microscopic modeling of multi-lane highway traffic flow

    NASA Astrophysics Data System (ADS)

    Hodas, Nathan O.; Jagota, Anand

    2003-12-01

    We discuss a microscopic model for the study of multi-lane highway traffic flow dynamics. Each car experiences a force resulting from a combination of the desire of the driver to attain a certain velocity, aerodynamic drag, and change of the force due to car-car interactions. The model also includes multi-lane simulation capability and the ability to add and remove obstructions. We implement the model via a Java applet, which is used to simulate traffic jam formation, the effect of bottlenecks on traffic flow, and the existence of light, medium, and heavy traffic flow. The simulations also provide insight into how the properties of individual cars result in macroscopic behavior. Because the investigation of emergent characteristics is so common in physics, the study of traffic in this manner sheds new light on how the micro-to-macro transition works in general.

  10. Research on a Queue Scheduling Algorithm in Wireless Communications Network

    NASA Astrophysics Data System (ADS)

    Yang, Wenchuan; Hu, Yuanmei; Zhou, Qiancai

    This paper proposes a protocol QS-CT, Queue Scheduling Mechanism based on Multiple Access in Ad hoc net work, which adds queue scheduling mechanism to RTS-CTS-DATA using multiple access protocol. By endowing different queues different scheduling mechanisms, it makes networks access to the channel much more fairly and effectively, and greatly enhances the performance. In order to observe the final performance of the network with QS-CT protocol, we simulate it and compare it with MACA/C-T without QS-CT protocol. Contrast to MACA/C-T, the simulation result shows that QS-CT has greatly improved the throughput, delay, rate of packets' loss and other key indicators.

  11. Conditioning with compound stimuli in Drosophila melanogaster in the flight simulator.

    PubMed

    Brembs, B; Heisenberg, M

    2001-08-01

    Short-term memory in Drosophila melanogaster operant visual learning in the flight simulator is explored using patterns and colours as a compound stimulus. Presented together during training, the two stimuli accrue the same associative strength whether or not a prior training phase rendered one of the two stimuli a stronger predictor for the reinforcer than the other (no blocking). This result adds Drosophila to the list of other invertebrates that do not exhibit the robust vertebrate blocking phenomenon. Other forms of higher-order learning, however, were detected: a solid sensory preconditioning and a small second-order conditioning effect imply that associations between the two stimuli can be formed, even if the compound is not reinforced.

  12. Simulation of complex pharmacokinetic models in Microsoft Excel.

    PubMed

    Meineke, Ingolf; Brockmöller, Jürgen

    2007-12-01

    With the arrival of powerful personal computers in the office numerical methods are accessible to everybody. Simulation of complex processes therefore has become an indispensible tool in research and education. In this paper Microsoft EXCEL is used as a platform for a universal differential equation solver. The software is designed as an add-in aiming at a minimum of required user input to perform a given task. Four examples are included to demonstrate both, the simplicity of use and the versatility of possible applications. While the layout of the program is admittedly geared to the needs of pharmacokineticists, it can be used in any field where sets of differential equations are involved. The software package is available upon request.

  13. Progress in Modeling and Simulation of Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, John A

    2016-01-01

    Modeling and simulation of batteries, in conjunction with theory and experiment, are important research tools that offer opportunities for advancement of technologies that are critical to electric motors. The development of data from the application of these tools can provide the basis for managerial and technical decision-making. Together, these will continue to transform batteries for electric vehicles. This collection of nine papers presents the modeling and simulation of batteries and the continuing contribution being made to this impressive progress, including topics that cover: * Thermal behavior and characteristics * Battery management system design and analysis * Moderately high-fidelity 3D capabilitiesmore » * Optimization Techniques and Durability As electric vehicles continue to gain interest from manufacturers and consumers alike, improvements in economy and affordability, as well as adoption of alternative fuel sources to meet government mandates are driving battery research and development. Progress in modeling and simulation will continue to contribute to battery improvements that deliver increased power, energy storage, and durability to further enhance the appeal of electric vehicles.« less

  14. Improvement of Progressive Damage Model to Predicting Crashworthy Composite Corrugated Plate

    NASA Astrophysics Data System (ADS)

    Ren, Yiru; Jiang, Hongyong; Ji, Wenyuan; Zhang, Hanyu; Xiang, Jinwu; Yuan, Fuh-Gwo

    2018-02-01

    To predict the crashworthy composite corrugated plate, different single and stacked shell models are evaluated and compared, and a stacked shell progressive damage model combined with continuum damage mechanics is proposed and investigated. To simulate and predict the failure behavior, both of the intra- and inter- laminar failure behavior are considered. The tiebreak contact method, 1D spot weld element and cohesive element are adopted in stacked shell model, and a surface-based cohesive behavior is used to capture delamination in the proposed model. The impact load and failure behavior of purposed and conventional progressive damage models are demonstrated. Results show that the single shell could simulate the impact load curve without the delamination simulation ability. The general stacked shell model could simulate the interlaminar failure behavior. The improved stacked shell model with continuum damage mechanics and cohesive element not only agree well with the impact load, but also capture the fiber, matrix debonding, and interlaminar failure of composite structure.

  15. Progressive Fracture of Composite Structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    2001-01-01

    This report includes the results of a research in which the COmposite Durability STRuctural ANalysis (CODSTRAN) computational simulation capabilities were augmented and applied to various structures for demonstration of the new features and verification. The first chapter of this report provides an introduction to the computational simulation or virtual laboratory approach for the assessment of damage and fracture progression characteristics in composite structures. The second chapter outlines the details of the overall methodology used, including the failure criteria and the incremental/iterative loading procedure with the definitions of damage, fracture, and equilibrium states. The subsequent chapters each contain an augmented feature of the code and/or demonstration examples. All but one of the presented examples contains laminated composite structures with various fiber/matrix constituents. For each structure simulated, damage initiation and progression mechanisms are identified and the structural damage tolerance is quantified at various degradation stages. Many chapters contain the simulation of defective and defect free structures to evaluate the effects of existing defects on structural durability.

  16. Simulating Initial and Progressive Failure of Open-Hole Composite Laminates under Tension

    NASA Astrophysics Data System (ADS)

    Guo, Zhangxin; Zhu, Hao; Li, Yongcun; Han, Xiaoping; Wang, Zhihua

    2016-12-01

    A finite element (FE) model is developed for the progressive failure analysis of fiber reinforced polymer laminates. The failure criterion for fiber and matrix failure is implemented in the FE code Abaqus using user-defined material subroutine UMAT. The gradual degradation of the material properties is controlled by the individual fracture energies of fiber and matrix. The failure and damage in composite laminates containing a central hole subjected to uniaxial tension are simulated. The numerical results show that the damage model can be used to accurately predicte the progressive failure behaviour both qualitatively and quantitatively.

  17. Curriculum-Based Measurement of Oral Reading: Quality of Progress Monitoring Outcomes

    ERIC Educational Resources Information Center

    Christ, Theodore J.; Zopluoglu, Cengiz; Long, Jeffery D.; Monaghen, Barbara D.

    2012-01-01

    Curriculum-based measurement of oral reading (CBM-R) is frequently used to set student goals and monitor student progress. This study examined the quality of growth estimates derived from CBM-R progress monitoring data. The authors used a linear mixed effects regression (LMER) model to simulate progress monitoring data for multiple levels of…

  18. Progress in Unsteady Turbopump Flow Simulations

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin C.; Chan, William; Kwak, Dochan; Williams, Robert

    2002-01-01

    This viewgraph presentation discusses unsteady flow simulations for a turbopump intended for a reusable launch vehicle (RLV). The simulation process makes use of computational grids and parallel processing. The architecture of the parallel computers used is discussed, as is the scripting of turbopump simulations.

  19. Effects of added chelated trace minerals, organic selenium, yeast culture, direct-fed microbials, and Yucca schidigera extract in horses: II. Nutrient excretion and potential environmental impact.

    PubMed

    Gordon, M E; Edwards, M S; Sweeney, C R; Jerina, M L

    2013-08-01

    The objective of this study was to test the hypothesis that an equine diet formulated with chelated trace minerals, organic selenium, yeast culture, direct-fed microbials (DFM) and Yucca schidigera extract would decrease excretion of nutrients that have potential for environmental impact. Horses were acclimated to 100% pelleted diets formulated with (ADD) and without (CTRL) the aforementioned additives. Chelated sources of Cu, Zn, Mn, and Co were included in the ADD diet at a 100% replacement rate of sulfate forms used in the CTRL diet. Additionally, the ADD diet included organic selenium yeast, DFM, and Yucca schidigera extract. Ten horses were fed the 2 experimental diets during two 42-d periods in a crossover design. Total fecal and urine collection occurred during the last 14 d of each period. Results indicate no significant differences between Cu, Zn, Mn, and Co concentrations excreted via urine (P > 0.05) due to dietary treatment. There was no difference between fecal Cu and Mn concentrations (P > 0.05) based on diet consumed. Mean fecal Zn and Co concentrations excreted by horses consuming ADD were greater than CTRL (P < 0.003). Differences due to diet were found for selenium fecal (P < 0.0001) and urine (P < 0.0001) excretions, with decreased concentrations found for horses consuming organic selenium yeast (ADD). In contrast, fecal K (%) was greater (P = 0.0421) for horses consuming ADD, whereas concentrations of fecal solids, total N, ammonia N, P, total ammonia, and fecal output did not differ between dietary treatments (P > 0.05). In feces stockpiled to simulate a crude composting method, no differences (P > 0.05) due to diet were detected for particle size, temperature, moisture, OM, total N, P, phosphate, K, moisture, potash, or ammonia N (P > 0.05). Although no difference (P = 0.2737) in feces stockpile temperature due to diet was found, temperature differences over time were documented (P < 0.0001). In conclusion, the addition of certain chelated mineral sources, organic Se yeast, DFM, and Yucca schidigera extract did not decrease most nutrient concentrations excreted. Horses consuming organic selenium as part of the additive diet had lower fecal and urine Se concentrations, as well as greater fecal K concentrations.

  20. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

    2005-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  1. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    2004-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  2. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  3. STS-26 simulation activities in JSC Mission Control Center (MCC)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Overall view of JSC Mission Control Center (MCC) Bldg 30 Flight Control Room (FCR) during Flight Day 1 of STS-26 integrated simulations in progress between MCC and JSC Mission Simulation and Training Facility Bldg 5 fixed-base (FB) shuttle mission simulator (SMS).

  4. Body Constraints on Motor Simulation in Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Conson, Massimiliano; Hamilton, Antonia; De Bellis, Francesco; Errico, Domenico; Improta, Ilaria; Mazzarella, Elisabetta; Trojano, Luigi; Frolli, Alessandro

    2016-01-01

    Developmental data suggested that mental simulation skills become progressively dissociated from overt motor activity across development. Thus, efficient simulation is rather independent from current sensorimotor information. Here, we tested the impact of bodily (sensorimotor) information on simulation skills of adolescents with Autism Spectrum…

  5. FERN - a Java framework for stochastic simulation and evaluation of reaction networks.

    PubMed

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-08-29

    Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.

  6. Quarterly Progress Report: Modeling and Simulation of the Homopolar Motor Test Apparatus

    DTIC Science & Technology

    2006-05-01

    Quarterly Progress Report: Modeling and Simulation of the Homopolar Motor Test Apparatus 5. FUNDING NUMBERS Contract # N00014-1-0588 6. AUTHOR(S) K...superconducting homopolar motor /generator (SCHPMG) machine for ship propulsion. Electrical contact (brush/slip ring) performance is a limiting factor in SCHPMG...SUBJECT TERMS superconducting homopolar motors , inhomogenous brush wear, polarity dependence, destabilized force 15. NUMBER OF PAGES 11 16. PRICE CODE

  7. Necrotizing Fasciitis: An Emergency Medicine Simulation Scenario.

    PubMed

    Galust, Henrik; Oliverio, Matthew H; Giorgio, Daniel J; Espinal, Alexis M; Ahmed, Rami

    2016-08-31

    Necrotizing fasciitis (NF) is a rare and rapidly progressing life-threatening infectious process. By progressing through a simulation involving a patient with NF and participating in a post-scenario debriefing, learners will gain the necessary skills and knowledge to properly diagnose and manage patients with NF. Learners are taught to initiate appropriate and timely treatment and to advocate on behalf of their patient after inappropriate pushback from consultants to improve outcomes.

  8. Quantum simulation from the bottom up: the case of rebits

    NASA Astrophysics Data System (ADS)

    Enshan Koh, Dax; Yuezhen Niu, Murphy; Yoder, Theodore J.

    2018-05-01

    Typically, quantum mechanics is thought of as a linear theory with unitary evolution governed by the Schrödinger equation. While this is technically true and useful for a physicist, with regards to computation it is an unfortunately narrow point of view. Just as a classical computer can simulate highly nonlinear functions of classical states, so too can the more general quantum computer simulate nonlinear evolutions of quantum states. We detail one particular simulation of nonlinearity on a quantum computer, showing how the entire class of -unitary evolutions (on n qubits) can be simulated using a unitary, real-amplitude quantum computer (consisting of n  +  1 qubits in total). These operators can be represented as the sum of a linear and antilinear operator, and add an intriguing new set of nonlinear quantum gates to the toolbox of the quantum algorithm designer. Furthermore, a subgroup of these nonlinear evolutions, called the -Cliffords, can be efficiently classically simulated, by making use of the fact that Clifford operators can simulate non-Clifford (in fact, non-linear) operators. This perspective of using the physical operators that we have to simulate non-physical ones that we do not is what we call bottom-up simulation, and we give some examples of its broader implications.

  9. Synthesis of High-Speed Digital Systems.

    DTIC Science & Technology

    1985-11-08

    1 (sub2 sub 16 2 (sub3 sub 16) 3 (sub4 sub 16) 4 (eub5 sub 16) 5 (sub6 sub 16) 6 ( sub7 sub 16) 7 (addi add 16) 8 (add2 add 16) 9 (add3 add 16) 10...seB uub5 J2 16 se5) 15 (se6 sub6 JI 16 soO) 18 (se7 sub7 J5 16 se7) 17 (aol addi Dl 16 aol) 18 (a921 add2 add7 18 a02) 19 (&922 add2 add5 16 a02) 20...de4l D4 add4 16 de4) 33 Wd942 D4 sub4 16 de4) 34 (de~i D5 sub7 16 de5) 35 (deS2 D5 add8 16 deS) 36 (jell Ji add7 16 jel) 37 (je12 JI D5 16 jel) 38 (je2

  10. Modeling the dynamics of chromosomal alteration progression in cervical cancer: A computational model

    PubMed Central

    2017-01-01

    Computational modeling has been applied to simulate the heterogeneity of cancer behavior. The development of Cervical Cancer (CC) is a process in which the cell acquires dynamic behavior from non-deleterious and deleterious mutations, exhibiting chromosomal alterations as a manifestation of this dynamic. To further determine the progression of chromosomal alterations in precursor lesions and CC, we introduce a computational model to study the dynamics of deleterious and non-deleterious mutations as an outcome of tumor progression. The analysis of chromosomal alterations mediated by our model reveals that multiple deleterious mutations are more frequent in precursor lesions than in CC. Cells with lethal deleterious mutations would be eliminated, which would mitigate cancer progression; on the other hand, cells with non-deleterious mutations would become dominant, which could predispose them to cancer progression. The study of somatic alterations through computer simulations of cancer progression provides a feasible pathway for insights into the transformation of cell mechanisms in humans. During cancer progression, tumors may acquire new phenotype traits, such as the ability to invade and metastasize or to become clinically important when they develop drug resistance. Non-deleterious chromosomal alterations contribute to this progression. PMID:28723940

  11. Membrane, action, and oscillatory potentials in simulated protocells

    NASA Technical Reports Server (NTRS)

    Syren, R. M.; Fox, S. W.; Przybylski, A. T.; Stratten, W. P.

    1982-01-01

    Electrical membrane potentials, oscillations, and action potentials are observed in proteinoid microspheres impaled with (3 M KCl) microelectrodes. Although effects are of greater magnitude when the vesicles contain glycerol and natural or synthetic lecithin, the results in the purely synthetic thermal protein structures are substantial, attaining 20 mV amplitude in some cases. The results add the property of electrical potential to the other known properties of proteinoid microspheres, in their role as models for protocells.

  12. High Resolution WENO Simulation of 3D Detonation Waves

    DTIC Science & Technology

    2012-02-27

    pocket behind the detonation front was not observed in their results because the rotating transverse detonation completely consumed the unburned gas. Dou...three-dimensional detonations We add source terms (functions of x, y, z and t) to the PDE system so that the following functions are exact solutions to... detonation rotates counter-clockwise, opposite to that in [48]. It can be seen that, the triple lines and transverse waves collide with the walls, and strong

  13. Ultranarrow bandwidth spectral filtering for long-range free-space quantum key distribution at daytime.

    PubMed

    Höckel, David; Koch, Lars; Martin, Eugen; Benson, Oliver

    2009-10-15

    We describe a Fabry-Perot-based spectral filter for free-space quantum key distribution (QKD). A multipass etalon filter was built, and its performance was studied. The whole filter setup was carefully optimized to add less than 2 dB attenuation to a signal beam but block stray light by 21 dB. Simulations show that such a filter might be sufficient to allow QKD satellite downlinks during daytime with the current technology.

  14. The Physlet Approach to Simulation Design

    NASA Astrophysics Data System (ADS)

    Christian, Wolfgang; Belloni, Mario; Esquembre, Francisco; Mason, Bruce A.; Barbato, Lyle; Riggsbee, Matt

    2015-10-01

    Over the past two years, the AAPT/ComPADRE staff and the Open Source Physics group have published the second edition of Physlet Physics and Physlet Quantum Physics, delivered as interactive web pages on AAPT/ComPADRE and as free eBooks available through iTunes and Google Play. These two websites, and their associated books, add over 1000 interactive exercises for the teaching of introductory physics, introductory and intermediate modern physics, and quantum mechanics to AAPT/ComPADRE.

  15. NEAMS-IPL MOOSE Midyear Framework Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Permann, Cody; Alger, Brian; Peterson, John

    The MOOSE Framework is a modular pluggable framework for building complex simulations. The ability to add new objects with custom syntax is a core capability that makes MOOSE a powerful platform for coupling multiple applications together within a single environment. The creation of a new, more standardized JSON syntax output improves the external interfaces for generating graphical components or for validating input file syntax. The design of this interface and the requirements it satisfies are covered in this short report.

  16. Challenges and Opportunities in Propulsion Simulations

    DTIC Science & Technology

    2015-09-24

    leverage Nvidia GPU accelerators •  Release common computational infrastructure as Distro A for collaboration •  Add physics modules as either...Gemini (6.4 GB/s) Dual Rail EDR-IB (23 GB/s) Interconnect Topology 3D Torus Non-blocking Fat Tree Processors AMD Opteron™ NVIDIA Kepler™ IBM...POWER9 NVIDIA Volta™ File System 32 PB, 1 TB/s, Lustre® 120 PB, 1 TB/s, GPFS™ Peak power consumption 9 MW 10 MW Titan vs. Summit Source: R

  17. The effect of a scanning flat fold mirror on a cosmic microwave background B-mode experiment.

    PubMed

    Grainger, William F; North, Chris E; Ade, Peter A R

    2011-06-01

    We investigate the possibility of using a flat-fold beam steering mirror for a cosmic microwave background B-mode experiment. An aluminium flat-fold mirror is found to add ∼0.075% polarization, which varies in a scan synchronous way. Time-domain simulations of a realistic scanning pattern are performed, and the effect on the power-spectrum illustrated, and a possible method of correction applied. © 2011 American Institute of Physics

  18. Development and evaluation of packet video schemes

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Y. C.; Hadenfeldt, A. C.

    1990-01-01

    Reflecting the two tasks proposed for the current year, namely a feasibility study of simulating the NASA network, and a study of progressive transmission schemes, are presented. The view of the NASA network, gleaned from the various technical reports made available to use, is provided. Also included is a brief overview of how the current simulator could be modified to accomplish the goal of simulating the NASA network. As the material in this section would be the basis for the actual simulation, it is important to make sure that it is an accurate reflection of the requirements on the simulator. Brief descriptions of the set of progressive transmission algorithms selected for the study are contained. The results available in the literature were obtained under a variety of different assumptions, not all of which are stated. As such, the only way to compare the efficiency and the implementational complexity of the various algorithms is to simulate them.

  19. Astrophysical Computation in Research, the Classroom and Beyond

    NASA Astrophysics Data System (ADS)

    Frank, Adam

    2009-03-01

    In this talk I review progress in the use of simulations as a tool for astronomical research, for education and public outreach. The talk will include the basic elements of numerical simulations as well as advances in algorithms which have led to recent dramatic progress such as the use of Adaptive Mesh Refinement methods. The scientific focus of the talk will be star formation jets and outflows while the educational emphasis will be on the use of advanced platforms for simulation based learning in lecture and integrated homework. Learning modules for science outreach websites such as DISCOVER magazine will also be highlighted.

  20. Vadose Zone Fate and Transport Simulation of Chemicals Associated with Coal Seam Gas Extraction

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Mallants, D.; Jacques, D.; Van Genuchten, M.

    2017-12-01

    The HYDRUS-1D and HYDRUS (2D/3D) computer software packages are widely used finite element models for simulating the one-, and two- or three-dimensional movement of water, heat, and multiple solutes in variably-saturated media, respectively. While the standard HYDRUS models consider only the fate and transport of individual solutes or solutes subject to first-order degradation reactions, several specialized HYDRUS add-on modules can simulate far more complex biogeochemical processes. The objective of this presentation is to provide an overview of the HYDRUS models and their add-on modules, and to demonstrate applications of the software to the subsurface fate and transport of chemicals involved in coal seam gas extraction and water management operations. One application uses the standard HYDRUS model to evaluate the natural soil attenuation potential of hydraulic fracturing chemicals and their transformation products in case of an accidental release. By coupling the processes of retardation, first-order degradation and convective-dispersive transport of the biocide bronopol and its degradation products, we demonstrated how natural attenuation reduces initial concentrations by more than a factor of hundred in the top 5 cm of the vadose zone. A second application uses the UnsatChem module to explore the possible use of coal seam gas produced water for sustainable irrigation. Simulations with different irrigation waters (untreated, amended with surface water, and reverse osmosis treated) provided detailed results regarding chemical indicators of soil and plant health, notably SAR, EC and sodium concentrations. A third application uses the coupled HYDRUS-PHREEQC module to analyze trace metal transport involving cation exchange and surface complexation sorption reactions in the vadose zone leached with coal seam gas produced water following some accidental water release scenario. Results show that the main process responsible for trace metal migration is complexation of naturally present trace metals with inorganic ligands such as (bi)carbonate that enter the soil upon infiltration with alkaline produced water.

  1. Governator vs. Hunter and Aggregator: A simulation of party competition with vote-seeking and office-seeking rules.

    PubMed

    Lehrer, Roni; Schumacher, Gijs

    2018-01-01

    The policy positions parties choose are central to both attracting voters and forming coalition governments. How then should parties choose positions to best represent voters? Laver and Sergenti show that in an agent-based model with boundedly rational actors a decision rule (Aggregator) that takes the mean policy position of its supporters is the best rule to achieve high congruence between voter preferences and party positions. But this result only pertains to representation by the legislature, not representation by the government. To evaluate this we add a coalition formation procedure with boundedly rational parties to the Laver and Sergenti model of party competition. We also add two new decision rules that are sensitive to government formation outcomes rather than voter positions. We develop two simulations: a single-rule one in which parties with the same rule compete and an evolutionary simulation in which parties with different rules compete. In these simulations we analyze party behavior under a large number of different parameters that describe real-world variance in political parties' motives and party system characteristics. Our most important conclusion is that Aggregators also produce the best match between government policy and voter preferences. Moreover, even though citizens often frown upon politicians' interest in the prestige and rents that come with winning political office (office pay-offs), we find that citizens actually receive better representation by the government if politicians are motivated by these office pay-offs in contrast to politicians with ideological motivations (policy pay-offs). Finally, we show that while more parties are linked to better political representation, how parties choose policy positions affects political representation as well. Overall, we conclude that to understand variation in the quality of political representation scholars should look beyond electoral systems and take into account variation in party behavior as well.

  2. Governator vs. Hunter and Aggregator: A simulation of party competition with vote-seeking and office-seeking rules

    PubMed Central

    2018-01-01

    The policy positions parties choose are central to both attracting voters and forming coalition governments. How then should parties choose positions to best represent voters? Laver and Sergenti show that in an agent-based model with boundedly rational actors a decision rule (Aggregator) that takes the mean policy position of its supporters is the best rule to achieve high congruence between voter preferences and party positions. But this result only pertains to representation by the legislature, not representation by the government. To evaluate this we add a coalition formation procedure with boundedly rational parties to the Laver and Sergenti model of party competition. We also add two new decision rules that are sensitive to government formation outcomes rather than voter positions. We develop two simulations: a single-rule one in which parties with the same rule compete and an evolutionary simulation in which parties with different rules compete. In these simulations we analyze party behavior under a large number of different parameters that describe real-world variance in political parties’ motives and party system characteristics. Our most important conclusion is that Aggregators also produce the best match between government policy and voter preferences. Moreover, even though citizens often frown upon politicians’ interest in the prestige and rents that come with winning political office (office pay-offs), we find that citizens actually receive better representation by the government if politicians are motivated by these office pay-offs in contrast to politicians with ideological motivations (policy pay-offs). Finally, we show that while more parties are linked to better political representation, how parties choose policy positions affects political representation as well. Overall, we conclude that to understand variation in the quality of political representation scholars should look beyond electoral systems and take into account variation in party behavior as well. PMID:29394268

  3. Impact of add-on laboratory testing at an academic medical center: a five year retrospective study.

    PubMed

    Nelson, Louis S; Davis, Scott R; Humble, Robert M; Kulhavy, Jeff; Aman, Dean R; Krasowski, Matthew D

    2015-01-01

    Clinical laboratories frequently receive orders to perform additional tests on existing specimens ('add-ons'). Previous studies have examined add-on ordering patterns over short periods of time. The objective of this study was to analyze add-on ordering patterns over an extended time period. We also analyzed the impact of a robotic specimen archival/retrieval system on add-on testing procedure and manual effort. In this retrospective study at an academic medical center, electronic health records from were searched to obtain all add-on orders that were placed in the time period of May 2, 2009 to December 31, 2014. During the time period of retrospective study, 880,359 add-on tests were ordered on 96,244 different patients. Add-on testing comprised 3.3 % of total test volumes. There were 443,411 unique ordering instances, leading to an average of 1.99 add-on tests per instance. Some patients had multiple episodes of add-on test orders at different points in time, leading to an average of 9.15 add-on tests per patient. The majority of add-on orders were for chemistry tests (78.8 % of total add-ons) with the next most frequent being hematology and coagulation tests (11.2 % of total add-ons). Inpatient orders accounted for 66.8 % of total add-on orders, while the emergency department and outpatient clinics had 14.8 % and 18.4 % of total add-on orders, respectively. The majority of add-ons were placed within 8 hours (87.3 %) and nearly all by 24 hours (96.8 %). Nearly 100 % of add-on orders within the emergency department were placed within 8 hours. The introduction of a robotic specimen archival/retrieval unit saved an average of 2.75 minutes of laboratory staff manual time per unique add-on order. This translates to 24.1 hours/day less manual effort in dealing with add-on orders. Our study reflects the previous literature in showing that add-on orders significantly impact the workload of the clinical laboratory. The majority of add-on orders are clinical chemistry tests, and most add-on orders occur within 24 hours of original specimen collection. Robotic specimen archival/retrieval units can reduce manual effort in the clinical laboratory associated with add-on orders.

  4. Policy and organizational implications of gender imbalance in the NHS.

    PubMed

    Miller, Karen

    2007-01-01

    The purpose of the paper is to examine the policy and organizational implications of gender imbalance in management, which research suggests exists in the NHS. The research in this paper involved a qualitative approach with an analysis of elite interviews conducted with a non-random sample of officials involved in health policy and interviews with a random sample of senior managers in NHS Scotland. The research formed part of a larger study, which explored the enablers and inhibitors to female career progression in various Scottish sectors. The paper finds that gender imbalance in management exists in the NHS. This is manifested in a masculine organizational context, leadership and policy decision-making process, which have implications for female career advancement opportunities and subsequently access to macro policy decisions. The paper involved a sample (30 percent) of senior managers and examined policy processes in NHS Scotland. To improve the external validity of the findings further research should be conducted in NHS organizations in England and Wales. The findings in the paper suggest that gender imbalance in management and a masculine organizational context and leadership style within the NHS create a less than conducive environment for female employees. This has practical implications in terms of levels of part-time employment, career progression and attrition rates. The paper adds to the debate of gender and organizational studies by examining the health sector, which has high levels of female employment but low levels of female representation at senior management levels. The paper therefore adds to an often-neglected area of study, women in leadership and senior managerial positions. The paper is original in its approach by examining the micro and meso organizational dimensions which impact on women's ability to influence macro health policy.

  5. A DMPA Langmuir monolayer study: from gas to solid phase. An atomistic description by molecular dynamics Simulation.

    PubMed

    Giner-Casares, J J; Camacho, L; Martín-Romero, M T; Cascales, J J López

    2008-03-04

    In this work, a DMPA Langmuir monolayer at the air/water interface was studied by molecular dynamics simulations. Thus, an atomistic picture of a Langmuir monolayer was drawn from its expanded gas phase to its final solid condensed one. In this sense, some properties of monolayers that were traditionally poorly or even not reproduced in computer simulations, such as lipid domain formation or pressure-area per lipid isotherm, were properly reproduced in this work. Thus, the physical laws that control the lipid domain formation in the gas phase and the structure of lipid monolayers from the gas to solid condensed phase were studied. Thanks to the atomistic information provided by the molecular dynamics simulations, we were able to add valuable information to the experimental description of these processes and to access experimental data related to the lipid monolayers in their expanded phase, which is difficult or inaccessible to study by experimental techniques. In this sense, properties such as lipids head hydration and lipid structure were studied.

  6. Simulating Progressive Damage of Notched Composite Laminates with Various Lamination Schemes

    NASA Astrophysics Data System (ADS)

    Mandal, B.; Chakrabarti, A.

    2017-05-01

    A three dimensional finite element based progressive damage model has been developed for the failure analysis of notched composite laminates. The material constitutive relations and the progressive damage algorithms are implemented into finite element code ABAQUS using user-defined subroutine UMAT. The existing failure criteria for the composite laminates are modified by including the failure criteria for fiber/matrix shear damage and delamination effects. The proposed numerical model is quite efficient and simple compared to other progressive damage models available in the literature. The efficiency of the present constitutive model and the computational scheme is verified by comparing the simulated results with the results available in the literature. A parametric study has been carried out to investigate the effect of change in lamination scheme on the failure behaviour of notched composite laminates.

  7. Towards Direct Numerical Simulation of mass and energy fluxes at the soil-atmospheric interface with advanced Lattice Boltzmann methods

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Krafczyk, Manfred; Geier, Martin; Schönherr, Martin

    2014-05-01

    The quantification of soil evaporation and of soil water content dynamics near the soil surface are critical in the physics of land-surface processes on many scales and are dominated by multi-component and multi-phase mass and energy fluxes between the ground and the atmosphere. Although it is widely recognized that both liquid and gaseous water movement are fundamental factors in the quantification of soil heat flux and surface evaporation, their computation has only started to be taken into account using simplified macroscopic models. As the flow field over the soil can be safely considered as turbulent, it would be natural to study the detailed transient flow dynamics by means of Large Eddy Simulation (LES [1]) where the three-dimensional flow field is resolved down to the laminar sub-layer. Yet this requires very fine resolved meshes allowing a grid resolution of at least one order of magnitude below the typical grain diameter of the soil under consideration. In order to gain reliable turbulence statistics, up to several hundred eddy turnover times have to be simulated which adds up to several seconds of real time. Yet, the time scale of the receding saturated water front dynamics in the soil is on the order of hours. Thus we are faced with the task of solving a transient turbulent flow problem including the advection-diffusion of water vapour over the soil-atmospheric interface represented by a realistic tomographic reconstruction of a real porous medium taken from laboratory probes. Our flow solver is based on the Lattice Boltzmann method (LBM) [2] which has been extended by a Cumulant approach similar to the one described in [3,4] to minimize the spurious coupling between the degrees of freedom in previous LBM approaches and can be used as an implicit LES turbulence model due to its low numerical dissipation and increased stability at high Reynolds numbers. The kernel has been integrated into the research code Virtualfluids [5] and delivers up to 30% of the peak performance of modern General Purpose Graphics Processing Units (GPGPU, [6]) allowing the simulation of several minutes real-time for an LES LBM model. In our contribution we will present detailed profiles of the velocity distribution for different surface roughnesses, describe our multi-scale approach for the advection diffusion and estimate water vapour fluxes from transient simulations of the coupled problem. REFERENCES [1] J. Fröhlich and D. von Terzi. Hybrid LES/RANS methods for the simulation of turbulent flows. Progress in Aerospace Sciences, 44(5):349 - 377, 2008. [2] S. Chen and G. D. Doolen, Annual Review, of Fluid Mechanics 30, 329, 1998, [3] S. Seeger and K. H. Hoffmann, The cumulant method for computational kinetic theory, Continuum Mech. Thermodyn., 12:403-421, 2000. [4] S. Seeger and K. H. Hoffmann, The cumulant method applied to a mixture of Maxwell gases, Continuum Mech. Thermodyn., 14:321-335, 2002. [5] S. Freudiger, J. Hegewald and M. Krafczyk. A parallelisation concept for a mult-physics Lattice Boltzmann prototype based on hierarchical grids. Progress in Computational Fluid Dynamics, 8(1):168-178, 2008. [6] M. Schönherr, K. Kucher, M. Geier, M. Stiebler, S. Freudiger and M. Krafczyk, Multi- thread implementations of the Lattice Boltzmann method on non-uniform grids for CPUs and GPUs. Computers & Mathematics with Applications, 61(12):3730-3743, 2011.

  8. Decision-Making Accuracy of CBM Progress-Monitoring Data

    ERIC Educational Resources Information Center

    Hintze, John M.; Wells, Craig S.; Marcotte, Amanda M.; Solomon, Benjamin G.

    2018-01-01

    This study examined the diagnostic accuracy associated with decision making as is typically conducted with curriculum-based measurement (CBM) approaches to progress monitoring. Using previously published estimates of the standard errors of estimate associated with CBM, 20,000 progress-monitoring data sets were simulated to model student reading…

  9. Psychometric properties and norms of the German ABC-Community and PAS-ADD Checklist.

    PubMed

    Zeilinger, Elisabeth L; Weber, Germain; Haveman, Meindert J

    2011-01-01

    The aim of the present study was to standardize and generate psychometric evidence of the German language versions of two well-established English language mental health instruments: the Aberrant Behavior Checklist-Community (ABC-C) and the Psychiatric Assessment Schedule for Adults with Developmental Disabilities (PAS-ADD) Checklist. New methods in this field were introduced: a simulation method for testing the factor structure and an exploration of long-term stability over two years. The checklists were both administered to a representative sample of 270 individuals with intellectual disability (ID) and, two years later in a second data collection, to 128 participants of the original sample. Principal component analysis and parallel analysis were performed. Reliability measures, long-term stability, subscale intercorrelations, as well as standardized norms were generated. Prevalence of mental health problems was examined. Psychometric properties were mostly excellent, with long-term stability showing moderate to strong effects. The original factor structure of the ABC-C was replicated. PAS-ADD Checklist produced a similar, but still different structure compared with findings from the English language area. The overall prevalence rate of mental health problems in the sample was about 20%. Considering the good results on the measured psychometric properties, the two checklists are recommended for the early detection of mental health problems in persons with ID. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Effects on Task Performance and Psychophysiological Measures of Performance During Normobaric Hypoxia Exposure

    NASA Technical Reports Server (NTRS)

    Stephens, Chad; Kennedy, Kellie; Napoli, Nicholas; Demas, Matthew; Barnes, Laura; Crook, Brenda; Williams, Ralph; Last, Mary Carolyn; Schutte, Paul

    2017-01-01

    Human-autonomous systems have the potential to mitigate pilot cognitive impairment and improve aviation safety. A research team at NASA Langley conducted an experiment to study the impact of mild normobaric hypoxia induction on aircraft pilot performance and psychophysiological state. A within-subjects design involved non-hypoxic and hypoxic exposures while performing three 10-minute tasks. Results indicated the effect of 15,000 feet simulated altitude did not induce significant performance decrement but did produce increase in perceived workload. Analyses of psychophysiological responses evince the potential of biomarkers for hypoxia onset. This study represents on-going work at NASA intending to add to the current knowledge of psychophysiologically-based input to automation to increase aviation safety. Analyses involving coupling across physiological systems and wavelet transforms of cortical activity revealed patterns that can discern between the simulated altitude conditions. Specifically, multivariate entropy of ECG/Respiration components were found to be significant predictors (p< 0.02) of hypoxia. Furthermore, in EEG, there was a significant decrease in mid-level beta (15.19-18.37Hz) during the hypoxic condition in thirteen of sixteen sites across the scalp. Task performance was not appreciably impacted by the effect of 15,000 feet simulated altitude. Analyses of psychophysiological responses evince the potential of biomarkers for mild hypoxia onset.The potential for identifying shifts in underlying cortical and physiological systems could serve as a means to identify the onset of deteriorated cognitive state. Enabling such assessment in future flightdecks could permit increasingly autonomous systems-supported operations. Augmenting human operator through assessment of cognitive impairment has the potential to further improve operator performance and mitigate human error in safety critical contexts. This study represents ongoing work at NASA intending to add to the current knowledge of psychophysiologically-based input to automation to increase aviation safety.

  11. Learning outcomes in a simulation game for associate degree nursing students.

    PubMed

    Clark-C

    1977-01-01

    Learning outcomes of a simulation game designed to have one-to-one correspondence between behavioral objectives and game plays is reported. The behavioral objectives were core concepts in psychiatric mental health nursing taught to associate degree nursing students. Decisions to use the simulation game method method grew out of difficulties inherent in the community college nursing program, as well as the need for self-paced, efficient, learner-centered learning and evaluative tools. After the trial and revision of the game, a number of research hypotheses were tested. Simulation gaming was found to be an effective mode of learning, and students who acted as teachers for other students learned significantly more than those who were taught. Some of the recommendations for further research were to study varied nursing populations, to add a control group, to test the long-range learning effects of playing the game, to decrease experimenter bias, to study transfer of learning to actual nurse-patient situations and changes in attitudes toward psychiatric patients, and to develop more simulation games for nursing education.

  12. Cost effectiveness of pomalidomide in patients with relapsed and refractory multiple myeloma in Sweden.

    PubMed

    Borg, Sixten; Nahi, Hareth; Hansson, Markus; Lee, Dawn; Elvidge, Jamie; Persson, Ulf

    2016-05-01

    Multiple myeloma (MM) patients who have progressed following treatment with both bortezomib and lenalidomide have a poor prognosis. In this late stage, other effective alternatives are limited, and patients in Sweden are often left with best supportive care. Pomalidomide is a new anti-angiogenic and immunomodulatory drug for the treatment of MM. Our objective was to evaluate the cost effectiveness of pomalidomide as an add-on to best supportive care in patients with relapsed and refractory MM in Sweden. We developed a health-economic discrete event simulation model of a patient's course through stable disease and progressive disease, until death. It estimates life expectancy, quality-adjusted life years (QALYs) and costs from a societal perspective. Effectiveness data and utilities were taken from the MM-003 trial comparing pomalidomide plus low-dose dexamethasone with high-dose dexamethasone (HIDEX). Cost data were taken from official Swedish price lists, government sources and literature. The model estimates that, if a patient is treated with HIDEX, life expectancy is 1.12 years and the total cost is SEK 179 976 (€19 100), mainly indirect costs. With pomalidomide plus low-dose dexamethasone, life expectancy is 2.33 years, with a total cost of SEK 767 064 (€81 500), mainly in drug and indirect costs. Compared to HIDEX, pomalidomide treatment gives a QALY gain of 0.7351 and an incremental cost of SEK 587 088 (€62 400) consisting of increased drug costs (59%), incremental indirect costs (33%) and other healthcare costs (8%). The incremental cost-effectiveness ratio is SEK 798 613 (€84 900) per QALY gained. In a model of late-stage MM patients with a poor prognosis in the Swedish setting, pomalidomide is associated with a relatively high incremental cost per QALY gained. This model was accepted by the national Swedish reimbursement authority TLV, and pomalidomide was granted reimbursement in Sweden.

  13. Numerical simulations of quasi-perpendicular collisionless shocks

    NASA Technical Reports Server (NTRS)

    Goodrich, C. C.

    1985-01-01

    Numerical simulations of collisionless quasi-perpendicular shock waves are reviewed. The strengths and limitations of these simulations are discussed and their experimental (laboratory and spacecraft) context is given. Recent simulation results are emphasized that, with ISEE bow shock observations, are responsible for recent progress in understanding quasi-steady shock structure.

  14. Student Engagement with a Science Simulation: Aspects That Matter

    ERIC Educational Resources Information Center

    Rodrigues, Susan; Gvozdenko, Eugene

    2011-01-01

    It is argued that multimedia technology affords an opportunity to better visualise complex relationships often seen in chemistry. This paper describes the influence of chemistry simulation design facets on user progress through a simulation. Three versions of an acid-base titration simulation were randomly allocated to 36 volunteers to examine…

  15. Abaqus Simulations of Rock Response to Dynamic Loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steedman, David W.; Coblentz, David

    The LANL Geodynamics Team has been applying Abaqus modeling to achieve increasingly complex simulations. Advancements in Abaqus model building and simulation tools allows this progress. We use Lab-developed constitutive models, the fully coupled CEL Abaqus and general contact to simulate response of realistic sites to explosively driven shock.

  16. DDS: The Dental Diagnostic Simulation System.

    ERIC Educational Resources Information Center

    Tira, Daniel E.

    The Dental Diagnostic Simulation (DDS) System provides an alternative to simulation systems which represent diagnostic case studies of relatively limited scope. It may be used to generate simulated case studies in all of the dental specialty areas with case materials progressing through the gamut of the diagnostic process. The generation of a…

  17. The progression of myopia from its onset at age 8-12 to adulthood and the influence of heredity and external factors on myopic progression. A 23-year follow-up study.

    PubMed

    Pärssinen, Olavi; Kauppinen, Markku; Viljanen, Anne

    2014-12-01

    To examine myopic progression and factors connected with myopic progression. Myopic schoolchildren, with no previous spectacles, 119 boys and 121 girls, were recruited during 1983-1984 to a randomized 3-year clinical trial of bifocal treatment of myopia with a subsequent 20-year follow-up. Participants' mean age at Baseline was 10.9, ranging from 8.7 to 12.8 years. An ophthalmological examination was carried out annually for 3 years and twice thereafter at ca. 10-year intervals. Additional refraction values were received from prescriptions issued by different ophthalmologists and opticians. Altogether, 1915 refraction values were available. Reading distance and accommodation were measured at each control visit. Data on parents' myopia, daily time spent on reading and close work, outdoor activities and watching television were gathered with a structured questionnaire. Using bifocals (+1.75 add) or reading without glasses or accommodation stimulus during the 3-year period in childhood did not correlate with adulthood refraction. Short reading distance in childhood predicted higher adulthood myopia among females. The factors predicting faster myopic progression were parents' myopia and less time spent on sports and outdoor activities at childhood. Time spent on reading and close work in childhood was related to myopic progression during the first 3 years but did not predict adulthood myopia. Myopia throughout follow-up was higher among those who watched television <3 hr daily than those who spent more time watching television. Mean myopic progression 8 years after age 20-24 was -0.45 D ± 0.71 (SD), and in 45% of cases, progression was ≥0.5 D. In nearly half of the cases, myopia beginning at school continued to progress into adulthood. Higher adulthood myopia was mainly related to parents' myopia and less time spent on sports and outdoor activities in childhood. © 2014 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  18. Thermal Characterization of a Simulated Fission Engine via Distributed Fiber Bragg Gratings

    NASA Astrophysics Data System (ADS)

    Duncan, Roger G.; Fielder, Robert S.; Seeley, Ryan J.; Kozikowski, Carrie L.; Raum, Matthew T.

    2005-02-01

    We report the use of distributed fiber Bragg gratings to monitor thermal conditions within a simulated nuclear reactor core located at the Early Flight Fission Test Facility of the NASA Marshall Space Flight Center. Distributed fiber-optic temperature measurements promise to add significant capability and advance the state-of-the-art in high-temperature sensing. For the work reported herein, seven probes were constructed with ten sensors each for a total of 70 sensor locations throughout the core. These discrete temperature sensors were monitored over a nine hour period while the test article was heated to over 700 °C and cooled to ambient through two operational cycles. The sensor density available permits a significantly elevated understanding of thermal effects within the simulated reactor. Fiber-optic sensor performance is shown to compare very favorably with co-located thermocouples where such co-location was feasible.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orea, Adrian; Betancourt, Minerba

    aThe objective for this project was to use MINERvA data to tune the simulation models in order to obtain the precision needed for current and future neutrino experiments. In order to do this, the current models need to be validated and then improved.more » $$\\#10146$$; Validation was done by recreating figures that have been used in previous publications $$\\#61553$$; This was done by comparing data from the detector and the simulation model (GENIE) $$\\#10146$$; Additionally, a newer version of GENIE was compared to the GENIE used for the publications to validate the new version as well as to note any improvements Another objective was to add new samples into the NUISANCE framework, which was used to compare data from the detector and simulation models. $$\\#10146$$; Specifically, the added sample was the two dimensional histogram of the double differential cross section as a function of the transversal and z-direction momentum for Numu and Numubar $$\\#61553$$; Was also used for validation« less

  20. Advanced Platform for Development and Evaluation of Grid Interconnection Systems Using Hardware-in-the-Loop: Part III -- Grid Interconnection System Evaluator: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, B.; Shirazi, M.; Coddington, M.

    2013-01-01

    This paper, presented at the IEEE Green Technologies Conference 2013, describes a Grid Interconnection System Evaluator (GISE) that leverages hardware-in-the-loop (HIL) simulation techniques to rapidly evaluate the grid interconnection standard conformance of an ICS according to the procedures in IEEE Std 1547.1 (TM). The architecture and test sequencing of this evaluation tool, along with a set of representative ICS test results from three different photovoltaic (PV) inverters, are presented. The GISE adds to the National Renewable Energy Laboratory's (NREL) evaluation platform that now allows for rapid development of ICS control algorithms using controller HIL (CHIL) techniques, the ability to testmore » the dc input characteristics of PV-based ICSs through the use of a PV simulator capable of simulating real-world dynamics using power HIL (PHIL), and evaluation of ICS grid interconnection conformance.« less

  1. Advanced Platform for Development and Evaluation of Grid Interconnection Systems Using Hardware-in-the-Loop: Part III - Grid Interconnection System Evaluator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, B.; Shirazi, M.; Coddington, M.

    2013-01-01

    This paper describes a Grid Interconnection System Evaluator (GISE) that leverages hardware-in-the-loop (HIL) simulation techniques to rapidly evaluate the grid interconnection standard conformance of an ICS according to the procedures in IEEE Std 1547.1. The architecture and test sequencing of this evaluation tool, along with a set of representative ICS test results from three different photovoltaic (PV) inverters, are presented. The GISE adds to the National Renewable Energy Laboratory's (NREL) evaluation platform that now allows for rapid development of ICS control algorithms using controller HIL (CHIL) techniques, the ability to test the dc input characteristics of PV-based ICSs through themore » use of a PV simulator capable of simulating real-world dynamics using power HIL (PHIL), and evaluation of ICS grid interconnection conformance.« less

  2. Interprofessional communication in healthcare: An integrative review.

    PubMed

    Foronda, Cynthia; MacWilliams, Brent; McArthur, Erin

    2016-07-01

    The link between miscommunication and poor patient outcomes has been well documented. To understand the current state of knowledge regarding interprofessional communication, an integrative review was performed. The review suggested that nurses and physicians are trained differently and they exhibit differences in communication styles. The distinct frustrations that nurses and physicians expressed with each other were discussed. Egos, lack of confidence, lack of organization and structural hierarchies hindered relationships and communications. Research suggested that training programs with the use of standardized tools and simulation are effective in improving interprofessional communication skills. Recommendations include education beyond communication techniques to address the broader related constructs of patient safety, valuing diversity, team science, and cultural humility. Future directions in education are to add courses in patient safety to the curriculum, use handover tools that are interprofessional in nature, practice in simulation hospitals for training, and use virtual simulation to unite the professions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Dynamic Modeling of Starting Aerodynamics and Stage Matching in an Axi-Centrifugal Compressor

    NASA Technical Reports Server (NTRS)

    Wilkes, Kevin; OBrien, Walter F.; Owen, A. Karl

    1996-01-01

    A DYNamic Turbine Engine Compressor Code (DYNTECC) has been modified to model speed transients from 0-100% of compressor design speed. The impetus for this enhancement was to investigate stage matching and stalling behavior during a start sequence as compared to rotating stall events above ground idle. The model can simulate speed and throttle excursions simultaneously as well as time varying bleed flow schedules. Results of a start simulation are presented and compared to experimental data obtained from an axi-centrifugal turboshaft engine and companion compressor rig. Stage by stage comparisons reveal the front stages to be operating in or near rotating stall through most of the start sequence. The model matches the starting operating line quite well in the forward stages with deviations appearing in the rearward stages near the start bleed. Overall, the performance of the model is very promising and adds significantly to the dynamic simulation capabilities of DYNTECC.

  4. PhET: The Best Education Software You Can't Buy

    NASA Astrophysics Data System (ADS)

    Dubson, M.; Duncan, D. K.

    2009-12-01

    Project PhET provides free educational software in the form of stand-alone java and flash simulations and associated classroom materials. Our motto is "It's the best educational software that money can buy, except you can't buy it, because its free." You can start playing with PhET sims right now at http://phet.colorado.edu and add to our 1 million hits per month. PhET originally stood for Physics Education Technology, but we now include other science fields so PhET is now a brand name. Our site has about 80 simulations, mostly in physics and math, but also in chemistry, geology, and biology. Based on careful research and student interviews, our sims have no instructions because no one reads instructions. These simulations can be used in lecture demonstrations, classroom activities, and homework assignments. The PhET site includes a long list of user-tested classroom activities and teacher tips.

  5. Management of queues in out-patient departments: the use of computer simulation.

    PubMed

    Aharonson-Daniel, L; Paul, R J; Hedley, A J

    1996-01-01

    Notes that patients attending public outpatient departments in Hong Kong spend a long time waiting for a short consultation, that clinics are congested and that both staff and patients are dissatisfied. Points out that experimentation of management changes in a busy clinical environment can be both expensive and difficult. Demonstrates computerized simulation modelling as a potential tool for clarifying processes occurring within such systems, improving clinic operation by suggesting possible answers to problems identified and evaluating the solutions, without interfering with the clinic routine. Adds that solutions can be implemented after they had proved to be successful on the model. Demonstrates some ways in which managers in health care facilities can benefit from the use of computerized simulation modelling. Specifically, shows the effect of changing the duration of consultation and the effect of the application of an appointment system on patients' waiting time.

  6. Forces required for a knife to penetrate a variety of clothing types.

    PubMed

    Nolan, Gary; Hainsworth, Sarah V; Rutty, Guy N

    2013-03-01

    In stabbing incidents, it is usual for the victim to be clothed and therefore a knife penetrates both clothes and skin. Clothes (other than leather) have been thought to make little difference to the penetration force. However, there is little quantitative data in the literature. In this study, a range of clothes have been tested, either singly or in layers of, for example, T-shirt and shirt, to quantify the additional force required when clothes are present. A materials testing system has been used to test the penetration force required to stab through clothes into a foam-silicone rubber skin simulant. The results show that the force required can be significantly different, particularly when layers of clothing are penetrated. A cotton t-shirt adds c. 8 N to the penetration force, while a T-shirt and jacket can add an additional 21 N. The results allow a more quantitative assessment of forces required in stabbing. © 2012 American Academy of Forensic Sciences.

  7. Spatial frequency spectrum of the x-ray scatter distribution in CBCT projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bootsma, G. J.; Verhaegen, F.; Department of Oncology, Medical Physics Unit, McGill University, Montreal, Quebec H3G 1A4

    2013-11-15

    Purpose: X-ray scatter is a source of significant image quality loss in cone-beam computed tomography (CBCT). The use of Monte Carlo (MC) simulations separating primary and scattered photons has allowed the structure and nature of the scatter distribution in CBCT to become better elucidated. This work seeks to quantify the structure and determine a suitable basis function for the scatter distribution by examining its spectral components using Fourier analysis.Methods: The scatter distribution projection data were simulated using a CBCT MC model based on the EGSnrc code. CBCT projection data, with separated primary and scatter signal, were generated for a 30.6more » cm diameter water cylinder [single angle projection with varying axis-to-detector distance (ADD) and bowtie filters] and two anthropomorphic phantoms (head and pelvis, 360 projections sampled every 1°, with and without a compensator). The Fourier transform of the resulting scatter distributions was computed and analyzed both qualitatively and quantitatively. A novel metric called the scatter frequency width (SFW) is introduced to determine the scatter distribution's frequency content. The frequency content results are used to determine a set basis functions, consisting of low-frequency sine and cosine functions, to fit and denoise the scatter distribution generated from MC simulations using a reduced number of photons and projections. The signal recovery is implemented using Fourier filtering (low-pass Butterworth filter) and interpolation. Estimates of the scatter distribution are used to correct and reconstruct simulated projections.Results: The spatial and angular frequencies are contained within a maximum frequency of 0.1 cm{sup −1} and 7/(2π) rad{sup −1} for the imaging scenarios examined, with these values varying depending on the object and imaging setup (e.g., ADD and compensator). These data indicate spatial and angular sampling every 5 cm and π/7 rad (∼25°) can be used to properly capture the scatter distribution, with reduced sampling possible depending on the imaging scenario. Using a low-pass Butterworth filter, tuned with the SFW values, to denoise the scatter projection data generated from MC simulations using 10{sup 6} photons resulted in an error reduction of greater than 85% for the estimating scatter in single and multiple projections. Analysis showed that the use of a compensator helped reduce the error in estimating the scatter distribution from limited photon simulations by more than 37% when compared to the case without a compensator for the head and pelvis phantoms. Reconstructions of simulated head phantom projections corrected by the filtered and interpolated scatter estimates showed improvements in overall image quality.Conclusions: The spatial frequency content of the scatter distribution in CBCT is found to be contained within the low frequency domain. The frequency content is modulated both by object and imaging parameters (ADD and compensator). The low-frequency nature of the scatter distribution allows for a limited set of sine and cosine basis functions to be used to accurately represent the scatter signal in the presence of noise and reduced data sampling decreasing MC based scatter estimation time. Compensator induced modulation of the scatter distribution reduces the frequency content and improves the fitting results.« less

  8. Nuclear Power Plant Mechanical Component Flooding Fragility Experiments Status

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, C. L.; Savage, B.; Johnson, B.

    This report describes progress on Nuclear Power Plant mechanical component flooding fragility experiments and supporting research. The progress includes execution of full scale fragility experiments using hollow-core doors, design of improvements to the Portal Evaluation Tank, equipment procurement and initial installation of PET improvements, designation of experiments exploiting the improved PET capabilities, fragility mathematical model development, Smoothed Particle Hydrodynamic simulations, wave impact simulation device research, and pipe rupture mechanics research.

  9. Necrotizing Fasciitis: An Emergency Medicine Simulation Scenario

    PubMed Central

    Galust, Henrik; Oliverio, Matthew H; Giorgio, Daniel J; Espinal, Alexis M

    2016-01-01

    Necrotizing fasciitis (NF) is a rare and rapidly progressing life-threatening infectious process. By progressing through a simulation involving a patient with NF and participating in a post-scenario debriefing, learners will gain the necessary skills and knowledge to properly diagnose and manage patients with NF. Learners are taught to initiate appropriate and timely treatment and to advocate on behalf of their patient after inappropriate pushback from consultants to improve outcomes. PMID:27733963

  10. HIV-1 Strategies of Immune Evasion

    NASA Astrophysics Data System (ADS)

    Castiglione, F.; Bernaschi, M.

    We simulate the progression of the HIV-1 infection in untreated host organisms. The phenotype features of the virus are represented by the replication rate, the probability of activating the transcription, the mutation rate and the capacity to stimulate an immune response (the so-called immunogenicity). It is very difficult to study in-vivo or in-vitro how these characteristics of the virus influence the evolution of the disease. Therefore we resorted to simulations based on a computer model validated in previous studies. We observe, by means of computer experiments, that the virus continuously evolves under the selective pressure of an immune response whose effectiveness downgrades along with the disease progression. The results of the simulations show that immunogenicity is the most important factor in determining the rate of disease progression but, by itself, it is not sufficient to drive the disease to a conclusion in all cases.

  11. Comparative effectiveness of incorporating a hypothetical DCIS prognostic marker into breast cancer screening.

    PubMed

    Trentham-Dietz, Amy; Ergun, Mehmet Ali; Alagoz, Oguzhan; Stout, Natasha K; Gangnon, Ronald E; Hampton, John M; Dittus, Kim; James, Ted A; Vacek, Pamela M; Herschorn, Sally D; Burnside, Elizabeth S; Tosteson, Anna N A; Weaver, Donald L; Sprague, Brian L

    2018-02-01

    Due to limitations in the ability to identify non-progressive disease, ductal carcinoma in situ (DCIS) is usually managed similarly to localized invasive breast cancer. We used simulation modeling to evaluate the potential impact of a hypothetical test that identifies non-progressive DCIS. A discrete-event model simulated a cohort of U.S. women undergoing digital screening mammography. All women diagnosed with DCIS underwent the hypothetical DCIS prognostic test. Women with test results indicating progressive DCIS received standard breast cancer treatment and a decrement to quality of life corresponding to the treatment. If the DCIS test indicated non-progressive DCIS, no treatment was received and women continued routine annual surveillance mammography. A range of test performance characteristics and prevalence of non-progressive disease were simulated. Analysis compared discounted quality-adjusted life years (QALYs) and costs for test scenarios to base-case scenarios without the test. Compared to the base case, a perfect prognostic test resulted in a 40% decrease in treatment costs, from $13,321 to $8005 USD per DCIS case. A perfect test produced 0.04 additional QALYs (16 days) for women diagnosed with DCIS, added to the base case of 5.88 QALYs per DCIS case. The results were sensitive to the performance characteristics of the prognostic test, the proportion of DCIS cases that were non-progressive in the model, and the frequency of mammography screening in the population. A prognostic test that identifies non-progressive DCIS would substantially reduce treatment costs but result in only modest improvements in quality of life when averaged over all DCIS cases.

  12. Modeling snail breeding in a bioregenerative life support system

    NASA Astrophysics Data System (ADS)

    Kovalev, V. S.; Manukovsky, N. S.; Tikhomirov, A. A.; Kolmakova, A. A.

    2015-07-01

    The discrete-time model of snail breeding consists of two sequentially linked submodels: "Stoichiometry" and "Population". In both submodels, a snail population is split up into twelve age groups within one year of age. The first submodel is used to simulate the metabolism of a single snail in each age group via the stoichiometric equation; the second submodel is used to optimize the age structure and the size of the snail population. Daily intake of snail meat by crewmen is a guideline which specifies the population productivity. The mass exchange of the snail unit inhabited by land snails of Achatina fulica is given as an outcome of step-by-step modeling. All simulations are performed using Solver Add-In of Excel 2007.

  13. Simulation of metastatic progression using a computer model including chemotherapy and radiation therapy.

    PubMed

    Bethge, Anja; Schumacher, Udo; Wedemann, Gero

    2015-10-01

    Despite considerable research efforts, the process of metastasis formation is still a subject of intense discussion, and even established models differ considerably in basic details and in the conclusions drawn from them. Mathematical and computational models add a new perspective to the research as they can quantitatively investigate the processes of metastasis and the effects of treatment. However, existing models look at only one treatment option at a time. We enhanced a previously developed computer model (called CaTSiT) that enables quantitative comparison of different metastasis formation models with clinical and experimental data, to include the effects of chemotherapy, external beam radiation, radioimmunotherapy and radioembolization. CaTSiT is based on a discrete event simulation procedure. The growth of the primary tumor and its metastases is modeled by a piecewise-defined growth function that describes the growth behavior of the primary tumor and metastases during various time intervals. The piecewise-defined growth function is composed of analytical functions describing the growth behavior of the tumor based on characteristics of the tumor, such as dormancy, or the effects of various therapies. The spreading of malignant cells into the blood is modeled by intravasation events, which are generated according to a rate function. Further events in the model describe the behavior of the released malignant cells until the formation of a new metastasis. The model is published under the GNU General Public License version 3. To demonstrate the application of the computer model, a case of a patient with a hepatocellular carcinoma and multiple metastases in the liver was simulated. Besides the untreated case, different treatments were simulated at two time points: one directly after diagnosis of the primary tumor and the other several months later. Except for early applied radioimmunotherapy, no treatment strategy was able to eliminate all metastases. These results emphasize the importance of early diagnosis and of proceeding with treatment even if no clinically detectable metastases are present at the time of diagnosis of the primary tumor. CaTSiT could be a valuable tool for quantitative investigation of the process of tumor growth and metastasis formation, including the effects of various treatment options. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. A multilingual audiometer simulator software for training purposes.

    PubMed

    Kompis, Martin; Steffen, Pascal; Caversaccio, Marco; Brugger, Urs; Oesch, Ivo

    2012-04-01

    A set of algorithms, which allows a computer to determine the answers of simulated patients during pure tone and speech audiometry, is presented. Based on these algorithms, a computer program for training in audiometry was written and found to be useful for teaching purposes. To develop a flexible audiometer simulator software as a teaching and training tool for pure tone and speech audiometry, both with and without masking. First a set of algorithms, which allows a computer to determine the answers of a simulated, hearing-impaired patient, was developed. Then, the software was implemented. Extensive use was made of simple, editable text files to define all texts in the user interface and all patient definitions. The software 'audiometer simulator' is available for free download. It can be used to train pure tone audiometry (both with and without masking), speech audiometry, measurement of the uncomfortable level, and simple simulation tests. Due to the use of text files, the user can alter or add patient definitions and all texts and labels shown on the screen. So far, English, French, German, and Portuguese user interfaces are available and the user can choose between German or French speech audiometry.

  15. High fidelity simulations of infrared imagery with animated characters

    NASA Astrophysics Data System (ADS)

    Näsström, F.; Persson, A.; Bergström, D.; Berggren, J.; Hedström, J.; Allvar, J.; Karlsson, M.

    2012-06-01

    High fidelity simulations of IR signatures and imagery tend to be slow and do not have effective support for animation of characters. Simplified rendering methods based on computer graphics methods can be used to overcome these limitations. This paper presents a method to combine these tools and produce simulated high fidelity thermal IR data of animated people in terrain. Infrared signatures for human characters have been calculated using RadThermIR. To handle multiple character models, these calculations use a simplified material model for the anatomy and clothing. Weather and temperature conditions match the IR-texture used in the terrain model. The calculated signatures are applied to the animated 3D characters that, together with the terrain model, are used to produce high fidelity IR imagery of people or crowds. For high level animation control and crowd simulations, HLAS (High Level Animation System) has been developed. There are tools available to create and visualize skeleton based animations, but tools that allow control of the animated characters on a higher level, e.g. for crowd simulation, are usually expensive and closed source. We need the flexibility of HLAS to add animation into an HLA enabled sensor system simulation framework.

  16. Formalism of photons in a nonlinear microring resonator

    NASA Astrophysics Data System (ADS)

    Tran, Quang Loc; Yupapin, Preecha

    2018-03-01

    In this paper, using short Gaussian pulses input from a monochromatic light source, we simulate the photon distribution and analyse the output gate's signals of PANDA nonlinear ring resonator. The present analysis is restricted to directional couplers characterized by two parameters, the power coupling coefficient κ and power coupling loss γ. Add/drop filters are also employed and investigated for the suitable to implement in the practical communication system. The experiment was conducted by using the combination of Lumerical FDTD Solutions and Lumerical MODE Solutions software.

  17. Electromelting of confined monolayer ice.

    PubMed

    Qiu, Hu; Guo, Wanlin

    2013-05-10

    In sharp contrast to the prevailing view that electric fields promote water freezing, here we show by molecular dynamics simulations that monolayer ice confined between two parallel plates can melt into liquid water under a perpendicularly applied electric field. The melting temperature of the monolayer ice decreases with the increasing strength of the external field due to the field-induced disruption of the water-wall interaction induced well-ordered network of the hydrogen bond. This electromelting process should add an important new ingredient to the physics of water.

  18. Domestic Ice Breaking (DOMICE) Simulation Model User Guide

    DTIC Science & Technology

    2013-02-01

    Second, add new ice data to the variable “D9 Historical Ice Data (SIGRID Coded) NBL Waterways” (D9_historical_ice_d3), which contains the...within that “ NBL ” scheme. The interpretation of the SIGRID ice codes into ice thickness estimates is also contained within the sub- module “District 9...User Guide)  “D9 Historical Ice Data (SIGRID Coded) NBL Waterways” (see Section 5.1.1.3.2 of this User Guide)  “Historical District 1 Weekly Air

  19. Design, Simulation, Fabrication and Testing of a Bio-Inspired Amphibious Robot with Multiple Modes of Mobility

    DTIC Science & Technology

    2012-01-01

    performance. Ob- stacle climbing using the tail is compared to results from a previous robot with a posterior body segment and body flexion joint. Actual...3. Mechanisms of Locomotion for Multi-Modal Mobility 3.1. Gate and Tail Design Demands of multi-modal locomotion motivated a quadruped design for...tail instead of a rear body segment simplifies waterproofing design requirements and adds stability both on land and in water. This new morphology is

  20. Noise-enhanced CVQKD with untrusted source

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoqun; Huang, Chunhui

    2017-06-01

    The performance of one-way and two-way continuous variable quantum key distribution (CVQKD) protocols can be increased by adding some noise on the reconciliation side. In this paper, we propose to add noise at the reconciliation end to improve the performance of CVQKD with untrusted source. We derive the key rate of this case and analyze the impact of the additive noise. The simulation results show that the optimal additive noise can improve the performance of the system in terms of maximum transmission distance and tolerable excess noise.

  1. Primary Biliary Cholangitis: advances in management and treatment of the disease.

    PubMed

    Invernizzi, Pietro; Floreani, Annarosa; Carbone, Marco; Marzioni, Marco; Craxi, Antonio; Muratori, Luigi; Vespasiani Gentilucci, Umberto; Gardini, Ivan; Gasbarrini, Antonio; Kruger, Paola; Mennini, Francesco Saverio; Ronco, Virginia; Lanati, Elena; Canonico, Pier Luigi; Alvaro, Domenico

    2017-08-01

    Primary Biliary Cholangitis, previously known as Primary Biliary Cirrhosis, is a rare disease, which mainly affects women in their fifth to seventh decades of life. It is a chronic autoimmune disease characterized by a progressive damage of interlobular bile ducts leading to ductopenia, chronic cholestasis and bile acids retention. Even if the disease usually presents a long asymptomatic phase and a slow progression, in many patients it may progress faster toward cirrhosis and its complications. The 10year mortality is greater than in diseases such as human immunodeficiency virus/Hepatitis C Virus coinfection and breast cancer. Ursodeoxycholic acid is the only treatment available today, but even if effective in counteracting the disease progression for the majority of patients, in approximately 40% is not able to decrease effectively the alkaline phosphatase, a surrogate marker of disease activity. Recently, obeticholic acid received the European Medicines Agency conditional approval, as add on treatment in patients non responders or intolerant to ursodeoxycholic acid. The present paper illustrates the opinion of a working group, composed by clinical pharmacologists, gastroenterologists/hepatologists with specific expertise on Primary Biliary Cholangitis and patient associations, on the state of the art and future perspectives of the disease management. The agreement on the document was reached through an Expert Meeting. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Genetic Analysis of Oncorhynchus Nerka : 1991 Annual Progress Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brannon, E.L.; Setter, A.L.; Welsh, T.L.

    1992-01-01

    The 1990 project to develop DNA assessment techniques for the purpose of determining relationships among populations of Oncorhynchus nerka demonstrated differences that had potential for such application. The work was continued in 1991 with specific application of the techniques to develop DNA probes that could be used in separating populations of 0. nerka associated with the lakes in the upper Salmon River, principally those in Redfish Lake. Research included sockeye-kokanee life history studies that might add supporting evidence for assessing the degree of difference or similarity among populations in the lake systems. This report summarizes the annual activities under themore » work plan.« less

  3. Structural Analysis in a Conceptual Design Framework

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.

    2012-01-01

    Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.

  4. Angelcare mobile system: homecare patient monitoring using bluetooth and GPRS.

    PubMed

    Ribeiro, Anna G D; Maitelli, Andre L; Valentim, Ricardo A M; Brandao, Glaucio B; Guerreiro, Ana M G

    2010-01-01

    The quick progress in technology has brought new paradigms to the computing area, bringing with them many benefits to society. The paradigm of ubiquitous computing brings innovations applying computing in people's daily life without being noticed. For this, it has used the combination of several existing technologies like wireless communications and sensors. Several of the benefits have reached the medical area, bringing new methods of surgery, appointments and examinations. This work presents telemedicine software that adds the idea of ubiquity to the medical area, innovating the relation between doctor and patient. It also brings security and confidence to a patient being monitored in homecare.

  5. Fuzzy set methods for object recognition in space applications

    NASA Technical Reports Server (NTRS)

    Keller, James M.

    1992-01-01

    Progress on the following tasks is reported: feature calculation; membership calculation; clustering methods (including initial experiments on pose estimation); and acquisition of images (including camera calibration information for digitization of model). The report consists of 'stand alone' sections, describing the activities in each task. We would like to highlight the fact that during this quarter, we believe that we have made a major breakthrough in the area of fuzzy clustering. We have discovered a method to remove the probabilistic constraints that the sum of the memberships across all classes must add up to 1 (as in the fuzzy c-means). A paper, describing this approach, is included.

  6. A Competence-Based Science Learning Framework Illustrated Through the Study of Natural Hazards and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Oyao, Sheila G.; Holbrook, Jack; Rannikmäe, Miia; Pagunsan, Marmon M.

    2015-09-01

    This article proposes a competence-based learning framework for science teaching, applied to the study of 'big ideas', in this case to the study of natural hazards and disaster risk reduction (NH&DRR). The framework focuses on new visions of competence, placing emphasis on nurturing connectedness and behavioral actions toward resilience and sustainability. The framework draws together competences familiarly expressed as cognitive knowledge and skills, plus dispositions and adds connectedness and action-related behaviors, and applies this by means of a progression shift associated with NH&DRR from abilities to capabilities. The target is enhanced scientific literacy approached through an education through science focus, amplified through the study of a big idea, promotion of sustained resilience in the face of disaster and the taking of responsibilities for behavioral actions. The framework is applied to a learning progression for each interrelated education dimension, thus serving as a guide for both the development of abilities and as a platform for stimulating student capabilities within instruction and assessment.

  7. Serum CEACAM1 Elevation Correlates with Melanoma Progression and Failure to Respond to Adoptive Cell Transfer Immunotherapy

    PubMed Central

    Ortenberg, R.; Sapoznik, S.; Zippel, D.; Shapira-Frommer, R.; Itzhaki, O.; Kubi, A.; Zikich, D.; Besser, M. J.; Schachter, J.; Markel, G.

    2015-01-01

    Malignant melanoma is a devastating disease whose incidences are continuously rising. The recently approved antimelanoma therapies carry new hope for metastatic patients for the first time in decades. However, the clinical management of melanoma is severely hampered by the absence of effective screening tools. The expression of the CEACAM1 adhesion molecule on melanoma cells is a strong predictor of poor prognosis. Interestingly, a melanoma-secreted form of CEACAM1 (sCEACAM1) has recently emerged as a potential tumor biomarker. Here we add novel evidences supporting the prognostic role of serum CEACAM1 by using a mice xenograft model of human melanoma and showing a correlation between serum CEACAM1 and tumor burden. Moreover, we demonstrate that serum CEACAM1 is elevated over time in progressive melanoma patients who fail to respond to immunotherapy as opposed to responders and stable disease patients, thus proving a correlation between sCEACAM1, response to treatment, and clinical deterioration. PMID:26688824

  8. Text mining for neuroanatomy using WhiteText with an updated corpus and a new web application

    PubMed Central

    French, Leon; Liu, Po; Marais, Olivia; Koreman, Tianna; Tseng, Lucia; Lai, Artemis; Pavlidis, Paul

    2015-01-01

    We describe the WhiteText project, and its progress towards automatically extracting statements of neuroanatomical connectivity from text. We review progress to date on the three main steps of the project: recognition of brain region mentions, standardization of brain region mentions to neuroanatomical nomenclature, and connectivity statement extraction. We further describe a new version of our manually curated corpus that adds 2,111 connectivity statements from 1,828 additional abstracts. Cross-validation classification within the new corpus replicates results on our original corpus, recalling 67% of connectivity statements at 51% precision. The resulting merged corpus provides 5,208 connectivity statements that can be used to seed species-specific connectivity matrices and to better train automated techniques. Finally, we present a new web application that allows fast interactive browsing of the over 70,000 sentences indexed by the system, as a tool for accessing the data and assisting in further curation. Software and data are freely available at http://www.chibi.ubc.ca/WhiteText/. PMID:26052282

  9. Computational Wear Simulation of Patellofemoral Articular Cartilage during In Vitro Testing

    PubMed Central

    Li, Lingmin; Patil, Shantanu; Steklov, Nick; Bae, Won; Temple-Wong, Michele; D'Lima, Darryl D.; Sah, Robert L.; Fregly, Benjamin J.

    2011-01-01

    Though changes in normal joint motions and loads (e.g., following anterior cruciate ligament injury) contribute to the development of knee osteoarthritis, the precise mechanism by which these changes induce osteoarthritis remains unknown. As a first step toward identifying this mechanism, this study evaluates computational wear simulations of a patellofemoral joint specimen wear tested on a knee simulator machine. A multi-body dynamic model of the specimen mounted in the simulator machine was constructed in commercial computer-aided engineering software. A custom elastic foundation contact model was used to calculate contact pressures and wear on the femoral and patellar articular surfaces using geometry created from laser scan and MR data. Two different wear simulation approaches were investigated – one that wore the surface geometries gradually over a sequence of 10 one-cycle dynamic simulations (termed the “progressive” approach), and one that wore the surface geometries abruptly using results from a single one-cycle dynamic simulation (termed the “non-progressive” approach). The progressive approach with laser scan geometry reproduced the experimentally measured wear depths and areas for both the femur and patella. The less costly non-progressive approach predicted deeper wear depths, especially on the patella, but had little influence on predicted wear areas. Use of MR data for creating the articular and subchondral bone geometry altered wear depth and area predictions by at most 13%. These results suggest that MR-derived geometry may be sufficient for simulating articular cartilage wear in vivo and that a progressive simulation approach may be needed for the patella and tibia since both remain in continuous contact with the femur. PMID:21453922

  10. The VIIRS Ocean Data Simulator Enhancements and Results

    NASA Technical Reports Server (NTRS)

    Robinson, Wayne D.; Patt, Fredrick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.

    2011-01-01

    The VIIRS Ocean Science Team (VOST) has been developing an Ocean Data Simulator to create realistic VIIRS SDR datasets based on MODIS water-leaving radiances. The simulator is helping to assess instrument performance and scientific processing algorithms. Several changes were made in the last two years to complete the simulator and broaden its usefulness. The simulator is now fully functional and includes all sensor characteristics measured during prelaunch testing, including electronic and optical crosstalk influences, polarization sensitivity, and relative spectral response. Also included is the simulation of cloud and land radiances to make more realistic data sets and to understand their important influence on nearby ocean color data. The atmospheric tables used in the processing, including aerosol and Rayleigh reflectance coefficients, have been modeled using VIIRS relative spectral responses. The capabilities of the simulator were expanded to work in an unaggregated sample mode and to produce scans with additional samples beyond the standard scan. These features improve the capability to realistically add artifacts which act upon individual instrument samples prior to aggregation and which may originate from beyond the actual scan boundaries. The simulator was expanded to simulate all 16 M-bands and the EDR processing was improved to use these bands to make an SST product. The simulator is being used to generate global VIIRS data from and in parallel with the MODIS Aqua data stream. Studies have been conducted using the simulator to investigate the impact of instrument artifacts. This paper discusses the simulator improvements and results from the artifact impact studies.

  11. The VIIRS ocean data simulator enhancements and results

    NASA Astrophysics Data System (ADS)

    Robinson, Wayne D.; Patt, Frederick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.

    2011-10-01

    The VIIRS Ocean Science Team (VOST) has been developing an Ocean Data Simulator to create realistic VIIRS SDR datasets based on MODIS water-leaving radiances. The simulator is helping to assess instrument performance and scientific processing algorithms. Several changes were made in the last two years to complete the simulator and broaden its usefulness. The simulator is now fully functional and includes all sensor characteristics measured during prelaunch testing, including electronic and optical crosstalk influences, polarization sensitivity, and relative spectral response. Also included is the simulation of cloud and land radiances to make more realistic data sets and to understand their important influence on nearby ocean color data. The atmospheric tables used in the processing, including aerosol and Rayleigh reflectance coefficients, have been modeled using VIIRS relative spectral responses. The capabilities of the simulator were expanded to work in an unaggregated sample mode and to produce scans with additional samples beyond the standard scan. These features improve the capability to realistically add artifacts which act upon individual instrument samples prior to aggregation and which may originate from beyond the actual scan boundaries. The simulator was expanded to simulate all 16 M-bands and the EDR processing was improved to use these bands to make an SST product. The simulator is being used to generate global VIIRS data from and in parallel with the MODIS Aqua data stream. Studies have been conducted using the simulator to investigate the impact of instrument artifacts. This paper discusses the simulator improvements and results from the artifact impact studies.

  12. Simulation of the Interactions Between Gamma-Rays and Detectors Using BSIMUL

    NASA Technical Reports Server (NTRS)

    Haywood, S. E.; Rester, A. C., Jr.

    1996-01-01

    Progress made during 1995 on the Monte-Carlo gamma-ray spectrum simulation program BSIMUL is discussed. Several features have been added, including the ability to model shield that are tapered cylinders. Several simulations were made on the Near Earth Asteroid Rendezvous detector.

  13. Implementation of interconnect simulation tools in spice

    NASA Technical Reports Server (NTRS)

    Satsangi, H.; Schutt-Aine, J. E.

    1993-01-01

    Accurate computer simulation of high speed digital computer circuits and communication circuits requires a multimode approach to simulate both the devices and the interconnects between devices. Classical circuit analysis algorithms (lumped parameter) are needed for circuit devices and the network formed by the interconnected devices. The interconnects, however, have to be modeled as transmission lines which incorporate electromagnetic field analysis. An approach to writing a multimode simulator is to take an existing software package which performs either lumped parameter analysis or field analysis and add the missing type of analysis routines to the package. In this work a traditionally lumped parameter simulator, SPICE, is modified so that it will perform lossy transmission line analysis using a different model approach. Modifying SPICE3E2 or any other large software package is not a trivial task. An understanding of the programming conventions used, simulation software, and simulation algorithms is required. This thesis was written to clarify the procedure for installing a device into SPICE3E2. The installation of three devices is documented and the installations of the first two provide a foundation for installation of the lossy line which is the third device. The details of discussions are specific to SPICE, but the concepts will be helpful when performing installations into other circuit analysis packages.

  14. Compact, self-contained enhanced-vision system (EVS) sensor simulator

    NASA Astrophysics Data System (ADS)

    Tiana, Carlo

    2007-04-01

    We describe the model SIM-100 PC-based simulator, for imaging sensors used, or planned for use, in Enhanced Vision System (EVS) applications. Typically housed in a small-form-factor PC, it can be easily integrated into existing out-the-window visual simulators for fixed-wing or rotorcraft, to add realistic sensor imagery to the simulator cockpit. Multiple bands of infrared (short-wave, midwave, extended-midwave and longwave) as well as active millimeter-wave RADAR systems can all be simulated in real time. Various aspects of physical and electronic image formation and processing in the sensor are accurately (and optionally) simulated, including sensor random and fixed pattern noise, dead pixels, blooming, B-C scope transformation (MMWR). The effects of various obscurants (fog, rain, etc.) on the sensor imagery are faithfully represented and can be selected by an operator remotely and in real-time. The images generated by the system are ideally suited for many applications, ranging from sensor development engineering tradeoffs (Field Of View, resolution, etc.), to pilot familiarization and operational training, and certification support. The realistic appearance of the simulated images goes well beyond that of currently deployed systems, and beyond that required by certification authorities; this level of realism will become necessary as operational experience with EVS systems grows.

  15. Psychometric properties of the Aberrant Behavior Checklist, the Anxiety, Depression and Mood Scale, the Assessment of Dual Diagnosis and the Social Performance Survey Schedule in adults with intellectual disabilities.

    PubMed

    Rojahn, Johannes; Rowe, Ellen W; Kasdan, Shana; Moore, Linda; van Ingen, Daniel J

    2011-01-01

    Progress in clinical research and in empirically supported interventions in the area of psychopathology in intellectual disabilities (ID) depends on high-quality assessment instruments. To this end, psychometric properties of four instruments were examined: the Aberrant Behavior Checklist (ABC), the Assessment of Dual Diagnosis (ADD), the Anxiety, Depression and Mood Scale (ADAMS), and the Social Performance Survey Schedule (SPSS). Data were collected in two community-based groups of adults with mild to profound ID (n = 263). Subscale reliability (internal consistency) ranged from fair to excellent for the ABC, the ADAMS, and the SPSS (mean coefficient α across ABC subscales was .87 (ranging from fair to excellent), the ADAMS subscales .83 (ranging from fair to good), and the SPSS subscales .91 (range from good to excellent). The ADD subscales had generally lower reliability scores with a mean of .59 (ranging from unacceptable to good). Convergent and discriminant validity was determined by bivariate Spearman ρ correlations between subscales of one instrument and the subscales of the other three instruments. For the most part, all four instruments showed solid convergent and discriminant validity. To examine the factorial validity, Confirmatory Factor Analyses (CFA) were attempted with the inter-item covariance matrix of each instrument. Generally, the data did not show good fits with the measurement models for the SPSS, ABC, or the ADAMS (CFA analyses with the ADD would not converge). However, most of the items on these three instruments had significant loadings on their respective factors. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. CERT TST November 2016 Visit Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, Robert Currier; Bailey, Teresa S.; Kahler, III, Albert Comstock

    2017-04-27

    The dozen plus presentations covered the span of the Center’s activities, including experimental progress, simulations of the experiments (both for calibration and validation), UQ analysis, nuclear data impacts, status of simulation codes, methods development, computational science progress, and plans for upcoming priorities. All three institutions comprising the Center (Texas A&M, University of Colorado Boulder, and Simon Fraser University) were represented. Center-supported students not only gave two of the oral presentations, but also highlighted their research in a number of excellent posters.

  17. STS-26 simulation activities in JSC Mission Control Center (MCC)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    In JSC Mission Control Center (MCC) Bldg 30 Flight Control Room (FCR), astronauts John O. Creighton (right) and L. Blaine Hammond review their notes while serving as spacecraft communicators (CAPCOMs) for STS-26 simulations in progress between MCC and JSC Mission Simulation and Training Facility Bldg 5 fixed-base (FB) shuttle mission simulator (SMS).

  18. STS-26 simulation activities in JSC Mission Control Center (MCC)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    In JSC Mission Control Center (MCC) Bldg 30 Flight Control Room (FCR), flight directors (FDs) Lee Briscoe (left) and Charles W. Shaw, seated at FD console, view front visual display monitors during STS-26 simulations in progress between MCC and JSC Mission Simulation and Training Facility Bldg 5 fixed-base (FB) shuttle mission simulator (SMS).

  19. Toward Endemic Deployment of Educational Simulation Games: A Review of Progress and Future Recommendations

    ERIC Educational Resources Information Center

    Moizer, Jonathan; Lean, Jonathan

    2010-01-01

    This article presents a conceptual analysis of simulation game adoption and use across university faculty. The metaphor of epidemiology is used to characterize the diffusion of simulation games for teaching and learning. A simple stock-flow diagram is presented to illustrate this dynamic. Future scenarios for simulation game adoption are…

  20. Reporting the national antimicrobial consumption in Danish pigs: influence of assigned daily dosage values and population measurement.

    PubMed

    Dupont, Nana; Fertner, Mette; Kristensen, Charlotte Sonne; Toft, Nils; Stege, Helle

    2016-05-03

    Transparent calculation methods are crucial when investigating trends in antimicrobial consumption over time and between populations. Until 2011, one single standardized method was applied when quantifying the Danish pig antimicrobial consumption with the unit "Animal Daily Dose" (ADD). However, two new methods for assigning values for ADDs have recently emerged, one implemented by DANMAP, responsible for publishing annual reports on antimicrobial consumption, and one by the Danish Veterinary and Food Administration (DVFA), responsible for the Yellow Card initiative. In addition to new ADD assignment methods, Denmark has also experienced a shift in the production pattern, towards a larger export of live pigs. The aims of this paper were to (1) describe previous and current ADD assignment methods used by the major Danish institutions and (2) to illustrate how ADD assignment method and choice of population and population measurement affect the calculated national antimicrobial consumption in pigs (2007-2013). The old VetStat ADD-values were based on SPCs in contrast to the new ADD-values, which were based on active compound, concentration and administration route. The new ADD-values stated by both DANMAP and DVFA were only identical for 48 % of antimicrobial products approved for use in pigs. From 2007 to 2013, the total number of ADDs per year increased by 9 % when using the new DVFA ADD-values, but decreased by 2 and 7 % when using the new DANMAP ADD-values or the old VetStat ADD-values, respectively. Through 2007 to 2013, the production of pigs increased from 26.1 million pigs per year with 18 % exported live to 28.7 million with 34 % exported live. In the same time span, the annual pig antimicrobial consumption increased by 22.2 %, when calculated using the new DVFA ADD-values and pigs slaughtered per year as population measurement (13.0 ADDs/pig/year to 15.9 ADDs/pig/year). However, when based on the old VetStat ADD values and pigs produced per year (including live export), a 10.9 % decrease was seen (10.6 ADDs/pig/year to 9.4 ADDs/pig/year). The findings of this paper clearly highlight that calculated national antimicrobial consumption is highly affected by chosen population measurement and the applied ADD-values.

  1. Microscopic Car Modeling for Intelligent Traffic and Scenario Generation in the UCF Driving Simulator : Year 2

    DOT National Transportation Integrated Search

    2000-01-01

    A multi-year project was initiated to introduce autonomous vehicles in the University of Central Florida (UCF) Driving Simulator for real-time interaction with the simulator vehicle. This report describes the progress during the second year. In the f...

  2. Attention Deficit Disorder. NICHCY Briefing Paper.

    ERIC Educational Resources Information Center

    Fowler, Mary

    This briefing paper uses a question-and-answer format to provide basic information about children with attention deficit disorder (ADD). Questions address the following concerns: nature and incidence of ADD; causes of ADD; signs of ADD (impulsivity, hyperactivity, disorganization, social skill deficits); the diagnostic ADD assessment; how to get…

  3. Progressive Fracture of Fiber Composite Thin Shell Structures Under Internal Pressure and Axial Loads

    NASA Technical Reports Server (NTRS)

    Gotsis, Pascal K.; Chamis, Christos C.; Minnetyan, Levon

    1996-01-01

    Graphite/epoxy composite thin shell structures were simulated to investigate damage and fracture progression due to internal pressure and axial loading. Defective and defect-free structures (thin cylinders) were examined. The three different laminates examined had fiber orientations of (90/0/+/-0)(sub s), where 0 is 45, 60, and 75 deg. CODSTRAN, an integrated computer code that scales up constituent level properties to the structural level and accounts for all possible failure modes, was used to simulate composite degradation under loading. Damage initiation, growth, accumulation, and propagation to fracture were included in the simulation. Burst pressures for defective and defect-free shells were compared to evaluate damage tolerance. The results showed that damage initiation began with matrix failure whereas damage and/or fracture progression occurred as a result of additional matrix failure and fiber fracture. In both thin cylinder cases examined (defective and defect-free), the optimum layup configuration was (90/0/+/-60)(sub s) because it had the best damage tolerance with respect to the burst pressure.

  4. Progress of the equation of state table for supernova simulations and its influence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sumiyoshi, Kohsuke

    2012-11-12

    We describe recent progress of the EOS tables for numerical simulations of core-collapse supernovae and related astrophysical phenomena. Based on the Shen EOS table, which has been widely used in supernova simulations, there is systematic progress by extending the degrees of freedom such as hyperons and quarks. These extended EOS tables have been used, for example, to study the neutrino bursts from the gravitational collapse of massive stars leading to the black hole formation. Observations of such neutrinos from galactic events in future will provide us with the information on the EOS. Recently, studies of the supernova EOS with themore » multi-composition of nuclei under the nuclear statistical equilibrium have been made beyond the single nucleus approximation as used in the Shen EOS. It has been found that light elements including deuterons are abundant in wide regions of the supernova cores. We discuss that neutrino-deuteron reactions may have a possible influence on the explosion mechanism through modifications of neutrino heating rates.« less

  5. Development and application of the dynamic system doctor to nuclear reactor probabilistic risk assessments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunsman, David Marvin; Aldemir, Tunc; Rutt, Benjamin

    2008-05-01

    This LDRD project has produced a tool that makes probabilistic risk assessments (PRAs) of nuclear reactors - analyses which are very resource intensive - more efficient. PRAs of nuclear reactors are being increasingly relied on by the United States Nuclear Regulatory Commission (U.S.N.R.C.) for licensing decisions for current and advanced reactors. Yet, PRAs are produced much as they were 20 years ago. The work here applied a modern systems analysis technique to the accident progression analysis portion of the PRA; the technique was a system-independent multi-task computer driver routine. Initially, the objective of the work was to fuse the accidentmore » progression event tree (APET) portion of a PRA to the dynamic system doctor (DSD) created by Ohio State University. Instead, during the initial efforts, it was found that the DSD could be linked directly to a detailed accident progression phenomenological simulation code - the type on which APET construction and analysis relies, albeit indirectly - and thereby directly create and analyze the APET. The expanded DSD computational architecture and infrastructure that was created during this effort is called ADAPT (Analysis of Dynamic Accident Progression Trees). ADAPT is a system software infrastructure that supports execution and analysis of multiple dynamic event-tree simulations on distributed environments. A simulator abstraction layer was developed, and a generic driver was implemented for executing simulators on a distributed environment. As a demonstration of the use of the methodological tool, ADAPT was applied to quantify the likelihood of competing accident progression pathways occurring for a particular accident scenario in a particular reactor type using MELCOR, an integrated severe accident analysis code developed at Sandia. (ADAPT was intentionally created with flexibility, however, and is not limited to interacting with only one code. With minor coding changes to input files, ADAPT can be linked to other such codes.) The results of this demonstration indicate that the approach can significantly reduce the resources required for Level 2 PRAs. From the phenomenological viewpoint, ADAPT can also treat the associated epistemic and aleatory uncertainties. This methodology can also be used for analyses of other complex systems. Any complex system can be analyzed using ADAPT if the workings of that system can be displayed as an event tree, there is a computer code that simulates how those events could progress, and that simulator code has switches to turn on and off system events, phenomena, etc. Using and applying ADAPT to particular problems is not human independent. While the human resources for the creation and analysis of the accident progression are significantly decreased, knowledgeable analysts are still necessary for a given project to apply ADAPT successfully. This research and development effort has met its original goals and then exceeded them.« less

  6. Establishing prostate cancer patient derived xenografts: lessons learned from older studies.

    PubMed

    Russell, Pamela J; Russell, Peter; Rudduck, Christina; Tse, Brian W C; Williams, Elizabeth D; Raghavan, Derek

    2015-05-01

    Understanding the progression of prostate cancer to androgen-independence/castrate resistance and development of preclinical testing models are important for developing new prostate cancer therapies. This report describes studies performed 30 years ago, which demonstrate utility and shortfalls of xenografting to preclinical modeling. We subcutaneously implanted male nude mice with small prostate cancer fragments from transurethral resection of the prostate (TURP) from 29 patients. Successful xenografts were passaged into new host mice. They were characterized using histology, immunohistochemistry for marker expression, flow cytometry for ploidy status, and in some cases by electron microscopy and response to testosterone. Two xenografts were karyotyped by G-banding. Tissues from 3/29 donors (10%) gave rise to xenografts that were successfully serially passaged in vivo. Two, (UCRU-PR-1, which subsequently was replaced by a mouse fibrosarcoma, and UCRU-PR-2, which combined epithelial and neuroendocrine features) have been described. UCRU-PR-4 line was a poorly differentiated prostatic adenocarcinoma derived from a patient who had undergone estrogen therapy and bilateral castration after his cancer relapsed. Histologically, this comprised diffusely infiltrating small acinar cell carcinoma with more solid aggregates of poorly differentiated adenocarcinoma. The xenografted line showed histology consistent with a poorly differentiated adenocarcinoma and stained positively for prostatic acid phosphatase (PAcP), epithelial membrane antigen (EMA) and the cytokeratin cocktail, CAM5.2, with weak staining for prostate specific antigen (PSA). The line failed to grow in female nude mice. Castration of three male nude mice after xenograft establishment resulted in cessation of growth in one, growth regression in another and transient growth in another, suggesting that some cells had retained androgen sensitivity. The karyotype (from passage 1) was 43-46, XY, dic(1;12)(p11;p11), der(3)t(3:?5)(q13;q13), -5, inv(7)(p15q35) x2, +add(7)(p13), add(8)(p22), add(11)(p14), add(13)(p11), add(20)(p12), -22, +r4[cp8]. Xenografts provide a clinically relevant model of prostate cancer, although establishing serially transplantable prostate cancer patient derived xenografts is challenging and requires rigorous characterization and high quality starting material. Xenografting from advanced prostate cancer is more likely to succeed, as xenografting from well differentiated, localized disease has not been achieved in our experience. Strong translational correlations can be demonstrated between the clinical disease state and the xenograft model. © 2015 The Authors. The Prostate published by Wiley Periodicals, Inc.

  7. Establishing Prostate Cancer Patient Derived Xenografts: Lessons Learned From Older Studies

    PubMed Central

    Russell, Pamela J; Russell, Peter; Rudduck, Christina; Tse, Brian W-C; Williams, Elizabeth D; Raghavan, Derek

    2015-01-01

    Background Understanding the progression of prostate cancer to androgen-independence/castrate resistance and development of preclinical testing models are important for developing new prostate cancer therapies. This report describes studies performed 30 years ago, which demonstrate utility and shortfalls of xenografting to preclinical modeling. Methods We subcutaneously implanted male nude mice with small prostate cancer fragments from transurethral resection of the prostate (TURP) from 29 patients. Successful xenografts were passaged into new host mice. They were characterized using histology, immunohistochemistry for marker expression, flow cytometry for ploidy status, and in some cases by electron microscopy and response to testosterone. Two xenografts were karyotyped by G-banding. Results Tissues from 3/29 donors (10%) gave rise to xenografts that were successfully serially passaged in vivo. Two, (UCRU-PR-1, which subsequently was replaced by a mouse fibrosarcoma, and UCRU-PR-2, which combined epithelial and neuroendocrine features) have been described. UCRU-PR-4 line was a poorly differentiated prostatic adenocarcinoma derived from a patient who had undergone estrogen therapy and bilateral castration after his cancer relapsed. Histologically, this comprised diffusely infiltrating small acinar cell carcinoma with more solid aggregates of poorly differentiated adenocarcinoma. The xenografted line showed histology consistent with a poorly differentiated adenocarcinoma and stained positively for prostatic acid phosphatase (PAcP), epithelial membrane antigen (EMA) and the cytokeratin cocktail, CAM5.2, with weak staining for prostate specific antigen (PSA). The line failed to grow in female nude mice. Castration of three male nude mice after xenograft establishment resulted in cessation of growth in one, growth regression in another and transient growth in another, suggesting that some cells had retained androgen sensitivity. The karyotype (from passage 1) was 43–46, XY, dic(1;12)(p11;p11), der(3)t(3:?5)(q13;q13), -5, inv(7)(p15q35) x2, +add(7)(p13), add(8)(p22), add(11)(p14), add(13)(p11), add(20)(p12), -22, +r4[cp8]. Conclusions Xenografts provide a clinically relevant model of prostate cancer, although establishing serially transplantable prostate cancer patient derived xenografts is challenging and requires rigorous characterization and high quality starting material. Xenografting from advanced prostate cancer is more likely to succeed, as xenografting from well differentiated, localized disease has not been achieved in our experience. Strong translational correlations can be demonstrated between the clinical disease state and the xenograft model. Prostate 75: 628–636, 2015. © The Authors. The Prostate published by Wiley Periodicals, Inc. PMID:25560784

  8. Simulation Training for the Office-Based Anesthesia Team.

    PubMed

    Ritt, Richard M; Bennett, Jeffrey D; Todd, David W

    2017-05-01

    An OMS office is a complex environment. Within such an environment, a diverse scope of complex surgical procedures is performed with different levels of anesthesia, ranging from local anesthesia to general anesthesia, on patients with varying comorbidities. Optimal patient outcomes require a functional surgical and anesthetic team, who are familiar with both standard operational principles and emergency recognition and management. Offices with high volume and time pressure add further stress and potential risk to the office environment. Creating and maintaining a functional surgical and anesthetic team that is competent with a culture of patient safety and risk reduction is a significant challenge that requires time, commitment, planning, and dedication. This article focuses on the role of simulation training in office training and preparation. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Simulator Motion as a Factor in Flight Simulator Training Effectiveness.

    ERIC Educational Resources Information Center

    Jacobs, Robert S.

    The document reviews the literature concerning the training effectiveness of flight simulators and describes an experiment in progress at the University of Illinois' Institute of Aviation which is an initial attempt to develop systematically the relationship between motion cue fidelity and resultant training effectiveness. The literature review…

  10. Direct solar-pumped iodine laser amplifier

    NASA Technical Reports Server (NTRS)

    Han, K. S.

    1985-01-01

    This semiannual progress report covers the period from April 1, 1985 to Sept. 30, 1985 under NASA grant NAS1-441 entitled direct solar pumped iodine laser amplifier. During this period the parametric studies of the iodine laser oscillator pumped by a Vortek simulator was carried out before the amplifier studies. The amplifier studies are postponed to the extended period following completion of the parametric studies. In addition, the kinetic modeling of a solar pumped iodine laser amplifier, and the experimental work for a solar pumped dye laser amplifier are in progress. This report contains three parts: (1) the radiation characteristics of solar simulator and the parametric characteristics of photodissociation iodine laser continuously pumped by a Vortek solar simulator; (2) kinetic modeling of a solar pumped iodine laser amplifier; and (3) the study of the dye laser amplifier pumped by a Tamarack solar simulator.

  11. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  12. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  13. Patient-Specific Computational Modeling of Keratoconus Progression and Differential Responses to Collagen Cross-linking

    PubMed Central

    Sinha Roy, Abhijit

    2011-01-01

    Purpose. To model keratoconus (KC) progression and investigate the differential responses of central and eccentric cones to standard and alternative collagen cross-linking (CXL) patterns. Methods. Three-dimensional finite element models (FEMs) were generated with clinical tomography and IOP measurements. Graded reductions in regional corneal hyperelastic properties and thickness were imposed separately in the less affected eye of a KC patient. Topographic results, including maximum curvature and first-surface, higher-order aberrations (HOAs), were compared to those of the more affected contralateral eye. In two eyes with central and eccentric cones, a standard broad-beam CXL protocol was simulated with 200- and 300-μm treatment depths and compared to spatially graded broad-beam and cone-centered CXL simulations. Results. In a model of KC progression, maximum curvature and HOA increased as regional corneal hyperelastic properties were decreased. A topographic cone could be generated without a reduction in corneal thickness. Simulation of standard 9-mm-diameter CXL produced decreases in corneal curvature comparable to clinical reports and affected cone location. A 100-μm increase in CXL depth enhanced flattening by 24% to 34% and decreased HOA by 22% to 31%. Topographic effects were greatest with cone-centered CXL simulations. Conclusions. Progressive hyperelastic weakening of a cornea with subclinical KC produced topographic features of manifest KC. The clinical phenomenon of topographic flattening after CXL was replicated. The magnitude and higher-order optics of this response depended on IOP and the spatial distribution of stiffening relative to the cone location. Smaller diameter simulated treatments centered on the cone provided greater reductions in curvature and HOA than a standard broad-beam CXL pattern. PMID:22039252

  14. The role of simulation in mixed-methods research: a framework & application to patient safety.

    PubMed

    Guise, Jeanne-Marie; Hansen, Matthew; Lambert, William; O'Brien, Kerth

    2017-05-04

    Research in patient safety is an important area of health services research and is a national priority. It is challenging to investigate rare occurrences, explore potential causes, and account for the complex, dynamic context of healthcare - yet all are required in patient safety research. Simulation technologies have become widely accepted as education and clinical tools, but have yet to become a standard tool for research. We developed a framework for research that integrates accepted patient safety models with mixed-methods research approaches and describe the performance of the framework in a working example of a large National Institutes of Health (NIH)-funded R01 investigation. This worked example of a framework in action, identifies the strengths and limitations of qualitative and quantitative research approaches commonly used in health services research. Each approach builds essential layers of knowledge. We describe how the use of simulation ties these layers of knowledge together and adds new and unique dimensions of knowledge. A mixed-methods research approach that includes simulation provides a broad multi-dimensional approach to health services and patient safety research.

  15. Piloted simulation of one-on-one helicopter air combat at NOE flight levels

    NASA Technical Reports Server (NTRS)

    Lewis, M. S.; Aiken, E. W.

    1985-01-01

    A piloted simulation designed to examine the effects of terrain proximity and control system design on helicopter performance during one-on-one air combat maneuvering (ACM) is discussed. The NASA Ames vertical motion simulator (VMS) and the computer generated imagery (CGI) systems were modified to allow two aircraft to be independently piloted on a single CGI data base. Engagements were begun with the blue aircraft already in a tail-chase position behind the red, and also with the two aircraft originating from positions unknown to each other. Maneuvering was very aggressive and safety requirements for minimum altitude, separation, and maximum bank angles typical of flight test were not used. Results indicate that the presence of terrain features adds an order of complexiaty to the task performed over clear air ACM and that mix of attitude and rate command-type stability and control augmentation system (SCAS) design may be desirable. The simulation system design, the flight paths flown, and the tactics used were compared favorably by the evaluation pilots to actual flight test experiments.

  16. Fusion Simulation Project Workshop Report

    NASA Astrophysics Data System (ADS)

    Kritz, Arnold; Keyes, David

    2009-03-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved 46 physicists, applied mathematicians and computer scientists, from 21 institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a 3-day workshop in May 2007.

  17. Enhanced conformational sampling of carbohydrates by Hamiltonian replica-exchange simulation.

    PubMed

    Mishra, Sushil Kumar; Kara, Mahmut; Zacharias, Martin; Koca, Jaroslav

    2014-01-01

    Knowledge of the structure and conformational flexibility of carbohydrates in an aqueous solvent is important to improving our understanding of how carbohydrates function in biological systems. In this study, we extend a variant of the Hamiltonian replica-exchange molecular dynamics (MD) simulation to improve the conformational sampling of saccharides in an explicit solvent. During the simulations, a biasing potential along the glycosidic-dihedral linkage between the saccharide monomer units in an oligomer is applied at various levels along the replica runs to enable effective transitions between various conformations. One reference replica runs under the control of the original force field. The method was tested on disaccharide structures and further validated on biologically relevant blood group B, Lewis X and Lewis A trisaccharides. The biasing potential-based replica-exchange molecular dynamics (BP-REMD) method provided a significantly improved sampling of relevant conformational states compared with standard continuous MD simulations, with modest computational costs. Thus, the proposed BP-REMD approach adds a new dimension to existing carbohydrate conformational sampling approaches by enhancing conformational sampling in the presence of solvent molecules explicitly at relatively low computational cost.

  18. Kinematic Evolution of Simulated Star-Forming Galaxies

    NASA Technical Reports Server (NTRS)

    Kassin, Susan A.; Brooks, Alyson; Governato, Fabio; Weiner, Benjamin J.; Gardner, Jonathan P.

    2014-01-01

    Recent observations have shown that star-forming galaxies like our own Milky Way evolve kinematically into ordered thin disks over the last approximately 8 billion years since z = 1.2, undergoing a process of "disk settling." For the first time, we study the kinematic evolution of a suite of four state of the art "zoom in" hydrodynamic simulations of galaxy formation and evolution in a fully cosmological context and compare with these observations. Until now, robust measurements of the internal kinematics of simulated galaxies were lacking as the simulations suffered from low resolution, overproduction of stars, and overly massive bulges. The current generation of simulations has made great progress in overcoming these difficulties and is ready for a kinematic analysis. We show that simulated galaxies follow the same kinematic trends as real galaxies: they progressively decrease in disordered motions (sigma(sub g)) and increase in ordered rotation (V(sub rot)) with time. The slopes of the relations between both sigma(sub g) and V(sub rot) with redshift are consistent between the simulations and the observations. In addition, the morphologies of the simulated galaxies become less disturbed with time, also consistent with observations. This match between the simulated and observed trends is a significant success for the current generation of simulations, and a first step in determining the physical processes behind disk settling.

  19. Layout-aware simulation of soft errors in sub-100 nm integrated circuits

    NASA Astrophysics Data System (ADS)

    Balbekov, A.; Gorbunov, M.; Bobkov, S.

    2016-12-01

    Single Event Transient (SET) caused by charged particle traveling through the sensitive volume of integral circuit (IC) may lead to different errors in digital circuits in some cases. In technologies below 180 nm, a single particle can affect multiple devices causing multiple SET. This fact adds the complexity to fault tolerant devices design, because the schematic design techniques become useless without their layout consideration. The most common layout mitigation technique is a spatial separation of sensitive nodes of hardened circuits. Spatial separation decreases the circuit performance and increases power consumption. Spacing should thus be reasonable and its scaling follows the device dimensions' scaling trend. This paper presents the development of the SET simulation approach comprised of SPICE simulation with "double exponent" current source as SET model. The technique uses layout in GDSII format to locate nearby devices that can be affected by a single particle and that can share the generated charge. The developed software tool automatizes multiple simulations and gathers the produced data to present it as the sensitivity map. The examples of conducted simulations of fault tolerant cells and their sensitivity maps are presented in this paper.

  20. Incorporation of EGPWS in the NASA Ames Research Center 747-400 Flight Simulator

    NASA Technical Reports Server (NTRS)

    Sallant, Ghislain; DeGennaro, Robert A.

    2001-01-01

    The NASA Ames Research Center CAE Boeing 747300 flight simulator is used primarily for the study of human factors in aviation safety. The simulator is constantly upgraded to maintain a configuration match to a specific United Airlines aircraft and maintains the highest level of FAA certification to ensure credibility to the results of research programs. United's 747-400 fleet and hence the simulator are transitioning from the older Ground Proximity Warning System (GPWS) to the state-of-the-art Enhanced Ground Proximity Warning System (EGPWS). GPWS was an early attempt to reduce or eliminate Controlled Flight Into Terrain (CFIT). Basic GPWS alerting modes include: excessive descent rate, excessive terrain closure rate, altitude loss after takeoff, unsafe terrain clearance, excessive deviation below glideslope, advisory callouts and windshear alerting. However, since GPWS uses the radar altimeter which looks straight down, ample warning is not always provided. EGPWS retains all of the basic functions of GPWS but adds the ability to look ahead by comparing the aircraft position to an internal database and provide additional alerting and display capabilities. This paper evaluates three methods of incorporating EGPWS in the simulator and describes the implementation and architecture of the preferred option.

  1. Conservative GRMHD simulations of moderately thin, tilted accretion disks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teixeira, Danilo Morales; Fragile, P. Chris; Zhuravlev, Viacheslav V.

    2014-12-01

    This paper presents our latest numerical simulations of accretion disks that are misaligned with respect to the rotation axis of a Kerr black hole. In this work, we use a new, fully conservative version of the Cosmos++ general relativistic magnetohydrodynamics (GRMHD) code, coupled with an ad hoc cooling function designed to control the thickness of the disk. Together these allow us to simulate the thinnest tilted accretion disks ever using a GRMHD code. In this way, we are able to probe the regime where the dimensionless stress and scale height of the disk become comparable. We present results for bothmore » prograde and retrograde cases. The simulated prograde tilted disk shows no sign of Bardeen-Petterson alignment even in the innermost parts of the disk. The simulated retrograde tilted disk, however, does show modest alignment. The implication of these results is that the parameter space associated with Bardeen-Petterson alignment for prograde disks may be rather small, only including very thin disks. Unlike our previous work, we find no evidence for standing shocks in our simulated tilted disks. We ascribe this to the black hole spin, tilt angle, and disk scale height all being small in these simulations. We also add to the growing body of literature pointing out that the turbulence driven by the magnetorotational instability in global simulations of accretion disks is not isotropic. Finally, we provide a comparison between our moderately thin, untilted reference simulation and other numerical simulations of thin disks in the literature.« less

  2. 2016 Institutional Computing Progress Report for w14_firetec

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Judith W.; Linn, Rodman

    2016-07-14

    This is a computing progress report for w14_firetec. FIRETEC simulations will explore the prescribed fire ignition methods to achieve burning objectives (understory reduction and ecosystem health) but at the same time minimize the risk of escaped fire.

  3. Idelalisib for the treatment of non-Hodgkin lymphoma

    PubMed Central

    Gopal, Ajay; Graf, Solomon

    2016-01-01

    Introduction B-cell Non-Hodgkin lymphomas (B-NHLs) include a number of disease subtypes, each defined by the tempo of disease progression and the identity of the cancerous cell. Idelalisib is a potent, selective inhibitor of the delta isoform of phosphatidylinositol-3-kinase (PI3K), a lipid kinase whose over-activity in B-NHL drives disease progression. Idelalisib has demonstrated activity in indolent B-NHL (iB-NHL) and is approved for use as monotherapy in patients with follicular lymphoma and small lymphocytic lymphoma and in combination with rituximab in patients with chronic lymphocytic leukemia. Areas Covered Herein we review the development and pharmacology of idelalisib, its safety and efficacy in clinical studies of iB-NHL, and its potential for inclusion in future applications in iB-NHL and in combination with other therapies. Expert Opinion Idelalisib adds to the growing arsenal of iB-NHL pharmacotherapeutics and to the progression of the field toward precision agents with good efficacy and reduced toxicities. Nevertheless, idelalisib carries important risks that require careful patient counseling and monitoring. The appropriate sequencing of idelalisib with other proven treatment options in addition to its potential for combination with established or novel drugs will be borne out in ongoing and planned investigations. PMID:26818003

  4. A cross sectional study of two independent cohorts identifies serum biomarkers for facioscapulohumeral muscular dystrophy (FSHD)

    PubMed Central

    Petek, Lisa M.; Rickard, Amanda M.; Budech, Christopher; Poliachik, Sandra L.; Shaw, Dennis; Ferguson, Mark R.; Tawil, Rabi; Friedman, Seth D.; Miller, Daniel G.

    2016-01-01

    Measuring the severity and progression of facioscapulohumeral muscular dystrophy (FSHD) is particularly challenging because muscle weakness progresses over long periods of time and can be sporadic. Biomarkers are essential for measuring disease burden and testing treatment strategies. We utilized the sensitive, specific, high-throughput SomaLogic proteomics platform of 1129 proteins to identify proteins with levels that correlate with FSHD severity in a cross-sectional study of two independent cohorts. We discovered biomarkers that correlate with clinical severity and disease burden measured by magnetic resonance imaging. Sixty-eight proteins in the Rochester cohort (n = 48) and 51 proteins in the Seattle cohort (n = 30) had significantly different levels in FSHD-affected individuals when compared with controls (p-value ≤ .005). A subset of these varied by at least 1.5 fold and four biomarkers were significantly elevated in both cohorts. Levels of creatine kinase MM and MB isoforms, carbonic anhydrase III, and troponin I type 2 reliably predicted the disease state and correlated with disease severity. Other novel biomarkers were also discovered that may reveal mechanisms of disease pathology. Assessing the levels of these biomarkers during clinical trials may add significance to other measures of quantifying disease progression or regression. PMID:27185459

  5. Boundary acquisition for setup of numerical simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diegert, C.

    1997-12-31

    The author presents a work flow diagram that includes a path that begins with taking experimental measurements, and ends with obtaining insight from results produced by numerical simulation. Two examples illustrate this path: (1) Three-dimensional imaging measurement at micron scale, using X-ray tomography, provides information on the boundaries of irregularly-shaped alumina oxide particles held in an epoxy matrix. A subsequent numerical simulation predicts the electrical field concentrations that would occur in the observed particle configurations. (2) Three-dimensional imaging measurement at meter scale, again using X-ray tomography, provides information on the boundaries fossilized bone fragments in a Parasaurolophus crest recently discoveredmore » in New Mexico. A subsequent numerical simulation predicts acoustic response of the elaborate internal structure of nasal passageways defined by the fossil record. The author must both add value, and must change the format of the three-dimensional imaging measurements before the define the geometric boundary initial conditions for the automatic mesh generation, and subsequent numerical simulation. The author applies a variety of filters and statistical classification algorithms to estimate the extents of the structures relevant to the subsequent numerical simulation, and capture these extents as faceted geometries. The author will describe the particular combination of manual and automatic methods used in the above two examples.« less

  6. Figures of Merit for Lunar Simulants

    NASA Technical Reports Server (NTRS)

    Slane, Frederick A.; Rickman, Douglas L.

    2012-01-01

    At an earlier SRR the concept for an international standard on Lunar regolith simulants was presented. The international standard, ISO 10788, Lunar Simulants, has recently been published. This paper presents the final content of the standard. Therefore, we are presenting an update of the following: The collection and analysis of lunar samples from 1969 to present has yielded large amounts of data. Published analyses give some idea of the complex nature of the regolith at all scales, rocks, soils and the smaller particulates commonly referred to as dust. Data recently acquired in support of NASA s simulant effort has markedly increased our knowledge and quantitatively demonstrates that complexity. It is anticipated that future analyses will further add to the known complexity. In an effort to communicate among the diverse technical communities performing research on or research using regolith samples and simulants, a set of Figures of Merit (FoM) have been devised. The objective is to allow consistent and concise comparative communication between researchers from multiple organizations and nations engaged in lunar exploration. This paper describes Figures of Merit in a new international standard for Lunar Simulants. The FoM methodology uses scientific understanding of the lunar samples to formulate parameters which are reproducibly quantifiable. Contaminants and impurities in the samples are also addressed.

  7. Ensemble-Biased Metadynamics: A Molecular Simulation Method to Sample Experimental Distributions

    PubMed Central

    Marinelli, Fabrizio; Faraldo-Gómez, José D.

    2015-01-01

    We introduce an enhanced-sampling method for molecular dynamics (MD) simulations referred to as ensemble-biased metadynamics (EBMetaD). The method biases a conventional MD simulation to sample a molecular ensemble that is consistent with one or more probability distributions known a priori, e.g., experimental intramolecular distance distributions obtained by double electron-electron resonance or other spectroscopic techniques. To this end, EBMetaD adds an adaptive biasing potential throughout the simulation that discourages sampling of configurations inconsistent with the target probability distributions. The bias introduced is the minimum necessary to fulfill the target distributions, i.e., EBMetaD satisfies the maximum-entropy principle. Unlike other methods, EBMetaD does not require multiple simulation replicas or the introduction of Lagrange multipliers, and is therefore computationally efficient and straightforward in practice. We demonstrate the performance and accuracy of the method for a model system as well as for spin-labeled T4 lysozyme in explicit water, and show how EBMetaD reproduces three double electron-electron resonance distance distributions concurrently within a few tens of nanoseconds of simulation time. EBMetaD is integrated in the open-source PLUMED plug-in (www.plumed-code.org), and can be therefore readily used with multiple MD engines. PMID:26083917

  8. Simulation and training in Urology - in collaboration with ESU/ESUT.

    PubMed

    Veneziano, Domenico; Cacciamani, Giovanni; Shekhar Biyani, Chandra

    2018-01-01

    Being a Surgeon today means taking on your shoulders countless responsibilities. It is definitely a high-stakes job but, even though the professionals do not go through the intense, focused and demanding training schedule as followed by the other equally risky fields, it doesn't yet require any practical training certification. Simulation was introduced in the aviation field in the early '30s with the "Link Trainer", designed to reproduce the most difficult flying case scenario: landing on an air-carrier. After almost a century, flight simulation is still becoming more sophisticated, while surgical training is slowly starting to fill the gap. The aim of a simulator is to produce an "imitation of the operation of a real-world process or system over time". This short but effective definition explains why simulators are utilised across different fields. There is no doubt that surgeons are continuously undergoing a condition of stress, even in nonthreatening situations, while performing a procedure. This condition adds a relevant variable to surgery, meaning that mastering technical skills is not always equal to "safe surgery". This is why "non-technical skills" (NTS) training should be a part of any simulation based training opportunity and will probably start to be always more part of the Handson Training programs.

  9. A Functional Comparison of Lunar Regoliths and Their Simulants

    NASA Technical Reports Server (NTRS)

    Rickman, D.; Edmunson, J.; McLemore, C.

    2012-01-01

    Lunar regolith simulants are essential to the development of technology for human exploration of the Moon. Any equipment that will interact with the surface environment must be tested with simulant to mitigate risk. To reduce the greatest amount of risk, the simulant must replicate the lunar surface as well as possible. To quantify the similarities and differences between simulants, the Figures of Merit were developed. The Figures of Merit software compares the simulants and regolith by particle size, particle shape, density, and bulk chemistry and mineralogy; these four properties dictate the majority of the remaining characteristics of a geologic material. There are limitations to both the current Figures of Merit approach and simulants in general. The effect of particle textures is lacking in the Figures of Merit software, and research into this topic has only recently begun with applications to simulants. In addition, not all of the properties for lunar regolith are defined sufficiently for simulant reproduction or comparison; for example, the size distribution of particles greater than 1 centimeter and the makeup of particles less than 10 micrometers is not well known. For simulants, contamination by terrestrial weathering products or undesired trace phases in feedstock material is a major issue. Vapor deposited rims have not yet been created for simulants. Fortunately, previous limitations such as the lack of agglutinates in simulants have been addressed and commercial companies are now making agglutinate material for simulants. Despite some limitations, the Figures of Merit sufficiently quantify the comparison between simulants and regolith for useful application in lunar surface technology. Over time, the compilation and analysis of simulant user data will add an advantageous predictive capability to the Figures of Merit, accurately relating Figures of Merit characteristics to simulant user parameters.

  10. Young women as smokers and nonsmokers: a qualitative social identity approach.

    PubMed

    Lennon, Alexia; Gallois, Cindy; Owen, Neville; McDermott, Liane

    2005-12-01

    The authors used a social identity perspective to explore young women's perceptions of smoking. They carried out 13 focus groups and 6 intercept interviews with women aged 16 to 28 years in regards to the social identities that might influence young women's smoking behavior. Three identities emerged: the cool smoker applied to the initiation of smoking; considerate smokers, who were older addicted smokers; and the actual and anticipated good mother identity, which applied to young women who quit smoking during pregnancy. These identities add to our understanding of the meaning of smoking within the lives of young women and might allow more focused initiatives with this group to prevent the progression to regular addicted smoking.

  11. VISUALIZING IRON IN MULTIPLE SCLEROSIS

    PubMed Central

    Bagnato, Francesca; Hametner, Simon; Welch, Edward Brian

    2012-01-01

    Magnetic resonance imaging (MRI) protocols that are designed to be sensitive to iron typically take advantage of (1) iron effects on the relaxation of water protons and/or (2) iron-induced local magnetic field susceptibility changes. Increasing evidence sustains the notion that imaging iron in brain of patients with multiple sclerosis (MS) may add some specificity toward the identification of the disease pathology. The present review summarizes currently reported in vivo and post mortem MRI evidence of (1) iron detection in white matter and grey matter of MS brains, (2) pathological and physiological correlates of iron as disclosed by imaging and (3) relations between iron accumulation and disease progression as measured by clinical metrics. PMID:23347601

  12. A User’s Guide for LOGATAK. A Simulation Model to Analyze Logistic Network Distribution and Interdiction,

    DTIC Science & Technology

    1981-04-15

    BLANK ADD LINK DATA TO LIBRARY 0 DELETE LINK DATA FROM LIBRARY C CHANGE ONE OR MORE VARIABLES OF DEFINED LINK IN LIBARY 25 ALTERATIONST...DELETE TERMINAL DATA FROM LIBRARY C CHANGE ONE OR MORE VARIABLES OF OEFINED TERMINAL IN LIBARY Unlike link data only one card is needed to change the...sector data cards identify the sets of data to be selected from the data library. Each data set is defined by a five digit number. The first four digits

  13. Simulating the Rayleigh-Taylor instability with the Ising model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ball, Justin R.; Elliott, James B.

    2011-08-26

    The Ising model, implemented with the Metropolis algorithm and Kawasaki dynamics, makes a system with its own physics, distinct from the real world. These physics are sophisticated enough to model behavior similar to the Rayleigh-Taylor instability and by better understanding these physics, we can learn how to modify the system to better re ect reality. For example, we could add a v x and a v y to each spin and modify the exchange rules to incorporate them, possibly using two body scattering laws to construct a more realistic system.

  14. Periodically poled silicon

    NASA Astrophysics Data System (ADS)

    Hon, Nick K.; Tsia, Kevin K.; Solli, Daniel R.; Jalali, Bahram

    2009-03-01

    We propose a new class of photonic devices based on periodic stress fields in silicon that enable second-order nonlinearity as well as quasi-phase matching. Periodically poled silicon (PePSi) adds the periodic poling capability to silicon photonics and allows the excellent crystal quality and advanced manufacturing capabilities of silicon to be harnessed for devices based on second-order nonlinear effects. As an example of the utility of the PePSi technology, we present simulations showing that midwave infrared radiation can be efficiently generated through difference frequency generation from near-infrared with a conversion efficiency of 50%.

  15. A survey of Applied Psychological Services' models of the human operator

    NASA Technical Reports Server (NTRS)

    Siegel, A. I.; Wolf, J. J.

    1979-01-01

    A historical perspective is presented in terms of the major features and status of two families of computer simulation models in which the human operator plays the primary role. Both task oriented and message oriented models are included. Two other recent efforts are summarized which deal with visual information processing. They involve not whole model development but a family of subroutines customized to add the human aspects to existing models. A global diagram of the generalized model development/validation process is presented and related to 15 criteria for model evaluation.

  16. A new collage steganographic algorithm using cartoon design

    NASA Astrophysics Data System (ADS)

    Yi, Shuang; Zhou, Yicong; Pun, Chi-Man; Chen, C. L. Philip

    2014-02-01

    Existing collage steganographic methods suffer from low payload of embedding messages. To improve the payload while providing a high level of security protection to messages, this paper introduces a new collage steganographic algorithm using cartoon design. It embeds messages into the least significant bits (LSBs) of color cartoon objects, applies different permutations to each object, and adds objects to a cartoon cover image to obtain the stego image. Computer simulations and comparisons demonstrate that the proposed algorithm shows significantly higher capacity of embedding messages compared with existing collage steganographic methods.

  17. Development of advanced avionics systems applicable to terminal-configured vehicles

    NASA Technical Reports Server (NTRS)

    Heimbold, R. L.; Lee, H. P.; Leffler, M. F.

    1980-01-01

    A technique to add the time constraint to the automatic descent feature of the existing L-1011 aircraft Flight Management System (FMS) was developed. Software modifications were incorporated in the FMS computer program and the results checked by lab simulation and on a series of eleven test flights. An arrival time dispersion (2 sigma) of 19 seconds was achieved. The 4 D descent technique can be integrated with the time-based metering method of air traffic control. Substantial reductions in delays at today's busy airports should result.

  18. Characterizing bias correction uncertainty in wheat yield predictions

    NASA Astrophysics Data System (ADS)

    Ortiz, Andrea Monica; Jones, Julie; Freckleton, Robert; Scaife, Adam

    2017-04-01

    Farming systems are under increased pressure due to current and future climate change, variability and extremes. Research on the impacts of climate change on crop production typically rely on the output of complex Global and Regional Climate Models, which are used as input to crop impact models. Yield predictions from these top-down approaches can have high uncertainty for several reasons, including diverse model construction and parameterization, future emissions scenarios, and inherent or response uncertainty. These uncertainties propagate down each step of the 'cascade of uncertainty' that flows from climate input to impact predictions, leading to yield predictions that may be too complex for their intended use in practical adaptation options. In addition to uncertainty from impact models, uncertainty can also stem from the intermediate steps that are used in impact studies to adjust climate model simulations to become more realistic when compared to observations, or to correct the spatial or temporal resolution of climate simulations, which are often not directly applicable as input into impact models. These important steps of bias correction or calibration also add uncertainty to final yield predictions, given the various approaches that exist to correct climate model simulations. In order to address how much uncertainty the choice of bias correction method can add to yield predictions, we use several evaluation runs from Regional Climate Models from the Coordinated Regional Downscaling Experiment over Europe (EURO-CORDEX) at different resolutions together with different bias correction methods (linear and variance scaling, power transformation, quantile-quantile mapping) as input to a statistical crop model for wheat, a staple European food crop. The objective of our work is to compare the resulting simulation-driven hindcasted wheat yields to climate observation-driven wheat yield hindcasts from the UK and Germany in order to determine ranges of yield uncertainty that result from different climate model simulation input and bias correction methods. We simulate wheat yields using a General Linear Model that includes the effects of seasonal maximum temperatures and precipitation, since wheat is sensitive to heat stress during important developmental stages. We use the same statistical model to predict future wheat yields using the recently available bias-corrected simulations of EURO-CORDEX-Adjust. While statistical models are often criticized for their lack of complexity, an advantage is that we are here able to consider only the effect of the choice of climate model, resolution or bias correction method on yield. Initial results using both past and future bias-corrected climate simulations with a process-based model will also be presented. Through these methods, we make recommendations in preparing climate model output for crop models.

  19. Nanoscale plasmonic waveguides for filtering and demultiplexing devices

    NASA Astrophysics Data System (ADS)

    Akjouj, A.; Noual, A.; Pennec, Y.; Bjafari-Rouhani, B.

    2010-05-01

    Numerical simulations, based on a FDTD (finite-difference-time-domain) method, of infrared light propagation for add/drop filtering in two-dimensional (2D) Ag-SiO2-Ag resonators are reported to design 2D Y-bent plasmonic waveguides with possible applications in telecommunication WDM (wavelength demultiplexing). First, we study optical transmission and reflection of a nanoscale SiO2 waveguide coupled to a nanocavity of the same insulator located either inside or on the side of a linear waveguide sandwiched between Ag. According to the inside or outside positioning of the nanocavity with respect to the waveguide, the transmission spectrum displays peaks or dips, respectively, which occur at the same central frequency. A fundamental study of the possible cavity modes in the near-infrared frequency band is also given. These filtering properties are then exploited to propose a nanoscale demultiplexer based on a Y-shaped plasmonic waveguide for separation of two different wavelengths, in selection or rejection, from an input broadband signal around 1550 nm. We detail coupling of the 2D add/drop Y connector to two cavities inserted on each of its branches.

  20. Jet crackle: skewness transport budget and a mechanistic source model

    NASA Astrophysics Data System (ADS)

    Buchta, David; Freund, Jonathan

    2016-11-01

    The sound from high-speed (supersonic) jets, such as on military aircraft, is distinctly different than that from lower-speed jets, such as on commercial airliners. Atop the already loud noise, a higher speed adds an intense, fricative, and intermittent character. The observed pressure wave patterns have strong peaks which are followed by relatively long shallows; notably, their pressure skewness is Sk >= 0 . 4 . Direct numerical simulation of free-shear-flow turbulence show that these skewed pressure waves occur immediately adjacent to the turbulence source for M >= 2 . 5 . Additionally, the near-field waves are seen to intersect and nonlinearly merge with other waves. Statistical analysis of terms in a pressure skewness transport equation show that starting just beyond δ99 the nonlinear wave mechanics that add to Sk are balanced by damping molecular effects, consistent with this aspect of the sound arising in the source region. A gas dynamics description is developed that neglects rotational turbulence dynamics and yet reproduces the key crackle features. At its core, this mechanism shows simply that nonlinear compressive effects lead directly to stronger compressions than expansions and thus Sk > 0 .

  1. Advanced Software Development Workstation Project

    NASA Technical Reports Server (NTRS)

    Lee, Daniel

    1989-01-01

    The Advanced Software Development Workstation Project, funded by Johnson Space Center, is investigating knowledge-based techniques for software reuse in NASA software development projects. Two prototypes have been demonstrated and a third is now in development. The approach is to build a foundation that provides passive reuse support, add a layer that uses domain-independent programming knowledge, add a layer that supports the acquisition of domain-specific programming knowledge to provide active support, and enhance maintainability and modifiability through an object-oriented approach. The development of new application software would use specification-by-reformulation, based on a cognitive theory of retrieval from very long-term memory in humans, and using an Ada code library and an object base. Current tasks include enhancements to the knowledge representation of Ada packages and abstract data types, extensions to support Ada package instantiation knowledge acquisition, integration with Ada compilers and relational databases, enhancements to the graphical user interface, and demonstration of the system with a NASA contractor-developed trajectory simulation package. Future work will focus on investigating issues involving scale-up and integration.

  2. Quadrature Moments Method for the Simulation of Turbulent Reactive Flows

    NASA Technical Reports Server (NTRS)

    Raman, Venkatramanan; Pitsch, Heinz; Fox, Rodney O.

    2003-01-01

    A sub-filter model for reactive flows, namely the DQMOM model, was formulated for Large Eddy Simulation (LES) using the filtered mass density function. Transport equations required to determine the location and size of the delta-peaks were then formulated for a 2-peak decomposition of the FDF. The DQMOM scheme was implemented in an existing structured-grid LES solver. Simulations of scalar shear layer using an experimental configuration showed that the first and second moments of both reactive and inert scalars are in good agreement with a conventional Lagrangian scheme that evolves the same FDF. Comparisons with LES simulations performed using laminar chemistry assumption for the reactive scalar show that the new method provides vast improvements at minimal computational cost. Currently, the DQMOM model is being implemented for use with the progress variable/mixture fraction model of Pierce. Comparisons with experimental results and LES simulations using a single-environment for the progress-variable are planned. Future studies will aim at understanding the effect of increase in environments on predictions.

  3. Recent progress in utilization planning for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Bartoe, John-David F.; Thiringer, Peter S.

    1991-01-01

    The progress made in utilization planning for the redesigned Space Station Freedom (SSF) concept is described. Consideration is given to the SSF user capabilities, the strategic planning process, the strategic planning organizations, and the Consolidated Operations and Utilization Plan (COUP, which will be released in January 1993) as well as to the COUP development process and implementation. The process by which the COUP will be produced was exercised in the international Multilateral Strategic and Tactical Integration Process (MUSTIP) simulation. The paper describes the MUSTIP simulation and its activities along with MUSTIP findings and recommendations.

  4. Design for progressive fracture in composite shell structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Murthy, Pappu L. N.

    1992-01-01

    The load carrying capability and structural behavior of composite shell structures and stiffened curved panels are investigated to provide accurate early design loads. An integrated computer code is utilized for the computational simulation of composite structural degradation under practical loading for realistic design. Damage initiation, growth, accumulation, and propagation to structural fracture are included in the simulation. Progressive fracture investigations providing design insight for several classes of composite shells are presented. Results demonstrate the significance of local defects, interfacial regions, and stress concentrations on the structural durability of composite shells.

  5. Effects of magnetospheric lobe cell convection on dayside upper thermospheric winds at high latitudes

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Wang, W.; Wu, Q.; Knipp, D.; Kilcommons, L.; Brambles, O. J.; Liu, J.; Wiltberger, M.; Lyon, J. G.; Häggström, I.

    2016-08-01

    This paper investigates a possible physical mechanism of the observed dayside high-latitude upper thermospheric wind using numerical simulations from the coupled magnetosphere-ionosphere-thermosphere (CMIT) model. Results show that the CMIT model is capable of reproducing the unexpected afternoon equatorward winds in the upper thermosphere observed by the High altitude Interferometer WIND observation (HIWIND) balloon. Models that lack adequate coupling produce poleward winds. The modeling study suggests that ion drag driven by magnetospheric lobe cell convection is another possible mechanism for turning the climatologically expected dayside poleward winds to the observed equatorward direction. The simulation results are validated by HIWIND, European Incoherent Scatter, and Defense Meteorological Satellite Program. The results suggest a strong momentum coupling between high-latitude ionospheric plasma circulation and thermospheric neutral winds in the summer hemisphere during positive IMF Bz periods, through the formation of magnetospheric lobe cell convection driven by persistent positive IMF By. The CMIT simulation adds important insight into the role of dayside coupling during intervals of otherwise quiet geomagnetic activity

  6. Experimental monitoring and numerical study of pesticide (carbofuran) transfer in an agricultural soil at a field site

    NASA Astrophysics Data System (ADS)

    Hmimou, Abderrahim; Maslouhi, Abdellatif; Tamoh, Karim; Candela, Lucila

    2014-09-01

    We studied the transport of a pesticide at field scale, namely carbofuran molecule, which is known for its high mobility, especially in sandy soils with high hydraulic conductivity and low organic matter. To add to our knowledge of the future of this high-mobility molecule in this type of soils, we developed a mechanistic numerical model allowing the simulation of hydric and solute transfers (bromide and carbofuran) in the soil. We carried out this study in an agricultural plot in the region of Mnasra in Morocco. Confrontation of the measured and simulated values allowed the calibration of the parameters of hydric transfer and carbofuran. The developed model accurately reproduces the measured values. Despite a weak irrigation and precipitation regime, carbofuran was practically leached beyond the root zone. Prospective simulations show that under a more important irrigation regime, carbofuran reaches a 100-cm depth, whereas it does not exceed 60 cm under a deficit regime.

  7. Real-time out-of-plane artifact subtraction tomosynthesis imaging using prior CT for scanning beam digital x-ray system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Meng, E-mail: mengwu@stanford.edu; Fahrig, Rebecca

    2014-11-01

    Purpose: The scanning beam digital x-ray system (SBDX) is an inverse geometry fluoroscopic system with high dose efficiency and the ability to perform continuous real-time tomosynthesis in multiple planes. This system could be used for image guidance during lung nodule biopsy. However, the reconstructed images suffer from strong out-of-plane artifact due to the small tomographic angle of the system. Methods: The authors propose an out-of-plane artifact subtraction tomosynthesis (OPAST) algorithm that utilizes a prior CT volume to augment the run-time image processing. A blur-and-add (BAA) analytical model, derived from the project-to-backproject physical model, permits the generation of tomosynthesis images thatmore » are a good approximation to the shift-and-add (SAA) reconstructed image. A computationally practical algorithm is proposed to simulate images and out-of-plane artifacts from patient-specific prior CT volumes using the BAA model. A 3D image registration algorithm to align the simulated and reconstructed images is described. The accuracy of the BAA analytical model and the OPAST algorithm was evaluated using three lung cancer patients’ CT data. The OPAST and image registration algorithms were also tested with added nonrigid respiratory motions. Results: Image similarity measurements, including the correlation coefficient, mean squared error, and structural similarity index, indicated that the BAA model is very accurate in simulating the SAA images from the prior CT for the SBDX system. The shift-variant effect of the BAA model can be ignored when the shifts between SBDX images and CT volumes are within ±10 mm in the x and y directions. The nodule visibility and depth resolution are improved by subtracting simulated artifacts from the reconstructions. The image registration and OPAST are robust in the presence of added respiratory motions. The dominant artifacts in the subtraction images are caused by the mismatches between the real object and the prior CT volume. Conclusions: Their proposed prior CT-augmented OPAST reconstruction algorithm improves lung nodule visibility and depth resolution for the SBDX system.« less

  8. Annual Research Briefs

    NASA Technical Reports Server (NTRS)

    Spinks, Debra (Compiler)

    1997-01-01

    This report contains the 1997 annual progress reports of the research fellows and students supported by the Center for Turbulence Research (CTR). Titles include: Invariant modeling in large-eddy simulation of turbulence; Validation of large-eddy simulation in a plain asymmetric diffuser; Progress in large-eddy simulation of trailing-edge turbulence and aeronautics; Resolution requirements in large-eddy simulations of shear flows; A general theory of discrete filtering for LES in complex geometry; On the use of discrete filters for large eddy simulation; Wall models in large eddy simulation of separated flow; Perspectives for ensemble average LES; Anisotropic grid-based formulas for subgrid-scale models; Some modeling requirements for wall models in large eddy simulation; Numerical simulation of 3D turbulent boundary layers using the V2F model; Accurate modeling of impinging jet heat transfer; Application of turbulence models to high-lift airfoils; Advances in structure-based turbulence modeling; Incorporating realistic chemistry into direct numerical simulations of turbulent non-premixed combustion; Effects of small-scale structure on turbulent mixing; Turbulent premixed combustion in the laminar flamelet and the thin reaction zone regime; Large eddy simulation of combustion instabilities in turbulent premixed burners; On the generation of vorticity at a free-surface; Active control of turbulent channel flow; A generalized framework for robust control in fluid mechanics; Combined immersed-boundary/B-spline methods for simulations of flow in complex geometries; and DNS of shock boundary-layer interaction - preliminary results for compression ramp flow.

  9. Design for inadvertent damage in composite laminates

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.; Chamis, Christos C.

    1992-01-01

    Simplified predictive methods and models to computationally simulate durability and damage in polymer matrix composite materials/structures are described. The models include (1) progressive fracture, (2) progressively damaged structural behavior, (3) progressive fracture in aggressive environments, (4) stress concentrations, and (5) impact resistance. Several examples are included to illustrate applications of the models and to identify significant parameters and sensitivities. Comparisons with limited experimental data are made.

  10. The impact of individual-level heterogeneity on estimated infectious disease burden: a simulation study.

    PubMed

    McDonald, Scott A; Devleesschauwer, Brecht; Wallinga, Jacco

    2016-12-08

    Disease burden is not evenly distributed within a population; this uneven distribution can be due to individual heterogeneity in progression rates between disease stages. Composite measures of disease burden that are based on disease progression models, such as the disability-adjusted life year (DALY), are widely used to quantify the current and future burden of infectious diseases. Our goal was to investigate to what extent ignoring the presence of heterogeneity could bias DALY computation. Simulations using individual-based models for hypothetical infectious diseases with short and long natural histories were run assuming either "population-averaged" progression probabilities between disease stages, or progression probabilities that were influenced by an a priori defined individual-level frailty (i.e., heterogeneity in disease risk) distribution, and DALYs were calculated. Under the assumption of heterogeneity in transition rates and increasing frailty with age, the short natural history disease model predicted 14% fewer DALYs compared with the homogenous population assumption. Simulations of a long natural history disease indicated that assuming homogeneity in transition rates when heterogeneity was present could overestimate total DALYs, in the present case by 4% (95% quantile interval: 1-8%). The consequences of ignoring population heterogeneity should be considered when defining transition parameters for natural history models and when interpreting the resulting disease burden estimates.

  11. Future directions for simulation of recreation use

    Treesearch

    David N. Cole

    2005-01-01

    As the case studies in Chapter 4 illustrate, simulation modeling can be a valuable tool for recreation planning and management. Although simulation modeling is already well developed for business applications, its adaptation to recreation management is less developed. Relatively few resources have been devoted to realizing its potential. Further progress is needed in...

  12. Computer-Aided Engineering Tools | Water Power | NREL

    Science.gov Websites

    energy converters that will provide a full range of simulation capabilities for single devices and arrays simulation of water power technologies on high-performance computers enables the study of complex systems and experimentation. Such simulation is critical to accelerate progress in energy programs within the U.S. Department

  13. SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method.

    PubMed

    Bernal, Javier; Torres-Jimenez, Jose

    2015-01-01

    SAGRAD (Simulated Annealing GRADient), a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller's scaled conjugate gradient algorithm, the latter a variation of the traditional conjugate gradient method, better suited for the nonquadratic nature of neural networks. Different aspects of the implementation of the training process in SAGRAD are discussed, such as the efficient computation of gradients and multiplication of vectors by Hessian matrices that are required by Møller's algorithm; the (re)initialization of weights with simulated annealing required to (re)start Møller's algorithm the first time and each time thereafter that it shows insufficient progress in reaching a possibly local minimum; and the use of simulated annealing when Møller's algorithm, after possibly making considerable progress, becomes stuck at a local minimum or flat area of weight space. Outlines of the scaled conjugate gradient algorithm, the simulated annealing procedure and the training process used in SAGRAD are presented together with results from running SAGRAD on two examples of training data.

  14. Toward a social-ecological theory of forest macrosystems for improved ecosystem management

    USGS Publications Warehouse

    Kleindl, William J.; Stoy, Paul C.; Binford, Michael W.; Desai, Ankur R.; Dietze, Michael C.; Schultz, Courtney A.; Starr, Gregory; Staudhammer, Christina; Wood, David J. A.

    2018-01-01

    The implications of cumulative land-use decisions and shifting climate on forests, require us to integrate our understanding of ecosystems, markets, policy, and resource management into a social-ecological system. Humans play a central role in macrosystem dynamics, which complicates ecological theories that do not explicitly include human interactions. These dynamics also impact ecological services and related markets, which challenges economic theory. Here, we use two forest macroscale management initiatives to develop a theoretical understanding of how management interacts with ecological functions and services at these scales and how the multiple large-scale management goals work either in consort or conflict with other forest functions and services. We suggest that calling upon theories developed for organismal ecology, ecosystem ecology, and ecological economics adds to our understanding of social-ecological macrosystems. To initiate progress, we propose future research questions to add rigor to macrosystem-scale studies: (1) What are the ecosystem functions that operate at macroscales, their necessary structural components, and how do we observe them? (2) How do systems at one scale respond if altered at another scale? (3) How do we both effectively measure these components and interactions, and communicate that information in a meaningful manner for policy and management across different scales?

  15. Authorship and citation manipulation in academic research

    PubMed Central

    2017-01-01

    Some scholars add authors to their research papers or grant proposals even when those individuals contribute nothing to the research effort. Some journal editors coerce authors to add citations that are not pertinent to their work and some authors pad their reference lists with superfluous citations. How prevalent are these types of manipulation, why do scholars stoop to such practices, and who among us is most susceptible to such ethical lapses? This study builds a framework around how intense competition for limited journal space and research funding can encourage manipulation and then uses that framework to develop hypotheses about who manipulates and why they do so. We test those hypotheses using data from over 12,000 responses to a series of surveys sent to more than 110,000 scholars from eighteen different disciplines spread across science, engineering, social science, business, and health care. We find widespread misattribution in publications and in research proposals with significant variation by academic rank, discipline, sex, publication history, co-authors, etc. Even though the majority of scholars disapprove of such tactics, many feel pressured to make such additions while others suggest that it is just the way the game is played. The findings suggest that certain changes in the review process might help to stem this ethical decline, but progress could be slow. PMID:29211744

  16. Authorship and citation manipulation in academic research.

    PubMed

    Fong, Eric A; Wilhite, Allen W

    2017-01-01

    Some scholars add authors to their research papers or grant proposals even when those individuals contribute nothing to the research effort. Some journal editors coerce authors to add citations that are not pertinent to their work and some authors pad their reference lists with superfluous citations. How prevalent are these types of manipulation, why do scholars stoop to such practices, and who among us is most susceptible to such ethical lapses? This study builds a framework around how intense competition for limited journal space and research funding can encourage manipulation and then uses that framework to develop hypotheses about who manipulates and why they do so. We test those hypotheses using data from over 12,000 responses to a series of surveys sent to more than 110,000 scholars from eighteen different disciplines spread across science, engineering, social science, business, and health care. We find widespread misattribution in publications and in research proposals with significant variation by academic rank, discipline, sex, publication history, co-authors, etc. Even though the majority of scholars disapprove of such tactics, many feel pressured to make such additions while others suggest that it is just the way the game is played. The findings suggest that certain changes in the review process might help to stem this ethical decline, but progress could be slow.

  17. Thermal Protection Supplement for Reducing Interface Thermal Mismatch

    NASA Technical Reports Server (NTRS)

    Stewart, David A. (Inventor); Leiser, Daniel B. (Inventor)

    2017-01-01

    A thermal protection system that reduces a mismatch of thermal expansion coefficients CTE between a first material layer (CTE1) and a second material layer (CTE2) at a first layer-second layer interface. A portion of aluminum borosilicate (abs) or another suitable additive (add), whose CTE value, CTE(add), satisfies (CTE(add)-CTE1)(CTE(add)-CTE2)<0, is distributed with variable additive density,.rho.(z;add), in the first material layer and/or in the second material layer, with.rho.(z;add) near the materials interface being relatively high (alternatively, relatively low) and.rho.(z;add) in a region spaced apart from the interface being relatively low (alternatively, relatively high).

  18. Flash-Type Discrimination

    NASA Technical Reports Server (NTRS)

    Koshak, William J.

    2010-01-01

    This viewgraph presentation describes the significant progress made in the flash-type discrimination algorithm development. The contents include: 1) Highlights of Progress for GLM-R3 Flash-Type discrimination Algorithm Development; 2) Maximum Group Area (MGA) Data; 3) Retrieval Errors from Simulations; and 4) Preliminary Global-scale Retrieval.

  19. LVC Architecture Roadmap Implementation - Results of the First Two Years

    DTIC Science & Technology

    2012-03-01

    NOTES Presented at the Simulation Interoperability Standards Organization?s (SISO) Spring Simulation Interoperability Workshop ( SIW ), 26-30 March...presented at the semi-annual Simulation Interoperability Workshops ( SIWs ) and the annual Interservice/Industry Training, Simulation & Education Conference...I/ITSEC), as well as other venues. For example, a full-day workshop on the initial progress of the effort was conducted at the 2010 Spring SIW [2

  20. Effectiveness and tolerability of second-line treatment with vildagliptin versus other oral drugs for type 2 diabetes in a real-world setting in the Middle East: results from the EDGE study

    PubMed Central

    Saab, Charles; Al-Saber, Feryal A; Haddad, Jihad; Jallo, Mahir Khalil; Steitieh, Habib; Bader, Giovanni; Ibrahim, Mohamed

    2015-01-01

    Background Type 2 diabetes mellitus (T2DM) is a chronic progressive disease that requires treatment intensification with antihyperglycemic agents due to progressive deterioration of β-cell function. A large observational study of 45,868 patients with T2DM across 27 countries (EDGE) assessed the effectiveness and safety of vildagliptin as add-on to other oral antidiabetic drugs (OADs) versus other comparator OAD combinations. Here, we present results from the Middle East countries (Bahrain, Jordan, Kuwait, Lebanon, Oman, Palestine, and the United Arab Emirates). Methods Patients inadequately controlled with OAD monotherapy were eligible after the add-on treatment was chosen by the physician based on clinical judgment and patient need. Patients were assigned to either vildagliptin or comparator OADs (sulfonylureas, thiazolidinediones, glinides, α-glucosidase inhibitors, or metformin, except incretin-based therapies) based on the add-on therapy. The primary endpoint was the proportion of patients achieving a glycated hemoglobin (HbA1c) reduction of >0.3% without peripheral edema, hypoglycemia, discontinuation due to a gastrointestinal event, or weight gain ≥5%. One of the secondary endpoints was the proportion of patients achieving HbA1c <7% without hypoglycemia or weight gain. Change in HbA1c from baseline to study endpoint and safety were also assessed. Results Of the 4,780 patients enrolled in the Middle East, 2,513 received vildagliptin and 2,267 received other OADs. Overall, the mean (± standard deviation) age at baseline was 52.1±10.2 years, mean HbA1c was 8.5%±1.3%, and mean T2DM duration was 4.2±4.0 years. The proportion of patients achieving the primary (76.1% versus 61.6%, P<0.0001) and secondary (54.8% versus 29.9%, P<0.0001) endpoints was higher with vildagliptin than with the comparator OADs. The unadjusted odds ratios for the primary and secondary endpoints were 1.98 (95% confidence interval 1.75–2.25) and 2.8 (95% confidence interval 2.5–3.2), respectively, in favor of vildagliptin. Vildagliptin achieved a numerically greater reduction in HbA1c (1.7%) from baseline versus comparator OADs (1.4%). The overall incidence of adverse events was comparable between studied cohorts. Conclusion In real life, treatment with vildagliptin was associated with a higher proportion of patients with T2DM achieving better glycemic control without tolerability issues in the Middle East. PMID:25750538

  1. Effectiveness and tolerability of second-line treatment with vildagliptin versus other oral drugs for type 2 diabetes in a real-world setting in the Middle East: results from the EDGE study.

    PubMed

    Saab, Charles; Al-Saber, Feryal A; Haddad, Jihad; Jallo, Mahir Khalil; Steitieh, Habib; Bader, Giovanni; Ibrahim, Mohamed

    2015-01-01

    Type 2 diabetes mellitus (T2DM) is a chronic progressive disease that requires treatment intensification with antihyperglycemic agents due to progressive deterioration of β-cell function. A large observational study of 45,868 patients with T2DM across 27 countries (EDGE) assessed the effectiveness and safety of vildagliptin as add-on to other oral antidiabetic drugs (OADs) versus other comparator OAD combinations. Here, we present results from the Middle East countries (Bahrain, Jordan, Kuwait, Lebanon, Oman, Palestine, and the United Arab Emirates). Patients inadequately controlled with OAD monotherapy were eligible after the add-on treatment was chosen by the physician based on clinical judgment and patient need. Patients were assigned to either vildagliptin or comparator OADs (sulfonylureas, thiazolidinediones, glinides, α-glucosidase inhibitors, or metformin, except incretin-based therapies) based on the add-on therapy. The primary endpoint was the proportion of patients achieving a glycated hemoglobin (HbA1c) reduction of >0.3% without peripheral edema, hypoglycemia, discontinuation due to a gastrointestinal event, or weight gain≥5%. One of the secondary endpoints was the proportion of patients achieving HbA1c<7% without hypoglycemia or weight gain. Change in HbA1c from baseline to study endpoint and safety were also assessed. Of the 4,780 patients enrolled in the Middle East, 2,513 received vildagliptin and 2,267 received other OADs. Overall, the mean (±standard deviation) age at baseline was 52.1±10.2 years, mean HbA1c was 8.5%±1.3%, and mean T2DM duration was 4.2±4.0 years. The proportion of patients achieving the primary (76.1% versus 61.6%, P<0.0001) and secondary (54.8% versus 29.9%, P<0.0001) endpoints was higher with vildagliptin than with the comparator OADs. The unadjusted odds ratios for the primary and secondary endpoints were 1.98 (95% confidence interval 1.75-2.25) and 2.8 (95% confidence interval 2.5-3.2), respectively, in favor of vildagliptin. Vildagliptin achieved a numerically greater reduction in HbA1c (1.7%) from baseline versus comparator OADs (1.4%). The overall incidence of adverse events was comparable between studied cohorts. In real life, treatment with vildagliptin was associated with a higher proportion of patients with T2DM achieving better glycemic control without tolerability issues in the Middle East.

  2. Hybrid Architectural Framework for C4ISR and Discrete-Event Simulation (DES) to Support Sensor-Driven Model Synthesis in Real-World Scenarios

    DTIC Science & Technology

    2013-09-01

    which utilizes FTA and then loads it into a DES engine to generate simulation results. .......44 Figure 21. This simulation architecture is...While Discrete Event Simulation ( DES ) can provide accurate time estimation and fast simulation speed, models utilizing it often suffer...C4ISR progress in MDW is developed in this research to demonstrate the feasibility of AEMF- DES and explore its potential. The simulation (MDSIM

  3. A simulation model for forecasting downhill ski participation

    Treesearch

    Daniel J. Stynes; Daniel M. Spotts

    1980-01-01

    The purpose of this paper is to describe progress in the development of a general computer simulation model to forecast future levels of outdoor recreation participation. The model is applied and tested for downhill skiing in Michigan.

  4. Progressive Fracture of Laminated Fiber-Reinforced Composite Stiffened Plate Under Pressure

    NASA Technical Reports Server (NTRS)

    Gotsis, Pascalis K.; Abdi, Frank; Chamis, Christos C.; Tsouros, Konstantinos

    2007-01-01

    S-Glass/epoxy laminated fiber-reinforced composite stiffened plate structure with laminate configuration (0/90)5 was simulated to investigate damage and fracture progression, under uniform pressure. For comparison reasons a simple plate was examined, in addition with the stiffened plate. An integrated computer code was used for the simulation. The damage initiation began with matrix failure in tension, continuous with damage and/or fracture progression as a result of additional matrix failure and fiber fracture and followed by additional interply delamination. Fracture through the thickness began when the damage accumulation was 90%. After that stage, the cracks propagate rapidly and the structures collapse. The collapse load for the simple plate is 21.57 MPa (3120 psi) and for the stiffened plate 25.24 MPa (3660 psi).

  5. Skin dose saving of the staff in 90Y/177Lu peptide receptor radionuclide therapy with the automatic dose dispenser.

    PubMed

    Fioroni, Federica; Grassi, Elisa; Giorgia, Cavatorta; Sara, Rubagotti; Piccagli, Vando; Filice, Angelina; Mostacci, Domiziano; Versari, Annibale; Iori, Mauro

    2016-10-01

    When handling Y-labelled and Lu-labelled radiopharmaceuticals, skin exposure is mainly due to β-particles. This study aimed to investigate the equivalent dose saving of the staff when changing from an essentially manual radiolabelling procedure to an automatic dose dispenser (ADD). The chemist and physician were asked to wear thermoluminescence dosimeters on their fingertips to evaluate the quantity of Hp(0.07) on the skin. Data collected were divided into two groups: before introducing ADD (no ADD) and after introducing ADD. For the chemist, the mean values (95th percentile) of Hp(0.07) for no ADD and ADD are 0.030 (0.099) and 0.019 (0.076) mSv/GBq, respectively, for Y, and 0.022 (0.037) and 0.007 (0.023) mSv/GBq, respectively, for Lu. The reduction for ADD was significant (t-test with P<0.05) for both isotopes. The relative differences before and after ADD collected for every finger were treated using the Wilcoxon test, proving a significantly higher reduction in extremity dose to each fingertip for Lu than for Y (P<0.05). For the medical staff, the mean values of Hp(0.07) (95th percentile) for no ADD and ADD are 0.021 (0.0762) and 0.0143 (0.0565) mSv/GBq, respectively, for Y, and 0.0011 (0.00196) and 0.0009 (0.00263) mSv/GBq, respectively, for Lu. The t-test provided a P-value less than 0.05 for both isotopes, making the difference between ADD and no ADD significant. ADD positively affects the dose saving of the chemist in handling both isotopes. For the medical staff not directly involved with the introduction of the ADD system, the analysis shows a learning curve of the workers over a 5-year period. Specific devices and procedures allow staff skin dose to be limited.

  6. Regulatory role of glycogen synthase kinase 3 for transcriptional activity of ADD1/SREBP1c.

    PubMed

    Kim, Kang Ho; Song, Min Jeong; Yoo, Eung Jae; Choe, Sung Sik; Park, Sang Dai; Kim, Jae Bum

    2004-12-10

    Adipocyte determination- and differentiation-dependent factor 1 (ADD1) plays important roles in lipid metabolism and insulin-dependent gene expression. Because insulin stimulates carbohydrate and lipid synthesis, it would be important to decipher how the transcriptional activity of ADD1/SREBP1c is regulated in the insulin signaling pathway. In this study, we demonstrated that glycogen synthase kinase (GSK)-3 negatively regulates the transcriptional activity of ADD1/SREBP1c. GSK3 inhibitors enhanced a transcriptional activity of ADD1/SREBP1c and expression of ADD1/SREBP1c target genes including fatty acid synthase (FAS), acetyl-CoA carboxylase 1 (ACC1), and steroyl-CoA desaturase 1 (SCD1) in adipocytes and hepatocytes. In contrast, overexpression of GSK3beta down-regulated the transcriptional activity of ADD1/SREBP1c. GSK3 inhibitor-mediated ADD1/SREBP1c target gene activation did not require de novo protein synthesis, implying that GSK3 might affect transcriptional activity of ADD1/SREBP1c at the level of post-translational modification. Additionally, we demonstrated that GSK3 efficiently phosphorylated ADD1/SREBP1c in vitro and in vivo. Therefore, these data suggest that GSK3 inactivation is crucial to confer stimulated transcriptional activity of ADD1/SREBP1c for insulin-dependent gene expression, which would coordinate lipid and glucose metabolism.

  7. An Innovative and Successful Simulation Day.

    PubMed

    Bowling, Ann M; Eismann, Michelle

    This article discusses the development of a creative and innovative plan to incorporate independent activities, including skill reviews and scenarios, into a single eight-hour day, using small student groups to enhance the learning process for pediatric nursing students. The simulation day consists of skills activities and pediatric simulation scenarios using the human patient simulator. Using small student groups in simulation captures the students' attention and enhances motivation to learn. The simulation day is a work in progress; appropriate changes are continually being made to improve the simulation experience for students.

  8. Population Simulation, AKA: Grahz, Rahbitz and Fawkzes

    NASA Technical Reports Server (NTRS)

    Bangert, Tyler R.

    2008-01-01

    In an effort to give students a more visceral experience of science and instill a deeper working knowledge of concepts, activities that utilize hands-on, laboratory and simulated experiences are recommended because these activities have a greater impact on student learning, especially for Native American students. Because it is not usually feasible to take large and/or multiple classes of high school science students into the field to count numbers of organisms of a particular species, especially over a long period of time and covering a large area of an environment, the population simulation presented in this paper was created to aid students in understanding population dynamics by working with a simulated environment, which can be done in the classroom. Students create an environment and populate the environment with imaginary species. Then, using a sequence of "rules" that allow organisms to eat, reproduce, move and age, students see how the population of a species changes over time. In particular, students practice collecting data, summarizing information, plotting graphs, and interpreting graphs for such information as carrying capacity, predator prey relationships, and how specific species factors impact population and the environment. Students draw conclusions from their results and suggest further research, which may involve changes in simulation parameters, prediction of outcomes, and testing predictions. The population Simulation has demonstrated success in the above student activities using a "board game" version of the population simulation. A computer version of the population simulation needs more testing, but preliminary runs are promising. A second - and more complicated - computer simulation will simulate the same things and will add simulated population genetics.

  9. A priori and a posteriori analyses of the flamelet/progress variable approach for supersonic combustion

    NASA Astrophysics Data System (ADS)

    Saghafian, Amirreza; Pitsch, Heinz

    2012-11-01

    A compressible flamelet/progress variable approach (CFPV) has been devised for high-speed flows. Temperature is computed from the transported total energy and tabulated species mass fractions and the source term of the progress variable is rescaled with pressure and temperature. The combustion is thus modeled by three additional scalar equations and a chemistry table that is computed in a pre-processing step. Three-dimensional direct numerical simulation (DNS) databases of reacting supersonic turbulent mixing layer with detailed chemistry are analyzed to assess the underlying assumptions of CFPV. Large eddy simulations (LES) of the same configuration using the CFPV method have been performed and compared with the DNS results. The LES computations are based on the presumed subgrid PDFs of mixture fraction and progress variable, beta function and delta function respectively, which are assessed using DNS databases. The flamelet equation budget is also computed to verify the validity of CFPV method for high-speed flows.

  10. Modeling Geometry and Progressive Failure of Material Interfaces in Plain Weave Composites

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Cheng, Ron-Bin

    2010-01-01

    A procedure combining a geometrically nonlinear, explicit-dynamics contact analysis, computer aided design techniques, and elasticity-based mesh adjustment is proposed to efficiently generate realistic finite element models for meso-mechanical analysis of progressive failure in textile composites. In the procedure, the geometry of fiber tows is obtained by imposing a fictitious expansion on the tows. Meshes resulting from the procedure are conformal with the computed tow-tow and tow-matrix interfaces but are incongruent at the interfaces. The mesh interfaces are treated as cohesive contact surfaces not only to resolve the incongruence but also to simulate progressive failure. The method is employed to simulate debonding at the material interfaces in a ceramic-matrix plain weave composite with matrix porosity and in a polymeric matrix plain weave composite without matrix porosity, both subject to uniaxial cyclic loading. The numerical results indicate progression of the interfacial damage during every loading and reverse loading event in a constant strain amplitude cyclic process. However, the composites show different patterns of damage advancement.

  11. On disciplinary fragmentation and scientific progress.

    PubMed

    Balietti, Stefano; Mäs, Michael; Helbing, Dirk

    2015-01-01

    Why are some scientific disciplines, such as sociology and psychology, more fragmented into conflicting schools of thought than other fields, such as physics and biology? Furthermore, why does high fragmentation tend to coincide with limited scientific progress? We analyzed a formal model where scientists seek to identify the correct answer to a research question. Each scientist is influenced by three forces: (i) signals received from the correct answer to the question; (ii) peer influence; and (iii) noise. We observed the emergence of different macroscopic patterns of collective exploration, and studied how the three forces affect the degree to which disciplines fall apart into divergent fragments, or so-called "schools of thought". We conducted two simulation experiments where we tested (A) whether the three forces foster or hamper progress, and (B) whether disciplinary fragmentation causally affects scientific progress and vice versa. We found that fragmentation critically limits scientific progress. Strikingly, there is no effect in the opposite causal direction. What is more, our results shows that at the heart of the mechanisms driving scientific progress we find (i) social interactions, and (ii) peer disagreement. In fact, fragmentation is increased and progress limited if the simulated scientists are open to influence only by peers with very similar views, or when within-school diversity is lost. Finally, disciplines where the scientists received strong signals from the correct answer were less fragmented and experienced faster progress. We discuss model's implications for the design of social institutions fostering interdisciplinarity and participation in science.

  12. On Disciplinary Fragmentation and Scientific Progress

    PubMed Central

    Balietti, Stefano; Mäs, Michael; Helbing, Dirk

    2015-01-01

    Why are some scientific disciplines, such as sociology and psychology, more fragmented into conflicting schools of thought than other fields, such as physics and biology? Furthermore, why does high fragmentation tend to coincide with limited scientific progress? We analyzed a formal model where scientists seek to identify the correct answer to a research question. Each scientist is influenced by three forces: (i) signals received from the correct answer to the question; (ii) peer influence; and (iii) noise. We observed the emergence of different macroscopic patterns of collective exploration, and studied how the three forces affect the degree to which disciplines fall apart into divergent fragments, or so-called “schools of thought”. We conducted two simulation experiments where we tested (A) whether the three forces foster or hamper progress, and (B) whether disciplinary fragmentation causally affects scientific progress and vice versa. We found that fragmentation critically limits scientific progress. Strikingly, there is no effect in the opposite causal direction. What is more, our results shows that at the heart of the mechanisms driving scientific progress we find (i) social interactions, and (ii) peer disagreement. In fact, fragmentation is increased and progress limited if the simulated scientists are open to influence only by peers with very similar views, or when within-school diversity is lost. Finally, disciplines where the scientists received strong signals from the correct answer were less fragmented and experienced faster progress. We discuss model’s implications for the design of social institutions fostering interdisciplinarity and participation in science. PMID:25790025

  13. Adaptive Language Games with Robots

    NASA Astrophysics Data System (ADS)

    Steels, Luc

    2010-11-01

    This paper surveys recent research into language evolution using computer simulations and robotic experiments. This field has made tremendous progress in the past decade going from simple simulations of lexicon formation with animallike cybernetic robots to sophisticated grammatical experiments with humanoid robots.

  14. Biomarkers predict outcome in Charcot-Marie-Tooth disease 1A.

    PubMed

    Fledrich, Robert; Mannil, Manoj; Leha, Andreas; Ehbrecht, Caroline; Solari, Alessandra; Pelayo-Negro, Ana L; Berciano, José; Schlotter-Weigel, Beate; Schnizer, Tuuli J; Prukop, Thomas; Garcia-Angarita, Natalia; Czesnik, Dirk; Haberlová, Jana; Mazanec, Radim; Paulus, Walter; Beissbarth, Tim; Walter, Maggie C; Triaal, Cmt-; Hogrel, Jean-Yves; Dubourg, Odile; Schenone, Angelo; Baets, Jonathan; De Jonghe, Peter; Shy, Michael E; Horvath, Rita; Pareyson, Davide; Seeman, Pavel; Young, Peter; Sereda, Michael W

    2017-11-01

    Charcot-Marie-Tooth disease type 1A (CMT1A) is the most common inherited neuropathy, a debilitating disease without known cure. Among patients with CMT1A, disease manifestation, progression and severity are strikingly variable, which poses major challenges for the development of new therapies. Hence, there is a strong need for sensitive outcome measures such as disease and progression biomarkers, which would add powerful tools to monitor therapeutic effects in CMT1A. We established a pan-European and American consortium comprising nine clinical centres including 311 patients with CMT1A in total. From all patients, the CMT neuropathy score and secondary outcome measures were obtained and a skin biopsy collected. In order to assess and validate disease severity and progression biomarkers, we performed qPCR on a set of 16 animal model-derived potential biomarkers in skin biopsy mRNA extracts. In 266 patients with CMT1A, a cluster of eight cutaneous transcripts differentiates disease severity with a sensitivity and specificity of 90% and 76.1%, respectively. In an additional cohort of 45 patients with CMT1A, from whom a second skin biopsy was taken after 2-3 years, the cutaneous mRNA expression of GSTT2, CTSA, PPARG, CDA, ENPP1 and NRG1-Iis changing over time and correlates with disease progression. In summary, we provide evidence that cutaneous transcripts in patients with CMT1A serve as disease severity and progression biomarkers and, if implemented into clinical trials, they could markedly accelerate the development of a therapy for CMT1A. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Disparities in child mortality trends: what is the evidence from disadvantaged states in India? the case of Orissa and Madhya Pradesh.

    PubMed

    Nguyen, Kim-Huong; Jimenez-Soto, Eliana; Dayal, Prarthna; Hodge, Andrew

    2013-06-27

    The Millennium Development Goals prompted renewed international efforts to reduce under-five mortality and measure national progress. However, scant evidence exists about the distribution of child mortality at low sub-national levels, which in diverse and decentralized countries like India are required to inform policy-making. This study estimates changes in child mortality across a range of markers of inequalities in Orissa and Madhya Pradesh, two of India's largest, poorest, and most disadvantaged states. Estimates of under-five and neonatal mortality rates were computed using seven datasets from three available sources--sample registration system, summary birth histories in surveys, and complete birth histories. Inequalities were gauged by comparison of mortality rates within four sub-state populations defined by the following characteristics: rural-urban location, ethnicity, wealth, and district. Trend estimates suggest that progress has been made in mortality rates at the state levels. However, reduction rates have been modest, particularly for neonatal mortality. Different mortality rates are observed across all the equity markers, although there is a pattern of convergence between rural and urban areas, largely due to inadequate progress in urban settings. Inter-district disparities and differences between socioeconomic groups are also evident. Although child mortality rates continue to decline at the national level, our evidence shows that considerable disparities persist. While progress in reducing under-five and neonatal mortality rates in urban areas appears to be levelling off, policies targeting rural populations and scheduled caste and tribe groups appear to have achieved some success in reducing mortality differentials. The results of this study thus add weight to recent government initiatives targeting these groups. Equitable progress, particularly for neonatal mortality, requires continuing efforts to strengthen health systems and overcome barriers to identify and reach vulnerable groups.

  16. Who pays for healthcare in Bangladesh? An analysis of progressivity in health systems financing.

    PubMed

    Molla, Azaher Ali; Chi, Chunhuei

    2017-09-06

    The relationship between payments towards healthcare and ability to pay is a measure of financial fairness. Analysis of progressivity is important from an equity perspective as well as for macroeconomic and political analysis of healthcare systems. Bangladesh health systems financing is characterized by high out-of-pocket payments (63.3%), which is increasing. Hence, we aimed to see who pays what part of this high out-of-pocket expenditure. To our knowledge, this was the first progressivity analysis of health systems financing in Bangladesh. We used data from Bangladesh Household Income and Expenditure Survey, 2010. This was a cross sectional and nationally representative sample of 12,240 households consisting of 55,580 individuals. For quantification of progressivity, we adopted the 'ability-to-pay' principle developed by O'Donnell, van Doorslaer, Wagstaff, and Lindelow (2008). We used the Kakwani index to measure the magnitude of progressivity. Health systems financing in Bangladesh is regressive. Inequality increases due to healthcare payments. The differences between the Gini coefficient and the Kakwani index for all sources of finance are negative, which indicates regressivity, and that financing is more concentrated among the poor. Income inequality increases due to high out-of-pocket payments. The increase in income inequality caused by out-of-pocket payments is 89% due to negative vertical effect and 11% due to horizontal inequity. Our findings add substantial evidence of health systems financing impact on inequitable financial burden of healthcare and income. The heavy reliance on out-of-pocket payments may affect household living standards. If the government and people of Bangladesh are concerned about equitable financing burden, our study suggests that Bangladesh needs to reform the health systems financing scheme.

  17. Project Assessment Skills Web Application

    NASA Technical Reports Server (NTRS)

    Goff, Samuel J.

    2013-01-01

    The purpose of this project is to utilize Ruby on Rails to create a web application that will replace a spreadsheet keeping track of training courses and tasks. The goal is to create a fast and easy to use web application that will allow users to track progress on training courses. This application will allow users to update and keep track of all of the training required of them. The training courses will be organized by group and by user, making readability easier. This will also allow group leads and administrators to get a sense of how everyone is progressing in training. Currently, updating and finding information from this spreadsheet is a long and tedious task. By upgrading to a web application, finding and updating information will be easier than ever as well as adding new training courses and tasks. Accessing this data will be much easier in that users just have to go to a website and log in with NDC credentials rather than request the relevant spreadsheet from the holder. In addition to Ruby on Rails, I will be using JavaScript, CSS, and jQuery to help add functionality and ease of use to my web application. This web application will include a number of features that will help update and track progress on training. For example, one feature will be to track progress of a whole group of users to be able to see how the group as a whole is progressing. Another feature will be to assign tasks to either a user or a group of users. All of these together will create a user friendly and functional web application.

  18. Predicted accommodative response from image quality in young eyes fitted with different dual-focus designs.

    PubMed

    Faria-Ribeiro, Miguel; Amorim-de-Sousa, Ana; González-Méijome, José M

    2018-05-01

    To investigate the separated and combined influences of inner zone (IZ) diameter and effective add power of dual-focus contact lenses (CL) in the image quality at distance and near viewing, in a functional accommodating model eye. Computational wave-optics methods were used to define zonal bifocal pupil functions, representing the optic zones of nine dual-focus centre-distance CLs. The dual-focus pupil functions were defined having IZ diameters of 2.10 mm, 3.36 mm and 4.00 mm, with add powers of 1.5 D, 2.0 D and 2.5 D (dioptres), for each design, that resulted in a ratio of 64%/36% between the distance and treatment zone areas, bounded by a 6 mm entrance pupil. A through-focus routine was implemented in MATLAB to simulate the changes in image quality, calculated from the Visual Strehl ratio, as the eye with the dual-focus accommodates, from 0 to -3.00 D target vergences. Accommodative responses were defined as the changes in the defocus coefficient, combined with a change in fourth and sixth order spherical aberration, which produced a peak in image quality at each target vergence. Distance viewing image quality was marginally affected by IZ diameter but not by add power. Near image quality obtained when focussing the image formed by the near optics was only higher by a small amount compared to the other two IZ diameters. The mean ± standard deviation values obtained with the three adds were 0.28 ± 0.02, 0.23 ± 0.02 and 0.22 ± 0.02, for the small, medium and larger IZ diameters, respectively. On the other hand, near image quality predicted by focussing the image formed by the distance optics was considerably lower relatively to the other two IZ diameters. The mean ± standard deviation values obtained with the three adds were 0.15 ± 0.01, 0.38 ± 0.00 and 0.54 ± 0.01, for the small, medium and larger IZ diameters, respectively. During near viewing through dual-focus CLs, image quality depends on the diameter of the most inner zone of the CL, while add power only affects the range of clear focus when focussing the image formed by the CL near optics. When only image quality gain is taken into consideration, medium and large IZ diameters designs are most likely to promote normal accommodative responses driven by the CL distance optics, while a smaller IZ diameter design is most likely to promote a reduced accommodative response driven by the dual-focus CL near optics. © 2018 The Authors Ophthalmic & Physiological Optics © 2018 The College of Optometrists.

  19. KSC-2013-4388

    NASA Image and Video Library

    2013-12-13

    CAPE CANAVERAL, Fla. – At a training location near Launch Complex 39 at NASA’s Kennedy Space Center in Florida, members of the Emergency Response Team, or ERT, participate in specialized training simulations in order to keep their skills current. They are wearing full protective gear and carrying non-lethal firearms, which are denoted in blue, for the training exercises. Often, ERT leadership serves as simulated suspects to add realism to the training exercises. Recently, eight members of the ERT competed in the 31st Annual SWAT Roundup International competition in Orlando, Fla., and received recognition by placing in the top five overall. In keeping with NASA’s commitment to safety and security of workforce and assets, the ERT is part of Kennedy’s security team and is trained to respond in the event of an emergency at the center. Photo credit: NASA/Daniel Casper

  20. A multi-scale residual-based anti-hourglass control for compatible staggered Lagrangian hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kucharik, M.; Scovazzi, Guglielmo; Shashkov, Mikhail Jurievich

    Hourglassing is a well-known pathological numerical artifact affecting the robustness and accuracy of Lagrangian methods. There exist a large number of hourglass control/suppression strategies. In the community of the staggered compatible Lagrangian methods, the approach of sub-zonal pressure forces is among the most widely used. However, this approach is known to add numerical strength to the solution, which can cause potential problems in certain types of simulations, for instance in simulations of various instabilities. To avoid this complication, we have adapted the multi-scale residual-based stabilization typically used in the finite element approach for staggered compatible framework. In this study, wemore » describe two discretizations of the new approach and demonstrate their properties and compare with the method of sub-zonal pressure forces on selected numerical problems.« less

  1. Addition of simultaneous heat and solute transport and variable fluid viscosity to SEAWAT

    USGS Publications Warehouse

    Thorne, D.; Langevin, C.D.; Sukop, M.C.

    2006-01-01

    SEAWAT is a finite-difference computer code designed to simulate coupled variable-density ground water flow and solute transport. This paper describes a new version of SEAWAT that adds the ability to simultaneously model energy and solute transport. This is necessary for simulating the transport of heat and salinity in coastal aquifers for example. This work extends the equation of state for fluid density to vary as a function of temperature and/or solute concentration. The program has also been modified to represent the effects of variable fluid viscosity as a function of temperature and/or concentration. The viscosity mechanism is verified against an analytical solution, and a test of temperature-dependent viscosity is provided. Finally, the classic Henry-Hilleke problem is solved with the new code. ?? 2006 Elsevier Ltd. All rights reserved.

  2. A Novel Approach to Realize of All Optical Frequency Encoded Dibit Based XOR and XNOR Logic Gates Using Optical Switches with Simulated Verification

    NASA Astrophysics Data System (ADS)

    Ghosh, B.; Hazra, S.; Haldar, N.; Roy, D.; Patra, S. N.; Swarnakar, J.; Sarkar, P. P.; Mukhopadhyay, S.

    2018-03-01

    Since last few decades optics has already proved its strong potentiality for conducting parallel logic, arithmetic and algebraic operations due to its super-fast speed in communication and computation. So many different logical and sequential operations using all optical frequency encoding technique have been proposed by several authors. Here, we have keened out all optical dibit representation technique, which has the advantages of high speed operation as well as reducing the bit error problem. Exploiting this phenomenon, we have proposed all optical frequency encoded dibit based XOR and XNOR logic gates using the optical switches like add/drop multiplexer (ADM) and reflected semiconductor optical amplifier (RSOA). Also the operations of these gates have been verified through proper simulation using MATLAB (R2008a).

  3. Design of Improved Arithmetic Logic Unit in Quantum-Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Heikalabad, Saeed Rasouli; Gadim, Mahya Rahimpour

    2018-06-01

    The quantum-dot cellular automata (QCA) can be replaced to overcome the limitation of CMOS technology. An arithmetic logic unit (ALU) is a basic structure of any computer devices. In this paper, design of improved single-bit arithmetic logic unit in quantum dot cellular automata is presented. The proposed structure for ALU has AND, OR, XOR and ADD operations. A unique 2:1 multiplexer, an ultra-efficient two-input XOR and a low complexity full adder are used in the proposed structure. Also, an extended design of this structure is provided for two-bit ALU in this paper. The proposed structure of ALU is simulated by QCADesigner and simulation result is evaluated. Evaluation results show that the proposed design has best performance in terms of area, complexity and delay compared to the previous designs.

  4. Toward textbook multigrid efficiency for fully implicit resistive magnetohydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Mark F.; Samtaney, Ravi, E-mail: samtaney@pppl.go; Brandt, Achi

    2010-09-01

    Multigrid methods can solve some classes of elliptic and parabolic equations to accuracy below the truncation error with a work-cost equivalent to a few residual calculations - so-called 'textbook' multigrid efficiency. We investigate methods to solve the system of equations that arise in time dependent magnetohydrodynamics (MHD) simulations with textbook multigrid efficiency. We apply multigrid techniques such as geometric interpolation, full approximate storage, Gauss-Seidel smoothers, and defect correction for fully implicit, nonlinear, second-order finite volume discretizations of MHD. We apply these methods to a standard resistive MHD benchmark problem, the GEM reconnection problem, and add a strong magnetic guide field,more » which is a critical characteristic of magnetically confined fusion plasmas. We show that our multigrid methods can achieve near textbook efficiency on fully implicit resistive MHD simulations.« less

  5. Toward textbook multigrid efficiency for fully implicit resistive magnetohydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Mark F.; Samtaney, Ravi; Brandt, Achi

    2010-09-01

    Multigrid methods can solve some classes of elliptic and parabolic equations to accuracy below the truncation error with a work-cost equivalent to a few residual calculations – so-called ‘‘textbook” multigrid efficiency. We investigate methods to solve the system of equations that arise in time dependent magnetohydrodynamics (MHD) simulations with textbook multigrid efficiency. We apply multigrid techniques such as geometric interpolation, full approximate storage, Gauss–Seidel smoothers, and defect correction for fully implicit, nonlinear, second-order finite volume discretizations of MHD. We apply these methods to a standard resistive MHD benchmark problem, the GEM reconnection problem, and add a strong magnetic guide field,more » which is a critical characteristic of magnetically confined fusion plasmas. We show that our multigrid methods can achieve near textbook efficiency on fully implicit resistive MHD simulations.« less

  6. Toward textbook multigrid efficiency for fully implicit resistive magnetohydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Mark F.; Samtaney, Ravi; Brandt, Achi

    2013-12-14

    Multigrid methods can solve some classes of elliptic and parabolic equations to accuracy below the truncation error with a work-cost equivalent to a few residual calculations – so-called “textbook” multigrid efficiency. We investigate methods to solve the system of equations that arise in time dependent magnetohydrodynamics (MHD) simulations with textbook multigrid efficiency. We apply multigrid techniques such as geometric interpolation, full approximate storage, Gauss-Seidel smoothers, and defect correction for fully implicit, nonlinear, second-order finite volume discretizations of MHD. We apply these methods to a standard resistive MHD benchmark problem, the GEM reconnection problem, and add a strong magnetic guide field,more » which is a critical characteristic of magnetically confined fusion plasmas. We show that our multigrid methods can achieve near textbook efficiency on fully implicit resistive MHD simulations.« less

  7. Design and Implementation of Hybrid CORDIC Algorithm Based on Phase Rotation Estimation for NCO

    PubMed Central

    Zhang, Chaozhu; Han, Jinan; Li, Ke

    2014-01-01

    The numerical controlled oscillator has wide application in radar, digital receiver, and software radio system. Firstly, this paper introduces the traditional CORDIC algorithm. Then in order to improve computing speed and save resources, this paper proposes a kind of hybrid CORDIC algorithm based on phase rotation estimation applied in numerical controlled oscillator (NCO). Through estimating the direction of part phase rotation, the algorithm reduces part phase rotation and add-subtract unit, so that it decreases delay. Furthermore, the paper simulates and implements the numerical controlled oscillator by Quartus II software and Modelsim software. Finally, simulation results indicate that the improvement over traditional CORDIC algorithm is achieved in terms of ease of computation, resource utilization, and computing speed/delay while maintaining the precision. It is suitable for high speed and precision digital modulation and demodulation. PMID:25110750

  8. Design of Improved Arithmetic Logic Unit in Quantum-Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Heikalabad, Saeed Rasouli; Gadim, Mahya Rahimpour

    2018-03-01

    The quantum-dot cellular automata (QCA) can be replaced to overcome the limitation of CMOS technology. An arithmetic logic unit (ALU) is a basic structure of any computer devices. In this paper, design of improved single-bit arithmetic logic unit in quantum dot cellular automata is presented. The proposed structure for ALU has AND, OR, XOR and ADD operations. A unique 2:1 multiplexer, an ultra-efficient two-input XOR and a low complexity full adder are used in the proposed structure. Also, an extended design of this structure is provided for two-bit ALU in this paper. The proposed structure of ALU is simulated by QCADesigner and simulation result is evaluated. Evaluation results show that the proposed design has best performance in terms of area, complexity and delay compared to the previous designs.

  9. A multi-scale residual-based anti-hourglass control for compatible staggered Lagrangian hydrodynamics

    DOE PAGES

    Kucharik, M.; Scovazzi, Guglielmo; Shashkov, Mikhail Jurievich; ...

    2017-10-28

    Hourglassing is a well-known pathological numerical artifact affecting the robustness and accuracy of Lagrangian methods. There exist a large number of hourglass control/suppression strategies. In the community of the staggered compatible Lagrangian methods, the approach of sub-zonal pressure forces is among the most widely used. However, this approach is known to add numerical strength to the solution, which can cause potential problems in certain types of simulations, for instance in simulations of various instabilities. To avoid this complication, we have adapted the multi-scale residual-based stabilization typically used in the finite element approach for staggered compatible framework. In this study, wemore » describe two discretizations of the new approach and demonstrate their properties and compare with the method of sub-zonal pressure forces on selected numerical problems.« less

  10. Autonomous Cryogenics Loading Operations Simulation Software: Knowledgebase Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S., Jr.

    2013-01-01

    Working on the ACLO (Autonomous Cryogenics Loading Operations) project I have had the opportunity to add functionality to the physics simulation software known as KATE (Knowledgebase Autonomous Test Engineer), create a new application allowing WYSIWYG (what-you-see-is-what-you-get) creation of KATE schematic files and begin a preliminary design and implementation of a new subsystem that will provide vision services on the IHM (Integrated Health Management) bus. The functionality I added to KATE over the past few months includes a dynamic visual representation of the fluid height in a pipe based on number of gallons of fluid in the pipe and implementing the IHM bus connection within KATE. I also fixed a broken feature in the system called the Browser Display, implemented many bug fixes and made changes to the GUI (Graphical User Interface).

  11. GOSA, a simulated annealing-based program for global optimization of nonlinear problems, also reveals transyears

    PubMed Central

    Czaplicki, Jerzy; Cornélissen, Germaine; Halberg, Franz

    2009-01-01

    Summary Transyears in biology have been documented thus far by the extended cosinor approach, including linear-nonlinear rhythmometry. We here confirm the existence of transyears by simulated annealing, a method originally developed for a much broader use, but described and introduced herein for validating its application to time series. The method is illustrated both on an artificial test case with known components and on biological data. We provide a table comparing results by the two methods and trust that the procedure will serve the budding sciences of chronobiology (the study of mechanisms underlying biological time structure), chronomics (the mapping of time structures in and around us), and chronobioethics, using the foregoing disciplines to add to concern for illnesses of individuals, and to budding focus on diseases of nations and civilizations. PMID:20414480

  12. Design and Application of Interactive Simulations in Problem-Solving in University-Level Physics Education

    ERIC Educational Resources Information Center

    Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel

    2016-01-01

    In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving…

  13. An overview of computational simulation methods for composite structures failure and life analysis

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1993-01-01

    Three parallel computational simulation methods are being developed at the LeRC Structural Mechanics Branch (SMB) for composite structures failure and life analysis: progressive fracture CODSTRAN; hierarchical methods for high-temperature composites; and probabilistic evaluation. Results to date demonstrate that these methods are effective in simulating composite structures failure/life/reliability.

  14. A Multiple-Sessions Interactive Computer-Based Learning Tool for Ability Cultivation in Circuit Simulation

    ERIC Educational Resources Information Center

    Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.

    2011-01-01

    An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…

  15. MD simulations of ligand-bound and ligand-free aptamer: molecular level insights into the binding and switching mechanism of the add A-riboswitch.

    PubMed

    Sharma, Monika; Bulusu, Gopalakrishnan; Mitra, Abhijit

    2009-09-01

    Riboswitches are structural cis-acting genetic regulatory elements in 5' UTRs of mRNAs, consisting of an aptamer domain that regulates the behavior of an expression platform in response to its recognition of, and binding to, specific ligands. While our understanding of the ligand-bound structure of the aptamer domain of the adenine riboswitches is based on crystal structure data and is well characterized, understanding of the structure and dynamics of the ligand-free aptamer is limited to indirect inferences from physicochemical probing experiments. Here we report the results of 15-nsec-long explicit-solvent molecular dynamics simulations of the add A-riboswitch crystal structure (1Y26), both in the adenine-bound (CLOSED) state and in the adenine-free (OPEN) state. Root-mean-square deviation, root-mean-square fluctuation, dynamic cross-correlation, and backbone torsion angle analyses are carried out on the two trajectories. These, along with solvent accessible surface area analysis of the two average structures, are benchmarked against available experimental data and are shown to constitute the basis for obtaining reliable insights into the molecular level details of the binding and switching mechanism. Our analysis reveals the interaction network responsible for, and conformational changes associated with, the communication between the binding pocket and the expression platform. It further highlights the significance of a, hitherto unreported, noncanonical W:H trans base pairing between A73 and A24, in the OPEN state, and also helps us to propose a possibly crucial role of U51 in the context of ligand binding and ligand discrimination.

  16. Evaluation of accuracy of linear regression models in predicting urban stormwater discharge characteristics.

    PubMed

    Madarang, Krish J; Kang, Joo-Hyon

    2014-06-01

    Stormwater runoff has been identified as a source of pollution for the environment, especially for receiving waters. In order to quantify and manage the impacts of stormwater runoff on the environment, predictive models and mathematical models have been developed. Predictive tools such as regression models have been widely used to predict stormwater discharge characteristics. Storm event characteristics, such as antecedent dry days (ADD), have been related to response variables, such as pollutant loads and concentrations. However it has been a controversial issue among many studies to consider ADD as an important variable in predicting stormwater discharge characteristics. In this study, we examined the accuracy of general linear regression models in predicting discharge characteristics of roadway runoff. A total of 17 storm events were monitored in two highway segments, located in Gwangju, Korea. Data from the monitoring were used to calibrate United States Environmental Protection Agency's Storm Water Management Model (SWMM). The calibrated SWMM was simulated for 55 storm events, and the results of total suspended solid (TSS) discharge loads and event mean concentrations (EMC) were extracted. From these data, linear regression models were developed. R(2) and p-values of the regression of ADD for both TSS loads and EMCs were investigated. Results showed that pollutant loads were better predicted than pollutant EMC in the multiple regression models. Regression may not provide the true effect of site-specific characteristics, due to uncertainty in the data. Copyright © 2014 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.

  17. Cost-utility of ranolazine for the symptomatic treatment of patients with chronic angina pectoris in Spain.

    PubMed

    Hidalgo-Vega, Alvaro; Ramos-Goñi, Juan Manuel; Villoro, Renata

    2014-12-01

    Ranolazine is an antianginal agent that was approved in the EU in 2008 as an add-on therapy for symptomatic chronic angina pectoris treatment in patients who are inadequately controlled by, or are intolerant to, first-line antianginal therapies. These patients' quality of life is significantly affected by more frequent angina events, which increase the risk of revascularization. To assess the cost-utility of ranolazine versus placebo as an add-on therapy for the symptomatic treatment of patients with chronic angina pectoris in Spain. A decision tree model with 1-year time horizon was designed. Transition probabilities and utility values for different angina frequencies were obtained from the literature. Costs were obtained from Spanish official DRGs for patients with chronic angina pectoris. We calculated the incremental cost-utility ratio of using ranolazine compared with a placebo. Sensitivity analyses, by means of Monte Carlo simulations, were performed. Acceptability curves and expected value of perfect information were calculated. The incremental cost-utility ratio was €8,455 per quality-adjusted life-year (QALY) per patient in Spain. Sensitivity analyses showed that if the decision makers' willingness to pay is €15,000 per QALY, the treatment with ranolazine will be cost effective at a 95 % level of confidence. The incremental cost-utility ratio is particularly sensitive to changes in utility values of those non-hospitalized patients with mild or moderate angina frequency. Ranolazine is a highly efficient add-on therapy for the symptomatic treatment of chronic angina pectoris in patients who are inadequately controlled by, or intolerant to, first-line antianginal therapies in Spain.

  18. Duct- and Acinar-Derived Pancreatic Ductal Adenocarcinomas Show Distinct Tumor Progression and Marker Expression.

    PubMed

    Ferreira, Rute M M; Sancho, Rocio; Messal, Hendrik A; Nye, Emma; Spencer-Dene, Bradley; Stone, Richard K; Stamp, Gordon; Rosewell, Ian; Quaglia, Alberto; Behrens, Axel

    2017-10-24

    The cell of origin of pancreatic ductal adenocarcinoma (PDAC) has been controversial. Here, we show that identical oncogenic drivers trigger PDAC originating from both ductal and acinar cells with similar histology but with distinct pathophysiology and marker expression dependent on cell of origin. Whereas acinar-derived tumors exhibited low AGR2 expression and were preceded by pancreatic intraepithelial neoplasias (PanINs), duct-derived tumors displayed high AGR2 and developed independently of a PanIN stage via non-mucinous lesions. Using orthotopic transplantation and chimera experiments, we demonstrate that PanIN-like lesions can be induced by PDAC as bystanders in adjacent healthy tissues, explaining the co-existence of mucinous and non-mucinous lesions and highlighting the need to distinguish between true precursor PanINs and PanIN-like bystander lesions. Our results suggest AGR2 as a tool to stratify PDAC according to cell of origin, highlight that not all PanIN-like lesions are precursors of PDAC, and add an alternative progression route to the current model of PDAC development. Copyright © 2017 Francis Crick Institute. Published by Elsevier Inc. All rights reserved.

  19. Penile involvement in Systemic Sclerosis: New Diagnostic and Therapeutic Aspects

    PubMed Central

    Aversa, Antonio; Bruzziches, Roberto; Francomano, Davide; Rosato, Edoardo; Salsano, Felice; Spera, Giovanni

    2010-01-01

    Systemic Sclerosis (SSc) is a connective tissue disorder featuring vascular alterations and an immunological activation leading to a progressive and widespread fibrosis of several organs such as the skin, lung, gastrointestinal tract, heart, and kidney. Men with SSc are at increased risk of developing erectile dysfunction (ED) because of the evolution of early microvascular tissutal damage into corporeal fibrosis. The entity of penile vascular damage in SSc patients has been demonstrated by using Duplex ultrasonography and functional infra-red imaging and it is now clear that this is a true clinical entity invariably occurring irrespective of age and disease duration and constituting the ‘‘sclerodermic penis”. Once-daily phosphodiesterase type-5 (PDE5) inhibitors improve both sexual function and vascular measures of cavernous arteries by improving surrogate markers of endothelial dysfunction, that is, plasma endothelin-1 and adrenomedullin levels, which may play a potential role in preventing progression of penile fibrosis and ED. Also, the beneficial effect of long-term PDE5i add-on therapy to SSc therapy in the treatment of Raynaud's phenomenon is described. PMID:20981315

  20. Dr.LiTHO: a development and research lithography simulator

    NASA Astrophysics Data System (ADS)

    Fühner, Tim; Schnattinger, Thomas; Ardelean, Gheorghe; Erdmann, Andreas

    2007-03-01

    This paper introduces Dr.LiTHO, a research and development oriented lithography simulation environment developed at Fraunhofer IISB to flexibly integrate our simulation models into one coherent platform. We propose a light-weight approach to a lithography simulation environment: The use of a scripting (batch) language as an integration platform. Out of the great variety of different scripting languages, Python proved superior in many ways: It exhibits a good-natured learning-curve, it is efficient, available on virtually any platform, and provides sophisticated integration mechanisms for existing programs. In this paper, we will describe the steps, required to provide Python bindings for existing programs and to finally generate an integrated simulation environment. In addition, we will give a short introduction into selected software design demands associated with the development of such a framework. We will especially focus on testing and (both technical and user-oriented) documentation issues. Dr.LiTHO Python files contain not only all simulation parameter settings but also the simulation flow, providing maximum flexibility. In addition to relatively simple batch jobs, repetitive tasks can be pooled in libraries. And as Python is a full-blown programming language, users can add virtually any functionality, which is especially useful in the scope of simulation studies or optimization tasks, that often require masses of evaluations. Furthermore, we will give a short overview of the numerous existing Python packages. Several examples demonstrate the feasibility and productiveness of integrating Python packages into custom Dr.LiTHO scripts.

  1. Simulation-based bronchoscopy training: systematic review and meta-analysis.

    PubMed

    Kennedy, Cassie C; Maldonado, Fabien; Cook, David A

    2013-07-01

    Simulation-based bronchoscopy training is increasingly used, but effectiveness remains uncertain. We sought to perform a comprehensive synthesis of published work on simulation-based bronchoscopy training. We searched MEDLINE, EMBASE, CINAHL, PsycINFO, ERIC, Web of Science, and Scopus for eligible articles through May 11, 2011. We included all original studies involving health professionals that evaluated, in comparison with no intervention or an alternative instructional approach, simulation-based training for flexible or rigid bronchoscopy. Study selection and data abstraction were performed independently and in duplicate. We pooled results using random effects meta-analysis. From an initial pool of 10,903 articles, we identified 17 studies evaluating simulation-based bronchoscopy training. In comparison with no intervention, simulation training was associated with large benefits on skills and behaviors (pooled effect size, 1.21 [95% CI, 0.82-1.60]; n=8 studies) and moderate benefits on time (0.62 [95% CI, 0.12-1.13]; n=7). In comparison with clinical instruction, behaviors with real patients showed nonsignificant effects favoring simulation for time (0.61 [95% CI, -1.47 to 2.69]) and process (0.33 [95% CI, -1.46 to 2.11]) outcomes (n=2 studies each), although variation in training time might account for these differences. Four studies compared alternate simulation-based training approaches. Inductive analysis to inform instructional design suggested that longer or more structured training is more effective, authentic clinical context adds value, and animal models and plastic part-task models may be superior to more costly virtual-reality simulators. Simulation-based bronchoscopy training is effective in comparison with no intervention. Comparative effectiveness studies are few.

  2. Block Oriented Simulation System (BOSS)

    NASA Technical Reports Server (NTRS)

    Ratcliffe, Jaimie

    1988-01-01

    Computer simulation is assuming greater importance as a flexible and expedient approach to modeling system and subsystem behavior. Simulation has played a key role in the growth of complex, multiple access space communications such as those used by the space shuttle and the TRW-built Tracking and Data Relay Satellites (TDRS). A powerful new simulator for use in designing and modeling the communication system of NASA's planned Space Station is being developed. Progress to date on the Block (Diagram) Oriented Simulation System (BOSS) is described.

  3. Electronic prototyping

    NASA Technical Reports Server (NTRS)

    Hopcroft, J.

    1987-01-01

    The potential benefits of automation in space are significant. The science base needed to support this automation not only will help control costs and reduce lead-time in the earth-based design and construction of space stations, but also will advance the nation's capability for computer design, simulation, testing, and debugging of sophisticated objects electronically. Progress in automation will require the ability to electronically represent, reason about, and manipulate objects. Discussed here is the development of representations, languages, editors, and model-driven simulation systems to support electronic prototyping. In particular, it identifies areas where basic research is needed before further progress can be made.

  4. Flat-plate techniques for measuring reflectance of macro-algae (Ulva curvata)

    USGS Publications Warehouse

    Ramsey, Elijah W.; Rangoonwala, Amina; Thomsen, Mads Solgaard; Schwarzschild, Arthur

    2012-01-01

    We tested the consistency and accuracy of flat-plate spectral measurements (400–1000 nm) of the marine macrophyte Ulva curvata. With sequential addition of Ulva thallus layers, the reflectance progressively increased from 6% to 9% with six thalli in the visible (VIS) and from 5% to 19% with ten thalli in the near infrared (NIR). This progressive increase was simulated by a mathematical calculation based on an Ulva thallus diffuse reflectance weighted by a transmittance power series. Experimental and simulated reflectance differences that were particularly high in the NIR most likely resulted from residual water and layering structure unevenness in the experimental progression. High spectral overlap existed between fouled and non-fouled Ulva mats and the coexistent lagoon mud in the VIS, whereas in the NIR, spectral contrast was retained but substantially dampened by fouling.

  5. Dynamic Shade and Irradiance Simulation of Aquatic ...

    EPA Pesticide Factsheets

    Penumbra is a landscape shade and irradiance simulation model that simulates how solar energy spatially and temporally interacts within dynamic ecosystems such as riparian zones, forests, and other terrain that cast topological shadows. Direct and indirect solar energy accumulates across landscapes and is the main energy driver for increasing aquatic and landscape temperatures at both local and holistic scales. Landscape disturbances such as landuse change, clear cutting, and fire can cause significant variations in the resulting irradiance reaching particular locations. Penumbra can simulate solar angles and irradiance at definable temporal grains as low as one minute while simulating landscape shadowing up to an entire year. Landscapes can be represented at sub-meter resolutions with appropriate spatial data inputs, such as field data or elevation and surface object heights derived from light detection and ranging (LiDAR) data. This work describes Penumbra’s framework and methodology, external model integration capability, and appropriate model application for a variety of watershed restoration project types. First, an overview of Penumbra’s framework reveals what this model adds to the existing ecological modeling domain. Second, Penumbra’s stand-alone and integration modes are explained and demonstrated. Stand-alone modeling results are showcased within the 3-D visualization tool VISTAS (VISualizing Terrestrial-Aquatic Systems), which fluently summariz

  6. The Promise of Quantum Simulation.

    PubMed

    Muller, Richard P; Blume-Kohout, Robin

    2015-08-25

    Quantum simulations promise to be one of the primary applications of quantum computers, should one be constructed. This article briefly summarizes the history of quantum simulation in light of the recent result of Wang and co-workers, demonstrating calculation of the ground and excited states for a HeH(+) molecule, and concludes with a discussion of why this and other recent progress in the field suggest that quantum simulations of quantum chemistry have a bright future.

  7. The Progress of Research Project for Magnetized Target Fusion in China

    NASA Astrophysics Data System (ADS)

    Yang, Xian-Jun

    2015-11-01

    The fusion of magnetized plasma called Magnetized Target Fusion (MTF) is a hot research area recently. It may significantly reduce the cost and size. Great progress has been achieved in past decades around the world. Five years ago, China initiated the MTF project and has gotten some progress as follows: 1. Verifying the feasibility of ignition of MTF by means of first principle and MHD simulation; 2. Generating the magnetic field over 1400 Tesla, which can be suppress the heat conduction from charged particles, deposit the energy of alpha particle to promote the ignition process, and produce the stable magnetized plasma for the target of ignition; 3. The imploding facility of FP-1 can put several Mega Joule energy to the solid liner of about ten gram in the range of microsecond risen time, while the simulating tool has been developed for design and analysis of the process; 4. The target of FRC can be generated by ``YG 1 facility'' while some simulating tools have be developed. Next five years, the above theoretical work and the experiments of MTF may be integrated to step up as the National project, which may make my term play an important lead role and be supposed to achieve farther progress in China. Supported by the National Natural Science Foundation of China under Grant No 11175028.

  8. 2017 National Household Travel Survey - California Add-On |

    Science.gov Websites

    Transportation Secure Data Center | NREL 7 National Household Travel Survey - California Add-On 2017 National Household Travel Survey - California Add-On The California add-on survey supplements the 2017 National Household Travel Survey (NHTS) with additional household samples and detailed travel

  9. Etiology of Attention Disorders: A Neurological/Genetic Perspective.

    ERIC Educational Resources Information Center

    Grantham, Madeline Kay

    This paper explores the historical origins of attention deficit disorder/attention deficit hyperactivity disorder (ADD/ADHD) as a neurological disorder, current neurological and genetic research concerning the etiology of ADD/ADHD, and implications for diagnosis and treatment. First, ADD/ADHD is defined and then the origins of ADD/ADHD as a…

  10. Illustration of cross flow of polystyrene melts through a coathanger die

    NASA Astrophysics Data System (ADS)

    Schöppner, V.; Henke, B.

    2015-05-01

    To design an optimal coathanger die with a uniform flow rate distribution and low pressure drop, it is essential to understand the flow conditions in the die. This is important because the quality of the product is influenced by the flow velocity and the flow rate distribution. In extrusion dies, cross flows also occur in addition to the main flow, which flow perpendicular to the main flow. This results in pressure gradients in the extrusion direction, which have an influence on flow distribution and pressure drop in the die. In recent decades, quantitative representation and analysis of physical flow processes have made considerable progress in predicting the weather, developing drive technologies and designing aircraft using simulation methods and lab trials. Using the flow-line method, the flow is analyzed in flat film extrusion dies with a rectangular cross-section, in particular cross flows. The simplest method to visualize the flow is based on the measurement of obstacle orientation in the flow field by adding individual particles. A near-surface flow field can be visualized by using wool or textile yarns. By sticking thin, frayed at the ends of strands of wool surface that is to be examined cross flows, near-wall profiles of the flow and vortex and separation regions can be visualized. A further possibility is to add glass fibers and analyze the fiber orientation by microscopy and x-ray analysis. In this paper the influence of process parameters (e.g. melt temperatures and throughput) on cross flow and fiber orientation is described.

  11. A Numerical Study of Factors Affecting Fracture-Fluid Cleanup and Produced Gas/Water in Marcellus Shale: Part II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seales, Maxian B.; Dilmore, Robert; Ertekin, Turgay

    Horizontal wells combined with successful multistage-hydraulic-fracture treatments are currently the most-established method for effectively stimulating and enabling economic development of gas-bearing organic-rich shale formations. Fracture cleanup in the stimulated reservoir volume (SRV) is critical to stimulation effectiveness and long-term well performance. But, fluid cleanup is often hampered by formation damage, and post-fracture well performance frequently falls to less than expectations. A systematic study of the factors that hinder fracture-fluid cleanup in shale formations can help optimize fracture treatments and better quantify long-term volumes of produced water and gas. Fracture-fluid cleanup is a complex process influenced by mutliphase flow through porousmore » media (relative permeability hysteresis, capillary pressure), reservoir-rock and -fluid properties, fracture-fluid properties, proppant placement, fracture-treatment parameters, and subsequent flowback and field operations. Changing SRV and fracture conductivity as production progresses further adds to the complexity of this problem. Numerical simulation is the best and most-practical approach to investigate such a complicated blend of mechanisms, parameters, their interactions, and subsequent effect on fracture-fluid cleanup and well deliverability. Here, a 3D, two-phase, dual-porosity model was used to investigate the effect of mutliphase flow, proppant crushing, proppant diagenesis, shut-in time, reservoir-rock compaction, gas slippage, and gas desorption on fracture-fluid cleanup and well performance in Marcellus Shale. Our findings have shed light on the factors that substantially constrain efficient fracture-fluid cleanup in gas shales, and we have provided guidelines for improved fracture-treatment designs and water management.« less

  12. Structure-Property Relationships in Amorphous Transparent Conducting Oxides

    NASA Astrophysics Data System (ADS)

    Moffitt, Stephanie Lucille

    Over the last 20 years a new field of amorphous transparent conducting oxides (a-TCOs) has developed. The amorphous nature of these films makes them well suited for large area applications. In addition, a-TCOs can be made at low temperatures and through solution processing methods. These assets provide promising opportunities to improve applications such as solar cells and back-lit displays where traditional crystalline TCOs are used. In addition, it opens the door for new technological applications including the possibility for transparent, flexible electronics. Despite the recent growth in this field, fundamental understanding of the true nature of conductivity and the amorphous structure in this materials system is still progressing. To develop a greater understanding of a-TCOs, structure-property relationships were developed in the a-IGO and a-IZO systems. From the combination of element-specific local structure studies and liquid quench molecular dynamics simulations it is clear that a degree of structure remains in a-TCOs. By understanding this structure, the effect of gallium on thermal stability, carrier concentration and carrier mobility is understood. The source of charge carriers in a-IZO is identified as oxygen vacancies through the application of in situ Brouwer analysis. The continued development of the Brouwer analysis technique for use in amorphous oxides adds to the available methods for studying defects in amorphous systems. Finally, the foundational knowledge gained from the in-depth study of a-IGO was extended to understand the role of combustion processing and pulsed laser deposition as growth methods for transistors based on a-IGO.

  13. Numerical investigation of depth profiling capabilities of helium and neon ions in ion microscopy

    PubMed Central

    Rzeznik, Lukasz; Wirtz, Tom

    2016-01-01

    The analysis of polymers by secondary ion mass spectrometry (SIMS) has been a topic of interest for many years. In recent years, the primary ion species evolved from heavy monatomic ions to cluster and massive cluster primary ions in order to preserve a maximum of organic information. The progress in less-damaging sputtering goes along with a loss in lateral resolution for 2D and 3D imaging. By contrast the development of a mass spectrometer as an add-on tool for the helium ion microscope (HIM), which uses finely focussed He+ or Ne+ beams, allows for the analysis of secondary ions and small secondary cluster ions with unprecedented lateral resolution. Irradiation induced damage and depth profiling capabilities obtained with these light rare gas species have been far less investigated than ion species used classically in SIMS. In this paper we simulated the sputtering of multi-layered polymer samples using the BCA (binary collision approximation) code SD_TRIM_SP to study preferential sputtering and atomic mixing in such samples up to a fluence of 1018 ions/cm2. Results show that helium primary ions are completely inappropriate for depth profiling applications with this kind of sample materials while results for neon are similar to argon. The latter is commonly used as primary ion species in SIMS. For the two heavier species, layers separated by 10 nm can be distinguished for impact energies of a few keV. These results are encouraging for 3D imaging applications where lateral and depth information are of importance. PMID:28144525

  14. A Numerical Study of Factors Affecting Fracture-Fluid Cleanup and Produced Gas/Water in Marcellus Shale: Part II

    DOE PAGES

    Seales, Maxian B.; Dilmore, Robert; Ertekin, Turgay; ...

    2017-04-01

    Horizontal wells combined with successful multistage-hydraulic-fracture treatments are currently the most-established method for effectively stimulating and enabling economic development of gas-bearing organic-rich shale formations. Fracture cleanup in the stimulated reservoir volume (SRV) is critical to stimulation effectiveness and long-term well performance. But, fluid cleanup is often hampered by formation damage, and post-fracture well performance frequently falls to less than expectations. A systematic study of the factors that hinder fracture-fluid cleanup in shale formations can help optimize fracture treatments and better quantify long-term volumes of produced water and gas. Fracture-fluid cleanup is a complex process influenced by mutliphase flow through porousmore » media (relative permeability hysteresis, capillary pressure), reservoir-rock and -fluid properties, fracture-fluid properties, proppant placement, fracture-treatment parameters, and subsequent flowback and field operations. Changing SRV and fracture conductivity as production progresses further adds to the complexity of this problem. Numerical simulation is the best and most-practical approach to investigate such a complicated blend of mechanisms, parameters, their interactions, and subsequent effect on fracture-fluid cleanup and well deliverability. Here, a 3D, two-phase, dual-porosity model was used to investigate the effect of mutliphase flow, proppant crushing, proppant diagenesis, shut-in time, reservoir-rock compaction, gas slippage, and gas desorption on fracture-fluid cleanup and well performance in Marcellus Shale. Our findings have shed light on the factors that substantially constrain efficient fracture-fluid cleanup in gas shales, and we have provided guidelines for improved fracture-treatment designs and water management.« less

  15. A Primer for Agent-Based Simulation and Modeling in Transportation Applications

    DOT National Transportation Integrated Search

    2013-11-01

    Agent-based modeling and simulation (ABMS) methods have been applied in a spectrum of research domains. This primer focuses on ABMS in the transportation interdisciplinary domain, describes the basic concepts of ABMS and the recent progress of ABMS i...

  16. Modelling the effect of wheat canopy architecture as affected by sowing density on Septoria tritici epidemics using a coupled epidemic–virtual plant model

    PubMed Central

    Baccar, Rim; Fournier, Christian; Dornbusch, Tino; Andrieu, Bruno; Gouache, David; Robert, Corinne

    2011-01-01

    Background and Aims The relationship between Septoria tritici, a splash-dispersed disease, and its host is complex because of the interactions between the dynamic plant architecture and the vertical progress of the disease. The aim of this study was to test the capacity of a coupled virtual wheat–Septoria tritici epidemic model (Septo3D) to simulate disease progress on the different leaf layers for contrasted sowing density treatments. Methods A field experiment was performed with winter wheat ‘Soissons’ grown at three contrasted densities. Plant architecture was characterized to parameterize the wheat model, and disease dynamic was monitored to compare with simulations. Three simulation scenarios, differing in the degree of detail with which plant variability of development was represented, were defined. Key Results Despite architectural differences between density treatments, few differences were found in disease progress; only the lower-density treatment resulted in a slightly higher rate of lesion development. Model predictions were consistent with field measurements but did not reproduce the higher rate of lesion progress in the low density. The canopy reconstruction scenario in which inter-plant variability was taken into account yielded the best agreement between measured and simulated epidemics. Simulations performed with the canopy represented by a population of the same average plant deviated strongly from the observations. Conclusions It was possible to compare the predicted and measured epidemics on detailed variables, supporting the hypothesis that the approach is able to provide new insights into the processes and plant traits that contribute to the epidemics. On the other hand, the complex and dynamic responses to sowing density made it difficult to test the model precisely and to disentangle the various aspects involved. This could be overcome by comparing more contrasted and/or simpler canopy architectures such as those resulting from quasi-isogenic lines differing by single architectural traits. PMID:21724656

  17. Multidisciplinary research leading to utilization of extraterrestrial resources

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Progress of the research accomplished during fiscal year 1972 is reported. The summaries presented include: (1) background analysis and coordination, (2) surface properties of rock in simulated lunar environment, (3) rock failure processes, strength and elastic properties in simulated lunar environment, (4) thermal fragmentation, and thermophysical and optical properties in simulated lunar environment, and (5) use of explosives on the moon.

  18. SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method

    PubMed Central

    Bernal, Javier; Torres-Jimenez, Jose

    2015-01-01

    SAGRAD (Simulated Annealing GRADient), a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller’s scaled conjugate gradient algorithm, the latter a variation of the traditional conjugate gradient method, better suited for the nonquadratic nature of neural networks. Different aspects of the implementation of the training process in SAGRAD are discussed, such as the efficient computation of gradients and multiplication of vectors by Hessian matrices that are required by Møller’s algorithm; the (re)initialization of weights with simulated annealing required to (re)start Møller’s algorithm the first time and each time thereafter that it shows insufficient progress in reaching a possibly local minimum; and the use of simulated annealing when Møller’s algorithm, after possibly making considerable progress, becomes stuck at a local minimum or flat area of weight space. Outlines of the scaled conjugate gradient algorithm, the simulated annealing procedure and the training process used in SAGRAD are presented together with results from running SAGRAD on two examples of training data. PMID:26958442

  19. New algorithms for field-theoretic block copolymer simulations: Progress on using adaptive-mesh refinement and sparse matrix solvers in SCFT calculations

    NASA Astrophysics Data System (ADS)

    Sides, Scott; Jamroz, Ben; Crockett, Robert; Pletzer, Alexander

    2012-02-01

    Self-consistent field theory (SCFT) for dense polymer melts has been highly successful in describing complex morphologies in block copolymers. Field-theoretic simulations such as these are able to access large length and time scales that are difficult or impossible for particle-based simulations such as molecular dynamics. The modified diffusion equations that arise as a consequence of the coarse-graining procedure in the SCF theory can be efficiently solved with a pseudo-spectral (PS) method that uses fast-Fourier transforms on uniform Cartesian grids. However, PS methods can be difficult to apply in many block copolymer SCFT simulations (eg. confinement, interface adsorption) in which small spatial regions might require finer resolution than most of the simulation grid. Progress on using new solver algorithms to address these problems will be presented. The Tech-X Chompst project aims at marrying the best of adaptive mesh refinement with linear matrix solver algorithms. The Tech-X code PolySwift++ is an SCFT simulation platform that leverages ongoing development in coupling Chombo, a package for solving PDEs via block-structured AMR calculations and embedded boundaries, with PETSc, a toolkit that includes a large assortment of sparse linear solvers.

  20. Reduction technique of drop voltage and power losses to improve power quality using ETAP Power Station simulation model

    NASA Astrophysics Data System (ADS)

    Satrio, Reza Indra; Subiyanto

    2018-03-01

    The effect of electric loads growth emerged direct impact in power systems distribution. Drop voltage and power losses one of the important things in power systems distribution. This paper presents modelling approach used to restructrure electrical network configuration, reduce drop voltage, reduce power losses and add new distribution transformer to enhance reliability of power systems distribution. Restructrure electrical network was aimed to analyse and investigate electric loads of a distribution transformer. Measurement of real voltage and real current were finished two times for each consumer, that were morning period and night period or when peak load. Design and simulation were conduct by using ETAP Power Station Software. Based on result of simulation and real measurement precentage of drop voltage and total power losses were mismatch with SPLN (Standard PLN) 72:1987. After added a new distribution transformer and restructrured electricity network configuration, the result of simulation could reduce drop voltage from 1.3 % - 31.3 % to 8.1 % - 9.6 % and power losses from 646.7 watt to 233.29 watt. Result showed, restructrure electricity network configuration and added new distribution transformer can be applied as an effective method to reduce drop voltage and reduce power losses.

Top