Update to core reporting practices in structural equation modeling.
Schreiber, James B
This paper is a technical update to "Core Reporting Practices in Structural Equation Modeling." 1 As such, the content covered in this paper includes, sample size, missing data, specification and identification of models, estimation method choices, fit and residual concerns, nested, alternative, and equivalent models, and unique issues within the SEM family of techniques. Copyright © 2016 Elsevier Inc. All rights reserved.
Mission Operations and Information Management Area Spacecraft Monitoring and Control Working Group
NASA Technical Reports Server (NTRS)
Lokerson, Donald C. (Editor)
2005-01-01
Working group goals for this year are: Goal 1. Due to many review comments the green books will be updated and available for re-review by CCSDS. Submission of green books to CCSDS for approval. Goal 2.Initial set of 4 new drafts of the red books as following: SM&C protocol: update with received comments. SM&C common services: update with received comments and expand the service specification. SM&C core services: update with received comments and expand the service the information model. SM&C time services: (target objective): produce initial draft following template of core services.
Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2011
DOE Office of Scientific and Technical Information (OSTI.GOV)
David W. Nigg; Devin A. Steuhm
2011-09-01
Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance and, to some extent, experiment management are obsolete, inconsistent with the state of modern nuclear engineering practice, and are becoming increasingly difficult to properly verify and validate (V&V). Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In 2009 the Idaho National Laboratory (INL) initiated a focused effort to address this situation through the introduction of modern high-fidelitymore » computational software and protocols, with appropriate V&V, within the next 3-4 years via the ATR Core Modeling and Simulation and V&V Update (or 'Core Modeling Update') Project. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF). The ATR Core Modeling Update Project, targeted for full implementation in phase with the anticipated ATR Core Internals Changeout (CIC) in the 2014 time frame, began during the last quarter of Fiscal Year 2009, and has just completed its first full year. Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (SCALE, KENO-6, HELIOS, NEWT, and ATTILA) have been installed at the INL under various permanent sitewide license agreements and corresponding baseline models of the ATR and ATRC are now operational, demonstrating the basic feasibility of these code packages for their intended purpose. Furthermore, a capability for rigorous sensitivity analysis and uncertainty quantification based on the TSUNAMI system is being implemented and initial computational results have been obtained. This capability will have many applications in 2011 and beyond as a tool for understanding the margins of uncertainty in the new models as well as for validation experiment design and interpretation. Finally we note that although full implementation of the new computational models and protocols will extend over a period 3-4 years as noted above, interim applications in the much nearer term have already been demonstrated. In particular, these demonstrations included an analysis that was useful for understanding the cause of some issues in December 2009 that were triggered by a larger than acceptable discrepancy between the measured excess core reactivity and a calculated value that was based on the legacy computational methods. As the Modeling Update project proceeds we anticipate further such interim, informal, applications in parallel with formal qualification of the system under the applicable INL Quality Assurance procedures and standards.« less
Update on the Health Services Research Doctoral Core Competencies.
Burgess, James F; Menachemi, Nir; Maciejewski, Matthew L
2018-03-13
To present revised core competencies for doctoral programs in health services research (HSR), modalities to deliver these competencies, and suggested methods for assessing mastery of these competencies. Core competencies were originally developed in 2005, updated (but unpublished) in 2008, modestly updated for a 2016 HSR workforce conference, and revised based on feedback from attendees. Additional feedback was obtained from doctoral program directors, employer/workforce experts and attendees of presentation on these competencies at the AcademyHealth's June 2017 Annual Research Meeting. The current version (V2.1) competencies include the ethical conduct of research, conceptual models, development of research questions, study designs, data measurement and collection methods, statistical methods for analyzing data, professional collaboration, and knowledge dissemination. These competencies represent a core that defines what HSR researchers should master in order to address the complexities of microsystem to macro-system research that HSR entails. There are opportunities to conduct formal evaluation of newer delivery modalities (e.g., flipped classrooms) and to integrate new Learning Health System Researcher Core Competencies, developed by AHRQ, into the HSR core competencies. Core competencies in HSR are a continually evolving work in progress because new research questions arise, new methods are developed, and the trans-disciplinary nature of the field leads to new multidisciplinary and team building needs. © Health Research and Educational Trust.
Numerical modeling and model updating for smart laminated structures with viscoelastic damping
NASA Astrophysics Data System (ADS)
Lu, Jun; Zhan, Zhenfei; Liu, Xu; Wang, Pan
2018-07-01
This paper presents a numerical modeling method combined with model updating techniques for the analysis of smart laminated structures with viscoelastic damping. Starting with finite element formulation, the dynamics model with piezoelectric actuators is derived based on the constitutive law of the multilayer plate structure. The frequency-dependent characteristics of the viscoelastic core are represented utilizing the anelastic displacement fields (ADF) parametric model in the time domain. The analytical model is validated experimentally and used to analyze the influencing factors of kinetic parameters under parametric variations. Emphasis is placed upon model updating for smart laminated structures to improve the accuracy of the numerical model. Key design variables are selected through the smoothing spline ANOVA statistical technique to mitigate the computational cost. This updating strategy not only corrects the natural frequencies but also improves the accuracy of damping prediction. The effectiveness of the approach is examined through an application problem of a smart laminated plate. It is shown that a good consistency can be achieved between updated results and measurements. The proposed method is computationally efficient.
Noble, Lorraine M; Scott-Smith, Wesley; O'Neill, Bernadette; Salisbury, Helen
2018-04-22
Clinical communication is a core component of undergraduate medical training. A consensus statement on the essential elements of the communication curriculum was co-produced in 2008 by the communication leads of UK medical schools. This paper discusses the relational, contextual and technological changes which have affected clinical communication since then and presents an updated curriculum for communication in undergraduate medicine. The consensus was developed through an iterative consultation process with the communication leads who represent their medical schools on the UK Council of Clinical Communication in Undergraduate Medical Education. The updated curriculum defines the underpinning values, core components and skills required within the context of contemporary medical care. It incorporates the evolving relational issues associated with the more prominent role of the patient in the consultation, reflected through legal precedent and changing societal expectations. The impact on clinical communication of the increased focus on patient safety, the professional duty of candour and digital medicine are discussed. Changes in the way medicine is practised should lead rapidly to adjustments to the content of curricula. The updated curriculum provides a model of best practice to help medical schools develop their teaching and argue for resources. Copyright © 2018 Elsevier B.V. All rights reserved.
Bumper 3 Update for IADC Protection Manual
NASA Technical Reports Server (NTRS)
Christiansen, Eric L.; Nagy, Kornel; Hyde, Jim
2016-01-01
The Bumper code has been the standard in use by NASA and contractors to perform meteoroid/debris risk assessments since 1990. It has undergone extensive revisions and updates [NASA JSC HITF website; Christiansen et al., 1992, 1997]. NASA Johnson Space Center (JSC) has applied BUMPER to risk assessments for Space Station, Shuttle, Mir, Extravehicular Mobility Units (EMU) space suits, and other spacecraft (e.g., LDEF, Iridium, TDRS, and Hubble Space Telescope). Bumper continues to be updated with changes in the ballistic limit equations describing failure threshold of various spacecraft components, as well as changes in the meteoroid and debris environment models. Significant efforts are expended to validate Bumper and benchmark it to other meteoroid/debris risk assessment codes. Bumper 3 is a refactored version of Bumper II. The structure of the code was extensively modified to improve maintenance, performance and flexibility. The architecture was changed to separate the frequently updated ballistic limit equations from the relatively stable common core functions of the program. These updates allow NASA to produce specific editions of the Bumper 3 that are tailored for specific customer requirements. The core consists of common code necessary to process the Micrometeoroid and Orbital Debris (MMOD) environment models, assess shadowing and calculate MMOD risk. The library of target response subroutines includes a board range of different types of MMOD shield ballistic limit equations as well as equations describing damage to various spacecraft subsystems or hardware (thermal protection materials, windows, radiators, solar arrays, cables, etc.). The core and library of ballistic response subroutines are maintained under configuration control. A change in the core will affect all editions of the code, whereas a change in one or more of the response subroutines will affect all editions of the code that contain the particular response subroutines which are modified. Note that the Bumper II program is no longer maintained or distributed by NASA.
NONROAD2008a Installation and Updates
NONROAD2008 is the overall set of modeling files including the core model, default data files, graphical user interface (GUI), and reporting utility. NONROAD2008a is essentially the same, but with one correction to the NOx emission factor data file.
NASA Astrophysics Data System (ADS)
Mielikainen, Jarno; Huang, Bormin; Huang, Allen H.-L.
2015-05-01
Intel Many Integrated Core (MIC) ushers in a new era of supercomputing speed, performance, and compatibility. It allows the developers to run code at trillions of calculations per second using the familiar programming model. In this paper, we present our results of optimizing the updated Goddard shortwave radiation Weather Research and Forecasting (WRF) scheme on Intel Many Integrated Core Architecture (MIC) hardware. The Intel Xeon Phi coprocessor is the first product based on Intel MIC architecture, and it consists of up to 61 cores connected by a high performance on-die bidirectional interconnect. The co-processor supports all important Intel development tools. Thus, the development environment is familiar one to a vast number of CPU developers. Although, getting a maximum performance out of Xeon Phi will require using some novel optimization techniques. Those optimization techniques are discusses in this paper. The results show that the optimizations improved performance of the original code on Xeon Phi 7120P by a factor of 1.3x.
Ando, David; Singh, Jahnavi; Keasling, Jay D.; García Martín, Héctor
2018-01-01
Determination of internal metabolic fluxes is crucial for fundamental and applied biology because they map how carbon and electrons flow through metabolism to enable cell function. 13C Metabolic Flux Analysis (13C MFA) and Two-Scale 13C Metabolic Flux Analysis (2S-13C MFA) are two techniques used to determine such fluxes. Both operate on the simplifying approximation that metabolic flux from peripheral metabolism into central “core” carbon metabolism is minimal, and can be omitted when modeling isotopic labeling in core metabolism. The validity of this “two-scale” or “bow tie” approximation is supported both by the ability to accurately model experimental isotopic labeling data, and by experimentally verified metabolic engineering predictions using these methods. However, the boundaries of core metabolism that satisfy this approximation can vary across species, and across cell culture conditions. Here, we present a set of algorithms that (1) systematically calculate flux bounds for any specified “core” of a genome-scale model so as to satisfy the bow tie approximation and (2) automatically identify an updated set of core reactions that can satisfy this approximation more efficiently. First, we leverage linear programming to simultaneously identify the lowest fluxes from peripheral metabolism into core metabolism compatible with the observed growth rate and extracellular metabolite exchange fluxes. Second, we use Simulated Annealing to identify an updated set of core reactions that allow for a minimum of fluxes into core metabolism to satisfy these experimental constraints. Together, these methods accelerate and automate the identification of a biologically reasonable set of core reactions for use with 13C MFA or 2S-13C MFA, as well as provide for a substantially lower set of flux bounds for fluxes into the core as compared with previous methods. We provide an open source Python implementation of these algorithms at https://github.com/JBEI/limitfluxtocore. PMID:29300340
Orbai, Ana-Maria; de Wit, Maarten; Mease, Philip J; Callis Duffin, Kristina; Elmamoun, Musaab; Tillett, William; Campbell, Willemina; FitzGerald, Oliver; Gladman, Dafna D; Goel, Niti; Gossec, Laure; Hoejgaard, Pil; Leung, Ying Ying; Lindsay, Chris; Strand, Vibeke; van der Heijde, Désirée M; Shea, Bev; Christensen, Robin; Coates, Laura; Eder, Lihi; McHugh, Neil; Kalyoncu, Umut; Steinkoenig, Ingrid; Ogdie, Alexis
2017-10-01
To include the patient perspective in accordance with the Outcome Measures in Rheumatology (OMERACT) Filter 2.0 in the updated Psoriatic Arthritis (PsA) Core Domain Set for randomized controlled trials (RCT) and longitudinal observational studies (LOS). At OMERACT 2016, research conducted to update the PsA Core Domain Set was presented and discussed in breakout groups. The updated PsA Core Domain Set was voted on and endorsed by OMERACT participants. We conducted a systematic literature review of domains measured in PsA RCT and LOS, and identified 24 domains. We conducted 24 focus groups with 130 patients from 7 countries representing 5 continents to identify patient domains. We achieved consensus through 2 rounds of separate surveys with 50 patients and 75 physicians, and a nominal group technique meeting with 12 patients and 12 physicians. We conducted a workshop and breakout groups at OMERACT 2016 in which findings were presented and discussed. The updated PsA Core Domain Set endorsed with 90% agreement by OMERACT 2016 participants included musculoskeletal disease activity, skin disease activity, fatigue, pain, patient's global assessment, physical function, health-related quality of life, and systemic inflammation, which were recommended for all RCT and LOS. These were important, but not required in all RCT and LOS: economic cost, emotional well-being, participation, and structural damage. Independence, sleep, stiffness, and treatment burden were on the research agenda. The updated PsA Core Domain Set was endorsed at OMERACT 2016. Next steps for the PsA working group include evaluation of PsA outcome measures and development of a PsA Core Outcome Measurement Set.
NASA Astrophysics Data System (ADS)
Kreutz, K. J.; Campbell, S. W.; Winski, D.; Osterberg, E. C.; Kochtitzky, W. H.; Copland, L.; Dixon, D.; Introne, D.; Medrzycka, D.; Main, B.; Bernsen, S.; Wake, C. P.
2017-12-01
A growing array of high-resolution paleoclimate records from the terrestrial region bordering the Gulf of Alaska (GoA) continues to reveal details about ocean-atmosphere variability in the region during the Common Era. Ice core records from high-elevation ranges in proximity to the GoA provide key information on extratropical hydroclimate, and potential teleconnections to low latitude regions. In particular, stable water isotope and snow accumulation reconstructions from ice cores collected in high precipitation locations are uniquely tied to regional water cycle changes. Here we present new data collected in 2016 and 2017 from the St. Elias Mountains (Eclipse Icefield, Yukon Territories, Canada), including a range of ice core and geophysical measurements. Low- and high-frequency ice penetrating radar data enable detailed mapping of icefield bedrock topography and internal reflector stratigraphy. The 1911 Katmai eruption layer can be clearly traced across the icefield, and tied definitively to the coeval ash layer found in the 345 meter ice core drilled at Eclipse Icefield in 2002. High-resolution radar data are used to map spatial variability in 2015/16 and 2016/17 snow accumulation. Ice velocity data from repeat GPS stake measurements and remote sensing feature tracking reveal a clear divide flow regime on the icefield. Shallow firn/ice cores (20 meters in 2017 and 65 meters in 2016) are used to update the 345 meter ice core drilled at Eclipse Icefield in 2002. We use new algorithm-based layer counting software to improve and provide error estimates on the new ice core chronology, which extends from 2017 to 1450AD. 3D finite element modeling, incorporating all available geophysical data, is used to refine the reconstructed accumulation rate record and account for vertical and horizontal ice flow. Together with high-resolution stable water isotope data, the updated Eclipse record provides detailed, sub-annual resolution data on several aspects of the regional water cycle (e.g., accumulation/precipitation, moisture source and trajectory, coupled ocean/atmosphere variability). We compare the updated Eclipse record with other data in the North Pacific region, including the new Denali 1200-year ice core datasets, to assess regional hydroclimate variability during the Common Era.
Single-Trial Event-Related Potential Correlates of Belief Updating
Murawski, Carsten; Bode, Stefan
2015-01-01
Abstract Belief updating—the process by which an agent alters an internal model of its environment—is a core function of the CNS. Recent theory has proposed broad principles by which belief updating might operate, but more precise details of its implementation in the human brain remain unclear. In order to address this question, we studied how two components of the human event-related potential encoded different aspects of belief updating. Participants completed a novel perceptual learning task while electroencephalography was recorded. Participants learned the mapping between the contrast of a dynamic visual stimulus and a monetary reward and updated their beliefs about a target contrast on each trial. A Bayesian computational model was formulated to estimate belief states at each trial and was used to quantify the following two variables: belief update size and belief uncertainty. Robust single-trial regression was used to assess how these model-derived variables were related to the amplitudes of the P3 and the stimulus-preceding negativity (SPN), respectively. Results showed a positive relationship between belief update size and P3 amplitude at one fronto-central electrode, and a negative relationship between SPN amplitude and belief uncertainty at a left central and a right parietal electrode. These results provide evidence that belief update size and belief uncertainty have distinct neural signatures that can be tracked in single trials in specific ERP components. This, in turn, provides evidence that the cognitive mechanisms underlying belief updating in humans can be described well within a Bayesian framework. PMID:26473170
NASA Astrophysics Data System (ADS)
Downey, N.; Begnaud, M. L.; Hipp, J. R.; Ballard, S.; Young, C. S.; Encarnacao, A. V.
2017-12-01
The SALSA3D global 3D velocity model of the Earth was developed to improve the accuracy and precision of seismic travel time predictions for a wide suite of regional and teleseismic phases. Recently, the global SALSA3D model was updated to include additional body wave phases including mantle phases, core phases, reflections off the core-mantle boundary and underside reflections off the surface of the Earth. We show that this update improves travel time predictions and leads directly to significant improvements in the accuracy and precision of seismic event locations as compared to locations computed using standard 1D velocity models like ak135, or 2½D models like RSTT. A key feature of our inversions is that path-specific model uncertainty of travel time predictions are calculated using the full 3D model covariance matrix computed during tomography, which results in more realistic uncertainty ellipses that directly reflect tomographic data coverage. Application of this method can also be done at a regional scale: we present a velocity model with uncertainty obtained using data obtained from the University of Utah Seismograph Stations. These results show a reduction in travel-time residuals for re-located events compared with those obtained using previously published models.
Update on the Common Core State Standards Initiative
ERIC Educational Resources Information Center
Ritter, Bill, Jr.
2009-01-01
In this update the National Governors Association presents the testimony of Honorable Bill Ritter, Jr., as submitted to the U.S. House Education and Labor Committee. Ritter speaks about the Common Core State Standards Initiative, a joint project by the National Governors Association (NGA) and Council of Chief State School Officers (CCSSO) to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mollerach, R.; Leszczynski, F.; Fink, J.
2006-07-01
In 2005 the Argentine Government took the decision to complete the construction of the Atucha-II nuclear power plant, which has been progressing slowly during the last ten years. Atucha-II is a 745 MWe nuclear station moderated and cooled with heavy water, of German (Siemens) design located in Argentina. It has a pressure-vessel design with 451 vertical coolant channels, and the fuel assemblies (FA) are clusters of 37 natural UO{sub 2} rods with an active length of 530 cm. For the reactor physics area, a revision and update calculation methods and models (cell, supercell and reactor) was recently carried out coveringmore » cell, supercell (control rod) and core calculations. As a validation of the new models some benchmark comparisons were done with Monte Carlo calculations with MCNP5. This paper presents comparisons of cell and supercell benchmark problems based on a slightly idealized model of the Atucha-I core obtained with the WIMS-D5 and DRAGON codes with MCNP5 results. The Atucha-I core was selected because it is smaller, similar from a neutronic point of view, and more symmetric than Atucha-II Cell parameters compared include cell k-infinity, relative power levels of the different rings of fuel rods, and some two-group macroscopic cross sections. Supercell comparisons include supercell k-infinity changes due to the control rods (tubes) of steel and hafnium. (authors)« less
Gutknecht, J. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Kluber, L. A. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Hanson, P. J. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Schadt, C. W. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.
2016-06-01
This data set provides the peat water content and peat temperature at time of sampling for peat cores collected before and during the SPRUCE Whole Ecosystem Warming (WEW) study. Cores for the current data set were collected during the following bulk peat sampling events: 13 June 2016 and 23 August 2016. Over time, this dataset will be updated with each new major bulk peat sampling event, and dates/methods will be updated accordingly.
Models of a partially hydrated Titan interior with clathrate crust
NASA Astrophysics Data System (ADS)
Lunine, J. I.; Castillo-Rogez, J.
2012-04-01
We present an updated model of the interior evolution of Titan over time, assuming the silicate core was hydrated early in Titan's history and is dehydrating over time. The original model presented in Castillo-Rogez and Lunine (2010) was motivated by a Cassini-derived moment of inertia (Iess et al., 2010) for Titan too large to be accommodated by classical fully differentiated models in which an anhydrous silicate core was overlain by a water ice (with possible perched ocean) mantle. Our model consisted of a silicate core still in the process of dehydrating today, a situation made possible by the leaching of radiogenic potassium from the silicates into the liquid water ocean. The crust of Titan was assumed to be pure water ice I. The model was consistent with the moment of inertia of Titan, but neglected the presence of large amounts of methane in the upper crust invoked to explain methane's persistence at present and through geologic time (Tobie et al. 2006). We have updated our model with such a feature. We have also improved our modeling with a better physical model for the dehydration of antigorite and other hydrated minerals. In particular our modeling now simulates heat advection resulting from water circulation (e.g., Seipold and Schilling 2003), rather than the purely conductive heat transfer regime assumed in the first version of our model. The modeling proceeds as in Castillo-Rogez and Lunine (2010), with the thermal conductivity of the methane clathrate crust rather than that of ice I. The former is several times lower than that of the latter, and the two have rather different temperature dependences (English and Tse, 2009). The crust turns out to have essentially no bearing on the temperature of the silicate core and hence the timing of dehydration, but it profoundly affects the thickness of the high-pressure ice layer beneath the ocean. Indeed, with the insulating methane clathrate crust, there must be a liquid water ocean beneath the methane clathrate crust and in contact with the silicates beneath for most of Titan's history. Although a high-pressure ice layer is likely in place today, it is thin enough that plumes of hot water from the dehydrating core probably breach the high pressure ice layer maintaining contact between the ocean and the silicate core. Part of this work has been performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract to NASA. Government sponsorship acknowledged.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortensi, Javier; Baker, Benjamin Allen; Schunert, Sebastian
The INL is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. This second year of work has been devoted to the generation of a deterministic reference solution for the full core, the preparation of anisotropic diffusion coefficients, the testing of the SPH equivalence method, and the improvement of the control rod modeling. In addition,more » this report includes the progress made in the modeling of the M8 core configuration and experiment vehicle since January of this year.« less
2017-04-13
modelling code, a parallel benchmark , and a communication avoiding version of the QR algorithm. Further, several improvements to the OmpSs model were...movement; and a port of the dynamic load balancing library to OmpSs. Finally, several updates to the tools infrastructure were accomplished, including: an...OmpSs: a basic algorithm on image processing applications, a mini application representative of an ocean modelling code, a parallel benchmark , and a
Impact of Neutrino Opacities on Core-collapse Supernova Simulations
NASA Astrophysics Data System (ADS)
Kotake, Kei; Takiwaki, Tomoya; Fischer, Tobias; Nakamura, Ko; Martínez-Pinedo, Gabriel
2018-02-01
The accurate description of neutrino opacities is central to both the core-collapse supernova (CCSN) phenomenon and the validity of the explosion mechanism itself. In this work, we study in a systematic fashion the role of a variety of well-selected neutrino opacities in CCSN simulations where the multi-energy, three-flavor neutrino transport is solved using the isotropic diffusion source approximation (IDSA) scheme. To verify our code, we first present results from one-dimensional (1D) simulations following the core collapse, bounce, and ∼250 ms postbounce of a 15 {M}ȯ star using a standard set of neutrino opacities by Bruenn. A detailed comparison with published results supports the reliability of our three-flavor IDSA scheme using the standard opacity set. We then investigate in 1D simulations how individual opacity updates lead to differences with the baseline run with the standard opacity set. Through detailed comparisons with previous work, we check the validity of our implementation of each update in a step-by-step manner. Individual neutrino opacities with the largest impact on the overall evolution in 1D simulations are selected for systematic comparisons in our two-dimensional (2D) simulations. Special attention is given to the criterion of explodability in the 2D models. We discuss the implications of these results as well as its limitations and the requirements for future, more elaborate CCSN modeling.
Update on the NASA GEOS-5 Aerosol Forecasting and Data Assimilation System
NASA Technical Reports Server (NTRS)
Colarco, Peter; da Silva, Arlindo; Aquila, Valentina; Bian, Huisheng; Buchard, Virginie; Castellanos, Patricia; Darmenov, Anton; Follette-Cook, Melanie; Govindaraju, Ravi; Keller, Christoph;
2017-01-01
GEOS-5 is the Goddard Earth Observing System model. GEOS-5 is maintained by the NASA Global Modeling and Assimilation Office. Core development is within GMAO,Goddard Atmospheric Chemistry and Dynamics Laboratory, and with external partners. Primary GEOS-5 functions: Earth system model for studying climate variability and change, provide research quality reanalyses for supporting NASA instrument teams and scientific community, provide near-real time forecasts of meteorology,aerosols, and other atmospheric constituents to support NASA airborne campaigns.
American Contact Dermatitis Society Core Allergen Series: 2017 Update.
Schalock, Peter C; Dunnick, Cory A; Nedorost, Susan; Brod, Bruce; Warshaw, Erin; Mowad, Christen
The American Contact Dermatitis Society Core Allergen Series was introduced in 2012. After 4 years of use, changes in our recommended allergens are necessary. For the updated series, we have reordered the first 4 panels to approximately mirror the current TRUE Test and removed parthenolide, triclosan, glutaraldehyde, and jasmine. Polymyxin B, lavender, sodium benzoate, ethylhexylglycerin, and benzoic acid are new additions to the American Contact Dermatitis Society series.
NASA Astrophysics Data System (ADS)
McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.
2017-12-01
The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.
Ex-Vessel Core Melt Modeling Comparison between MELTSPREAD-CORQUENCH and MELCOR 2.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robb, Kevin R.; Farmer, Mitchell; Francis, Matthew W.
System-level code analyses by both United States and international researchers predict major core melting, bottom head failure, and corium-concrete interaction for Fukushima Daiichi Unit 1 (1F1). Although system codes such as MELCOR and MAAP are capable of capturing a wide range of accident phenomena, they currently do not contain detailed models for evaluating some ex-vessel core melt behavior. However, specialized codes containing more detailed modeling are available for melt spreading such as MELTSPREAD as well as long-term molten corium-concrete interaction (MCCI) and debris coolability such as CORQUENCH. In a preceding study, Enhanced Ex-Vessel Analysis for Fukushima Daiichi Unit 1: Meltmore » Spreading and Core-Concrete Interaction Analyses with MELTSPREAD and CORQUENCH, the MELTSPREAD-CORQUENCH codes predicted the 1F1 core melt readily cooled in contrast to predictions by MELCOR. The user community has taken notice and is in the process of updating their systems codes; specifically MAAP and MELCOR, to improve and reduce conservatism in their ex-vessel core melt models. This report investigates why the MELCOR v2.1 code, compared to the MELTSPREAD and CORQUENCH 3.03 codes, yield differing predictions of ex-vessel melt progression. To accomplish this, the differences in the treatment of the ex-vessel melt with respect to melt spreading and long-term coolability are examined. The differences in modeling approaches are summarized, and a comparison of example code predictions is provided.« less
Rethinking the core list of journals for libraries that serve schools and colleges of pharmacy.
Beckett, Robert D; Cole, Sabrina W; Rogers, Hannah K; Bickett, Skye; Seeger, Christina; McDaniel, Jennifer A
2014-10-01
The Core List of Journals for Libraries that Serve Schools and Colleges of Pharmacy is a guide for developing and maintaining pharmacy-affiliated library collections. A work group was created to update the list and design a process for updating that will streamline future revisions. Work group members searched the National Library of Medicine catalog for an initial list of journals and then applied inclusion criteria to narrow the list. The work group finalized the fifth edition of the list with 225 diverse publications and produced a sustainable set of criteria for journal inclusion, providing a structured, objective process for future updates.
Rethinking the Core List of Journals for Libraries that Serve Schools and Colleges of Pharmacy
Beckett, Robert D.; Rogers, Hannah K.; Bickett, Skye; Seeger, Christina; McDaniel, Jennifer A.
2014-01-01
The Core List of Journals for Libraries that Serve Schools and Colleges of Pharmacy is a guide for developing and maintaining pharmacy-affiliated library collections. A work group was created to update the list and design a process for updating that will streamline future revisions. Work group members searched the National Library of Medicine catalog for an initial list of journals and then applied inclusion criteria to narrow the list. The work group finalized the fifth edition of the list with 225 diverse publications and produced a sustainable set of criteria for journal inclusion, providing a structured, objective process for future updates. PMID:25349548
Valenta, Annette L; Meagher, Emma A; Tachinardi, Umberto
2016-01-01
Since the inception of the Clinical and Translational Science Award (CTSA) program in 2006, leaders in education across CTSA sites have been developing and updating core competencies for Clinical and Translational Science (CTS) trainees. By 2009, 14 competency domains, including biomedical informatics, had been identified and published. Since that time, the evolution of the CTSA program, changes in the practice of CTS, the rapid adoption of electronic health records (EHRs), the growth of biomedical informatics, the explosion of big data, and the realization that some of the competencies had proven to be difficult to apply in practice have made it clear that the competencies should be updated. This paper describes the process undertaken and puts forth a new set of competencies that has been recently endorsed by the Clinical Research Informatics Workgroup of AMIA. In addition to providing context and background for the current version of the competencies, we hope this will serve as a model for revision of competencies over time. PMID:27121608
Functionally dissociable influences on learning rate in a dynamic environment
McGuire, Joseph T.; Nassar, Matthew R.; Gold, Joshua I.; Kable, Joseph W.
2015-01-01
Summary Maintaining accurate beliefs in a changing environment requires dynamically adapting the rate at which one learns from new experiences. Beliefs should be stable in the face of noisy data, but malleable in periods of change or uncertainty. Here we used computational modeling, psychophysics and fMRI to show that adaptive learning is not a unitary phenomenon in the brain. Rather, it can be decomposed into three computationally and neuroanatomically distinct factors that were evident in human subjects performing a spatial-prediction task: (1) surprise-driven belief updating, related to BOLD activity in visual cortex; (2) uncertainty-driven belief updating, related to anterior prefrontal and parietal activity; and (3) reward-driven belief updating, a context-inappropriate behavioral tendency related to activity in ventral striatum. These distinct factors converged in a core system governing adaptive learning. This system, which included dorsomedial frontal cortex, responded to all three factors and predicted belief updating both across trials and across individuals. PMID:25459409
Khan, Aziz; Fornes, Oriol; Stigliani, Arnaud; Gheorghe, Marius; Castro-Mondragon, Jaime A; van der Lee, Robin; Bessy, Adrien; Chèneby, Jeanne; Kulkarni, Shubhada R; Tan, Ge; Baranasic, Damir; Arenillas, David J; Sandelin, Albin; Vandepoele, Klaas; Lenhard, Boris; Ballester, Benoît; Wasserman, Wyeth W; Parcy, François; Mathelier, Anthony
2018-01-04
JASPAR (http://jaspar.genereg.net) is an open-access database of curated, non-redundant transcription factor (TF)-binding profiles stored as position frequency matrices (PFMs) and TF flexible models (TFFMs) for TFs across multiple species in six taxonomic groups. In the 2018 release of JASPAR, the CORE collection has been expanded with 322 new PFMs (60 for vertebrates and 262 for plants) and 33 PFMs were updated (24 for vertebrates, 8 for plants and 1 for insects). These new profiles represent a 30% expansion compared to the 2016 release. In addition, we have introduced 316 TFFMs (95 for vertebrates, 218 for plants and 3 for insects). This release incorporates clusters of similar PFMs in each taxon and each TF class per taxon. The JASPAR 2018 CORE vertebrate collection of PFMs was used to predict TF-binding sites in the human genome. The predictions are made available to the scientific community through a UCSC Genome Browser track data hub. Finally, this update comes with a new web framework with an interactive and responsive user-interface, along with new features. All the underlying data can be retrieved programmatically using a RESTful API and through the JASPAR 2018 R/Bioconductor package. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Fornes, Oriol; Stigliani, Arnaud; Gheorghe, Marius; Castro-Mondragon, Jaime A; Bessy, Adrien; Chèneby, Jeanne; Kulkarni, Shubhada R; Tan, Ge; Baranasic, Damir; Arenillas, David J; Vandepoele, Klaas; Parcy, François
2018-01-01
Abstract JASPAR (http://jaspar.genereg.net) is an open-access database of curated, non-redundant transcription factor (TF)-binding profiles stored as position frequency matrices (PFMs) and TF flexible models (TFFMs) for TFs across multiple species in six taxonomic groups. In the 2018 release of JASPAR, the CORE collection has been expanded with 322 new PFMs (60 for vertebrates and 262 for plants) and 33 PFMs were updated (24 for vertebrates, 8 for plants and 1 for insects). These new profiles represent a 30% expansion compared to the 2016 release. In addition, we have introduced 316 TFFMs (95 for vertebrates, 218 for plants and 3 for insects). This release incorporates clusters of similar PFMs in each taxon and each TF class per taxon. The JASPAR 2018 CORE vertebrate collection of PFMs was used to predict TF-binding sites in the human genome. The predictions are made available to the scientific community through a UCSC Genome Browser track data hub. Finally, this update comes with a new web framework with an interactive and responsive user-interface, along with new features. All the underlying data can be retrieved programmatically using a RESTful API and through the JASPAR 2018 R/Bioconductor package. PMID:29140473
Doyle, Caoilainn; Smeaton, Alan F.; Roche, Richard A. P.; Boran, Lorraine
2018-01-01
To elucidate the core executive function profile (strengths and weaknesses in inhibition, updating, and switching) associated with dyslexia, this study explored executive function in 27 children with dyslexia and 29 age matched controls using sensitive z-mean measures of each ability and controlled for individual differences in processing speed. This study found that developmental dyslexia is associated with inhibition and updating, but not switching impairments, at the error z-mean composite level, whilst controlling for processing speed. Inhibition and updating (but not switching) error composites predicted both dyslexia likelihood and reading ability across the full range of variation from typical to atypical. The predictive relationships were such that those with poorer performance on inhibition and updating measures were significantly more likely to have a diagnosis of developmental dyslexia and also demonstrate poorer reading ability. These findings suggest that inhibition and updating abilities are associated with developmental dyslexia and predict reading ability. Future studies should explore executive function training as an intervention for children with dyslexia as core executive functions appear to be modifiable with training and may transfer to improved reading ability. PMID:29892245
eVolv2k: A new ice core-based volcanic forcing reconstruction for the past 2000 years
NASA Astrophysics Data System (ADS)
Toohey, Matthew; Sigl, Michael
2016-04-01
Radiative forcing resulting from stratospheric aerosols produced by major volcanic eruptions is a dominant driver of climate variability in the Earth's past. The ability of climate model simulations to accurately recreate past climate is tied directly to the accuracy of the volcanic forcing timeseries used in the simulations. We present here a new volcanic forcing reconstruction, based on newly updated ice core composites from Antarctica and Greenland. Ice core records are translated into stratospheric aerosol properties for use in climate models through the Easy Volcanic Aerosol (EVA) module, which provides an analytic representation of volcanic stratospheric aerosol forcing based on available observations and aerosol model results, prescribing the aerosol's radiative properties and primary modes of spatial and temporal variability. The evolv2k volcanic forcing dataset covers the past 2000 years, and has been provided for use in the Paleo-Modeling Intercomparison Project (PMIP), and VolMIP experiments within CMIP6. Here, we describe the construction of the eVolv2k data set, compare with prior forcing sets, and show initial simulation results.
Romanowicz, Barbara; Cao, Aimin; Godwal, Budhiram; ...
2016-01-06
Using an updated data set of ballistic PKIKP travel time data at antipodal distances, we test different models of anisotropy in the Earth's innermost inner core (IMIC) and obtain significantly better fits for a fast axis aligned with Earth's rotation axis, rather than a quasi-equatorial direction, as proposed recently. Reviewing recent results on the single crystal structure and elasticity of iron at core conditions, we find that an hcp structure with the fast c axis parallel to Earth's rotation is more likely but a body-centered cubic structure with the [111] axis aligned in that direction results in very similar predictionsmore » for seismic anisotropy. These models are therefore not distinguishable based on current seismological data. In addition, to match the seismological observations, the inferred strength of anisotropy in the IMIC (6–7%) implies almost perfect alignment of iron crystals, an intriguing, albeit unlikely situation, especially in the presence of heterogeneity, which calls for further studies. Fast axis of anisotropy in the central part of the inner core aligned with Earth's axis of rotation Lastly, the structure of iron in the inner core is most likely hcp, not bcc Not currently possible to distinguish between hcp and bcc structures from seismic observations« less
Consent-based access to core EHR information. Collaborative approaches in Norway.
Heimly, Vigdis; Berntsen, Kirsti E
2009-01-01
Lack of access to updated drug information is a challenge for healthcare providers in Norway. Drug charts are updated in separate EHR systems but exchange of drug information between them is lacking. In order to provide ready access to updated medication information, a project for consent-based access to a core EHR has been established. End users have developed requirements for additions to the medication modules in the EHR systems in cooperation with vendors, researchers and standardization workers. The modules are then implemented by the vendors, tested in the usability lab, and finally tested by the national testing and approval service before implementation. An ethnographic study, with focus on future users and their interaction with other actors regarding medicines and medication, has included semi-/unstructured interviews with the involved organizational units. The core EHR uses the EHR kept by the patient's regular GP as the main source of information. A server-based solution has been chosen in order to keep the core EHR accessible outside the GP's regular work hours. The core EHR is being tested, and the EHR-vendors are implementing additions to their systems in order to facilitate communication with the core EHR. All major EHR-system vendors in Norway participate in the project. The core EHR provides a generic basis that may be used as a pilot for a national patient summary. Examples of a wider use of the core EHR can be: shared individual plans to support continuity of care, summary of the patient's contacts with health providers in different organizations, and core EHR information such as important diagnoses, allergies and contact information. Extensive electronic cooperation and communication requires that all partners adjust their documentation practices to fit with other actors' needs. The implementation effects on future work practices will be followed by researchers.
Tarzian, Anita J
2013-01-01
Ethics consultation has become an integral part of the fabric of U.S. health care delivery. This article summarizes the second edition of the Core Competencies for Health Care Ethics Consultation report of the American Society for Bioethics and Humanities. The core knowledge and skills competencies identified in the first edition of Core Competencies have been adopted by various ethics consultation services and education programs, providing evidence of their endorsement as health care ethics consultation (HCEC) standards. This revised report was prompted by thinking in the field that has evolved since the original report. Patients, family members, and health care providers who encounter ethical questions or concerns that ethics consultants could help address deserve access to efficient, effective, and accountable HCEC services. All individuals providing such services should be held to the standards of competence and quality described in the revised report.
Update on matter radii of O-2417
NASA Astrophysics Data System (ADS)
Fortune, H. T.
2018-05-01
The appearance of new theoretical papers concerning matter radii of neutron-rich oxygen nuclei has prompted a return to this problem. New results provide no better agreement with experimental values than did previous calculations with a simple model. I maintain that there is no reason to adjust the 22O core in the 24O nucleus, and the case of 24O should be reexamined experimentally.
Pena, Jose M; Manguno-Mire, Gina; Kinzie, Erik; Johnson, Janet E
2016-04-01
The authors describe the Tulane Model for teaching cultural competence to psychiatry residents in order to outline an innovative approach to curricula development in academic psychiatry. The authors focus on the didactic experience that takes place during the first and second postgraduate years and present seven core concepts that should inform the emerging clinician's thinking in the formulation of every clinical case. The authors discuss the correspondence between each core concept and the Outline for Cultural Formulation, introduced in Diagnostic and Statistical Manual of Mental Disorders (DSM)-IV and updated in DSM-5. The authors illustrate how each of the core concepts is utilized as a guideline for teaching residents a process for eliciting culturally relevant information from their patients and their personal histories and how to apply that knowledge in the assessment and treatment of patients in clinical settings.
Essential Distinctiveness: Strategic Alternatives in Updating the Business Core Curriculum
ERIC Educational Resources Information Center
Alstete, Jeffrey W.
2013-01-01
Purpose: This paper seeks to propose the use of specific strategic management tools for identifying opportunities for gaining competitive advantage in the business core curricula offered at colleges and universities. Design/methodology/approach: A brief review of the literature on business core curriculum innovation and change is examined, and…
ERIC Educational Resources Information Center
Carroll, Kathleen
2015-01-01
The challenge of updating curriculum to align with Common Core State Standards is a national one felt by states, districts, and teachers alike. Teachers generally express enthusiasm for the Common Core, but consistently cite a lack of high-quality curricula as an impediment to teaching them. The demand for core-aligned quality materials has far…
McKerrow, Alexa; Davidson, A.; Earnhardt, Todd; Benson, Abigail L.; Toth, Charles; Holm, Thomas; Jutz, Boris
2014-01-01
Over the past decade, great progress has been made to develop national extent land cover mapping products to address natural resource issues. One of the core products of the GAP Program is range-wide species distribution models for nearly 2000 terrestrial vertebrate species in the U.S. We rely on deductive modeling of habitat affinities using these products to create models of habitat availability. That approach requires that we have a thematically rich and ecologically meaningful map legend to support the modeling effort. In this work, we tested the integration of the Multi-Resolution Landscape Characterization Consortium's National Land Cover Database 2011 and LANDFIRE's Disturbance Products to update the 2001 National GAP Vegetation Dataset to reflect 2011 conditions. The revised product can then be used to update the species models. We tested the update approach in three geographic areas (Northeast, Southeast, and Interior Northwest). We used the NLCD product to identify areas where the cover type mapped in 2011 was different from what was in the 2001 land cover map. We used Google Earth and ArcGIS base maps as reference imagery in order to label areas identified as "changed" to the appropriate class from our map legend. Areas mapped as urban or water in the 2011 NLCD map that were mapped differently in the 2001 GAP map were accepted without further validation and recoded to the corresponding GAP class. We used LANDFIRE's Disturbance products to identify changes that are the result of recent disturbance and to inform the reassignment of areas to their updated thematic label. We ran species habitat models for three species including Lewis's Woodpecker (Melanerpes lewis) and the White-tailed Jack Rabbit (Lepus townsendii) and Brown Headed nuthatch (Sitta pusilla). For each of three vertebrate species we found important differences in the amount and location of suitable habitat between the 2001 and 2011 habitat maps. Specifically, Brown headed nuthatch habitat in 2011 was −14% of the 2001 modeled habitat, whereas Lewis's Woodpecker increased by 4%. The white-tailed jack rabbit (Lepus townsendii) had a net change of −1% (11% decline, 10% gain). For that species we found the updates related to opening of forest due to burning and regenerating shrubs following harvest to be the locally important main transitions. In the Southeast updates related to timber management and urbanization are locally important.
Systems Modeling for Crew Core Body Temperature Prediction Postlanding
NASA Technical Reports Server (NTRS)
Cross, Cynthia; Ochoa, Dustin
2010-01-01
The Orion Crew Exploration Vehicle, NASA s latest crewed spacecraft project, presents many challenges to its designers including ensuring crew survivability during nominal and off nominal landing conditions. With a nominal water landing planned off the coast of San Clemente, California, off nominal water landings could range from the far North Atlantic Ocean to the middle of the equatorial Pacific Ocean. For all of these conditions, the vehicle must provide sufficient life support resources to ensure that the crew member s core body temperatures are maintained at a safe level prior to crew rescue. This paper will examine the natural environments, environments created inside the cabin and constraints associated with post landing operations that affect the temperature of the crew member. Models of the capsule and the crew members are examined and analysis results are compared to the requirement for safe human exposure. Further, recommendations for updated modeling techniques and operational limits are included.
State Accountability in the Transition to Common Core. Updated
ERIC Educational Resources Information Center
Sears, Victoria
2014-01-01
The Common Core is at a critical juncture. While many surveys show that support for the standards themselves remains strong, implementation has not been without major challenges. "State Accountability in the Transition to Common Core," a new policy brief from the Thomas B. Fordham Institute, provides cautionary advice about what key…
Accelerating deep neural network training with inconsistent stochastic gradient descent.
Wang, Linnan; Yang, Yi; Min, Renqiang; Chakradhar, Srimat
2017-09-01
Stochastic Gradient Descent (SGD) updates Convolutional Neural Network (CNN) with a noisy gradient computed from a random batch, and each batch evenly updates the network once in an epoch. This model applies the same training effort to each batch, but it overlooks the fact that the gradient variance, induced by Sampling Bias and Intrinsic Image Difference, renders different training dynamics on batches. In this paper, we develop a new training strategy for SGD, referred to as Inconsistent Stochastic Gradient Descent (ISGD) to address this problem. The core concept of ISGD is the inconsistent training, which dynamically adjusts the training effort w.r.t the loss. ISGD models the training as a stochastic process that gradually reduces down the mean of batch's loss, and it utilizes a dynamic upper control limit to identify a large loss batch on the fly. ISGD stays on the identified batch to accelerate the training with additional gradient updates, and it also has a constraint to penalize drastic parameter changes. ISGD is straightforward, computationally efficient and without requiring auxiliary memories. A series of empirical evaluations on real world datasets and networks demonstrate the promising performance of inconsistent training. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pietrinferni, Adriano; Cassisi, Santi; Salaris, Maurizio; Castelli, Fiorella
2004-09-01
We present a large and updated stellar evolution database for low-, intermediate-, and high-mass stars in a wide metallicity range, suitable for studying Galactic and extragalactic simple and composite stellar populations using population synthesis techniques. The stellar mass range is between ~0.5 and 10 Msolar with a fine mass spacing. The metallicity [Fe/H] comprises 10 values ranging from -2.27 to 0.40, with a scaled solar metal distribution. The initial He mass fraction ranges from Y=0.245, for the more metal-poor composition, up to 0.303 for the more metal-rich one, with ΔY/ΔZ~1.4. For each adopted chemical composition, the evolutionary models have been computed without (canonical models) and with overshooting from the Schwarzschild boundary of the convective cores during the central H-burning phase. Semiconvection is included in the treatment of core convection during the He-burning phase. The whole set of evolutionary models can be used to compute isochrones in a wide age range, from ~30 Myr to ~15 Gyr. Both evolutionary models and isochrones are available in several observational planes, employing an updated set of bolometric corrections and color-Teff relations computed for this project. The number of points along the models and the resulting isochrones is selected in such a way that interpolation for intermediate metallicities not contained in the grid is straightforward; a simple quadratic interpolation produces results of sufficient accuracy for population synthesis applications.We compare our isochrones with results from a series of widely used stellar evolution databases and perform some empirical tests for the reliability of our models. Since this work is devoted to scaled solar chemical compositions, we focus our attention on the Galactic disk stellar populations, employing multicolor photometry of unevolved field main-sequence stars with precise Hipparcos parallaxes, well-studied open clusters, and one eclipsing binary system with precise measurements of masses, radii, and [Fe/H] of both components. We find that the predicted metallicity dependence of the location of the lower, unevolved main sequence in the color magnitude diagram (CMD) appears in satisfactory agreement with empirical data. When comparing our models with CMDs of selected, well-studied, open clusters, once again we were able to properly match the whole observed evolutionary sequences by assuming cluster distance and reddening estimates in satisfactory agreement with empirical evaluations of these quantities. In general, models including overshooting during the H-burning phase provide a better match to the observations, at least for ages below ~4 Gyr. At [Fe/H] around solar and higher ages (i.e., smaller convective cores) before the onset of radiative cores, the selected efficiency of core overshooting may be too high in our model, as well as in various other models in the literature. Since we also provide canonical models, the reader is strongly encouraged to always compare the results from both sets in this critical age range.
NASA Astrophysics Data System (ADS)
Tracy, James L., Jr.
A study of ground state binding energy values listed in the Atomic Mass Evaluation 2012 (AME2012) using an interpretive approach, as opposed to the exploratory methods of previous models, is presented. This model is based on a postulate requiring all protons to pair with available neutrons to form bound alpha clusters as the ground state for an N = Z core upon which excess neutrons are added. For each core, the trend of the binding energy as a function of excess neutrons in the isotopic chain can be fit with a three-term quadratic function. The quadratic parameter reveals a smooth decaying exponential function. By re-envisioning the determination of mass excess, the constant-term fit parameters, representing N = Z nuclei, reveal a near-symmetry around Z = 50. The linear fit parameters exhibit trends which are linear functions of core size. A neutron drip-line prediction is compared against current models. By considering the possibility of an alpha-cluster core, a new ground-state structure grouping scheme is presented; nucleon-nucleon pairing is shown to have a greater role in level filling. This model, referred to as the Alpha-Deuteron-Neutron Model, yields promising first results when considering root-mean-square variances from the AME2012. The beta-decay of the neutron-rich isotope 74Cu has been studied using three high-purity Germanium clover detectors at the Holifield Radioactive Ion Beam Facility at Oak Ridge National Laboratory. A high-resolution mass separator greatly improved the purity of the 74Cu beam by removing isobaric contaminants, thus allowing decay through its isobar chain to the stable 74Ge at the center of the LeRIBSS detector array without any decay chain member dominating. Using coincidence gating techniques, 121 gamma-rays associated with 74Cu were isolated from the collective singles spectrum. Eighty-seven of these were placed in an expanded level scheme, and updated beta-feeding level intensities and log( ft) values are presented based on multiple newly-placed excited states up to 6.8 MeV. The progression of simulated Total Absorption gamma-ray Spectroscopy (TAGS) based on known levels and beta feeding values from previous measurements to this evaluation are presented and demonstrate the need for a TAGS measurement of this isotope to gain a more complete understanding of its decay scheme.
Core dynamics and the nutations of the Earth.
NASA Astrophysics Data System (ADS)
Dehant, V. M. A.; Laguerre, R.; Rekier, J.; Rivoldini, A.; Trinh, A.; Triana, A. S.; Van Hoolst, T.; Zhu, P.
2016-12-01
We here present an overview of the recent activities within the project RotaNut - Rotation and Nutation of a Wobbly Earth, an ERC Advanced Grant funding from the European Research Council. We have recomputed the Basic Earth Parameters from recent VLBI series and we interpret them in terms of physics of the Earth's deep interior. This includes updates of the nutational constraints on Earth's internal magnetic field and inner core viscosity, as well as of the coupling constants at the core-mantle boundary (CMB) and inner core boundary ICB. We have explored on simplified Earth models the interactions between rotational and gravito-inertial modes. With the help of numerical simulations, we have also addressed the coupling between the global rotation and the inertial waves in the fluid core through parametric instabilities. Special interests have been given to the influence of the inner core onto the stability properties of the liquid core and the large scale formation in the turbulent flow through inverse cascade of energy. The role of precession and nutation forcing for the liquid core is characterized as well as the interaction between the Free Core Nutation (in the fluid core community called the tilt-over mode) and the inertial waves. This research represents the first steps in the project RotaNut financed by the European Research Council under ERC Advanced Grant 670874 for 2015-2020.
Parallel transformation of K-SVD solar image denoising algorithm
NASA Astrophysics Data System (ADS)
Liang, Youwen; Tian, Yu; Li, Mei
2017-02-01
The images obtained by observing the sun through a large telescope always suffered with noise due to the low SNR. K-SVD denoising algorithm can effectively remove Gauss white noise. Training dictionaries for sparse representations is a time consuming task, due to the large size of the data involved and to the complexity of the training algorithms. In this paper, an OpenMP parallel programming language is proposed to transform the serial algorithm to the parallel version. Data parallelism model is used to transform the algorithm. Not one atom but multiple atoms updated simultaneously is the biggest change. The denoising effect and acceleration performance are tested after completion of the parallel algorithm. Speedup of the program is 13.563 in condition of using 16 cores. This parallel version can fully utilize the multi-core CPU hardware resources, greatly reduce running time and easily to transplant in multi-core platform.
Role of nuclear reactions on stellar evolution of intermediate-mass stars
NASA Astrophysics Data System (ADS)
Möller, H.; Jones, S.; Fischer, T.; Martínez-Pinedo, G.
2018-01-01
The evolution of intermediate-mass stars (8 - 12 solar masses) represents one of the most challenging subjects in nuclear astrophysics. Their final fate is highly uncertain and strongly model dependent. They can become white dwarfs, they can undergo electron-capture or core-collapse supernovae or they might even proceed towards explosive oxygen burning and a subsequent thermonuclear explosion. We believe that an accurate description of nuclear reactions is crucial for the determination of the pre-supernova structure of these stars. We argue that due to the possible development of an oxygen-deflagration, a hydrodynamic description has to be used. We implement a nuclear reaction network with ∼200 nuclear species into the implicit hydrodynamic code AGILE. The reaction network considers all relevant nuclear electron captures and beta-decays. For selected relevant nuclear species, we include a set of updated reaction rates, for which we discuss the role for the evolution of the stellar core, at the example of selected stellar models. We find that the final fate of these intermediate-mass stars depends sensitively on the density threshold for weak processes that deleptonize the core.
Updating medical school psychiatry curricula to meet projected mental health needs.
Thomas, Susan; Pai, Nagesh; Dawes, Kerry; Wilson, Coralie; Williams, Virginia
2013-12-01
In view of the growing disease burden of mental disorders, we consider the pressing need to update medical school psychiatry education to better equip doctors to recognise and treat these conditions. Key challenges to the delivery of medical school mental health curricula, and possible directions for reform, are reviewed with the aims of stimulating collaboration and enhancing the efficiency across schools. In Australia, medical school expansion provides opportunities to prepare many training doctors to meet growing mental health care needs. Despite this, published reviews of practice and curriculum models are notably lacking. Australia, unlike other countries, has yet to agree on a core curriculum in medical school psychiatry, with practices varying widely between schools. Curricula should equip doctors to better recognise and treat common mental disorders during early stages, as well as preparing some for specialist psychiatry training. High-quality, multidisciplinary teaching in varied clinical settings may boost teaching resources. Additionally, medical education provides opportunities to better equip doctors to take care of their own mental health. Key challenges are to achieve a consensus on core curricula across Australian medical schools, and an appropriate proportion of medical school curriculum time for mental disorders, relative to their complexity and large disease burden.
Invasive Species Science Update (No. 5)
Dean Pearson; Yvette Ortega
2011-01-01
Welcome to the fifth issue of the Rocky Mountain Research Station's (RMRS) Invasive Species Science Update. The newsletter is produced by the RMRS Invasive Species Working Group (ISWG), which is a core group of scientists who volunteer to coordinate outreach of RMRS invasive species science to managers and the public. After publishing the past four newsletters, we...
Geomagnetic Jerks in the Swarm Era
NASA Astrophysics Data System (ADS)
Brown, William; Beggan, Ciaran; Macmillan, Susan
2016-08-01
The timely provision of geomagnetic observations as part of the European Space Agency (ESA) Swarm mission means up-to-date analysis and modelling of the Earth's magnetic field can be conducted rapidly in a manner not possible before. Observations from each of the three Swarm constellation satellites are available within 4 days and a database of close-to-definitive ground observatory measurements is updated every 3 months. This makes it possible to study very recent variations of the core magnetic field. Here we investigate rapid, unpredictable internal field variations known as geomagnetic jerks. Given that jerks represent (currently) unpredictable changes in the core field and have been identified to have happened in 2014 since Swarm was launched, we ask what impact this might have on the future accuracy of the International Geomagnetic Reference Field (IGRF). We assess the performance of each of the IGRF-12 secular variation model candidates in light of recent jerks, given that four of the nine candidates are novel physics-based predictive models.
Super-nodal methods for space-time kinetics
NASA Astrophysics Data System (ADS)
Mertyurek, Ugur
The purpose of this research has been to develop an advanced Super-Nodal method to reduce the run time of 3-D core neutronics models, such as in the NESTLE reactor core simulator and FORMOSA nuclear fuel management optimization codes. Computational performance of the neutronics model is increased by reducing the number of spatial nodes used in the core modeling. However, as the number of spatial nodes decreases, the error in the solution increases. The Super-Nodal method reduces the error associated with the use of coarse nodes in the analyses by providing a new set of cross sections and ADFs (Assembly Discontinuity Factors) for the new nodalization. These so called homogenization parameters are obtained by employing consistent collapsing technique. During this research a new type of singularity, namely "fundamental mode singularity", is addressed in the ANM (Analytical Nodal Method) solution. The "Coordinate Shifting" approach is developed as a method to address this singularity. Also, the "Buckling Shifting" approach is developed as an alternative and more accurate method to address the zero buckling singularity, which is a more common and well known singularity problem in the ANM solution. In the course of addressing the treatment of these singularities, an effort was made to provide better and more robust results from the Super-Nodal method by developing several new methods for determining the transverse leakage and collapsed diffusion coefficient, which generally are the two main approximations in the ANM methodology. Unfortunately, the proposed new transverse leakage and diffusion coefficient approximations failed to provide a consistent improvement to the current methodology. However, improvement in the Super-Nodal solution is achieved by updating the homogenization parameters at several time points during a transient. The update is achieved by employing a refinement technique similar to pin-power reconstruction. A simple error analysis based on the relative residual in the 3-D few group diffusion equation at the fine mesh level is also introduced in this work.
NASA Astrophysics Data System (ADS)
Paxton, Bill; Cantiello, Matteo; Arras, Phil; Bildsten, Lars; Brown, Edward F.; Dotter, Aaron; Mankovich, Christopher; Montgomery, M. H.; Stello, Dennis; Timmes, F. X.; Townsend, Richard
2013-09-01
We substantially update the capabilities of the open source software package Modules for Experiments in Stellar Astrophysics (MESA), and its one-dimensional stellar evolution module, MESA star. Improvements in MESA star's ability to model the evolution of giant planets now extends its applicability down to masses as low as one-tenth that of Jupiter. The dramatic improvement in asteroseismology enabled by the space-based Kepler and CoRoT missions motivates our full coupling of the ADIPLS adiabatic pulsation code with MESA star. This also motivates a numerical recasting of the Ledoux criterion that is more easily implemented when many nuclei are present at non-negligible abundances. This impacts the way in which MESA star calculates semi-convective and thermohaline mixing. We exhibit the evolution of 3-8 M ⊙ stars through the end of core He burning, the onset of He thermal pulses, and arrival on the white dwarf cooling sequence. We implement diffusion of angular momentum and chemical abundances that enable calculations of rotating-star models, which we compare thoroughly with earlier work. We introduce a new treatment of radiation-dominated envelopes that allows the uninterrupted evolution of massive stars to core collapse. This enables the generation of new sets of supernovae, long gamma-ray burst, and pair-instability progenitor models. We substantially modify the way in which MESA star solves the fully coupled stellar structure and composition equations, and we show how this has improved the scaling of MESA's calculational speed on multi-core processors. Updates to the modules for equation of state, opacity, nuclear reaction rates, and atmospheric boundary conditions are also provided. We describe the MESA Software Development Kit that packages all the required components needed to form a unified, maintained, and well-validated build environment for MESA. We also highlight a few tools developed by the community for rapid visualization of MESA star results.
The COMET Initiative database: progress and activities update (2015).
Gargon, E; Williamson, P R; Altman, D G; Blazeby, J M; Tunis, S; Clarke, M
2017-02-03
This letter describes the substantial activity on the Core Outcome Measure in Effectiveness Trials (COMET) website in 2015, updating our earlier progress reports for the period from the launch of the COMET website and database in August 2011 to December 2014. As in previous years, 2015 saw further increases in the annual number of visits to the website, the number of pages viewed and the number of searches undertaken. The sustained growth in use of the website and database suggests that COMET is continuing to gain interest and prominence, and that the resources are useful to people interested in the development of core outcome sets.
Mercier, Tracey J.; Brownfield, Michael E.; Johnson, Ronald C.; Self, Jesse G.
1998-01-01
This CD-ROM includes updated files containing Fischer assays of samples of core holes and cuttings from exploration drill holes drilled in the Eocene Green River Formation in the Piceance Basin of northwestern Colorado. A database was compiled that includes more than 321,380 Fischer assays from 782 boreholes. Most of the oil yield data were analyzed by the former U.S. Bureau of Mines oil shale laboratory in Laramie, Wyoming, and some analyses were made by private laboratories. Location data for 1,042 core and rotary holes, oil and gas tests, as well as a few surface sections are listed in a spreadsheet and included in the CD-ROM. These assays are part of a larger collection of subsurface information held by the U.S. Geological Survey, including geophysical and lithologic logs, water data, and chemical and X-ray diffraction analyses having to do with the Green River oil shale deposits in Colorado, Wyoming, and Utah. Because of an increased interest in oil shale, this CD-ROM disc containing updated Fischer assay data for the Piceance Basin oil shale deposits in northwestern Colorado is being released to the public.
Neutron star cooling and pion condensation
NASA Technical Reports Server (NTRS)
Umeda, Hideyuki; Nomoto, Ken'ichi; Tsuruta, Sachiko; Muto, Takumi; Tatsumi, Toshitaka
1994-01-01
The nonstandard cooling of a neutron star with the central pion core is explored. By adopting the latest results from the pion condensation theory, neutrino emissivity is calulated for both pure charged pions and a mixture of charged and neutral pions, and the equations of state are constructed for the pion condensate. The effect of superfluidity on cooling is investigated, adopting methods more realistic than in previous studies. Our theoretical models are compared with the currently updated observational data, and possible implications are explored.
SAM Photovoltaic Model Technical Reference 2016 Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilman, Paul; DiOrio, Nicholas A; Freeman, Janine M
This manual describes the photovoltaic performance model in the System Advisor Model (SAM) software, Version 2016.3.14 Revision 4 (SSC Version 160). It is an update to the 2015 edition of the manual, which describes the photovoltaic model in SAM 2015.1.30 (SSC 41). This new edition includes corrections of errors in the 2015 edition and descriptions of new features introduced in SAM 2016.3.14, including: 3D shade calculator Battery storage model DC power optimizer loss inputs Snow loss model Plane-of-array irradiance input from weather file option Support for sub-hourly simulations Self-shading works with all four subarrays, and uses same algorithm for fixedmore » arrays and one-axis tracking Linear self-shading algorithm for thin-film modules Loss percentages replace derate factors. The photovoltaic performance model is one of the modules in the SAM Simulation Core (SSC), which is part of both SAM and the SAM SDK. SAM is a user-friedly desktop application for analysis of renewable energy projects. The SAM SDK (Software Development Kit) is for developers writing their own renewable energy analysis software based on SSC. This manual is written for users of both SAM and the SAM SDK wanting to learn more about the details of SAM's photovoltaic model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao Ying, E-mail: ying.xiao@jefferson.edu; De Amorim Bernstein, Karen; Chetty, Indrin J.
Purpose: In 2004, the American Society for Radiation Oncology (ASTRO) published its first physics education curriculum for residents, which was updated in 2007. A committee composed of physicists and physicians from various residency program teaching institutions was reconvened again to update the curriculum in 2009. Methods and Materials: Members of this committee have associations with ASTRO, the American Association of Physicists in Medicine, the Association of Residents in Radiation Oncology, the American Board of Radiology (ABR), and the American College of Radiology. Members reviewed and updated assigned subjects from the last curriculum. The updated curriculum was carefully reviewed by amore » representative from the ABR and other physics and clinical experts. Results: The new curriculum resulted in a recommended 56-h course, excluding initial orientation. Learning objectives are provided for each subject area, and a detailed outline of material to be covered is given for each lecture hour. Some recent changes in the curriculum include the addition of Radiation Incidents and Bioterrorism Response Training as a subject and updates that reflect new treatment techniques and modalities in a number of core subjects. The new curriculum was approved by the ASTRO board in April 2010. We anticipate that physicists will use this curriculum for structuring their teaching programs, and subsequently the ABR will adopt this educational program for its written examination. Currently, the American College of Radiology uses the ASTRO curriculum for their training examination topics. In addition to the curriculum, the committee updated suggested references and the glossary. Conclusions: The ASTRO physics education curriculum for radiation oncology residents has been updated. To ensure continued commitment to a current and relevant curriculum, the subject matter will be updated again in 2 years.« less
Xiao, Ying; Bernstein, Karen De Amorim; Chetty, Indrin J; Eifel, Patricia; Hughes, Lesley; Klein, Eric E; McDermott, Patrick; Prisciandaro, Joann; Paliwal, Bhudatt; Price, Robert A; Werner-Wasik, Maria; Palta, Jatinder R
2011-11-15
In 2004, the American Society for Radiation Oncology (ASTRO) published its first physics education curriculum for residents, which was updated in 2007. A committee composed of physicists and physicians from various residency program teaching institutions was reconvened again to update the curriculum in 2009. Members of this committee have associations with ASTRO, the American Association of Physicists in Medicine, the Association of Residents in Radiation Oncology, the American Board of Radiology (ABR), and the American College of Radiology. Members reviewed and updated assigned subjects from the last curriculum. The updated curriculum was carefully reviewed by a representative from the ABR and other physics and clinical experts. The new curriculum resulted in a recommended 56-h course, excluding initial orientation. Learning objectives are provided for each subject area, and a detailed outline of material to be covered is given for each lecture hour. Some recent changes in the curriculum include the addition of Radiation Incidents and Bioterrorism Response Training as a subject and updates that reflect new treatment techniques and modalities in a number of core subjects. The new curriculum was approved by the ASTRO board in April 2010. We anticipate that physicists will use this curriculum for structuring their teaching programs, and subsequently the ABR will adopt this educational program for its written examination. Currently, the American College of Radiology uses the ASTRO curriculum for their training examination topics. In addition to the curriculum, the committee updated suggested references and the glossary. The ASTRO physics education curriculum for radiation oncology residents has been updated. To ensure continued commitment to a current and relevant curriculum, the subject matter will be updated again in 2 years. Copyright © 2011 Elsevier Inc. All rights reserved.
Modeling Radicalization Phenomena in Heterogeneous Populations.
Galam, Serge; Javarone, Marco Alberto
2016-01-01
The phenomenon of radicalization is investigated within a mixed population composed of core and sensitive subpopulations. The latest includes first to third generation immigrants. Respective ways of life may be partially incompatible. In case of a conflict core agents behave as inflexible about the issue. In contrast, sensitive agents can decide either to live peacefully adjusting their way of life to the core one, or to oppose it with eventually joining violent activities. The interplay dynamics between peaceful and opponent sensitive agents is driven by pairwise interactions. These interactions occur both within the sensitive population and by mixing with core agents. The update process is monitored using a Lotka-Volterra-like Ordinary Differential Equation. Given an initial tiny minority of opponents that coexist with both inflexible and peaceful agents, we investigate implications on the emergence of radicalization. Opponents try to turn peaceful agents to opponents driving radicalization. However, inflexible core agents may step in to bring back opponents to a peaceful choice thus weakening the phenomenon. The required minimum individual core involvement to actually curb radicalization is calculated. It is found to be a function of both the majority or minority status of the sensitive subpopulation with respect to the core subpopulation and the degree of activeness of opponents. The results highlight the instrumental role core agents can have to hinder radicalization within the sensitive subpopulation. Some hints are outlined to favor novel public policies towards social integration.
Modeling Radicalization Phenomena in Heterogeneous Populations
2016-01-01
The phenomenon of radicalization is investigated within a mixed population composed of core and sensitive subpopulations. The latest includes first to third generation immigrants. Respective ways of life may be partially incompatible. In case of a conflict core agents behave as inflexible about the issue. In contrast, sensitive agents can decide either to live peacefully adjusting their way of life to the core one, or to oppose it with eventually joining violent activities. The interplay dynamics between peaceful and opponent sensitive agents is driven by pairwise interactions. These interactions occur both within the sensitive population and by mixing with core agents. The update process is monitored using a Lotka-Volterra-like Ordinary Differential Equation. Given an initial tiny minority of opponents that coexist with both inflexible and peaceful agents, we investigate implications on the emergence of radicalization. Opponents try to turn peaceful agents to opponents driving radicalization. However, inflexible core agents may step in to bring back opponents to a peaceful choice thus weakening the phenomenon. The required minimum individual core involvement to actually curb radicalization is calculated. It is found to be a function of both the majority or minority status of the sensitive subpopulation with respect to the core subpopulation and the degree of activeness of opponents. The results highlight the instrumental role core agents can have to hinder radicalization within the sensitive subpopulation. Some hints are outlined to favor novel public policies towards social integration. PMID:27166677
Supplemental Thermal-Hydraulic Transient Analyses of BR2 in Support of Conversion to LEU Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Licht, J.; Dionne, B.; Sikik, E.
2016-01-01
Belgian Reactor 2 (BR2) is a research and test reactor located in Mol, Belgium and is primarily used for radioisotope production and materials testing. The Materials Management and Minimization (M3) Reactor Conversion Program of the National Nuclear Security Administration (NNSA) is supporting the conversion of the BR2 reactor from Highly Enriched Uranium (HEU) fuel to Low Enriched Uranium (LEU) fuel. The RELAP5/Mod 3.3 code has been used to perform transient thermal-hydraulic safety analyses of the BR2 reactor to support reactor conversion. A RELAP5 model of BR2 has been validated against select transient BR2 reactor experiments performed in 1963 by showingmore » agreement with measured cladding temperatures. Following the validation, the RELAP5 model was then updated to represent the current use of the reactor; taking into account core configuration, neutronic parameters, trip settings, component changes, etc. Simulations of the 1963 experiments were repeated with this updated model to re-evaluate the boiling risks associated with the currently allowed maximum heat flux limit of 470 W/cm 2 and temporary heat flux limit of 600 W/cm 2. This document provides analysis of additional transient simulations that are required as part of a modern BR2 safety analysis report (SAR). The additional simulations included in this report are effect of pool temperature, reduced steady-state flow rate, in-pool loss of coolant accidents, and loss of external cooling. The simulations described in this document have been performed for both an HEU- and LEU-fueled core.« less
Invasive Species Science Update (No. 8)
Dean Pearson; Yvette Ortega; Jack Butler
2015-01-01
Invasive Species Science Updates are designed to keep managers and other users up-to-date with recently completed and ongoing research by RMRS scientists, as well as highlight breaking news related to invasive species issues. The newsletter is produced by the RMRS Invasive Species Working Group (ISWG), which is a core group of scientists who volunteer to coordinate...
Invasive Species Science Update (No. 7)
Dean Pearson; Yvette Ortega; Jack Butler
2014-01-01
Invasive Species Science Updates are designed to keep managers and other users up-to-date with recently completed and ongoing research by RMRS scientists, as well as highlight breaking news related to invasive species issues. The newsletter is produced by the RMRS Invasive Species Working Group (ISWG), which is a core group of scientists who volunteer to coordinate...
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth A.
2018-01-01
The following are updated or new subjects added to the FPGA SEE Test Guidelines manual: academic versus mission specific device evaluation, single event latch-up (SEL) test and analysis, SEE response visibility enhancement during radiation testing, mitigation evaluation (embedded and user-implemented), unreliable design and its affects to SEE Data, testing flushable architectures versus non-flushable architectures, intellectual property core (IP Core) test and evaluation (addresses embedded and user-inserted), heavy-ion energy and linear energy transfer (LET) selection, proton versus heavy-ion testing, fault injection, mean fluence to failure analysis, and mission specific system-level single event upset (SEU) response prediction. Most sections within the guidelines manual provide information regarding best practices for test structure and test system development. The scope of this manual addresses academic versus mission specific device evaluation and visibility enhancement in IP Core testing.
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth A.
2018-01-01
The following are updated or new subjects added to the FPGA SEE Test Guidelines manual: academic versus mission specific device evaluation, single event latch-up (SEL) test and analysis, SEE response visibility enhancement during radiation testing, mitigation evaluation (embedded and user-implemented), unreliable design and its affects to SEE Data, testing flushable architectures versus non-flushable architectures, intellectual property core (IP Core) test and evaluation (addresses embedded and user-inserted), heavy-ion energy and linear energy transfer (LET) selection, proton versus heavy-ion testing, fault injection, mean fluence to failure analysis, and mission specific system-level single event upset (SEU) response prediction. Most sections within the guidelines manual provide information regarding best practices for test structure and test system development. The scope of this manual addresses academic versus mission specific device evaluation and visibility enhancement in IP Core testing.
NASA Astrophysics Data System (ADS)
Day, E. A.; Ward, J. A.; Bastow, I. D.; Irving, J. C. E.
2016-12-01
The Earth's inner core is a surprisingly complex region of our planet. Simple models of inner core solidification and evolution would lead us to expect a layered structure, which has "frozen in" in information about the state of the core at the time of solidification. However, seismic observations of Earth's inner core are not dominated by a radial "tree-ring" like pattern, but instead have revealed a hemispherical dichotomy in addition to depth dependent variations. There is a degree-one structure in isotropic and anisotropic velocities and in attenuation between the so-called eastern and western hemispheres of the inner core, with different depth distributions proposed for these varying phenomena. A range of mechanisms have been proposed to explain the hemispherical differences. These include models that require differences between the two hemispheres at the time of formation, post-solidification texturing, convection in the inner core, or hybrid mechanisms. Regional observations of the inner core suggest that a simple division between East and West may not be able to fully capture the structure present in the inner core. More detailed seismic observations will help us to understand the puzzle of the inner core's evolution. In this study we focus on updating observations of the seismic phase P'P', an inner core sensitive body wave with a more complex path than those typically used to study the inner core. By making new measurements of P'P' we illuminate new regions of the core with a high frequency phase that is sensitive to small scale structures. We examine the differential travel times of the different branches of P'P' (PKIKPPKIKP and PKPPKP), comparing the arrival time of inner core turning branch, P'P'df, with the arrival times of branches that turn in the outer core. P'P' is a relatively small amplitude phase, so we use both linear and non-linear stacking methods to make observations of the P'P' signals. These measurements are sensitive to the broad scale hemispherical pattern of anisotropy in the inner core as well as smaller scale variations.
NASA Astrophysics Data System (ADS)
Finlay, Christopher C.; Olsen, Nils; Kotsiaros, Stavros; Gillet, Nicolas; Tøffner-Clausen, Lars
2016-07-01
We use more than 2 years of magnetic data from the Swarm mission, and monthly means from 160 ground observatories as available in March 2016, to update the CHAOS time-dependent geomagnetic field model. The new model, CHAOS-6, provides information on time variations of the core-generated part of the Earth's magnetic field between 1999.0 and 2016.5. We present details of the secular variation (SV) and secular acceleration (SA) from CHAOS-6 at Earth's surface and downward continued to the core surface. At Earth's surface, we find evidence for positive acceleration of the field intensity in 2015 over a broad area around longitude 90°E that is also seen at ground observatories such as Novosibirsk. At the core surface, we are able to map the SV up to at least degree 16. The radial field SA at the core surface in 2015 is found to be largest at low latitudes under the India-South-East Asia region, under the region of northern South America, and at high northern latitudes under Alaska and Siberia. Surprisingly, there is also evidence for significant SA in the central Pacific region, for example near Hawaii where radial field SA is observed on either side of a jerk in 2014. On the other hand, little SV or SA has occurred over the past 17 years in the southern polar region. Inverting for a quasi-geostrophic core flow that accounts for this SV, we obtain a prominent planetary-scale, anti-cyclonic, gyre centred on the Atlantic hemisphere. We also find oscillations of non-axisymmetric, azimuthal, jets at low latitudes, for example close to 40°W, that may be responsible for localized SA oscillations. In addition to scalar data from Ørsted, CHAMP, SAC-C and Swarm, and vector data from Ørsted, CHAMP and Swarm, CHAOS-6 benefits from the inclusion of along-track differences of scalar and vector field data from both CHAMP and the three Swarm satellites, as well as east-west differences between the lower pair of Swarm satellites, Alpha and Charlie. Moreover, ground observatory SV estimates are fit to a Huber-weighted rms level of 3.1 nT/year for the eastward components and 3.8 and 3.7 nT/year for the vertical and southward components. We also present an update of the CHAOS high-degree lithospheric field, making use of along-track differences of CHAMP scalar and vector field data to produce a new static field model that agrees well with the MF7 field model out to degree 110.
Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
David W. Nigg, Principal Investigator; Kevin A. Steuhm, Project Manager
Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance, and to some extent, experiment management, are inconsistent with the state of modern nuclear engineering practice, and are difficult, if not impossible, to properly verify and validate (V&V) according to modern standards. Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In late 2009, the Idaho National Laboratory (INL) initiated a focused effort, the ATR Core Modeling Updatemore » Project, to address this situation through the introduction of modern high-fidelity computational software and protocols. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF). The ATR Core Modeling Update Project, targeted for full implementation in phase with the next anticipated ATR Core Internals Changeout (CIC) in the 2014-2015 time frame, began during the last quarter of Fiscal Year 2009, and has just completed its third full year. Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL under various licensing arrangements. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009, Cycle 145A through Cycle 151B, was successfully completed during 2012. This major effort supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR Core Safety Analysis Package (CSAP) preparation process, in parallel with the established PDQ-based methodology, beginning late in Fiscal Year 2012. Acquisition of the advanced SERPENT (VTT-Finland) and MC21 (DOE-NR) Monte Carlo stochastic neutronics simulation codes was also initiated during the year and some initial applications of SERPENT to ATRC experiment analysis were demonstrated. These two new codes will offer significant additional capability, including the possibility of full-3D Monte Carlo fuel management support capabilities for the ATR at some point in the future. Finally, a capability for rigorous sensitivity analysis and uncertainty quantification based on the TSUNAMI system has been implemented and initial computational results have been obtained. This capability will have many applications as a tool for understanding the margins of uncertainty in the new models as well as for validation experiment design and interpretation.« less
The Updated BaSTI Stellar Evolution Models and Isochrones. I. Solar-scaled Calculations
NASA Astrophysics Data System (ADS)
Hidalgo, Sebastian L.; Pietrinferni, Adriano; Cassisi, Santi; Salaris, Maurizio; Mucciarelli, Alessio; Savino, Alessandro; Aparicio, Antonio; Silva Aguirre, Victor; Verma, Kuldeep
2018-04-01
We present an updated release of the BaSTI (a Bag of Stellar Tracks and Isochrones) stellar model and isochrone library for a solar-scaled heavy element distribution. The main input physics that have been changed from the previous BaSTI release include the solar metal mixture, electron conduction opacities, a few nuclear reaction rates, bolometric corrections, and the treatment of the overshooting efficiency for shrinking convective cores. The new model calculations cover a mass range between 0.1 and 15 M ⊙, 22 initial chemical compositions between [Fe/H] = ‑3.20 and +0.45, with helium to metal enrichment ratio dY/dZ = 1.31. The isochrones cover an age range between 20 Myr and 14.5 Gyr, consistently take into account the pre-main-sequence phase, and have been translated to a large number of popular photometric systems. Asteroseismic properties of the theoretical models have also been calculated. We compare our isochrones with results from independent databases and with several sets of observations to test the accuracy of the calculations. All stellar evolution tracks, asteroseismic properties, and isochrones are made available through a dedicated web site.
Status and threats analysis for the Florida manatee (Trichechus manatus latirostris), 2012
Runge, Michael C.; Langtimm, Catherine A.; Martin, Julien; Fonnesbeck, Christopher J.
2015-01-01
The endangered West Indian manatee (Trichechus manatus), especially the Florida subspecies (T. m. latirostris), has been the focus of conservation efforts and extensive research since its listing under the Endangered Species Act. On the basis of the best information available as of December 2012, the threats facing the Florida manatee were determined to be less severe than previously thought, either because the conservation efforts have been successful, or because our knowledge of the demographic effects of those threats is increased, or both. Using the manatee Core Biological Model, we estimated the probability of the Florida manatee population on either the Atlantic or Gulf coast falling below 500 adults in the next 150 years to be 0.92 percent. The primary threats remain watercraft-related mortality and long-term loss of warm-water habitat. Since 2009, however, there have been a number of unusual events that have not yet been incorporated into this analysis, including several severely cold winters, a severe red-tide die off, and substantial loss of seagrass habitat in Brevard County, Fla. Further, the version of the Core Biological Model used in 2012 makes a number of assumptions that are under investigation. A revision of the Core Biological Model and an update of this quantitative threats analysis are underway as of 2015.
Factors influencing infants’ ability to update object representations in memory
Moher, Mariko; Feigenson, Lisa
2013-01-01
Remembering persisting objects over occlusion is critical to representing a stable environment. Infants remember hidden objects at multiple locations and can update their representation of a hidden array when an object is added or subtracted. However, the factors influencing these updating abilities have received little systematic exploration. Here we examined the flexibility of infants’ ability to update object representations. We tested 11-month-olds in a looking-time task in which objects were added to or subtracted from two hidden arrays. Across five experiments, infants successfully updated their representations of hidden arrays when the updating occurred successively at one array before beginning at the other. But when updating required alternating between two arrays, infants failed. However, simply connecting the two arrays with a thin strip of foam-core led infants to succeed. Our results suggest that infants’ construal of an event strongly affects their ability to update memory representations of hidden objects. When construing an event as containing multiple updates to the same array, infants succeed, but when construing the event as requiring the revisiting and updating of previously attended arrays, infants fail. PMID:24049245
Primary Sources. Update: Teachers' Views on Common Core State Standards
ERIC Educational Resources Information Center
Scholastic Inc. and the Bill & Melinda Gates Foundation, 2014
2014-01-01
Scholastic and the Bill & Melinda Gates Foundation fielded the third edition of the "Primary Sources" survey of America's teachers in July 2013 (see ED562664). Twenty thousand pre-K through grade 12 public school teachers responded, sharing their perspectives on issues important to their profession, including the Common Core State…
The Alzheimer’s Disease Neuroimaging Initiative Informatics Core: A Decade in Review
Toga, Arthur W.; Crawford, Karen L.
2015-01-01
The Informatics Core of the Alzheimer’s Diseases Neuroimaging Initiative (ADNI) has coordinated data integration and dissemination for a continually growing and complex dataset in which both data contributors and recipients span institutions, scientific disciplines and geographic boundaries. This article provides an update on the accomplishments and future plans. PMID:26194316
30 CFR 550.297 - What information must a CID contain?
Code of Federal Regulations, 2014 CFR
2014-07-01
... drilled before your CID submittal that define the extent of the reservoirs. You must notify BOEM of any well that is drilled to total depth during the CID evaluation period and you may be required to update..., caliper curves) curves in an acceptable digital format; (4) Sidewall core/whole core and pressure-volume...
30 CFR 550.297 - What information must a CID contain?
Code of Federal Regulations, 2013 CFR
2013-07-01
... drilled before your CID submittal that define the extent of the reservoirs. You must notify BOEM of any well that is drilled to total depth during the CID evaluation period and you may be required to update..., caliper curves) curves in an acceptable digital format; (4) Sidewall core/whole core and pressure-volume...
30 CFR 550.297 - What information must a CID contain?
Code of Federal Regulations, 2012 CFR
2012-07-01
... drilled before your CID submittal that define the extent of the reservoirs. You must notify BOEM of any well that is drilled to total depth during the CID evaluation period and you may be required to update..., caliper curves) curves in an acceptable digital format; (4) Sidewall core/whole core and pressure-volume...
Economics America: Content Statements for State Standards in Economics, K-12.
ERIC Educational Resources Information Center
National Council on Economic Education, New York, NY.
This updated list of content standards covering economics is suggested for states developing their own economics standards. The list outlines the core requirements for basic literacy in economics for grades K-12. The statements are similar to designated content standards from other core subject areas. Key economic concepts describing their basic…
The Comparative Toxicogenomics Database: update 2017.
Davis, Allan Peter; Grondin, Cynthia J; Johnson, Robin J; Sciaky, Daniela; King, Benjamin L; McMorran, Roy; Wiegers, Jolene; Wiegers, Thomas C; Mattingly, Carolyn J
2017-01-04
The Comparative Toxicogenomics Database (CTD; http://ctdbase.org/) provides information about interactions between chemicals and gene products, and their relationships to diseases. Core CTD content (chemical-gene, chemical-disease and gene-disease interactions manually curated from the literature) are integrated with each other as well as with select external datasets to generate expanded networks and predict novel associations. Today, core CTD includes more than 30.5 million toxicogenomic connections relating chemicals/drugs, genes/proteins, diseases, taxa, Gene Ontology (GO) annotations, pathways, and gene interaction modules. In this update, we report a 33% increase in our core data content since 2015, describe our new exposure module (that harmonizes exposure science information with core toxicogenomic data) and introduce a novel dataset of GO-disease inferences (that identify common molecular underpinnings for seemingly unrelated pathologies). These advancements centralize and contextualize real-world chemical exposures with molecular pathways to help scientists generate testable hypotheses in an effort to understand the etiology and mechanisms underlying environmentally influenced diseases. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
ERIC Educational Resources Information Center
British Columbia Council on Admissions and Transfer, 2010
2010-01-01
In 2008, a number of changes were identified that expanded the scope of the updating required for Block Transfer for tourism management as follows: a new core curriculum for diploma programs; the need for expanded information on diploma to diploma transfer; and, a growing need for an expanded system of transfer identified in Campus 2020…
Core Noise: Implications of Emerging N+3 Designs and Acoustic Technology Needs
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.
2011-01-01
This presentation is a summary of the core-noise implications of NASA's primary N+3 aircraft concepts. These concepts are the MIT/P&W D8.5 Double Bubble design, the Boeing/GE SUGAR Volt hybrid gas-turbine/electric engine concept, the NASA N3-X Turboelectric Distributed Propulsion aircraft, and the NASA TBW-XN Truss-Braced Wing concept. The first two are future concepts for the Boeing 737/Airbus A320 US transcontinental mission of 180 passengers and a maximum range of 3000 nm. The last two are future concepts for the Boeing 777 transpacific mission of 350 passengers and a 7500 nm range. Sections of the presentation cover: turbofan design trends on the N+1.5 time frame and the already emerging importance of core noise; the NASA N+3 concepts and associated core-noise challenges; the historical trends for the engine bypass ratio (BPR), overall pressure ratio (OPR), and combustor exit temperature; and brief discussion of a noise research roadmap being developed to address the core-noise challenges identified for the N+3 concepts. The N+3 conceptual aircraft have (i) ultra-high bypass ratios, in the rage of 18 - 30, accomplished by either having a small-size, high-power-density core, an hybrid design which allows for an increased fan size, or by utilizing a turboelectric distributed propulsion design; and (ii) very high OPR in the 50 - 70 range. These trends will elevate the overall importance of turbomachinery core noise. The N+3 conceptual designs specify the need for the development and application of advanced liners and passive and active control strategies to reduce the core noise. Current engineering prediction of core noise uses semi-empirical methods based on older turbofan engines, with (at best) updates for more recent designs. The models have not seen the same level of development and maturity as those for fan and jet noise and are grossly inadequate for the designs considered for the N+3 time frame. An aggressive program for the development of updated noise prediction tools for integrated core assemblies as well as and strategies for noise reduction and control is needed in order to meet the NASA N+3 noise goals. The NASA Fundamental Aeronautics Program has the principal objective of overcoming today's national challenges in air transportation. The SFW Reduced-Perceived-Noise Technical Challenge aims to develop concepts and technologies to dramatically reduce the perceived aircraft noise outside of airport boundaries. This reduction of aircraft noise is critical to enabling the anticipated large increase in future air traffic.
Mathelier, Anthony; Fornes, Oriol; Arenillas, David J.; Chen, Chih-yu; Denay, Grégoire; Lee, Jessica; Shi, Wenqiang; Shyr, Casper; Tan, Ge; Worsley-Hunt, Rebecca; Zhang, Allen W.; Parcy, François; Lenhard, Boris; Sandelin, Albin; Wasserman, Wyeth W.
2016-01-01
JASPAR (http://jaspar.genereg.net) is an open-access database storing curated, non-redundant transcription factor (TF) binding profiles representing transcription factor binding preferences as position frequency matrices for multiple species in six taxonomic groups. For this 2016 release, we expanded the JASPAR CORE collection with 494 new TF binding profiles (315 in vertebrates, 11 in nematodes, 3 in insects, 1 in fungi and 164 in plants) and updated 59 profiles (58 in vertebrates and 1 in fungi). The introduced profiles represent an 83% expansion and 10% update when compared to the previous release. We updated the structural annotation of the TF DNA binding domains (DBDs) following a published hierarchical structural classification. In addition, we introduced 130 transcription factor flexible models trained on ChIP-seq data for vertebrates, which capture dinucleotide dependencies within TF binding sites. This new JASPAR release is accompanied by a new web tool to infer JASPAR TF binding profiles recognized by a given TF protein sequence. Moreover, we provide the users with a Ruby module complementing the JASPAR API to ease programmatic access and use of the JASPAR collection of profiles. Finally, we provide the JASPAR2016 R/Bioconductor data package with the data of this release. PMID:26531826
The Mixing of Regolith on the Moon and Beyond; A Model Refreshed
NASA Astrophysics Data System (ADS)
Costello, E.; Ghent, R. R.; Lucey, P. G.
2017-12-01
Meteoritic impactors constantly mix the lunar regolith, affecting stratigraphy, the lifetime of rays and other anomalous surface features, and the burial, exposure, and break down of volatiles and rocks. In this work we revisit the pioneering regolith mixing model presented by Gault et al. (1974), with updated assumptions and input parameters. Our updates significantly widen the parameter space and allow us to explore mixing as it is driven by different impactors in different materials (e.g. radar-dark halos and melt ponds). The updated treatment of micrometeorites suggests a very high rate of processing at the immediate lunar surface, with implications for rock breakdown and regolith production on melt ponds. We find that the inclusion of secondary impacts has a very strong effect on the rate and magnitude of mixing at all depths and timescales. Our calculations are in good agreement with the timescale of reworking in the top 2-3 cm of regolith that was predicted by observations of LROC temporal pairs and by the depth profile of 26Al abundance in Apollo drill cores. Further, our calculations with secondaries included are consistent with the depth profile of in situ exposure age calculated from Is/FeO and cosmic track abundance in Apollo deep drill cores down to 50cm. The mixing we predict is also consistent with the erasure of density anomalies, or `cold spots', observed in the top decimeters of regolith by LRO Diviner, and the 1Gyr lifetime of 1-10m thick Copernican rays. This exploration of Moon's surface evolution has profound implications for our understanding of other planetary bodies. We take advantage of this computationally inexpensive analytic model and apply it to describe mixing on a variety of bodies across the solar system; including asteroids, Mercury, and Europa. We use the results of ongoing studies that describe porosity calculations and cratering laws in porous asteroid-like material to explore the reworking rate experienced by an asteroid. On Mercury, we apply this model to describe the rate at which reworking depletes water ice and calculate the maximum age of Mercury's polar ice deposits. We apply the model to Europa to understand the impact portion of its regolith evolution and provide insight into the sampling zone intended for a future Europa lander.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dash, Satyakam; Khodayari, Ali; Zhou, Jilai
Background. Clostridium thermocellum is a Gram-positive anaerobe with the ability to hydrolyze and metabolize cellulose into biofuels such as ethanol, making it an attractive candidate for consolidated bioprocessing (CBP). At present, metabolic engineering in C. thermocellum is hindered due to the incomplete description of its metabolic repertoire and regulation within a predictive metabolic model. Genome-scale metabolic (GSM) models augmented with kinetic models of metabolism have been shown to be effective at recapitulating perturbed metabolic phenotypes. Results. In this effort, we first update a second-generation genome-scale metabolic model (iCth446) for C. thermocellum by correcting cofactor dependencies, restoring elemental and charge balances,more » and updating GAM and NGAM values to improve phenotype predictions. The iCth446 model is next used as a scaffold to develop a core kinetic model (k-ctherm118) of the C. thermocellum central metabolism using the Ensemble Modeling (EM) paradigm. Model parameterization is carried out by simultaneously imposing fermentation yield data in lactate, malate, acetate, and hydrogen production pathways for 19 measured metabolites spanning a library of 19 distinct single and multiple gene knockout mutants along with 18 intracellular metabolite concentration data for a Δgldh mutant and ten experimentally measured Michaelis–Menten kinetic parameters. Conclusions. The k-ctherm118 model captures significant metabolic changes caused by (1) nitrogen limitation leading to increased yields for lactate, pyruvate, and amino acids, and (2) ethanol stress causing an increase in intracellular sugar phosphate concentrations (~1.5-fold) due to upregulation of cofactor pools. Robustness analysis of k-ctherm118 alludes to the presence of a secondary activity of ketol-acid reductoisomerase and possible regulation by valine and/or leucine pool levels. In addition, cross-validation and robustness analysis allude to missing elements in k-ctherm118 and suggest additional experiments to improve kinetic model prediction fidelity. Overall, the study quantitatively assesses the advantages of EM-based kinetic modeling towards improved prediction of C. thermocellum metabolism and develops a predictive kinetic model which can be used to design biofuel-overproducing strains.« less
Dash, Satyakam; Khodayari, Ali; Zhou, Jilai; ...
2017-05-02
Background. Clostridium thermocellum is a Gram-positive anaerobe with the ability to hydrolyze and metabolize cellulose into biofuels such as ethanol, making it an attractive candidate for consolidated bioprocessing (CBP). At present, metabolic engineering in C. thermocellum is hindered due to the incomplete description of its metabolic repertoire and regulation within a predictive metabolic model. Genome-scale metabolic (GSM) models augmented with kinetic models of metabolism have been shown to be effective at recapitulating perturbed metabolic phenotypes. Results. In this effort, we first update a second-generation genome-scale metabolic model (iCth446) for C. thermocellum by correcting cofactor dependencies, restoring elemental and charge balances,more » and updating GAM and NGAM values to improve phenotype predictions. The iCth446 model is next used as a scaffold to develop a core kinetic model (k-ctherm118) of the C. thermocellum central metabolism using the Ensemble Modeling (EM) paradigm. Model parameterization is carried out by simultaneously imposing fermentation yield data in lactate, malate, acetate, and hydrogen production pathways for 19 measured metabolites spanning a library of 19 distinct single and multiple gene knockout mutants along with 18 intracellular metabolite concentration data for a Δgldh mutant and ten experimentally measured Michaelis–Menten kinetic parameters. Conclusions. The k-ctherm118 model captures significant metabolic changes caused by (1) nitrogen limitation leading to increased yields for lactate, pyruvate, and amino acids, and (2) ethanol stress causing an increase in intracellular sugar phosphate concentrations (~1.5-fold) due to upregulation of cofactor pools. Robustness analysis of k-ctherm118 alludes to the presence of a secondary activity of ketol-acid reductoisomerase and possible regulation by valine and/or leucine pool levels. In addition, cross-validation and robustness analysis allude to missing elements in k-ctherm118 and suggest additional experiments to improve kinetic model prediction fidelity. Overall, the study quantitatively assesses the advantages of EM-based kinetic modeling towards improved prediction of C. thermocellum metabolism and develops a predictive kinetic model which can be used to design biofuel-overproducing strains.« less
Defining the system of care concept and philosophy: to update or not to update?
Stroul, Beth A; Blau, Gary M
2010-02-01
This commentary considers the task of updating the system of care concept and philosophy within its historical context, reviewing the original intent of the definition and clarifying misconceptions about its meaning. The authors identify the aspects of the concept and philosophy that should be updated based on the latest thinking, experience, and data, such as incorporating applicability to a broader range of populations, increasing the emphasis on the core values, specifying desired outcomes, and adding accountability as a critical element. An updated definition and values and principles are proposed, and the importance of always presenting the definition along with the accompanying specification of the philosophy is emphasized in order to increase its utility in assisting the field to move from theory to practice.
NASA Technical Reports Server (NTRS)
Long, M. S.; Yantosca, R.; Nielsen, J. E; Keller, C. A.; Da Silva, A.; Sulprizio, M. P.; Pawson, S.; Jacob, D. J.
2015-01-01
The GEOS-Chem global chemical transport model (CTM), used by a large atmospheric chemistry research community, has been re-engineered to also serve as an atmospheric chemistry module for Earth system models (ESMs). This was done using an Earth System Modeling Framework (ESMF) interface that operates independently of the GEOSChem scientific code, permitting the exact same GEOSChem code to be used as an ESM module or as a standalone CTM. In this manner, the continual stream of updates contributed by the CTM user community is automatically passed on to the ESM module, which remains state of science and referenced to the latest version of the standard GEOS-Chem CTM. A major step in this re-engineering was to make GEOS-Chem grid independent, i.e., capable of using any geophysical grid specified at run time. GEOS-Chem data sockets were also created for communication between modules and with external ESM code. The grid-independent, ESMF-compatible GEOS-Chem is now the standard version of the GEOS-Chem CTM. It has been implemented as an atmospheric chemistry module into the NASA GEOS- 5 ESM. The coupled GEOS-5-GEOS-Chem system was tested for scalability and performance with a tropospheric oxidant-aerosol simulation (120 coupled species, 66 transported tracers) using 48-240 cores and message-passing interface (MPI) distributed-memory parallelization. Numerical experiments demonstrate that the GEOS-Chem chemistry module scales efficiently for the number of cores tested, with no degradation as the number of cores increases. Although inclusion of atmospheric chemistry in ESMs is computationally expensive, the excellent scalability of the chemistry module means that the relative cost goes down with increasing number of cores in a massively parallel environment.
NASA Astrophysics Data System (ADS)
Yang, Yang; Wang, Ziyu; Ding, Yi; Lu, Zhihong; Sun, Haoliang; Li, Ya; Wei, Jianhong; Xiong, Rui; Shi, Jing; Liu, Zhengyou; Lei, Qingquan
2013-11-01
This work reports the excellent dielectric properties of polyimide (PI) embedded with CaCu3Ti4O12 (CCTO) nanofibers. The dielectric behaviors were investigated over a frequency of 100 Hz-1 MHz. It is shown that embedding CCTO nanofibers with high aspect ratio (67) is an effective means to enhance the dielectric permittivity and reduce the percolation threshold. The dielectric permittivity of PI/CCTO nanofiber composites is 85 with 1.5 vol.% loading of filler, also the dielectric loss is only 0.015 at 100 Hz. Monte Carlo simulation was used to investigate the percolation threshold of CCTO nanofibers reinforced polyimide matrix by using excluded volume theory and soft, hard-core models. The results are in good agreement with the percolation theory and the hard-core model can well explain the percolation phenomena in PI/CCTO nanofiber composites. The dielectric properties of the composites will meet the practical requirements for the application in high dielectric constant capacitors and high energy density materials.
Relativistic tests with lunar laser ranging
NASA Astrophysics Data System (ADS)
Hofmann, F.; Müller, J.
2018-02-01
This paper presents the recent version of the lunar laser ranging (LLR) analysis model at the Institut für Erdmessung (IfE), Leibniz Universität Hannover and highlights a few tests of Einstein’s theory of gravitation using LLR data. Investigations related to a possible temporal variation of the gravitational constant, the equivalence principle, the PPN parameters β and γ as well as the geodetic precession were carried out. The LLR analysis model was updated by gravitational effects of the Sun and planets with the Moon as extended body. The higher-order gravitational interaction between Earth and Moon as well as effects of the solid Earth tides on the lunar motion were refined. The basis for the modeled lunar rotation is now a 2-layer core/mantle model according to the DE430 ephemeris. The validity of Einstein’s theory was studied using this updated analysis model and an LLR data set from 1970 to January 2015. Within the estimated accuracies, no deviations from Einstein’s theory are detected. A relative temporal variation of the gravitational constant is estimated as \\dot{G}/G_0=(7.1+/-7.6)×10-14~yr-1 , the test of the equivalence principle gives Δ(m_g/m_i)EM=(-3+/-5)×10-14 and the Nordtvedt parameter \
Benchmarking GPU and CPU codes for Heisenberg spin glass over-relaxation
NASA Astrophysics Data System (ADS)
Bernaschi, M.; Parisi, G.; Parisi, L.
2011-06-01
We present a set of possible implementations for Graphics Processing Units (GPU) of the Over-relaxation technique applied to the 3D Heisenberg spin glass model. The results show that a carefully tuned code can achieve more than 100 GFlops/s of sustained performance and update a single spin in about 0.6 nanoseconds. A multi-hit technique that exploits the GPU shared memory further reduces this time. Such results are compared with those obtained by means of a highly-tuned vector-parallel code on latest generation multi-core CPUs.
Evaluation of Teachers and Leaders. State Implementation of Common Core State Standards
ERIC Educational Resources Information Center
Anderson, Kimberly; Mira, Mary Elizabeth
2014-01-01
By 2012, all of the states in this study had started implementing new or revised teacher and leader evaluation systems. There are many and varying updates to these systems, and some of them have been made to meet conditions for a state's federal "Race to the Top" (RTT) grant. Other updates have been made to meet conditions for a state's…
Hippo Signaling in Mitosis: An Updated View in Light of the MEN Pathway.
Hergovich, Alexander
2017-01-01
The Hippo pathway is an essential tumor suppressor signaling network that coordinates cell proliferation, death, and differentiation in higher eukaryotes. Intriguingly, the core components of the Hippo pathway are conserved from yeast to man, with the yeast analogs of mammalian MST1/2 (fly Hippo), MOB1 (fly Mats), LATS1/2 (fly Warts), and NDR1/2 (fly Tricornered) functioning as essential components of the mitotic exit network (MEN). Here, we update our previous summary of mitotic functions of Hippo core components in Drosophila melanogaster and mammals, with particular emphasis on similarities between the yeast MEN pathway and mitotic Hippo signaling. Mitotic functions of YAP and TAZ, the two main effectors of Hippo signaling, are also discussed.
Enhanced backgrounds in scene rendering with GTSIMS
NASA Astrophysics Data System (ADS)
Prussing, Keith F.; Pierson, Oliver; Cordell, Chris; Stewart, John; Nielson, Kevin
2018-05-01
A core component to modeling visible and infrared sensor responses is the ability to faithfully recreate background noise and clutter in a synthetic image. Most tracking and detection algorithms use a combination of signal to noise or clutter to noise ratios to determine if a signature is of interest. A primary source of clutter is the background that defines the environment in which a target is placed. Over the past few years, the Electro-Optical Systems Laboratory (EOSL) at the Georgia Tech Research Institute has made significant improvements to its in house simulation framework GTSIMS. First, we have expanded our terrain models to include the effects of terrain orientation on emission and reflection. Second, we have included the ability to model dynamic reflections with full BRDF support. Third, we have added the ability to render physically accurate cirrus clouds. And finally, we have updated the overall rendering procedure to reduce the time necessary to generate a single frame by taking advantage of hardware acceleration. Here, we present the updates to GTSIMS to better predict clutter and noise doe to non-uniform backgrounds. Specifically, we show how the addition of clouds, terrain, and improved non-uniform sky rendering improve our ability to represent clutter during scene generation.
ERIC Educational Resources Information Center
Winebrenner, Susan
2014-01-01
A gold mine of practical, easy-to-use teaching methods, strategies, and tips to improve learning outcomes for students who score below proficiency levels. This fully revised and updated third edition provides information on integrated learning, problem solving, and critical thinking in line with Common Core State Standards and 21st-century…
ERIC Educational Resources Information Center
Goodyear, Rodney K.; Brewer, Dominic J.; Gallagher, Karen Symms; Tracey, Terence J. G.; Claiborn, Charles D.; Lichtenberg, James W.; Wampold, Bruce E.
2009-01-01
Academic journals are the primary mode of communication among researchers, and they play a central role in the creation, diffusion, and use of knowledge. This article updates previous attempts to identify a core set of journals that most education scholars would acknowledge as consequential sources. On the basis of nominations from a panel of…
NASA Technical Reports Server (NTRS)
Righter, K.; Danielson, L.; Pando, K.; Shofner, G.; Lee, C. -T.
2013-01-01
Siderophile elements have been used to constrain conditions of core formation and differentiation for the Earth, Mars and other differentiated bodies [1]. Recent models for the Earth have concluded that the mantle and core did not fully equilibrate and the siderophile element contents of the mantle can only be explained under conditions where the oxygen fugacity changes from low to high during accretion and the mantle and core do not fully equilibrate [2,3]. However these conclusions go against several physical and chemical constraints. First, calculations suggest that even with the composition of accreting material changing from reduced to oxidized over time, the fO2 defined by metal-silicate equilibrium does not change substantially, only by approximately 1 logfO2 unit [4]. An increase of more than 2 logfO2 units in mantle oxidation are required in models of [2,3]. Secondly, calculations also show that metallic impacting material will become deformed and sheared during accretion to a large body, such that it becomes emulsified to a fine scale that allows equilibrium at nearly all conditions except for possibly the length scale for giant impacts [5] (contrary to conclusions of [6]). Using new data for D(Mo) metal/silicate at high pressures, together with updated partitioning expressions for many other elements, we will show that metal-silicate equilibrium across a long span of Earth s accretion history may explain the concentrations of many siderophile elements in Earth's mantle. The modeling includes refractory elements Ni, Co, Mo, and W, as well as highly siderophile elements Au, Pd and Pt, and volatile elements Cd, In, Bi, Sb, Ge and As.
NASA Astrophysics Data System (ADS)
Mielikainen, Jarno; Huang, Bormin; Huang, Allen
2015-10-01
The Thompson cloud microphysics scheme is a sophisticated cloud microphysics scheme in the Weather Research and Forecasting (WRF) model. The scheme is very suitable for massively parallel computation as there are no interactions among horizontal grid points. Compared to the earlier microphysics schemes, the Thompson scheme incorporates a large number of improvements. Thus, we have optimized the speed of this important part of WRF. Intel Many Integrated Core (MIC) ushers in a new era of supercomputing speed, performance, and compatibility. It allows the developers to run code at trillions of calculations per second using the familiar programming model. In this paper, we present our results of optimizing the Thompson microphysics scheme on Intel Many Integrated Core Architecture (MIC) hardware. The Intel Xeon Phi coprocessor is the first product based on Intel MIC architecture, and it consists of up to 61 cores connected by a high performance on-die bidirectional interconnect. The coprocessor supports all important Intel development tools. Thus, the development environment is familiar one to a vast number of CPU developers. Although, getting a maximum performance out of MICs will require using some novel optimization techniques. New optimizations for an updated Thompson scheme are discusses in this paper. The optimizations improved the performance of the original Thompson code on Xeon Phi 7120P by a factor of 1.8x. Furthermore, the same optimizations improved the performance of the Thompson on a dual socket configuration of eight core Intel Xeon E5-2670 CPUs by a factor of 1.8x compared to the original Thompson code.
Ordering of guarded and unguarded stores for no-sync I/O
Gara, Alan; Ohmacht, Martin
2013-06-25
A parallel computing system processes at least one store instruction. A first processor core issues a store instruction. A first queue, associated with the first processor core, stores the store instruction. A second queue, associated with a first local cache memory device of the first processor core, stores the store instruction. The first processor core updates first data in the first local cache memory device according to the store instruction. The third queue, associated with at least one shared cache memory device, stores the store instruction. The first processor core invalidates second data, associated with the store instruction, in the at least one shared cache memory. The first processor core invalidates third data, associated with the store instruction, in other local cache memory devices of other processor cores. The first processor core flushing only the first queue.
Mathelier, Anthony; Fornes, Oriol; Arenillas, David J; Chen, Chih-Yu; Denay, Grégoire; Lee, Jessica; Shi, Wenqiang; Shyr, Casper; Tan, Ge; Worsley-Hunt, Rebecca; Zhang, Allen W; Parcy, François; Lenhard, Boris; Sandelin, Albin; Wasserman, Wyeth W
2016-01-04
JASPAR (http://jaspar.genereg.net) is an open-access database storing curated, non-redundant transcription factor (TF) binding profiles representing transcription factor binding preferences as position frequency matrices for multiple species in six taxonomic groups. For this 2016 release, we expanded the JASPAR CORE collection with 494 new TF binding profiles (315 in vertebrates, 11 in nematodes, 3 in insects, 1 in fungi and 164 in plants) and updated 59 profiles (58 in vertebrates and 1 in fungi). The introduced profiles represent an 83% expansion and 10% update when compared to the previous release. We updated the structural annotation of the TF DNA binding domains (DBDs) following a published hierarchical structural classification. In addition, we introduced 130 transcription factor flexible models trained on ChIP-seq data for vertebrates, which capture dinucleotide dependencies within TF binding sites. This new JASPAR release is accompanied by a new web tool to infer JASPAR TF binding profiles recognized by a given TF protein sequence. Moreover, we provide the users with a Ruby module complementing the JASPAR API to ease programmatic access and use of the JASPAR collection of profiles. Finally, we provide the JASPAR2016 R/Bioconductor data package with the data of this release. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
The Silicon Trypanosome: a test case of iterative model extension in systems biology
Achcar, Fiona; Fadda, Abeer; Haanstra, Jurgen R.; Kerkhoven, Eduard J.; Kim, Dong-Hyun; Leroux, Alejandro E.; Papamarkou, Theodore; Rojas, Federico; Bakker, Barbara M.; Barrett, Michael P.; Clayton, Christine; Girolami, Mark; Luise Krauth-Siegel, R.; Matthews, Keith R.; Breitling, Rainer
2016-01-01
The African trypanosome, Trypanosoma brucei, is a unicellular parasite causing African Trypanosomiasis (sleeping sickness in humans and nagana in animals). Due to some of its unique properties, it has emerged as a popular model organism in systems biology. A predictive quantitative model of glycolysis in the bloodstream form of the parasite has been constructed and updated several times. The Silicon Trypanosome (SilicoTryp) is a project that brings together modellers and experimentalists to improve and extend this core model with new pathways and additional levels of regulation. These new extensions and analyses use computational methods that explicitly take different levels of uncertainty into account. During this project, numerous tools and techniques have been developed for this purpose, which can now be used for a wide range of different studies in systems biology. PMID:24797926
Experiments with a Regional Vector-Vorticity Model, and Comparison with Other Models
NASA Astrophysics Data System (ADS)
Konor, C. S.; Dazlich, D. A.; Jung, J.; Randall, D. A.
2017-12-01
The Vector-Vorticity Model (VVM) is an anelastic model with a unique dynamical core that predicts the three-dimensional vorticity instead of the three-dimensional momentum. The VVM is used in the CRMs of the Global Quasi-3D Multiscale Modeling Framework, which is discussed by Joon-Hee Jung and collaborators elsewhere in this session. We are updating the physics package of the VVM, replacing it with the physics package of the System for Atmosphere Modeling (SAM). The new physics package includes a double-moment microphysics, Mellor-Yamada turbulence, Monin-Obukov surface fluxes, and the RRTMG radiation parameterization. We briefly describe the VVM and show results from standard test cases, including TWP-ICE. We compare the results with those obtained using the earlier physics. We also show results from experiments on convection aggregation in radiative-convective equilibrium, and compare with those obtained using both SAM and the Regional Atmospheric Modeling System (RAMS).
ERIC Educational Resources Information Center
Patterson, Brian F.; Mattern, Krista D.
2013-01-01
The continued accumulation of validity evidence for the core uses of educational assessments is critical to ensure that proper inferences will be made for those core purposes. To that end, the College Board has continued to follow previous cohorts of college students and this report provides updated validity evidence for using the SAT to predict…
Dynamic shaping of dopamine signals during probabilistic Pavlovian conditioning.
Hart, Andrew S; Clark, Jeremy J; Phillips, Paul E M
2015-01-01
Cue- and reward-evoked phasic dopamine activity during Pavlovian and operant conditioning paradigms is well correlated with reward-prediction errors from formal reinforcement learning models, which feature teaching signals in the form of discrepancies between actual and expected reward outcomes. Additionally, in learning tasks where conditioned cues probabilistically predict rewards, dopamine neurons show sustained cue-evoked responses that are correlated with the variance of reward and are maximal to cues predicting rewards with a probability of 0.5. Therefore, it has been suggested that sustained dopamine activity after cue presentation encodes the uncertainty of impending reward delivery. In the current study we examined the acquisition and maintenance of these neural correlates using fast-scan cyclic voltammetry in rats implanted with carbon fiber electrodes in the nucleus accumbens core during probabilistic Pavlovian conditioning. The advantage of this technique is that we can sample from the same animal and recording location throughout learning with single trial resolution. We report that dopamine release in the nucleus accumbens core contains correlates of both expected value and variance. A quantitative analysis of these signals throughout learning, and during the ongoing updating process after learning in probabilistic conditions, demonstrates that these correlates are dynamically encoded during these phases. Peak CS-evoked responses are correlated with expected value and predominate during early learning while a variance-correlated sustained CS signal develops during the post-asymptotic updating phase. Copyright © 2014 Elsevier Inc. All rights reserved.
The NASA Lightning Nitrogen Oxides Model (LNOM): Recent Updates and Applications
NASA Technical Reports Server (NTRS)
Koshak, William; Peterson, Harold; Biazar, Arastoo; Khan, Maudood; Wang, Lihua; Park, Yee-Hun
2011-01-01
Improvements to the NASA Marshall Space Flight Center Lightning Nitrogen Oxides Model (LNOM) and its application to the Community Multiscale Air Quality (CMAQ) modeling system are presented. The LNOM analyzes Lightning Mapping Array (LMA) and National Lightning Detection Network(tm) (NLDN) data to estimate the raw (i.e., unmixed and otherwise environmentally unmodified) vertical profile of lightning NOx (= NO + NO2). Lightning channel length distributions and lightning 10-m segment altitude distributions are also provided. In addition to NOx production from lightning return strokes, the LNOM now includes non-return stroke lightning NOx production due to: hot core stepped and dart leaders, stepped leader corona sheath, K-changes, continuing currents, and M-components. The impact of including LNOM-estimates of lightning NOx for an August 2006 run of CMAQ is discussed.
NASA Astrophysics Data System (ADS)
Kordy, M.; Wannamaker, P.; Maris, V.; Cherkaev, E.; Hill, G.
2016-01-01
Following the creation described in Part I of a deformable edge finite-element simulator for 3-D magnetotelluric (MT) responses using direct solvers, in Part II we develop an algorithm named HexMT for 3-D regularized inversion of MT data including topography. Direct solvers parallelized on large-RAM, symmetric multiprocessor (SMP) workstations are used also for the Gauss-Newton model update. By exploiting the data-space approach, the computational cost of the model update becomes much less in both time and computer memory than the cost of the forward simulation. In order to regularize using the second norm of the gradient, we factor the matrix related to the regularization term and apply its inverse to the Jacobian, which is done using the MKL PARDISO library. For dense matrix multiplication and factorization related to the model update, we use the PLASMA library which shows very good scalability across processor cores. A synthetic test inversion using a simple hill model shows that including topography can be important; in this case depression of the electric field by the hill can cause false conductors at depth or mask the presence of resistive structure. With a simple model of two buried bricks, a uniform spatial weighting for the norm of model smoothing recovered more accurate locations for the tomographic images compared to weightings which were a function of parameter Jacobians. We implement joint inversion for static distortion matrices tested using the Dublin secret model 2, for which we are able to reduce nRMS to ˜1.1 while avoiding oscillatory convergence. Finally we test the code on field data by inverting full impedance and tipper MT responses collected around Mount St Helens in the Cascade volcanic chain. Among several prominent structures, the north-south trending, eruption-controlling shear zone is clearly imaged in the inversion.
NASA Astrophysics Data System (ADS)
Jackson, S. J.; Reynolds, C.; Krevor, S. C.
2017-12-01
Predictions of the flow behaviour and storage capacity of CO2 in subsurface reservoirs are dependent on accurate modelling of multiphase flow and trapping. A number of studies have shown that small scale rock heterogeneities have a significant impact on CO2flow propagating to larger scales. The need to simulate flow in heterogeneous reservoir systems has led to the development of numerical upscaling techniques which are widely used in industry. Less well understood, however, is the best approach for incorporating laboratory characterisations of small scale heterogeneities into models. At small scales, heterogeneity in the capillary pressure characteristic function becomes significant. We present a digital rock workflow that combines core flood experiments with numerical simulations to characterise sub-core scale capillary pressure heterogeneities within rock cores from several target UK storage reservoirs - the Bunter, Captain and Ormskirk sandstone formations. Measured intrinsic properties (permeability, capillary pressure, relative permeability) and 3D saturations maps from steady-state core flood experiments were the primary inputs to construct a 3D digital rock model in CMG IMEX. We used vertical end-point scaling to iteratively update the voxel by voxel capillary pressure curves from the average MICP curve; with each iteration more closely predicting the experimental saturations and pressure drops. Once characterised, the digital rock cores were used to predict equivalent flow functions, such as relative permeability and residual trapping, across the range of flow conditions estimated to prevail in the CO2 storage reservoirs. In the case of the Captain sandstone, rock cores were characterised across an entire 100m vertical transect of the reservoir. This allowed analysis of the upscaled impact of small scale heterogeneity on flow and trapping. Figure 1 shows the varying degree to which heterogeneity impacted flow depending on the capillary number in the Captain sandstone. At low capillary numbers, typical of regions where flow is dominated by buoyancy, fluid flow is impeded and trapping enhanced. At high capillary numbers, typical of the near wellbore environment, the fluid distributed homogeneously and the equivalent relative permeability was higher leading to improved injectivity.
NASA Astrophysics Data System (ADS)
Dullo, Bililign T.; Graham, Alister W.
2014-11-01
New surface brightness profiles from 26 early-type galaxies with suspected partially depleted cores have been extracted from the full radial extent of Hubble Space Telescope images. We have carefully quantified the radial stellar distributions of the elliptical galaxies using the core-Sérsic model whereas for the lenticular galaxies a core-Sérsic bulge plus an exponential disc model gives the best representation. We additionally caution about the use of excessive multiple Sérsic functions for decomposing galaxies and compare with past fits in the literature. The structural parameters obtained from our fitted models are, in general, in good agreement with our initial study using radially limited (R ≲ 10 arcsec) profiles, and are used here to update several `central' as well as `global' galaxy scaling relations. We find near-linear relations between the break radius Rb and the spheroid luminosity L such that Rb ∝ L1.13±0.13, and with the supermassive black hole mass MBH such that R_b∝ M_BH^{0.83 ± 0.21}. This is internally consistent with the notion that major, dry mergers add the stellar and black hole mass in equal proportion, i.e. MBH ∝ L. In addition, we observe a linear relation R_b∝ R_e^{0.98 ± 0.15} for the core-Sérsic elliptical galaxies - where Re is the galaxies' effective half-light radii - which is collectively consistent with the approximately linear, bright-end of the curved L-Re relation. Finally, we measure accurate stellar mass deficits Mdef that are in general 0.5-4 MBH, and we identify two galaxies (NGC 1399, NGC 5061) that, due to their high Mdef/MBH ratio, may have experienced oscillatory core-passage by a (gravitational radiation)-kicked black hole. The galaxy scaling relations and stellar mass deficits favour core-Sérsic galaxy formation through a few `dry' major merger events involving supermassive black holes such that M_def ∝ M_BH^{3.70 ± 0.76}, for MBH ≳ 2 × 108 M⊙.
Davis, Katherine; Gorst, Sarah L; Harman, Nicola; Smith, Valerie; Gargon, Elizabeth; Altman, Douglas G; Blazeby, Jane M; Clarke, Mike; Tunis, Sean; Williamson, Paula R
2018-01-01
Core outcome sets (COS) comprise a minimum set of outcomes that should be measured and reported in all trials for a specific health condition. The COMET (Core Outcome Measures in Effectiveness Trials) Initiative maintains an up to date, publicly accessible online database of published and ongoing COS. An annual systematic review update is an important part of this process. This review employed the same, multifaceted approach that was used in the original review and the previous two updates. This approach has identified studies that sought to determine which outcomes/domains to measure in clinical trials of a specific condition. This update includes an analysis of the inclusion of participants from low and middle income countries (LMICs) as identified by the OECD, in these COS. Eighteen publications, relating to 15 new studies describing the development of 15 COS, were eligible for inclusion in the review. Results show an increase in the use of mixed methods, including Delphi surveys. Clinical experts remain the most common stakeholder group involved. Overall, only 16% of the 259 COS studies published up to the end of 2016 have included participants from LMICs. This review highlights opportunities for greater public participation in COS development and the involvement of stakeholders from a wider range of geographical settings, in particular LMICs.
Radiological Society of North America
... Courses Electronic Education Exhibits RSNA Journals RSNA/AAPM Physics Modules RadioGraphics ABR Diagnostic Radiology Core Exam Study ... Brain Tumor Classification System In 2016, the World Health Organization (WHO) released an update to its brain ...
Rare Earth Geochemistry of Rock Core form WY Reservoirs
Quillinan, Scott; Bagdonnas, Davin; McLaughlin, J. Fred; Nye, Charles
2016-10-01
These data include major, minor, trace and rare earth element concentration of geologic formations in Wyoming oil and gas fields. *Note - Link below contains updated version of spreadsheet (6/14/2017)
Agent-Based Modeling of China's Rural-Urban Migration and Social Network Structure.
Fu, Zhaohao; Hao, Lingxin
2018-01-15
We analyze China's rural-urban migration and endogenous social network structures using agent-based modeling. The agents from census micro data are located in their rural origin with an empirical-estimated prior propensity to move. The population-scale social network is a hybrid one, combining observed family ties and locations of the origin with a parameter space calibrated from census, survey and aggregate data and sampled using a stepwise Latin Hypercube Sampling method. At monthly intervals, some agents migrate and these migratory acts change the social network by turning within-nonmigrant connections to between-migrant-nonmigrant connections, turning local connections to nonlocal connections, and adding among-migrant connections. In turn, the changing social network structure updates migratory propensities of those well-connected nonmigrants who become more likely to move. These two processes iterate over time. Using a core-periphery method developed from the k -core decomposition method, we identify and quantify the network structural changes and map these changes with the migration acceleration patterns. We conclude that network structural changes are essential for explaining migration acceleration observed in China during the 1995-2000 period.
Agent-based modeling of China's rural-urban migration and social network structure
NASA Astrophysics Data System (ADS)
Fu, Zhaohao; Hao, Lingxin
2018-01-01
We analyze China's rural-urban migration and endogenous social network structures using agent-based modeling. The agents from census micro data are located in their rural origin with an empirical-estimated prior propensity to move. The population-scale social network is a hybrid one, combining observed family ties and locations of the origin with a parameter space calibrated from census, survey and aggregate data and sampled using a stepwise Latin Hypercube Sampling method. At monthly intervals, some agents migrate and these migratory acts change the social network by turning within-nonmigrant connections to between-migrant-nonmigrant connections, turning local connections to nonlocal connections, and adding among-migrant connections. In turn, the changing social network structure updates migratory propensities of those well-connected nonmigrants who become more likely to move. These two processes iterate over time. Using a core-periphery method developed from the k-core decomposition method, we identify and quantify the network structural changes and map these changes with the migration acceleration patterns. We conclude that network structural changes are essential for explaining migration acceleration observed in China during the 1995-2000 period.
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.
2012-01-01
This presentation is a technical summary of and outlook for NASA-internal and NASA-sponsored external research on core noise funded by the Fundamental Aeronautics Program Subsonic Fixed Wing (SFW) Project. Sections of the presentation cover: the SFW system-level noise metrics for the 2015 (N+1), 2020 (N+2), and 2025 (N+3) timeframes; SFW strategic thrusts and technical challenges; SFW advanced subsystems that are broadly applicable to N+3 vehicle concepts, with an indication where further noise research is needed; the components of core noise (compressor, combustor and turbine noise) and a rationale for NASA's current emphasis on the combustor-noise component; the increase in the relative importance of core noise due to turbofan design trends; the need to understand and mitigate core-noise sources for high-efficiency small gas generators; and the current research activities in the core-noise area, with additional details given about forthcoming updates to NASA's Aircraft Noise Prediction Program (ANOPP) core-noise prediction capabilities, two NRA efforts (Honeywell International, Phoenix, AZ and University of Illinois at Urbana-Champaign, respectively) to improve the understanding of core-noise sources and noise propagation through the engine core, and an effort to develop oxide/oxide ceramic-matrix-composite (CMC) liners for broadband noise attenuation suitable for turbofan-core application. Core noise must be addressed to ensure that the N+3 noise goals are met. Focused, but long-term, core-noise research is carried out to enable the advanced high-efficiency small gas-generator subsystem, common to several N+3 conceptual designs, needed to meet NASA's technical challenges. Intermediate updates to prediction tools are implemented as the understanding of the source structure and engine-internal propagation effects is improved. The NASA Fundamental Aeronautics Program has the principal objective of overcoming today's national challenges in air transportation. The SFW Quiet-Aircraft Subproject aims to develop concepts and technologies to reduce perceived community noise attributable to aircraft with minimal impact on weight and performance. This reduction of aircraft noise is critical to enabling the anticipated large increase in future air traffic.
Bruland, Philipp; Doods, Justin; Storck, Michael; Dugas, Martin
2017-01-01
Data dictionaries provide structural meta-information about data definitions in health information technology (HIT) systems. In this regard, reusing healthcare data for secondary purposes offers several advantages (e.g. reduce documentation times or increased data quality). Prerequisites for data reuse are its quality, availability and identical meaning of data. In diverse projects, research data warehouses serve as core components between heterogeneous clinical databases and various research applications. Given the complexity (high number of data elements) and dynamics (regular updates) of electronic health record (EHR) data structures, we propose a clinical metadata warehouse (CMDW) based on a metadata registry standard. Metadata of two large hospitals were automatically inserted into two CMDWs containing 16,230 forms and 310,519 data elements. Automatic updates of metadata are possible as well as semantic annotations. A CMDW allows metadata discovery, data quality assessment and similarity analyses. Common data models for distributed research networks can be established based on similarity analyses.
Kumaran, Dharshan; Banino, Andrea; Blundell, Charles; Hassabis, Demis; Dayan, Peter
2016-12-07
Knowledge about social hierarchies organizes human behavior, yet we understand little about the underlying computations. Here we show that a Bayesian inference scheme, which tracks the power of individuals, better captures behavioral and neural data compared with a reinforcement learning model inspired by rating systems used in games such as chess. We provide evidence that the medial prefrontal cortex (MPFC) selectively mediates the updating of knowledge about one's own hierarchy, as opposed to that of another individual, a process that underpinned successful performance and involved functional interactions with the amygdala and hippocampus. In contrast, we observed domain-general coding of rank in the amygdala and hippocampus, even when the task did not require it. Our findings reveal the computations underlying a core aspect of social cognition and provide new evidence that self-relevant information may indeed be afforded a unique representational status in the brain. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Correction and update to 'The earth's C21 and S21 gravity coefficients and the rotation of the core'
NASA Technical Reports Server (NTRS)
Wahr, John
1990-01-01
Wahr (1987) used satellite constraints on C21 and S21 (the spherical harmonic coefficients of the earth's external gravitational potential) to infer certain properties of the core and core/mantle boundary. It is shown here, contrary to the claim by Wahr, that it is not possible to use C21 and S21 to placed bounds on the core's products of inertia. As a result, Wahr's constraints on the l = 2, m = 1 components of the core/mantle boundary topography and on the angular orientation of the inner core with respect to the earth's rotation vector are not justified. On the other hand, Wahr's conclusions about the time-averaged torque between the core and mantle and the resulting implications for the l = 2, m = 1 components of fluid pressure at the top of the core can be strengthened. Wahr's conclusions about the mean rotational flow in the core are unaltered.
St. Petersburg Coastal and Marine Science Center's Core Archive Portal
Reich, Chris; Streubert, Matt; Dwyer, Brendan; Godbout, Meg; Muslic, Adis; Umberger, Dan
2012-01-01
This Web site contains information on rock cores archived at the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center (SPCMSC). Archived cores consist of 3- to 4-inch-diameter coral cores, 1- to 2-inch-diameter rock cores, and a few unlabeled loose coral and rock samples. This document - and specifically the archive Web site portal - is intended to be a 'living' document that will be updated continually as additional cores are collected and archived. This document may also contain future references and links to a catalog of sediment cores. Sediment cores will include vibracores, pushcores, and other loose sediment samples collected for research purposes. This document will: (1) serve as a database for locating core material currently archived at the USGS SPCMSC facility; (2) provide a protocol for entry of new core material into the archive system; and, (3) set the procedures necessary for checking out core material for scientific purposes. Core material may be loaned to other governmental agencies, academia, or non-governmental organizations at the discretion of the USGS SPCMSC curator.
caCORE: a common infrastructure for cancer informatics.
Covitz, Peter A; Hartel, Frank; Schaefer, Carl; De Coronado, Sherri; Fragoso, Gilberto; Sahni, Himanso; Gustafson, Scott; Buetow, Kenneth H
2003-12-12
Sites with substantive bioinformatics operations are challenged to build data processing and delivery infrastructure that provides reliable access and enables data integration. Locally generated data must be processed and stored such that relationships to external data sources can be presented. Consistency and comparability across data sets requires annotation with controlled vocabularies and, further, metadata standards for data representation. Programmatic access to the processed data should be supported to ensure the maximum possible value is extracted. Confronted with these challenges at the National Cancer Institute Center for Bioinformatics, we decided to develop a robust infrastructure for data management and integration that supports advanced biomedical applications. We have developed an interconnected set of software and services called caCORE. Enterprise Vocabulary Services (EVS) provide controlled vocabulary, dictionary and thesaurus services. The Cancer Data Standards Repository (caDSR) provides a metadata registry for common data elements. Cancer Bioinformatics Infrastructure Objects (caBIO) implements an object-oriented model of the biomedical domain and provides Java, Simple Object Access Protocol and HTTP-XML application programming interfaces. caCORE has been used to develop scientific applications that bring together data from distinct genomic and clinical science sources. caCORE downloads and web interfaces can be accessed from links on the caCORE web site (http://ncicb.nci.nih.gov/core). caBIO software is distributed under an open source license that permits unrestricted academic and commercial use. Vocabulary and metadata content in the EVS and caDSR, respectively, is similarly unrestricted, and is available through web applications and FTP downloads. http://ncicb.nci.nih.gov/core/publications contains links to the caBIO 1.0 class diagram and the caCORE 1.0 Technical Guide, which provide detailed information on the present caCORE architecture, data sources and APIs. Updated information appears on a regular basis on the caCORE web site (http://ncicb.nci.nih.gov/core).
VKCDB: voltage-gated K+ channel database updated and upgraded.
Gallin, Warren J; Boutet, Patrick A
2011-01-01
The Voltage-gated K(+) Channel DataBase (VKCDB) (http://vkcdb.biology.ualberta.ca) makes a comprehensive set of sequence data readily available for phylogenetic and comparative analysis. The current update contains 2063 entries for full-length or nearly full-length unique channel sequences from Bacteria (477), Archaea (18) and Eukaryotes (1568), an increase from 346 solely eukaryotic entries in the original release. In addition to protein sequences for channels, corresponding nucleotide sequences of the open reading frames corresponding to the amino acid sequences are now available and can be extracted in parallel with sets of protein sequences. Channels are categorized into subfamilies by phylogenetic analysis and by using hidden Markov model analyses. Although the raw database contains a number of fragmentary, duplicated, obsolete and non-channel sequences that were collected in early steps of data collection, the web interface will only return entries that have been validated as likely K(+) channels. The retrieval function of the web interface allows retrieval of entries that contain a substantial fraction of the core structural elements of VKCs, fragmentary entries, or both. The full database can be downloaded as either a MySQL dump or as an XML dump from the web site. We have now implemented automated updates at quarterly intervals.
Recent Updates of A Multi-Phase Transport (AMPT) Model
NASA Astrophysics Data System (ADS)
Lin, Zi-Wei
2008-10-01
We will present recent updates to the AMPT model, a Monte Carlo transport model for high energy heavy ion collisions, since its first public release in 2004 and the corresponding detailed descriptions in Phys. Rev. C 72, 064901 (2005). The updates often result from user requests. Some of these updates expand the physics processes or descriptions in the model, while some updates improve the usability of the model such as providing the initial parton distributions or help avoid crashes on some operating systems. We will also explain how the AMPT model is being maintained and updated.
2015-11-01
The AAOHN Competency document is one of the core documents that define occupational health nursing practice. This article provides a description of the process used to update the competencies, as well as a description of the new competencies. © 2015 The Author(s).
Sulphur chemistry in the L1544 pre-stellar core
NASA Astrophysics Data System (ADS)
Vastel, Charlotte; Quénard, D.; Le Gal, R.; Wakelam, V.; Andrianasolo, A.; Caselli, P.; Vidal, T.; Ceccarelli, C.; Lefloch, B.; Bachiller, R.
2018-05-01
The L1544 pre-stellar core has been observed as part of the ASAI IRAM 30m Large Program as well as follow-up programs. These observations have revealed the chemical richness of the earliest phases of low-mass star-forming regions. In this paper we focus on the twenty-one sulphur bearing species (ions, isotopomers and deuteration) that have been detected in this spectral-survey through fifty one transitions: CS, CCS, C3S, SO, SO2, H2CS, OCS, HSCN, NS, HCS+, NS+ and H2S. We also report the tentative detection (4 σ level) for methyl mercaptan (CH3SH). LTE and non-LTE radiative transfer modelling have been performed and we used the NAUTILUS chemical code updated with the most recent chemical network for sulphur to explain our observations. From the chemical modelling we expect a strong radial variation for the abundances of these species, which mostly are emitted in the external layer where non thermal desorption of other species has previously been observed. We show that the chemical study cannot be compared to what has been done for the TMC-1 dark cloud, where the abundance is supposed constant along the line of sight, and conclude that a strong sulphur depletion is necessary to fully reproduce our observations of the prototypical pre-stellar core L1544.
Trace Assessment for BWR ATWS Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, L.Y.; Diamond, D.; Arantxa Cuadra, Gilad Raitses, Arnold Aronson
2010-04-22
A TRACE/PARCS input model has been developed in order to be able to analyze anticipated transients without scram (ATWS) in a boiling water reactor. The model is based on one developed previously for the Browns Ferry reactor for doing loss-of-coolant accident analysis. This model was updated by adding the control systems needed for ATWS and a core model using PARCS. The control systems were based on models previously developed for the TRAC-B code. The PARCS model is based on information (e.g., exposure and moderator density (void) history distributions) obtained from General Electric Hitachi and cross sections for GE14 fuel obtainedmore » from an independent source. The model is able to calculate an ATWS, initiated by the closure of main steam isolation valves, with recirculation pump trip, water level control, injection of borated water from the standby liquid control system and actuation of the automatic depres-surization system. The model is not considered complete and recommendations are made on how it should be improved.« less
Situation Model Updating in Young and Older Adults: Global versus Incremental Mechanisms
Bailey, Heather R.; Zacks, Jeffrey M.
2015-01-01
Readers construct mental models of situations described by text. Activity in narrative text is dynamic, so readers must frequently update their situation models when dimensions of the situation change. Updating can be incremental, such that a change leads to updating just the dimension that changed, or global, such that the entire model is updated. Here, we asked whether older and young adults make differential use of incremental and global updating. Participants read narratives containing changes in characters and spatial location and responded to recognition probes throughout the texts. Responses were slower when probes followed a change, suggesting that situation models were updated at changes. When either dimension changed, responses to probes for both dimensions were slowed; this provides evidence for global updating. Moreover, older adults showed stronger evidence of global updating than did young adults. One possibility is that older adults perform more global updating to offset reduced ability to manipulate information in working memory. PMID:25938248
Steps towards Improving GNSS Systematic Errors and Biases
NASA Astrophysics Data System (ADS)
Herring, T.; Moore, M.
2017-12-01
Four general areas of analysis method improvements, three related to data analysis models and the fourth to calibration methods, have been recommended at the recent unified analysis workshop (UAW) and we discuss aspects of these areas for improvement. The gravity fields used in the GNSS orbit integrations should be updated to match modern fields to make them consistent with the fields being used by the other IAG services. The update would include the static part of the field and a time variable component. The force models associated with radiation forces are the most uncertain and modeling of these forces can be made more consistent with the exchange of attitude information. The international GNSS service (IGS) will develop an attitude format and make attitude information available so that analysis centers can validate their models. The IGS has noted the appearance of the GPS draconitic period and harmonics of this period in time series of various geodetic products (e.g., positions and Earth orientation parameters). An updated short-period (diurnal and semidiurnal) model is needed and a method to determine the best model developed. The final area, not directly related to analysis models, is the recommendation that site dependent calibration of GNSS antennas are needed since these have a direct effect on the ITRF realization and position offsets when antennas are changed. Evaluation of the effects of the use of antenna specific phase center models will be investigated for those sites where these values are available without disturbing an existing antenna installation. Potential development of an in-situ antenna calibration system is strongly encouraged. In-situ calibration would be deployed at core sites where GNSS sites are tied to other geodetic systems. With recent expansion of the number of GPS satellites transmitting unencrypted codes on the GPS L2 frequency and the availability of software GNSS receivers in-situ calibration between an existing installation and a movable directional antenna is now more likely to generate accurate results than earlier analog switching systems. With all of these improvements, there is the expectation that there will be better agreement between the space geodetic methods thus allowing more definitive assessment and modeling of the Earth's time variable shape and gravity field.
A study of internet of things real-time data updating based on WebSocket
NASA Astrophysics Data System (ADS)
Wei, Shoulin; Yu, Konglin; Dai, Wei; Liang, Bo; Zhang, Xiaoli
2015-12-01
The Internet of Things (IoT) is gradually entering the industrial stage. Web applications in IoT such as monitoring, instant messaging, real-time quote system changes need to be transmitted in real-time mode to client without client constantly refreshing and sending the request. These applications often need to be as fast as possible and provide nearly real-time components. Real-time data updating is becoming the core part of application layer visualization technology in IoT. With support of data push in server-side, running state of "Things" in IoT could be displayed in real-time mode. This paper discusses several current real-time data updating method and explores the advantages and disadvantages of each method. We explore the use of WebSocket in a new approach for real-time data updating in IoT, since WebSocket provides low delay, low network throughput solutions for full-duplex communication.
Family History Collection Practices: National Survey of Pediatric Primary Care Providers.
Tarini, Beth A; Gornick, Michele C; Zikmund-Fisher, Brian J; Saal, Howard M; Edmondson, Laurie; Uhlmann, Wendy R
2018-05-01
While family history (FH) collection is a core responsibility of pediatric primary care providers (PCPs), few details about this practice are known. We surveyed a random national sample of 1200 pediatricians and family medicine physicians about FH collection practices. A total of 86% of respondents (n = 289 pediatricians; n = 152 family medicine physicians) indicated that they collect a FH "always" or "most of the time" with 77% reporting collection at the first visit, regardless of whether it is a health maintenance or problem-focused visit. Less than half ask about relatives other than parents, siblings, or grandparents (36.3%). Among respondents, 42% routinely update the FH at every health maintenance visit while 6% updated FH at every visit. Pediatric PCPs use a variety of methods to collect a FH that is limited in scope and variably updated. Our results suggest that interventions are needed to help pediatric PCPs collect a systematic, efficient, and updated FH.
Le, Thang M; Borghi, John A; Kujawa, Autumn J; Klein, Daniel N; Leung, Hoi-Chung
2017-01-01
The present study examined the impacts of major depressive disorder (MDD) on visual and prefrontal cortical activity as well as their connectivity during visual working memory updating and related them to the core clinical features of the disorder. Impairment in working memory updating is typically associated with the retention of irrelevant negative information which can lead to persistent depressive mood and abnormal affect. However, performance deficits have been observed in MDD on tasks involving little or no demand on emotion processing, suggesting dysfunctions may also occur at the more basic level of information processing. Yet, it is unclear how various regions in the visual working memory circuit contribute to behavioral changes in MDD. We acquired functional magnetic resonance imaging data from 18 unmedicated participants with MDD and 21 age-matched healthy controls (CTL) while they performed a visual delayed recognition task with neutral faces and scenes as task stimuli. Selective working memory updating was manipulated by inserting a cue in the delay period to indicate which one or both of the two memorized stimuli (a face and a scene) would remain relevant for the recognition test. Our results revealed several key findings. Relative to the CTL group, the MDD group showed weaker postcue activations in visual association areas during selective maintenance of face and scene working memory. Across the MDD subjects, greater rumination and depressive symptoms were associated with more persistent activation and connectivity related to no-longer-relevant task information. Classification of postcue spatial activation patterns of the scene-related areas was also less consistent in the MDD subjects compared to the healthy controls. Such abnormalities appeared to result from a lack of updating effects in postcue functional connectivity between prefrontal and scene-related areas in the MDD group. In sum, disrupted working memory updating in MDD was revealed by alterations in activity patterns of the visual association areas, their connectivity with the prefrontal cortex, and their relationship with core clinical characteristics. These results highlight the role of information updating deficits in the cognitive control and symptomatology of depression.
DOT National Transportation Integrated Search
2009-10-01
Transit Operations Decision Support Systems (TODSS) are systems designed to support dispatchers and others in real-time operations : management in response to incidents, special events, and other changing conditions in order to improve operating spee...
Mapping the literature of occupational therapy: an update.
Potter, Jonathan
2010-07-01
This study updated Reed's 1999 "Mapping the Literature of Occupational Therapy." An analysis of citation patterns and indexing coverage was undertaken to identify the core literature of occupational therapy and to determine access to that literature. Citations from three source journals for the years 2006 through 2008 were studied following the common methodology of the "Mapping the Literature of Allied Health Project." Bradford's Law of Scattering was applied to analyze the productivity of cited journals. A comparative analysis of indexing was conducted across three bibliographic databases. A total of 364 articles cited 10,425 references. Journals were the most frequently cited format, accounting for 65.3% of the references, an increase of 4.1% over the 1999 study. Approximately one-third of the journal references cited a cluster of 9 journals, with the American Journal of Occupational Therapy dominating the field. An additional 120 journals were identified as moderately important based on times cited. CINAHL provided the most comprehensive indexing of core journals, while MEDLINE provided the best overall coverage. Occupational therapy is a multidisciplinary field with a strong core identity and an increasingly diverse literature. Indexing has improved overall since 1999, but gaps in the coverage are still evident.
Finite element modelling and updating of a lively footbridge: The complete process
NASA Astrophysics Data System (ADS)
Živanović, Stana; Pavic, Aleksandar; Reynolds, Paul
2007-03-01
The finite element (FE) model updating technology was originally developed in the aerospace and mechanical engineering disciplines to automatically update numerical models of structures to match their experimentally measured counterparts. The process of updating identifies the drawbacks in the FE modelling and the updated FE model could be used to produce more reliable results in further dynamic analysis. In the last decade, the updating technology has been introduced into civil structural engineering. It can serve as an advanced tool for getting reliable modal properties of large structures. The updating process has four key phases: initial FE modelling, modal testing, manual model tuning and automatic updating (conducted using specialist software). However, the published literature does not connect well these phases, although this is crucial when implementing the updating technology. This paper therefore aims to clarify the importance of this linking and to describe the complete model updating process as applicable in civil structural engineering. The complete process consisting the four phases is outlined and brief theory is presented as appropriate. Then, the procedure is implemented on a lively steel box girder footbridge. It was found that even a very detailed initial FE model underestimated the natural frequencies of all seven experimentally identified modes of vibration, with the maximum error being almost 30%. Manual FE model tuning by trial and error found that flexible supports in the longitudinal direction should be introduced at the girder ends to improve correlation between the measured and FE-calculated modes. This significantly reduced the maximum frequency error to only 4%. It was demonstrated that only then could the FE model be automatically updated in a meaningful way. The automatic updating was successfully conducted by updating 22 uncertain structural parameters. Finally, a physical interpretation of all parameter changes is discussed. This interpretation is often missing in the published literature. It was found that the composite slabs were less stiff than originally assumed and that the asphalt layer contributed considerably to the deck stiffness.
Numerical model updating technique for structures using firefly algorithm
NASA Astrophysics Data System (ADS)
Sai Kubair, K.; Mohan, S. C.
2018-03-01
Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.
Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2013)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David C.; Syamlal, Madhava; Cottrell, Roger
2013-09-30
The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories’ core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI’s industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI’s academic participants (Carnegie Mellon University, Princeton University, West Virginia University, Boston University and the University of Texas at Austin) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 13, CCSI announced the initial release of its first set of computational tools and models during the October 2012 meeting of its Industry Advisory Board. This initial release led to five companies licensing the CCSI Toolset under a Test and Evaluation Agreement this year. By the end of FY13, the CCSI Technical Team had completed development of an updated suite of computational tools and models. The list below summarizes the new and enhanced toolset components that were released following comprehensive testing during October 2013. 1. FOQUS. Framework for Optimization and Quantification of Uncertainty and Sensitivity. Package includes: FOQUS Graphic User Interface (GUI), simulation-based optimization engine, Turbine Client, and heat integration capabilities. There is also an updated simulation interface and new configuration GUI for connecting Aspen Plus or Aspen Custom Modeler (ACM) simulations to FOQUS and the Turbine Science Gateway. 2. A new MFIX-based Computational Fluid Dynamics (CFD) model to predict particle attrition. 3. A new dynamic reduced model (RM) builder, which generates computationally efficient RMs of the behavior of a dynamic system. 4. A completely re-written version of the algebraic surrogate model builder for optimization (ALAMO). The new version is several orders of magnitude faster than the initial release and eliminates the MATLAB dependency. 5. A new suite of high resolution filtered models for the hydrodynamics associated with horizontal cylindrical objects in a flow path. 6. The new Turbine Science Gateway (Cluster), which supports FOQUS for running multiple simulations for optimization or UQ using a local computer or cluster. 7. A new statistical tool (BSS-ANOVA-UQ) for calibration and validation of CFD models. 8. A new basic data submodel in Aspen Plus format for a representative high viscosity capture solvent, 2-MPZ system. 9. An updated RM tool for CFD (REVEAL) that can create a RM from MFIX. A new lightweight, stand-alone version will be available in late 2013. 10. An updated RM integration tool to convert the RM from REVEAL into a CAPE-OPEN or ACM model for use in a process simulator. 11. An updated suite of unified steady-state and dynamic process models for solid sorbent carbon capture included bubbling fluidized bed and moving bed reactors. 12. An updated and unified set of compressor models including steady-state design point model and dynamic model with surge detection. 13. A new framework for the synthesis and optimization of coal oxycombustion power plants using advanced optimization algorithms. This release focuses on modeling and optimization of a cryogenic air separation unit (ASU). 14. A new technical risk model in spreadsheet format. 15. An updated version of the sorbent kinetic/equilibrium model for parameter estimation for the 1st generation sorbent model. 16. An updated process synthesis superstructure model to determine optimal process configurations utilizing surrogate models from ALAMO for adsorption and regeneration in a solid sorbent process. 17. Validation models for NETL Carbon Capture Unit utilizing sorbent AX. Additional validation models will be available for sorbent 32D in 2014. 18. An updated hollow fiber membrane model and system example for carbon capture. 19. An updated reference power plant model in Thermoflex that includes additional steam extraction and reinjection points to enable heat integration module. 20. An updated financial risk model in spreadsheet format.« less
Staskowski, Maureen
2012-05-01
Educational reform is sweeping the country. The adoption and the implementation of the Common Core State Standards in almost every state are meant to transform education. It is intended to update the way schools educate, the way students learn, and to ultimately prepare the nation's next generation for the global workplace. This article will describe the Common Core State Standard initiative and the underlying concerns about the quality of education in the United States as well as the opportunities this reform initiative affords speech-language pathologists. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Deployment of a tool for measuring freeway safety performance.
DOT National Transportation Integrated Search
2011-12-01
This project updated and deployed a freeway safety performance measurement tool, building upon a previous project that developed the core methodology. The tool evaluates the cumulative risk over time of an accident or a particular kind of accident. T...
Social anxiety is characterized by biased learning about performance and the self.
Koban, Leonie; Schneider, Rebecca; Ashar, Yoni K; Andrews-Hanna, Jessica R; Landy, Lauren; Moscovitch, David A; Wager, Tor D; Arch, Joanna J
2017-12-01
People learn about their self from social information, and recent work suggests that healthy adults show a positive bias for learning self-related information. In contrast, social anxiety disorder (SAD) is characterized by a negative view of the self, yet what causes and maintains this negative self-view is not well understood. Here the authors use a novel experimental paradigm and computational model to test the hypothesis that biased social learning regarding self-evaluation and self-feelings represents a core feature that distinguishes adults with SAD from healthy controls. Twenty-one adults with SAD and 35 healthy controls (HCs) performed a speech in front of 3 judges. They subsequently evaluated themselves and received performance feedback from the judges and then rated how they felt about themselves and the judges. Affective updating (i.e., change in feelings about the self over time, in response to feedback from the judges) was modeled using an adapted Rescorla-Wagner learning model. HCs demonstrated a positivity bias in affective updating, which was absent in SAD. Further, self-performance ratings revealed group differences in learning from positive feedback-a difference that endured at an average of 1 year follow up. These findings demonstrate the presence and long-term endurance of positively biased social learning about the self among healthy adults, a bias that is absent or reversed among socially anxious adults. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Idzerda, Leanne; Rader, Tamara; Tugwell, Peter; Boers, Maarten
2014-05-01
The usefulness of randomized control trials to advance clinical care depends upon the outcomes reported, but disagreement on the choice of outcome measures has resulted in inconsistency and the potential for reporting bias. One solution to this problem is the development of a core outcome set: a minimum set of outcome measures deemed critical for clinical decision making. Within rheumatology the Outcome Measures in Rheumatology (OMERACT) initiative has pioneered the development of core outcome sets since 1992. As the number of diseases addressed by OMERACT has increased and its experience in formulating core sets has grown, clarification and update of the conceptual framework and formulation of a more explicit process of area/domain core set development has become necessary. As part of the update process of the OMERACT Filter criteria to version 2, a literature review was undertaken to compare and contrast the OMERACT conceptual framework with others within and outside rheumatology. A scoping search was undertaken to examine the extent, range, and nature of conceptual frameworks for core set outcome selection in health. We searched the following resources: Cochrane Library Methods Group Register; Medline; Embase; PsycInfo; Environmental Studies and Policy Collection; and ABI/INFORM Global. We also conducted a targeted Google search. Five conceptual frameworks were identified: the WHO tripartite definition of health; the 5 Ds (discomfort, disability, drug toxicity, dollar cost, and death); the International Classification of Functioning (ICF); PROMIS (Patient-Reported Outcomes Measurement System); and the Outcomes Hierarchy. Of these, only the 5 Ds and ICF frameworks have been systematically applied in core set development. Outside the area of rheumatology, several core sets were identified; these had been developed through a limited range of consensus-based methods with varying degrees of methodological rigor. None applied a framework to ensure content validity of the end product. This scoping review reinforced the need for clear methods and standards for core set development. Based on these findings, OMERACT will make its own conceptual framework and working process more explicit. Proposals for how to achieve this were discussed at the OMERACT 11 conference.
1981-12-01
file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler
Sundvall, Erik; Wei-Kleiner, Fang; Freire, Sergio M; Lambrix, Patrick
2017-01-01
Archetype-based Electronic Health Record (EHR) systems using generic reference models from e.g. openEHR, ISO 13606 or CIMI should be easy to update and reconfigure with new types (or versions) of data models or entries, ideally with very limited programming or manual database tweaking. Exploratory research (e.g. epidemiology) leading to ad-hoc querying on a population-wide scale can be a challenge in such environments. This publication describes implementation and test of an archetype-aware Dewey encoding optimization that can be used to produce such systems in environments supporting relational operations, e.g. RDBMs and distributed map-reduce frameworks like Hadoop. Initial testing was done using a nine-node 2.2 GHz quad-core Hadoop cluster querying a dataset consisting of targeted extracts from 4+ million real patient EHRs, query results with sub-minute response time were obtained.
Timing Interactions in Social Simulations: The Voter Model
NASA Astrophysics Data System (ADS)
Fernández-Gracia, Juan; Eguíluz, Víctor M.; Miguel, Maxi San
The recent availability of huge high resolution datasets on human activities has revealed the heavy-tailed nature of the interevent time distributions. In social simulations of interacting agents the standard approach has been to use Poisson processes to update the state of the agents, which gives rise to very homogeneous activity patterns with a well defined characteristic interevent time. As a paradigmatic opinion model we investigate the voter model and review the standard update rules and propose two new update rules which are able to account for heterogeneous activity patterns. For the new update rules each node gets updated with a probability that depends on the time since the last event of the node, where an event can be an update attempt (exogenous update) or a change of state (endogenous update). We find that both update rules can give rise to power law interevent time distributions, although the endogenous one more robustly. Apart from that for the exogenous update rule and the standard update rules the voter model does not reach consensus in the infinite size limit, while for the endogenous update there exist a coarsening process that drives the system toward consensus configurations.
Michigan's forest resources, 2009
S.A. Pugh
2010-01-01
This publication provides an overview of forest resource attributes for Michigan based on an annual inventory (2005-2009) conducted by the Forest Inventory and Analysis (FIA) program of the Northern Research Station, U.S. Forest Service. These estimates, along with web-posted core tables, are updated annually.
Nebraska's forest resources, 2008
D.M. Meneguzzo
2010-01-01
This publication provides an overview of forest resource attributes for Nebraska based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the USDA Forest Service. These estimates, along with web-posted core tables, will be updated annually.
2017-09-01
VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS by Matthew D. Bouwense...VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS 5. FUNDING NUMBERS 6. AUTHOR...unlimited. EXPERIMENTAL VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY
Norsigian, Charles J; Kavvas, Erol; Seif, Yara; Palsson, Bernhard O; Monk, Jonathan M
2018-01-01
Acinetobacter baumannii has become an urgent clinical threat due to the recent emergence of multi-drug resistant strains. There is thus a significant need to discover new therapeutic targets in this organism. One means for doing so is through the use of high-quality genome-scale reconstructions. Well-curated and accurate genome-scale models (GEMs) of A. baumannii would be useful for improving treatment options. We present an updated and improved genome-scale reconstruction of A. baumannii AYE, named iCN718, that improves and standardizes previous A. baumannii AYE reconstructions. iCN718 has 80% accuracy for predicting gene essentiality data and additionally can predict large-scale phenotypic data with as much as 89% accuracy, a new capability for an A. baumannii reconstruction. We further demonstrate that iCN718 can be used to analyze conserved metabolic functions in the A. baumannii core genome and to build strain-specific GEMs of 74 other A. baumannii strains from genome sequence alone. iCN718 will serve as a resource to integrate and synthesize new experimental data being generated for this urgent threat pathogen.
Sriram, Vinay K; Montgomery, Doug
2017-07-01
The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.
ERM model analysis for adaptation to hydrological model errors
NASA Astrophysics Data System (ADS)
Baymani-Nezhad, M.; Han, D.
2018-05-01
Hydrological conditions are changed continuously and these phenomenons generate errors on flood forecasting models and will lead to get unrealistic results. Therefore, to overcome these difficulties, a concept called model updating is proposed in hydrological studies. Real-time model updating is one of the challenging processes in hydrological sciences and has not been entirely solved due to lack of knowledge about the future state of the catchment under study. Basically, in terms of flood forecasting process, errors propagated from the rainfall-runoff model are enumerated as the main source of uncertainty in the forecasting model. Hence, to dominate the exciting errors, several methods have been proposed by researchers to update the rainfall-runoff models such as parameter updating, model state updating, and correction on input data. The current study focuses on investigations about the ability of rainfall-runoff model parameters to cope with three types of existing errors, timing, shape and volume as the common errors in hydrological modelling. The new lumped model, the ERM model, has been selected for this study to evaluate its parameters for its use in model updating to cope with the stated errors. Investigation about ten events proves that the ERM model parameters can be updated to cope with the errors without the need to recalibrate the model.
A review of statistical updating methods for clinical prediction models.
Su, Ting-Li; Jaki, Thomas; Hickey, Graeme L; Buchan, Iain; Sperrin, Matthew
2018-01-01
A clinical prediction model is a tool for predicting healthcare outcomes, usually within a specific population and context. A common approach is to develop a new clinical prediction model for each population and context; however, this wastes potentially useful historical information. A better approach is to update or incorporate the existing clinical prediction models already developed for use in similar contexts or populations. In addition, clinical prediction models commonly become miscalibrated over time, and need replacing or updating. In this article, we review a range of approaches for re-using and updating clinical prediction models; these fall in into three main categories: simple coefficient updating, combining multiple previous clinical prediction models in a meta-model and dynamic updating of models. We evaluated the performance (discrimination and calibration) of the different strategies using data on mortality following cardiac surgery in the United Kingdom: We found that no single strategy performed sufficiently well to be used to the exclusion of the others. In conclusion, useful tools exist for updating existing clinical prediction models to a new population or context, and these should be implemented rather than developing a new clinical prediction model from scratch, using a breadth of complementary statistical methods.
NASA Astrophysics Data System (ADS)
Reyer, D.; Philipp, S. L.
2014-09-01
Information about geomechanical and physical rock properties, particularly uniaxial compressive strength (UCS), are needed for geomechanical model development and updating with logging-while-drilling methods to minimise costs and risks of the drilling process. The following parameters with importance at different stages of geothermal exploitation and drilling are presented for typical sedimentary and volcanic rocks of the Northwest German Basin (NWGB): physical (P wave velocities, porosity, and bulk and grain density) and geomechanical parameters (UCS, static Young's modulus, destruction work and indirect tensile strength both perpendicular and parallel to bedding) for 35 rock samples from quarries and 14 core samples of sandstones and carbonate rocks. With regression analyses (linear- and non-linear) empirical relations are developed to predict UCS values from all other parameters. Analyses focus on sedimentary rocks and were repeated separately for clastic rock samples or carbonate rock samples as well as for outcrop samples or core samples. Empirical relations have high statistical significance for Young's modulus, tensile strength and destruction work; for physical properties, there is a wider scatter of data and prediction of UCS is less precise. For most relations, properties of core samples plot within the scatter of outcrop samples and lie within the 90% prediction bands of developed regression functions. The results indicate the applicability of empirical relations that are based on outcrop data on questions related to drilling operations when the database contains a sufficient number of samples with varying rock properties. The presented equations may help to predict UCS values for sedimentary rocks at depth, and thus develop suitable geomechanical models for the adaptation of the drilling strategy on rock mechanical conditions in the NWGB.
NASA Astrophysics Data System (ADS)
Fu, Y.; Yang, W.; Xu, O.; Zhou, L.; Wang, J.
2017-04-01
To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately.
Update rules and interevent time distributions: slow ordering versus no ordering in the voter model.
Fernández-Gracia, J; Eguíluz, V M; San Miguel, M
2011-07-01
We introduce a general methodology of update rules accounting for arbitrary interevent time (IET) distributions in simulations of interacting agents. We consider in particular update rules that depend on the state of the agent, so that the update becomes part of the dynamical model. As an illustration we consider the voter model in fully connected, random, and scale-free networks with an activation probability inversely proportional to the time since the last action, where an action can be an update attempt (an exogenous update) or a change of state (an endogenous update). We find that in the thermodynamic limit, at variance with standard updates and the exogenous update, the system orders slowly for the endogenous update. The approach to the absorbing state is characterized by a power-law decay of the density of interfaces, observing that the mean time to reach the absorbing state might be not well defined. The IET distributions resulting from both update schemes show power-law tails.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-10
... care transactions results in higher use of EDI by health care providers.\\8\\ We expect usage of EFT and..., surveys, and straw polls, and shared updates on the CAQH CORE and NACHA Web sites. On August 1, 2011 CAQH...
Indiana's forest resources, 2006
C.W. Woodall; J. Gallion
2007-01-01
This publication provides an overview of forest resource attributes for this state based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the USDA Forest Service. These annual estimates, along with web-posted core tables, will be updated annually.
MISR CMVs and Multiangular Views of Tropical Cyclone Inner-Core Dynamics
NASA Technical Reports Server (NTRS)
Wu, Dong L.; Diner, David J.; Garay, Michael J; Jovanovic, Veljko M.; Lee, Jae N.; Moroney, Catherine M.; Mueller, Kevin J.; Nelson, David L.
2010-01-01
Multi-camera stereo imaging of cloud features from the MISR (Multiangle Imaging SpectroRadiometer) instrument on NASA's Terra satellite provides accurate and precise measurements of cloud top heights (CTH) and cloud motion vector (CMV) winds. MISR observes each cloudy scene from nine viewing angles (Nadir, +/-26(sup o), +/-46(sup o), +/-60(sup o), +/-70(sup o)) with approximatel 275-m pixel resolution. This paper provides an update on MISR CMV and CTH algorithm improvements, and explores a high-resolution retrieval of tangential winds inside the eyewall of tropical cyclones (TC). The MISR CMV and CTH retrievals from the updated algorithm are significantly improved in terms of spatial coverage and systematic errors. A new product, the 1.1-km cross-track wind, provides high accuracy and precision in measuring convective outflows. Preliminary results obtained from the 1.1-km tangential wind retrieval inside the TC eyewall show that the inner-core rotation is often faster near the eyewall, and this faster rotation appears to be related linearly to cyclone intensity.
Nighttime Foreground Pedestrian Detection Based on Three-Dimensional Voxel Surface Model.
Li, Jing; Zhang, Fangbing; Wei, Lisong; Yang, Tao; Lu, Zhaoyang
2017-10-16
Pedestrian detection is among the most frequently-used preprocessing tasks in many surveillance application fields, from low-level people counting to high-level scene understanding. Even though many approaches perform well in the daytime with sufficient illumination, pedestrian detection at night is still a critical and challenging problem for video surveillance systems. To respond to this need, in this paper, we provide an affordable solution with a near-infrared stereo network camera, as well as a novel three-dimensional foreground pedestrian detection model. Specifically, instead of using an expensive thermal camera, we build a near-infrared stereo vision system with two calibrated network cameras and near-infrared lamps. The core of the system is a novel voxel surface model, which is able to estimate the dynamic changes of three-dimensional geometric information of the surveillance scene and to segment and locate foreground pedestrians in real time. A free update policy for unknown points is designed for model updating, and the extracted shadow of the pedestrian is adopted to remove foreground false alarms. To evaluate the performance of the proposed model, the system is deployed in several nighttime surveillance scenes. Experimental results demonstrate that our method is capable of nighttime pedestrian segmentation and detection in real time under heavy occlusion. In addition, the qualitative and quantitative comparison results show that our work outperforms classical background subtraction approaches and a recent RGB-D method, as well as achieving comparable performance with the state-of-the-art deep learning pedestrian detection method even with a much lower hardware cost.
Nighttime Foreground Pedestrian Detection Based on Three-Dimensional Voxel Surface Model
Li, Jing; Zhang, Fangbing; Wei, Lisong; Lu, Zhaoyang
2017-01-01
Pedestrian detection is among the most frequently-used preprocessing tasks in many surveillance application fields, from low-level people counting to high-level scene understanding. Even though many approaches perform well in the daytime with sufficient illumination, pedestrian detection at night is still a critical and challenging problem for video surveillance systems. To respond to this need, in this paper, we provide an affordable solution with a near-infrared stereo network camera, as well as a novel three-dimensional foreground pedestrian detection model. Specifically, instead of using an expensive thermal camera, we build a near-infrared stereo vision system with two calibrated network cameras and near-infrared lamps. The core of the system is a novel voxel surface model, which is able to estimate the dynamic changes of three-dimensional geometric information of the surveillance scene and to segment and locate foreground pedestrians in real time. A free update policy for unknown points is designed for model updating, and the extracted shadow of the pedestrian is adopted to remove foreground false alarms. To evaluate the performance of the proposed model, the system is deployed in several nighttime surveillance scenes. Experimental results demonstrate that our method is capable of nighttime pedestrian segmentation and detection in real time under heavy occlusion. In addition, the qualitative and quantitative comparison results show that our work outperforms classical background subtraction approaches and a recent RGB-D method, as well as achieving comparable performance with the state-of-the-art deep learning pedestrian detection method even with a much lower hardware cost. PMID:29035295
Gerwin, Philip M; Norinsky, Rada M; Tolwani, Ravi J
2018-03-01
Laboratory animal programs and core laboratories often set service rates based on cost estimates. However, actual costs may be unknown, and service rates may not reflect the actual cost of services. Accurately evaluating the actual costs of services can be challenging and time-consuming. We used a time-driven activity-based costing (ABC) model to determine the cost of services provided by a resource laboratory at our institution. The time-driven approach is a more efficient approach to calculating costs than using a traditional ABC model. We calculated only 2 parameters: the time required to perform an activity and the unit cost of the activity based on employee cost. This method allowed us to rapidly and accurately calculate the actual cost of services provided, including microinjection of a DNA construct, microinjection of embryonic stem cells, embryo transfer, and in vitro fertilization. We successfully implemented a time-driven ABC model to evaluate the cost of these services and the capacity of labor used to deliver them. We determined how actual costs compared with current service rates. In addition, we determined that the labor supplied to conduct all services (10,645 min/wk) exceeded the practical labor capacity (8400 min/wk), indicating that the laboratory team was highly efficient and that additional labor capacity was needed to prevent overloading of the current team. Importantly, this time-driven ABC approach allowed us to establish a baseline model that can easily be updated to reflect operational changes or changes in labor costs. We demonstrated that a time-driven ABC model is a powerful management tool that can be applied to other core facilities as well as to entire animal programs, providing valuable information that can be used to set rates based on the actual cost of services and to improve operating efficiency.
Seismological Modeling of the Delta Scuti Star: CD-24 7599
NASA Astrophysics Data System (ADS)
Bradley, Paul A.; Guzik, Joyce A.
1996-01-01
A major goal of asteroseismology is a better understanding of stellar evolution via ''snapshots'' of many stars of different masses in different evolutionary states. For stars of about 2M(circle dot) near the sequence, b Scuti stars are the usual suspects. There is an ongoing renaissance in theoretical modeling of 6 Scuti stars brought on by improvements in constitutive physics and by a dramatic increase in the number of modes observed. FG Virginis and CD-24' 7599 are two of the best studied objects, and they have 19 and 13 known frequencies, respectively. . We create models using an updated and modified version of the Iben code described by Guzik & Cox that includes either of the two versions of the OPAL opacities . We use the star's observed location on the H-R diagram as a starting point for our seismological modeling. Because there is no evidence for observed t = 3 modes, we only consider l = 0, 1, and 2 modes in our analysis. We take into account rotational splitting (about 5 - 10 (mu)Hz) in our frequency matching. Several observed modes must be rotationally split members of a given mode. CD-24' 7599 is less than halfway through core hydrogen burning, and the modes appear to be a set of consecutive 3rd through 5th overtones of (ital l) = 0 through 2 modes. With only 13 modes, we find satisfactory fits with models between 1.9 and 2.0 M(circle dot) that fall within the observed luminosity and effective temperature range. By contrast, Guzik & Bradley suggest that FG Virginis is over halfway through core hydrogen burning and the best fitting models lie near 1.80 or 2.00 M(circle dot). We see persistent discrepancies in some low frequency modes, which suggests we may need a small amount of core overshoot or a slight change in metallicity to duplicate FG Virginis.
Mathelier, Anthony; Zhao, Xiaobei; Zhang, Allen W.; Parcy, François; Worsley-Hunt, Rebecca; Arenillas, David J.; Buchman, Sorana; Chen, Chih-yu; Chou, Alice; Ienasescu, Hans; Lim, Jonathan; Shyr, Casper; Tan, Ge; Zhou, Michelle; Lenhard, Boris; Sandelin, Albin; Wasserman, Wyeth W.
2014-01-01
JASPAR (http://jaspar.genereg.net) is the largest open-access database of matrix-based nucleotide profiles describing the binding preference of transcription factors from multiple species. The fifth major release greatly expands the heart of JASPAR—the JASPAR CORE subcollection, which contains curated, non-redundant profiles—with 135 new curated profiles (74 in vertebrates, 8 in Drosophila melanogaster, 10 in Caenorhabditis elegans and 43 in Arabidopsis thaliana; a 30% increase in total) and 43 older updated profiles (36 in vertebrates, 3 in D. melanogaster and 4 in A. thaliana; a 9% update in total). The new and updated profiles are mainly derived from published chromatin immunoprecipitation-seq experimental datasets. In addition, the web interface has been enhanced with advanced capabilities in browsing, searching and subsetting. Finally, the new JASPAR release is accompanied by a new BioPython package, a new R tool package and a new R/Bioconductor data package to facilitate access for both manual and automated methods. PMID:24194598
Mathelier, Anthony; Zhao, Xiaobei; Zhang, Allen W; Parcy, François; Worsley-Hunt, Rebecca; Arenillas, David J; Buchman, Sorana; Chen, Chih-yu; Chou, Alice; Ienasescu, Hans; Lim, Jonathan; Shyr, Casper; Tan, Ge; Zhou, Michelle; Lenhard, Boris; Sandelin, Albin; Wasserman, Wyeth W
2014-01-01
JASPAR (http://jaspar.genereg.net) is the largest open-access database of matrix-based nucleotide profiles describing the binding preference of transcription factors from multiple species. The fifth major release greatly expands the heart of JASPAR-the JASPAR CORE subcollection, which contains curated, non-redundant profiles-with 135 new curated profiles (74 in vertebrates, 8 in Drosophila melanogaster, 10 in Caenorhabditis elegans and 43 in Arabidopsis thaliana; a 30% increase in total) and 43 older updated profiles (36 in vertebrates, 3 in D. melanogaster and 4 in A. thaliana; a 9% update in total). The new and updated profiles are mainly derived from published chromatin immunoprecipitation-seq experimental datasets. In addition, the web interface has been enhanced with advanced capabilities in browsing, searching and subsetting. Finally, the new JASPAR release is accompanied by a new BioPython package, a new R tool package and a new R/Bioconductor data package to facilitate access for both manual and automated methods.
Mapping the literature of occupational therapy: an update
Potter, Jonathan
2010-01-01
Objectives: This study updated Reed's 1999 “Mapping the Literature of Occupational Therapy.” An analysis of citation patterns and indexing coverage was undertaken to identify the core literature of occupational therapy and to determine access to that literature. Methods: Citations from three source journals for the years 2006 through 2008 were studied following the common methodology of the “Mapping the Literature of Allied Health Project.” Bradford's Law of Scattering was applied to analyze the productivity of cited journals. A comparative analysis of indexing was conducted across three bibliographic databases. Results: A total of 364 articles cited 10,425 references. Journals were the most frequently cited format, accounting for 65.3% of the references, an increase of 4.1% over the 1999 study. Approximately one-third of the journal references cited a cluster of 9 journals, with the American Journal of Occupational Therapy dominating the field. An additional 120 journals were identified as moderately important based on times cited. CINAHL provided the most comprehensive indexing of core journals, while MEDLINE provided the best overall coverage. Conclusions: Occupational therapy is a multidisciplinary field with a strong core identity and an increasingly diverse literature. Indexing has improved overall since 1999, but gaps in the coverage are still evident. PMID:20648258
NASA Astrophysics Data System (ADS)
Clary, W. A.; Worthington, L. L.; Daigle, H.; Slagle, A. L.; Gulick, S. P. S.
2016-12-01
Sediments offshore Southern Alaska offer a natural laboratory to study glacial erosion, sediment deposition, and orogenesis. A major goal of Integrated Ocean Drilling Program (IODP) Expedition 341 was investigation of interrelationships among tectonic processes, paleoclimate, and glacial activity. Here, we focus on core-log-seismic integration of IODP Sites U1420 and U1421 on the shallow shelf and slope near the Bering Trough, a glacially derived shelf-crossing landform. These sites sample glacial and marine sediments that record a history of sedimentation following the onset of glacial intensification near the mid-Pleistocene transition (1.2 Ma) and Yakutat microplate convergence with North America. Ocean drilling provides important stratigraphic, physical properties, and age data in depth which support development of a stratigraphic model that can be extended across the shelf if carefully calibrated to local and regional seismic surveys. We use high resolution multichannel seismic, core, and logging data to develop a time-depth relationship (TDR) and update the developing chronostratigraphic model based on correlation of seismic sequence boundaries and drilling-related data, including biostratigraphic and paleomagnetic age controls. We calibrate, combine, and interpolate core and logging data at each site to minimize gaps in physical property information and generate synthetic seismic traces. At Site U1421, vertical seismic profiling further constrains the TDR, and provides input for the initial velocity model during the tie. Finally, we match reflectors in the synthetic trace with events in nearby seismic reflection data to establish a TDR at each site. We can use this relationship to better interpret the development of the Bering Trough, a recurring and favored path for ice streams and glacial advance. Initial results suggest late Pleistocene sedimentation rates of at least 1 km/m.y. on average, and variable sedimentation rates which are possibly correlated with paleoenvironmental indicators such as sea ice related species of diatoms.
A Probabilistic Approach to Model Update
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.
2001-01-01
Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.
NASA Technical Reports Server (NTRS)
Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.
2010-01-01
This paper details the validation of modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly, developed for use in a Portable Life Support System (PLSS). The first core component in the subassembly is a sorbent bed, used to capture and reject metabolically produced carbon dioxide (CO2). The sorbent bed performance can be augmented with a temperature swing driven by a liquid CO2 (LCO2) sublimation heat exchanger (SHX) for cooling the sorbent bed, and a condensing, icing heat exchanger (CIHX) for warming the sorbent bed. As part of the overall MTSA effort, scaled design validation test articles for each of these three components have been independently tested in laboratory conditions. Previously described modeling methodologies developed for implementation in Thermal Desktop and SINDA/FLUINT are reviewed and updated, their application in test article models outlined, and the results of those model correlations relayed. Assessment of the applicability of each modeling methodology to the challenge of simulating the response of the test articles and their extensibility to a full scale integrated subassembly model is given. The independent verified and validated modeling methods are applied to the development of a MTSA subassembly prototype model and predictions of the subassembly performance are given. These models and modeling methodologies capture simulation of several challenging and novel physical phenomena in the Thermal Desktop and SINDA/FLUINT software suite. Novel methodologies include CO2 adsorption front tracking and associated thermal response in the sorbent bed, heat transfer associated with sublimation of entrained solid CO2 in the SHX, and water mass transfer in the form of ice as low as 210 K in the CIHX.
The Role of the Mathematics Supervisor in K-12 Education
ERIC Educational Resources Information Center
Greenes, Carole
2013-01-01
The implementation of "the Common Core Standards for Mathematics" and the assessments of those concepts, skills, reasoning methods, and mathematical practices that are in development necessitate the updating of teachers' knowledge of content, pedagogical techniques to enhance engagement and persistence, and strategies for responding to…
School Library Certification Requirements: 1978 Update
ERIC Educational Resources Information Center
Franklin, Ann Y.
1978-01-01
State certification requirements are listed for school librarians and media specialists. Two charts are included for comparision and study: the first delineates basic core courses or subject areas as required by states, and the second gives certificate information, number of hours required, audiovisual education information, and the accreditation…
76 FR 3877 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-21
...: Communication and Outreach. OMB Control Number: None. Form Number(s): None. Type of Request: Emergency... be used to revise/ update its collateral materials, outreach strategies and program services in a manner that effectively matches the interests and needs of its core constituency and outreach strategies...
NASA Technical Reports Server (NTRS)
Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant
2014-01-01
In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.
Nuclear Engine System Simulation (NESS). Version 2.0: Program user's guide
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman
1993-01-01
This Program User's Guide discusses the Nuclear Thermal Propulsion (NTP) engine system design features and capabilities modeled in the Nuclear Engine System Simulation (NESS): Version 2.0 program (referred to as NESS throughout the remainder of this document), as well as its operation. NESS was upgraded to include many new modeling capabilities not available in the original version delivered to NASA LeRC in Dec. 1991, NESS's new features include the following: (1) an improved input format; (2) an advanced solid-core NERVA-type reactor system model (ENABLER 2); (3) a bleed-cycle engine system option; (4) an axial-turbopump design option; (5) an automated pump-out turbopump assembly sizing option; (6) an off-design gas generator engine cycle design option; (7) updated hydrogen properties; (8) an improved output format; and (9) personal computer operation capability. Sample design cases are presented in the user's guide that demonstrate many of the new features associated with this upgraded version of NESS, as well as design modeling features associated with the original version of NESS.
Hodges, Mary K.V.; Davis, Linda C.; Bartholomay, Roy C.
2018-01-30
In 1990, the U.S. Geological Survey, in cooperation with the U.S. Department of Energy Idaho Operations Office, established the Lithologic Core Storage Library at the Idaho National Laboratory (INL). The facility was established to consolidate, catalog, and permanently store nonradioactive drill cores and cuttings from subsurface investigations conducted at the INL, and to provide a location for researchers to examine, sample, and test these materials.The facility is open by appointment to researchers for examination, sampling, and testing of cores and cuttings. This report describes the facility and cores and cuttings stored at the facility. Descriptions of cores and cuttings include the corehole names, corehole locations, and depth intervals available.Most cores and cuttings stored at the facility were drilled at or near the INL, on the eastern Snake River Plain; however, two cores drilled on the western Snake River Plain are stored for comparative studies. Basalt, rhyolite, sedimentary interbeds, and surficial sediments compose most cores and cuttings, most of which are continuous from land surface to their total depth. The deepest continuously drilled core stored at the facility was drilled to 5,000 feet below land surface. This report describes procedures and researchers' responsibilities for access to the facility and for examination, sampling, and return of materials.
Visualization assisted by parallel processing
NASA Astrophysics Data System (ADS)
Lange, B.; Rey, H.; Vasques, X.; Puech, W.; Rodriguez, N.
2011-01-01
This paper discusses the experimental results of our visualization model for data extracted from sensors. The objective of this paper is to find a computationally efficient method to produce a real time rendering visualization for a large amount of data. We develop visualization method to monitor temperature variance of a data center. Sensors are placed on three layers and do not cover all the room. We use particle paradigm to interpolate data sensors. Particles model the "space" of the room. In this work we use a partition of the particle set, using two mathematical methods: Delaunay triangulation and Voronoý cells. Avis and Bhattacharya present these two algorithms in. Particles provide information on the room temperature at different coordinates over time. To locate and update particles data we define a computational cost function. To solve this function in an efficient way, we use a client server paradigm. Server computes data and client display this data on different kind of hardware. This paper is organized as follows. The first part presents related algorithm used to visualize large flow of data. The second part presents different platforms and methods used, which was evaluated in order to determine the better solution for the task proposed. The benchmark use the computational cost of our algorithm that formed based on located particles compared to sensors and on update of particles value. The benchmark was done on a personal computer using CPU, multi core programming, GPU programming and hybrid GPU/CPU. GPU programming method is growing in the research field; this method allows getting a real time rendering instates of a precompute rendering. For improving our results, we compute our algorithm on a High Performance Computing (HPC), this benchmark was used to improve multi-core method. HPC is commonly used in data visualization (astronomy, physic, etc) for improving the rendering and getting real-time.
NASA Astrophysics Data System (ADS)
Guo, Ning; Yang, Zhichun; Wang, Le; Ouyang, Yan; Zhang, Xinping
2018-05-01
Aiming at providing a precise dynamic structural finite element (FE) model for dynamic strength evaluation in addition to dynamic analysis. A dynamic FE model updating method is presented to correct the uncertain parameters of the FE model of a structure using strain mode shapes and natural frequencies. The strain mode shape, which is sensitive to local changes in structure, is used instead of the displacement mode for enhancing model updating. The coordinate strain modal assurance criterion is developed to evaluate the correlation level at each coordinate over the experimental and the analytical strain mode shapes. Moreover, the natural frequencies which provide the global information of the structure are used to guarantee the accuracy of modal properties of the global model. Then, the weighted summation of the natural frequency residual and the coordinate strain modal assurance criterion residual is used as the objective function in the proposed dynamic FE model updating procedure. The hybrid genetic/pattern-search optimization algorithm is adopted to perform the dynamic FE model updating procedure. Numerical simulation and model updating experiment for a clamped-clamped beam are performed to validate the feasibility and effectiveness of the present method. The results show that the proposed method can be used to update the uncertain parameters with good robustness. And the updated dynamic FE model of the beam structure, which can correctly predict both the natural frequencies and the local dynamic strains, is reliable for the following dynamic analysis and dynamic strength evaluation.
Static and Dynamic Model Update of an Inflatable/Rigidizable Torus Structure
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Reaves, mercedes C.
2006-01-01
The present work addresses the development of an experimental and computational procedure for validating finite element models. A torus structure, part of an inflatable/rigidizable Hexapod, is used to demonstrate the approach. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with optimization is used to modify key model parameters. Static test results are used to update stiffness parameters and dynamic test results are used to update the mass distribution. Updated parameters are computed using gradient and non-gradient based optimization algorithms. Results show significant improvements in model predictions after parameters are updated. Lessons learned in the areas of test procedures, modeling approaches, and uncertainties quantification are presented.
The CoreWall Project: An Update for 2007
NASA Astrophysics Data System (ADS)
Yu-Chung Chen, J.; Higgins, S.; Hur, H.; Ito, E.; Jenkins, C. J.; Johnson, A.; Leigh, J.; Morin, P.; Lee, J.
2007-12-01
The CoreWall Suite is a NSF-supported collaborative development for a real-time core description (Corelyzer), stratigraphic correlation (Correlater), and data visualization (CoreNavigator) software to be used by the marine, terrestrial and Antarctic science communities. The overall goal of the Corewall software development is to bring portable cross-platform tools to the broader drilling and coring communities to expand and enhance data visualization and enhance collaborative integration of multiple datasets. The CoreWall Project is now in its second year and significant progress has been made on all 3 software components. Corelyzer has undergone 2 field deployments and testing by ANDRILL program in 2006 (and again in Fall 2007) and by ICDP's SAFOD project (summer 2007). In addition, Corewall group and ICDP are working together so that the core description (DIS) system can expose DIS core data directly into Corelyzer seamlessly and be available to future ICDP and IODP-Mission Specific Platform expeditions. Educators have also taken note of the software's ease of use and strong visualization capabilities to begin exploring curriculum projects with Corelyzer software. To ensure that the software development is integrated with other community IT activities the development of the U.S. IODP-Phase 2 Scientific Ocean Drilling Vessel (SODV), a Steering Committee was constituted. It is composed of key U.S. IODP and related database (e.g., CHRONOS, SedDB) developers and users as well as representatives of other core-based enterprises (e.g., ANDRILL, ICDP, LacCore). Corelyzer (CoreWall's main visual core description tool) software displays digital core images from one or more cores along with discrete data streams (eg. physical properties, downhole logs) and nested images (eg. thin sections, fossils) to provide a robust approach to the description of sediment cores. Corelyzer's digital image handling allows the cores to be viewed from micron to km scale determined by the image resolution along a sliding plane, effectively making it a "digital microscope". Detailed features such as lithologic variation, macroscopic grain size variation, bioturbation intensity, chemical composition and micropaleontology are easier to interpret and annotate. Significant new capabilities have been added to allow for importing multiple images and data types, sharing/exporting Corelyzer "work sessions" for multiple users, enhanced annotations, as well as support for other activities like examining clasts, and sample requests. The new Correlator software, the updated version of Splicer/Sagan software used by ODP for over 10 years, has been ported into a single new analysis tool that will work across multiple platforms and interact seamlessly with both JANUS (ODP's relational database), CHRONOS, PetDB, SedDB, dbSEABED and other databases. This functionality will result in a CoreWall Suite module that can be used and distributed anywhere for stratigraphic and age correlation tasks. CoreNavigator, a spatial data discovery tool, has taken on a virtual Globe interface that allows users to enter Corelyzer from a geographic-visual standpoint.
Monte Carlo modelling of TRIGA research reactor
NASA Astrophysics Data System (ADS)
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
2010-10-01
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
Dynamic Simulation of Human Thermoregulation and Heat Transfer for Spaceflight Applications
NASA Technical Reports Server (NTRS)
Miller, Thomas R.; Nelson, David A.; Bue, Grant; Kuznetz, Lawrence
2011-01-01
Models of human thermoregulation and heat transfer date from the early 1970s and have been developed for applications ranging from evaluating thermal comfort in spacecraft and aircraft cabin environments to predicting heat stress during EVAs. Most lumped or compartment models represent the body as an assemblage cylindrical and spherical elements which may be subdivided into layers to describe tissue heterogeneity. Many existing models are of limited usefulness in asymmetric thermal environments, such as may be encountered during an EVA. Conventional whole-body clothing models also limit the ability to describe local surface thermal and evaporation effects in sufficient detail. A further limitation is that models based on a standard man model are not readily scalable to represent large or small subjects. This work describes development of a new human thermal model derived from the 41-node man model. Each segment is divided into four concentric, constant thickness cylinders made up of a central core surrounded by muscle, fat, and skin, respectively. These cylinders are connected by the flow of blood from a central blood pool to each part. The central blood pool is updated at each time step, based on a whole-body energy balance. Results show the model simulates core and surface temperature histories, sweat evaporation and metabolic rates which generally are consistent with controlled exposures of human subjects. Scaling rules are developed to enable simulation of small and large subjects (5th percentile and 95th percentile). Future refinements will include a clothing model that addresses local surface insulation and permeation effects and developing control equations to describe thermoregulatory effects such as may occur with prolonged weightlessness or with aging.
Calhaz-Jorge, Carlos; Feki, Anis; Farquharson, Roy
2015-07-01
Specialist training in reproductive medicine within Europe continues to evolve. Recent revisions, updates, and initiatives have helped to refine the core educational needs for the specialist trainee. Copyright © 2015 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
The Real Face of Separatism. Quebec Report.
ERIC Educational Resources Information Center
Kelebay, Yarema Gregory
1997-01-01
Provides an update on the continuing debate over separatism for the Canadian province of Quebec. Briefly profiles "La Patente," a group that eventually became the organizational core of the separatist movement. Maintains that the separatist movement is a well-organized minority that doesn't reflect the majority opinion in Quebec. (MJP)
ERIC Educational Resources Information Center
What Works Clearinghouse, 2013
2013-01-01
"Scott Foresman-Addison Wesley Elementary Mathematics" is a core mathematics curriculum for students in prekindergarten through grade 6. The program aims to improve students' understanding of key math concepts through problem-solving instruction, hands-on activities, and math problems that involve reading and writing. The curriculum…
ERIC Educational Resources Information Center
Sargent, John
The Office of Technology Policy analyzed Bureau of Labor Statistics' growth projections for the core occupational classifications of IT (information technology) workers to assess future demand in the United States. Classifications studied were computer engineers, systems analysts, computer programmers, database administrators, computer support…
ERIC Educational Resources Information Center
Berenson, Mark L.; Koppel, Nicole B.; Lord, Richard A.; Chapdelaine, Laura L.
2018-01-01
Typically, the core-required undergraduate business statistics course covers a broad spectrum of topics with applications pertaining to all functional areas of business. The recently updated American Statistical Association's GAISE (Guidelines for Assessment and Instruction in Statistics Education) College Report once again stresses the…
Gorst, Sarah L; Gargon, Elizabeth; Clarke, Mike; Smith, Valerie; Williamson, Paula R
2016-01-01
The COMET (Core Outcome Measures in Effectiveness Trials) Initiative promotes the development and application of core outcome sets (COS), including relevant studies in an online database. In order to keep the database current, an annual search of the literature is undertaken. This study aimed to update a previous systematic review, in order to identify any further studies where a COS has been developed. Furthermore, no prioritization for COS development has previously been undertaken, therefore this study also aimed to identify COS relevant to the world's most prevalent health conditions. The methods used in this updated review followed the same approach used in the original review and the previous update. A survey was also sent to the corresponding authors of COS identified for inclusion in this review, to ascertain what lessons they had learnt from developing their COS. Additionally, the COMET database was searched to identify COS that might be relevant to the conditions with the highest global prevalence. Twenty-five reports relating to 22 new studies were eligible for inclusion in the review. Further improvements were identified in relation to the description of the scope of the COS, use of the Delphi technique, and the inclusion of patient participants within the development process. Additionally, 33 published and ongoing COS were identified for 13 of the world's most prevalent conditions. The development of a reporting guideline and minimum standards should contribute towards future improvements in development and reporting of COS. This study has also described a first approach to identifying gaps in existing COS, and to priority setting in this area. Important gaps have been identified, on the basis of global burden of disease, and the development and application of COS in these areas should be considered a priority.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, Ho-Ling; Davis, Stacy Cagle
2009-12-01
This report is designed to document the analysis process and estimation models currently used by the Federal Highway Administration (FHWA) to estimate the off-highway gasoline consumption and public sector fuel consumption. An overview of the entire FHWA attribution process is provided along with specifics related to the latest update (2008) on the Off-Highway Gasoline Use Model and the Public Use of Gasoline Model. The Off-Highway Gasoline Use Model is made up of five individual modules, one for each of the off-highway categories: agricultural, industrial and commercial, construction, aviation, and marine. This 2008 update of the off-highway models was the secondmore » major update (the first model update was conducted during 2002-2003) after they were originally developed in mid-1990. The agricultural model methodology, specifically, underwent a significant revision because of changes in data availability since 2003. Some revision to the model was necessary due to removal of certain data elements used in the original estimation method. The revised agricultural model also made use of some newly available information, published by the data source agency in recent years. The other model methodologies were not drastically changed, though many data elements were updated to improve the accuracy of these models. Note that components in the Public Use of Gasoline Model were not updated in 2008. A major challenge in updating estimation methods applied by the public-use model is that they would have to rely on significant new data collection efforts. In addition, due to resource limitation, several components of the models (both off-highway and public-us models) that utilized regression modeling approaches were not recalibrated under the 2008 study. An investigation of the Environmental Protection Agency's NONROAD2005 model was also carried out under the 2008 model update. Results generated from the NONROAD2005 model were analyzed, examined, and compared, to the extent that is possible on the overall totals, to the current FHWA estimates. Because NONROAD2005 model was designed for emission estimation purposes (i.e., not for measuring fuel consumption), it covers different equipment populations from those the FHWA models were based on. Thus, a direct comparison generally was not possible in most sectors. As a result, NONROAD2005 data were not used in the 2008 update of the FHWA off-highway models. The quality of fuel use estimates directly affect the data quality in many tables published in the Highway Statistics. Although updates have been made to the Off-Highway Gasoline Use Model and the Public Use Gasoline Model, some challenges remain due to aging model equations and discontinuation of data sources.« less
Prediction-error variance in Bayesian model updating: a comparative study
NASA Astrophysics Data System (ADS)
Asadollahi, Parisa; Li, Jian; Huang, Yong
2017-04-01
In Bayesian model updating, the likelihood function is commonly formulated by stochastic embedding in which the maximum information entropy probability model of prediction error variances plays an important role and it is Gaussian distribution subject to the first two moments as constraints. The selection of prediction error variances can be formulated as a model class selection problem, which automatically involves a trade-off between the average data-fit of the model class and the information it extracts from the data. Therefore, it is critical for the robustness in the updating of the structural model especially in the presence of modeling errors. To date, three ways of considering prediction error variances have been seem in the literature: 1) setting constant values empirically, 2) estimating them based on the goodness-of-fit of the measured data, and 3) updating them as uncertain parameters by applying Bayes' Theorem at the model class level. In this paper, the effect of different strategies to deal with the prediction error variances on the model updating performance is investigated explicitly. A six-story shear building model with six uncertain stiffness parameters is employed as an illustrative example. Transitional Markov Chain Monte Carlo is used to draw samples of the posterior probability density function of the structure model parameters as well as the uncertain prediction variances. The different levels of modeling uncertainty and complexity are modeled through three FE models, including a true model, a model with more complexity, and a model with modeling error. Bayesian updating is performed for the three FE models considering the three aforementioned treatments of the prediction error variances. The effect of number of measurements on the model updating performance is also examined in the study. The results are compared based on model class assessment and indicate that updating the prediction error variances as uncertain parameters at the model class level produces more robust results especially when the number of measurement is small.
Assessing the performance of eight real-time updating models and procedures for the Brosna River
NASA Astrophysics Data System (ADS)
Goswami, M.; O'Connor, K. M.; Bhattarai, K. P.; Shamseldin, A. Y.
2005-10-01
The flow forecasting performance of eight updating models, incorporated in the Galway River Flow Modelling and Forecasting System (GFMFS), was assessed using daily data (rainfall, evaporation and discharge) of the Irish Brosna catchment (1207 km2), considering their one to six days lead-time discharge forecasts. The Perfect Forecast of Input over the Forecast Lead-time scenario was adopted, where required, in place of actual rainfall forecasts. The eight updating models were: (i) the standard linear Auto-Regressive (AR) model, applied to the forecast errors (residuals) of a simulation (non-updating) rainfall-runoff model; (ii) the Neural Network Updating (NNU) model, also using such residuals as input; (iii) the Linear Transfer Function (LTF) model, applied to the simulated and the recently observed discharges; (iv) the Non-linear Auto-Regressive eXogenous-Input Model (NARXM), also a neural network-type structure, but having wide options of using recently observed values of one or more of the three data series, together with non-updated simulated outflows, as inputs; (v) the Parametric Simple Linear Model (PSLM), of LTF-type, using recent rainfall and observed discharge data; (vi) the Parametric Linear perturbation Model (PLPM), also of LTF-type, using recent rainfall and observed discharge data, (vii) n-AR, an AR model applied to the observed discharge series only, as a naïve updating model; and (viii) n-NARXM, a naive form of the NARXM, using only the observed discharge data, excluding exogenous inputs. The five GFMFS simulation (non-updating) models used were the non-parametric and parametric forms of the Simple Linear Model and of the Linear Perturbation Model, the Linearly-Varying Gain Factor Model, the Artificial Neural Network Model, and the conceptual Soil Moisture Accounting and Routing (SMAR) model. As the SMAR model performance was found to be the best among these models, in terms of the Nash-Sutcliffe R2 value, both in calibration and in verification, the simulated outflows of this model only were selected for the subsequent exercise of producing updated discharge forecasts. All the eight forms of updating models for producing lead-time discharge forecasts were found to be capable of producing relatively good lead-1 (1-day ahead) forecasts, with R2 values almost 90% or above. However, for higher lead time forecasts, only three updating models, viz., NARXM, LTF, and NNU, were found to be suitable, with lead-6 values of R2 about 90% or higher. Graphical comparisons were made of the lead-time forecasts for the two largest floods, one in the calibration period and the other in the verification period.
An Online Risk Monitor System (ORMS) to Increase Safety and Security Levels in Industry
NASA Astrophysics Data System (ADS)
Zubair, M.; Rahman, Khalil Ur; Hassan, Mehmood Ul
2013-12-01
The main idea of this research is to develop an Online Risk Monitor System (ORMS) based on Living Probabilistic Safety Assessment (LPSA). The article highlights the essential features and functions of ORMS. The basic models and modules such as, Reliability Data Update Model (RDUM), running time update, redundant system unavailability update, Engineered Safety Features (ESF) unavailability update and general system update have been described in this study. ORMS not only provides quantitative analysis but also highlights qualitative aspects of risk measures. ORMS is capable of automatically updating the online risk models and reliability parameters of equipment. ORMS can support in the decision making process of operators and managers in Nuclear Power Plants.
NASA Astrophysics Data System (ADS)
Wyngaard, J.
2017-12-01
A key challenge in the burgeoning sector of IoT (Internet of Things) is ensuring device and communication security. Ubuntu Core's approach to this is the use of 'snaps'. Along side this growth, scientists are increasingly utilising the many new low cost sensors now available. This work prototypes the use of snaps as a possible avenue to reducing the barrier to entry for scientific use of these low cost sensors while also ensuring proper meta-data is captured. Snaps are contained applications that have been signed. This means that a snap application is unable to read or write to any area of the system beyond its assigned reach, thereby significantly limiting the possible impact of any break in security higher up the stack. Further, application and system updates are automatically verified as authentic before being applied. Additionally, on an embedded system running Ubuntu Core the hardware interface (Gadget), kernel, and OS (Core) are all also snaps and therefore also have acquired these same gains. The result is an architecture that enables: (1) Secure, robust, remote automatic updates of both the OS and applications. (2) A user friendly deployment mechanism.(3) A easy to maintain means of supporting multiple platforms. The above is primarily targeted at non-academic domains, however, it is proposed that the Scientific community can benefit from it too. This work therefore prototypes a snap for sensors on board a small Unmanned Aircraft System (sUAS). For demonstration purposes this snap specifically targets connecting a popular low cost CO2 meter to a Raspberry Pi3 and the popular open source sUAS autopilot Arducopter.
Impact of electron-captures on nuclei near N = 50 on core-collapse supernovae
NASA Astrophysics Data System (ADS)
Titus, R.; Sullivan, C.; Zegers, R. G. T.; Brown, B. A.; Gao, B.
2018-01-01
The sensitivity of the late stages of stellar core collapse to electron-capture rates on nuclei is investigated, with a focus on electron-capture rates on 74 nuclei with neutron number close to 50, just above doubly magic 78Ni. It is demonstrated that variations in key characteristics of the evolution, such as the lepton fraction, electron fraction, entropy, stellar density, and in-fall velocity are about 50% due to uncertainties in the electron-capture rates on nuclei in this region, although thousands of nuclei are included in the simulations. The present electron-capture rate estimates used for the nuclei in this high-sensitivity region of the chart of isotopes are primarily based on a simple approximation, and it is shown that the estimated rates are likely too high, by an order of magnitude or more. Electron-capture rates based on Gamow-Teller strength distributions calculated in microscopic theoretical models will be required to obtain better estimates. Gamow-Teller distributions extracted from charge-exchange experiments performed at intermediate energies serve to guide the development and benchmark the models. A previously compiled weak-rate library that is used in the astrophysical simulations was updated as part of the work presented here, by adding additional rate tables for nuclei near stability for mass numbers between 60 and 110.
Faunt, C.C.; Hanson, R.T.; Martin, P.; Schmid, W.
2011-01-01
California's Central Valley has been one of the most productive agricultural regions in the world for more than 50 years. To better understand the groundwater availability in the valley, the U.S. Geological Survey (USGS) developed the Central Valley hydrologic model (CVHM). Because of recent water-level declines and renewed subsidence, the CVHM is being updated to better simulate the geohydrologic system. The CVHM updates and refinements can be grouped into two general categories: (1) model code changes and (2) data updates. The CVHM updates and refinements will require that the model be recalibrated. The updated CVHM will provide a detailed transient analysis of changes in groundwater availability and flow paths in relation to climatic variability, urbanization, stream flow, and changes in irrigated agricultural practices and crops. The updated CVHM is particularly focused on more accurately simulating the locations and magnitudes of land subsidence. The intent of the updated CVHM is to help scientists better understand the availability and sustainability of water resources and the interaction of groundwater levels with land subsidence. ?? 2011 ASCE.
Faunt, Claudia C.; Hanson, Randall T.; Martin, Peter; Schmid, Wolfgang
2011-01-01
California's Central Valley has been one of the most productive agricultural regions in the world for more than 50 years. To better understand the groundwater availability in the valley, the U.S. Geological Survey (USGS) developed the Central Valley hydrologic model (CVHM). Because of recent water-level declines and renewed subsidence, the CVHM is being updated to better simulate the geohydrologic system. The CVHM updates and refinements can be grouped into two general categories: (1) model code changes and (2) data updates. The CVHM updates and refinements will require that the model be recalibrated. The updated CVHM will provide a detailed transient analysis of changes in groundwater availability and flow paths in relation to climatic variability, urbanization, stream flow, and changes in irrigated agricultural practices and crops. The updated CVHM is particularly focused on more accurately simulating the locations and magnitudes of land subsidence. The intent of the updated CVHM is to help scientists better understand the availability and sustainability of water resources and the interaction of groundwater levels with land subsidence.
About Non-Line-Of-Sight Satellite Detection and Exclusion in a 3D Map-Aided Localization Algorithm
Peyraud, Sébastien; Bétaille, David; Renault, Stéphane; Ortiz, Miguel; Mougel, Florian; Meizel, Dominique; Peyret, François
2013-01-01
Reliable GPS positioning in city environment is a key issue actually, signals are prone to multipath, with poor satellite geometry in many streets. Using a 3D urban model to forecast satellite visibility in urban contexts in order to improve GPS localization is the main topic of the present article. A virtual image processing that detects and eliminates possible faulty measurements is the core of this method. This image is generated using the position estimated a priori by the navigation process itself, under road constraints. This position is then updated by measurements to line-of-sight satellites only. This closed-loop real-time processing has shown very first promising full-scale test results. PMID:23344379
Advanced Fuels for LWRs: Fully-Ceramic Microencapsulated and Related Concepts FY 2012 Interim Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Sonat Sen; Brian Boer; John D. Bess
2012-03-01
This report summarizes the progress in the Deep Burn project at Idaho National Laboratory during the first half of fiscal year 2012 (FY2012). The current focus of this work is on Fully-Ceramic Microencapsulated (FCM) fuel containing low-enriched uranium (LEU) uranium nitride (UN) fuel kernels. UO2 fuel kernels have not been ruled out, and will be examined as later work in FY2012. Reactor physics calculations confirmed that the FCM fuel containing 500 mm diameter kernels of UN fuel has positive MTC with a conventional fuel pellet radius of 4.1 mm. The methodology was put into place and validated against MCNP tomore » perform whole-core calculations using DONJON, which can interpolate cross sections from a library generated using DRAGON. Comparisons to MCNP were performed on the whole core to confirm the accuracy of the DRAGON/DONJON schemes. A thermal fluid coupling scheme was also developed and implemented with DONJON. This is currently able to iterate between diffusion calculations and thermal fluid calculations in order to update fuel temperatures and cross sections in whole-core calculations. Now that the DRAGON/DONJON calculation capability is in place and has been validated against MCNP results, and a thermal-hydraulic capability has been implemented in the DONJON methodology, the work will proceed to more realistic reactor calculations. MTC calculations at the lattice level without the correct burnable poison are inadequate to guarantee zero or negative values in a realistic mode of operation. Using the DONJON calculation methodology described in this report, a startup core with enrichment zoning and burnable poisons will be designed. Larger fuel pins will be evaluated for their ability to (1) alleviate the problem of positive MTC and (2) increase reactivity-limited burnup. Once the critical boron concentration of the startup core is determined, MTC will be calculated to verify a non-positive value. If the value is positive, the design will be changed to require less soluble boron by, for example, increasing the reactivity hold-down by burnable poisons. Then, the whole core analysis will be repeated until an acceptable design is found. Calculations of departure from nucleate boiling ratio (DNBR) will be included in the safety evaluation as well. Once a startup core is shown to be viable, subsequent reloads will be simulated by shuffling fuel and introducing fresh fuel. The PASTA code has been updated with material properties of UN fuel from literature and a model for the diffusion and release of volatile fission products from the SiC matrix material . Preliminary simulations have been performed for both normal conditions and elevated temperatures. These results indicated that the fuel performs well and that the SiC matrix has a good retention of the fission products. The path forward for fuel performance work includes improvement of metallic fission product release from the kernel. Results should be considered preliminary and further validation is required.« less
NASA Astrophysics Data System (ADS)
Balla, Vamsi Krishna; Coox, Laurens; Deckers, Elke; Plyumers, Bert; Desmet, Wim; Marudachalam, Kannan
2018-01-01
The vibration response of a component or system can be predicted using the finite element method after ensuring numerical models represent realistic behaviour of the actual system under study. One of the methods to build high-fidelity finite element models is through a model updating procedure. In this work, a novel model updating method of deep-drawn components is demonstrated. Since the component is manufactured with a high draw ratio, significant deviations in both profile and thickness distributions occurred in the manufacturing process. A conventional model updating, involving Young's modulus, density and damping ratios, does not lead to a satisfactory match between simulated and experimental results. Hence a new model updating process is proposed, where geometry shape variables are incorporated, by carrying out morphing of the finite element model. This morphing process imitates the changes that occurred during the deep drawing process. An optimization procedure that uses the Global Response Surface Method (GRSM) algorithm to maximize diagonal terms of the Modal Assurance Criterion (MAC) matrix is presented. This optimization results in a more accurate finite element model. The advantage of the proposed methodology is that the CAD surface of the updated finite element model can be readily obtained after optimization. This CAD model can be used for carrying out analysis, as it represents the manufactured part more accurately. Hence, simulations performed using this updated model with an accurate geometry, will therefore yield more reliable results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heath, Jason E.; Bauer, Stephen J.; Broome, Scott Thomas
The Iowa Stored Energy Plant Agency selected a geologic structure at Dallas Center, Iowa, for evaluation of subsurface compressed air energy storage. The site was rejected due to lower-than-expected and heterogeneous permeability of the target reservoir, lower-than-desired porosity, and small reservoir volume. In an initial feasibility study, permeability and porosity distributions of flow units for the nearby Redfield gas storage field were applied as analogue values for numerical modeling of the Dallas Center Structure. These reservoir data, coupled with an optimistic reservoir volume, produced favorable results. However, it was determined that the Dallas Center Structure cannot be simplified to fourmore » zones of high, uniform permeabilities. Updated modeling using field and core data for the site provided unfavorable results for air fill-up. This report presents Sandia National Laboratories petrologic and petrophysical analysis of the Dallas Center Structure that aids in understanding why the site was not suitable for gas storage.« less
Peters, Sanne A E; Dunford, Elizabeth; Jones, Alexandra; Ni Mhurchu, Cliona; Crino, Michelle; Taylor, Fraser; Woodward, Mark; Neal, Bruce
2017-07-05
The Health Star Rating (HSR) is an interpretive front-of-pack labelling system that rates the overall nutritional profile of packaged foods. The algorithm underpinning the HSR includes total sugar content as one of the components. This has been criticised because intrinsic sugars naturally present in dairy, fruits, and vegetables are treated the same as sugars added during food processing. We assessed whether the HSR could better discriminate between core and discretionary foods by including added sugar in the underlying algorithm. Nutrition information was extracted for 34,135 packaged foods available in The George Institute's Australian FoodSwitch database. Added sugar levels were imputed from food composition databases. Products were classified as 'core' or 'discretionary' based on the Australian Dietary Guidelines. The ability of each of the nutrients included in the HSR algorithm, as well as added sugar, to discriminate between core and discretionary foods was estimated using the area under the curve (AUC). 15,965 core and 18,350 discretionary foods were included. Of these, 8230 (52%) core foods and 15,947 (87%) discretionary foods contained added sugar. Median (Q1, Q3) HSRs were 4.0 (3.0, 4.5) for core foods and 2.0 (1.0, 3.0) for discretionary foods. Median added sugar contents (g/100 g) were 3.3 (1.5, 5.5) for core foods and 14.6 (1.8, 37.2) for discretionary foods. Of all the nutrients used in the current HSR algorithm, total sugar had the greatest individual capacity to discriminate between core and discretionary foods; AUC 0.692 (0.686; 0.697). Added sugar alone achieved an AUC of 0.777 (0.772; 0.782). A model with all nutrients in the current HSR algorithm had an AUC of 0.817 (0.812; 0.821), which increased to 0.871 (0.867; 0.874) with inclusion of added sugar. The HSR nutrients discriminate well between core and discretionary packaged foods. However, discrimination was improved when added sugar was also included. These data argue for inclusion of added sugar in an updated HSR algorithm and declaration of added sugar as part of mandatory nutrient declarations.
NASA Astrophysics Data System (ADS)
Machado, M. R.; Adhikari, S.; Dos Santos, J. M. C.; Arruda, J. R. F.
2018-03-01
Structural parameter estimation is affected not only by measurement noise but also by unknown uncertainties which are present in the system. Deterministic structural model updating methods minimise the difference between experimentally measured data and computational prediction. Sensitivity-based methods are very efficient in solving structural model updating problems. Material and geometrical parameters of the structure such as Poisson's ratio, Young's modulus, mass density, modal damping, etc. are usually considered deterministic and homogeneous. In this paper, the distributed and non-homogeneous characteristics of these parameters are considered in the model updating. The parameters are taken as spatially correlated random fields and are expanded in a spectral Karhunen-Loève (KL) decomposition. Using the KL expansion, the spectral dynamic stiffness matrix of the beam is expanded as a series in terms of discretized parameters, which can be estimated using sensitivity-based model updating techniques. Numerical and experimental tests involving a beam with distributed bending rigidity and mass density are used to verify the proposed method. This extension of standard model updating procedures can enhance the dynamic description of structural dynamic models.
Summary of Expansions, Updates, and Results in GREET 2017 Suite of Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Michael; Elgowainy, Amgad; Han, Jeongwoo
This report provides a technical summary of the expansions and updates to the 2017 release of Argonne National Laboratory’s Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET®) model, including references and links to key technical documents related to these expansions and updates. The GREET 2017 release includes an updated version of the GREET1 (the fuel-cycle GREET model) and GREET2 (the vehicle-cycle GREET model), both in the Microsoft Excel platform and in the GREET.net modeling platform. Figure 1 shows the structure of the GREET Excel modeling platform. The .net platform integrates all GREET modules together seamlessly.
Updates to the Demographic and Spatial Allocation Models to ...
EPA's announced the availability of the final report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2). This update furthered land change modeling by providing nationwide housing development scenarios up to 2100. This newest version includes updated population and land use data sets and addresses limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide (Final Report) describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.
Model Update of a Micro Air Vehicle (MAV) Flexible Wing Frame with Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.; Waszak, Martin R.; Morgan, Benjamin G.
2004-01-01
This paper describes a procedure to update parameters in the finite element model of a Micro Air Vehicle (MAV) to improve displacement predictions under aerodynamics loads. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with Multidisciplinary Design Optimization (MDO) is used to modify key model parameters. Static test data collected using photogrammetry are used to correlate with model predictions. Results show significant improvements in model predictions after parameters are updated; however, computed probabilities values indicate low confidence in updated values and/or model structure errors. Lessons learned in the areas of wing design, test procedures, modeling approaches with geometric nonlinearities, and uncertainties quantification are all documented.
NASA Astrophysics Data System (ADS)
Desconnets, Jean-Christophe; Giuliani, Gregory; Guigoz, Yaniss; Lacroix, Pierre; Mlisa, Andiswa; Noort, Mark; Ray, Nicolas; Searby, Nancy D.
2017-02-01
The discovery of and access to capacity building resources are often essential to conduct environmental projects based on Earth Observation (EO) resources, whether they are Earth Observation products, methodological tools, techniques, organizations that impart training in these techniques or even projects that have shown practical achievements. Recognizing this opportunity and need, the European Commission through two FP7 projects jointly with the Group on Earth Observations (GEO) teamed up with the Committee on Earth observation Satellites (CEOS). The Global Earth Observation CApacity Building (GEOCAB) portal aims at compiling all current capacity building efforts on the use of EO data for societal benefits into an easily updateable and user-friendly portal. GEOCAB offers a faceted search to improve user discovery experience with a fully interactive world map with all inventoried projects and activities. This paper focuses on the conceptual framework used to implement the underlying platform. An ISO19115 metadata model associated with a terminological repository are the core elements that provide a semantic search application and an interoperable discovery service. The organization and the contribution of different user communities to ensure the management and the update of the content of GEOCAB are addressed.
Revolution…Now The Future Arrives for Five Clean Energy Technologies – 2015 Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
In 2013, the U.S. Department of Energy (DOE) released the Revolution Now report, highlighting four transformational technologies: land-based wind power, silicon photovoltaic (PV) solar modules, light-emitting diodes (LEDs), and electric vehicles (EVs). That study and its 2014 update showed how dramatic reductions in cost are driving a surge in consumer, industrial, and commercial adoption for these clean energy technologies—as well as yearly progress. In addition to presenting the continued progress made over the last year in these areas, this year’s update goes further. Two separate sections now cover large, central, utility-scale PV plants and smaller, rooftop, distributed PV systems tomore » highlight how both have achieved significant deployment nationwide, and have done so through different innovations, such as easier access to capital for utility-scale PV and reductions of non-hardware costs and third-party ownership for distributed PV. Along with these core technologies« less
An Approach to Stable Gradient-Descent Adaptation of Higher Order Neural Units.
Bukovsky, Ivo; Homma, Noriyasu
2017-09-01
Stability evaluation of a weight-update system of higher order neural units (HONUs) with polynomial aggregation of neural inputs (also known as classes of polynomial neural networks) for adaptation of both feedforward and recurrent HONUs by a gradient descent method is introduced. An essential core of the approach is based on the spectral radius of a weight-update system, and it allows stability monitoring and its maintenance at every adaptation step individually. Assuring the stability of the weight-update system (at every single adaptation step) naturally results in the adaptation stability of the whole neural architecture that adapts to the target data. As an aside, the used approach highlights the fact that the weight optimization of HONU is a linear problem, so the proposed approach can be generally extended to any neural architecture that is linear in its adaptable parameters.
Kim, Seung-Nam; Park, Taewon; Lee, Sang-Hyun
2014-01-01
Damage of a 5-story framed structure was identified from two types of measured data, which are frequency response functions (FRF) and natural frequencies, using a finite element (FE) model updating procedure. In this study, a procedure to determine the appropriate weightings for different groups of observations was proposed. In addition, a modified frame element which included rotational springs was used to construct the FE model for updating to represent concentrated damage at the member ends (a formulation for plastic hinges in framed structures subjected to strong earthquakes). The results of the model updating and subsequent damage detection when the rotational springs (RS model) were used were compared with those obtained using the conventional frame elements (FS model). Comparisons indicated that the RS model gave more accurate results than the FS model. That is, the errors in the natural frequencies of the updated models were smaller, and the identified damage showed clearer distinctions between damaged and undamaged members and was more consistent with observed damage. PMID:24574888
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fung, Inez
The project aims to investigate the feasibility of advancing our understanding of the carbon cycle, using a carbon-weather data assimilation system that updates the modeled carbon dioxide concentration and atmospheric circulation every six hours using CO 2 data (from the OCO 2 satellite) and weather data. At the core of the system is the DOE-NCAR-CAM5fv global circulation model coupled to the National Center for Atmospheric Research's Data Assimilation Testbed, running an ensemble of 30 models. This combination provides realistic vertical carbon dioxide gradients and conservation of dry air mass. A global four-dimensional distribution of atmospheric CO 2 concentration is produced.more » Our results show (1) that OCO 2 total precipitable water data are reliable and provide valuable uncertainty information for the OCO 2 data assimilation; and (2) that our approach is a promising method for monitoring national carbon dioxide emissions.« less
Connecticut's forest resources, 2010
Brett J. Butler; Cassandra Kurtz; Christopher Martin; W. Keith Moser
2011-01-01
This publication provides an overview of forest resource attributes for Connecticut based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Connecticut's forest resources, 2009
Brett J. Butler; Christopher Martin
2011-01-01
This publication provides an overview of forest resource attributes for Connecticut based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Maine's forest resources, 2012
G.L. McCaskill; K.M. Laustsen; W.H. McWilliams
2013-01-01
This publication provides an overview of forest resource attributes for Maine based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Minnesota's forest resources, 2006
P.D. Miles; D. Heinzen
2007-01-01
This publication provides an overview of forest resource attributes for Minnesota based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for...
Present State of CAD Teaching in Spanish Universities
ERIC Educational Resources Information Center
Garcia, Ramon Rubio; Santos, Ramon Gallego; Quiros, Javier Suarez; Penin, Pedro I. Alvarez
2005-01-01
During the 1990s, all Spanish Universities updated the syllabuses of their courses as a result of the entry into force of the new Organic Law of Universities ("Ley Organica de Universidades") and, for the first time, "Computer Assisted Design" (CAD) appears in the list of core subjects (compulsory teaching content set by the…
Nebraska's forest resources, 2009
D.M. Meneguzzo
2011-01-01
This publication provides an overview of forest resource attributes for Nebraska based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...
Minnesota's forest resources, 2010
P.D. Miles; T. Aunan
2011-01-01
This publication provides an overview of forest resource attributes for Minnesota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Minnesota's forest resources, 2012
P.D. Miles; C.L. VanderSchaaf
2012-01-01
This publication provides an overview of forest resource attributes for Minnesota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Michigan's forest resources, 2011
S.A. Pugh
2012-01-01
This publication provides an overview of forest resource attributes for Michigan based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the Northern Research Station, U.S. Forest Service. These estimates, along with web-posted core tables, are updated annually. For more information please refer to page 4 of this report or visit our...
Pennsylvania's forest resources, 2007
G.L. McCaskill; W.H. McWilliams; B.J. Butler; D.M. Meneguzzo; C.J. Barnett; M.H. Hansen
2011-01-01
This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 6 of...
Pennsylvania's forest resources, 2011
G.L. McCaskill; W.H. McWilliams; C.J. Barnett
2012-01-01
This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...
Pennsylvania's Forest Resources, 2006
William H. McWilliams
2008-01-01
This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service (NRS-FIA). These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory...
Delaware's forest resources, 2007
T.W. Lister; G. Gladders; W. McWilliams; D. Meneguzo; C. Barnett; B. O' Connell
2010-01-01
This publication provides an overview of forest resource attributes for Delaware based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to the last page of this...
Maryland's forest resources, 2008
T.W. Lister; J. Perdue; B. Butler; C. Barnett; B. O' Connell
2010-01-01
This publication provides an overview of forest resource attributes for Maryland based on an annual inventory (2004-2008) conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to the last...
Massachusetts' forest resources, 2011
Brett J. Butler; Randall S. Morin; Mark D. Nelson
2012-01-01
This publication provides an overview of forest resource attributes for Massachusetts based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
Vermont's forest resources, 2010
R.S. Morin; M. Nelson; R. De Geus
2011-01-01
This publication provides an overview of forest resource attributes for Vermont based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report....
M.D. Nelson; M. Brewer
2009-01-01
This publication provides an overview of forest resource attributes for Iowa based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, are updated annually. For more information please refer to page 4 of this report.
M.D. Nelson; M. Brewer; S.A. Pugh
2013-01-01
This publication provides an overview of forest resource attributes for Iowa based on an annual inventory (2008-2012) conducted by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station. These estimates, along with Web-posted core tables, are updated annually. For more information please refer to page 4 of this report....
M.D. Nelson; M. Brewer; S.J. Crocker
2010-01-01
This publication provides an overview of forest resource attributes for Iowa based on an annual inventory (2005-2009) conducted by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, are updated annually. For more information, please refer to page 4 of this report....
M.D. Nelson; M. Brewer
2011-01-01
This publication provides an overview of forest resource attributes for Iowa based on an annual inventory (2006-2010) conducted by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, are updated annually. For more information please refer to page 4 of this report....
M.D. Nelson; M. Brewer; G. Domke
2012-01-01
This publication provides an overview of forest resource attributes for Iowa based on an annual inventory (2007-2011) conducted by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, are updated annually. For more information please refer to page 4 of this report....
Connecticut's forest resources, 2011
Brett J. Butler; Randall S. Morin; Mark D. Nelson
2012-01-01
This publication provides an overview of forest resource attributes for Connecticut based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Maine's forest resources, 2007
G.L. McCaskill; W.H. McWilliams; B.J. Butler; D.M. Meneguzzo; C.J. Barnett; M.H. Hansen
2010-01-01
This publication provides an overview of forest resource attributes for Maine based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
Michigan's forest resources, 2012
S.A. Pugh
2013-01-01
This publication provides an overview of forest resource attributes for Michigan based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the Northern Research Station, U.S. Forest Service. These estimates, along with Web-posted core tables, are updated annually. For more information please refer to page 4 of this report or visit our...
Maryland's forest resources, 2011
Tonya Lister; J. Perdue
2012-01-01
This publication provides an overview of forest resource attributes for Maryland based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
ERIC Educational Resources Information Center
McNamara, Julie
2017-01-01
Long before the release of the Common Core State Standards (CCSSI 2010), the Mathematical Tug-of-War was engaging students in the type of reasoning and problem solving described by the Standards for Mathematical Practice (SMP). In this updated version of a Marilyn Burns task, students use algebraic reasoning to determine the outcome of a contest…
Michigan's forest resources, 2010
S.A. Pugh
2011-01-01
This publication provides an overview of forest resource attributes for Michigan based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the Northern Research Station, U.S. Forest Service. These estimates, along with web-posted core tables, are updated annually. For more information please refer to page 4 of this report or visit our...
Michigan's Forest Resources, 2007
S.A. Pugh
2008-01-01
This publication provides an overview of forest resource attributes for Michigan based on an annual inventory (2003-2007) conducted by the Forest Inventory and Analysis (FIA) program of the Northern Research Station, U.S. Forest Service. These estimates, along with web-posted core tables, are updated annually. For more information please refer to page 4 of this report...
Wisconsin's forest resources, 2011
C.H. Perry
2012-01-01
This publication provides an overview of forest resource attributes for Wisconsin based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Indiana's forest resources, 2011
C.W. Woodall; J. Gallion
2012-01-01
This publication provides an overview of forest resource attributes for Indiana based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Missouri's forest resources, 2011
W.K. Moser; R.J. Piva; T.B. Treiman
2012-01-01
This publication provides an overview of forest resource attributes for Missouri based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...
Minnesota's forest resources, 2011
P.D. Miles; C.L. VanderSchaaf
2012-01-01
This publication provides an overview of forest resource attributes for Minnesota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Maine's forest resources, 2011
G.L. McCaskill; W.H. McWilliams
2012-01-01
This publication provides an overview of forest resource attributes for Maine based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
New Jersey's forest resources, 2007
Susan. J. Crocker; William H. McWilliams
2010-01-01
This publication provides an overview of forest resource attributes for New Jersey based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, will be updated annually. For more information, refer to page 4 of this report.
Maryland's forest resources, 2007
T.W. Lister; J. Perdue; W. McWilliams; D. Meneguzzo; C. Barnett; B. O’Connell
2010-01-01
This publication provides an overview of forest resource attributes for Maryland based on an annual inventory (2004-2007) conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to the last...
Illinois' Forest Resources, 2007
S.J. Crocker
2009-01-01
This publication provides an overview of forest resource attributes for Illinois based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...
New Jersey's Forest Resources, 2006
R.H. Widmann
2008-01-01
This publication provides an overview of forest resource attributes for New Jersey based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for...
Exercise and Fluid Balance Update
ERIC Educational Resources Information Center
Schlicht, Jeff
2005-01-01
One common piece of advice that exercise professionals give their clients is to drink water before, during, and after exercise. During exercise people can lose as much as three liters of water per hour (about 100 ounces) through sweat. Dehydration alters normal sweat patterns, which can lead to an increased core body temperature. Since most of the…
Nebraska's forest resources, 2010
D.M. Meneguzzo
2011-01-01
This publication provides an overview of forest resource attributes for Nebraska based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
DOT National Transportation Integrated Search
2011-01-01
This report provides a summary of a peer exchange sponsored by the Vermont Agency of Transportation (VTrans). The peer exchange convened Vermonts Strategic Highway Safety Plan (SHSP) Core Group to discuss the strengths and weaknesses of Vermont...
S.J. Crocker
2007-01-01
This publication provides an overview of forest resource attributes for Iowa based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for Iowa,...
North Dakota's forest resources 2006
D.E. Haugen; M. Kangas
2007-01-01
This publication provides an overview of forest resources attributes for this state based on annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the USDA Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for...
Delaware's Forest Resources, 2006
T.W. Lister; G. Gladders
2008-01-01
This publication provides an overview of forest resource attributes for this state based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory...
Massachusetts' forest resources, 2012
Brett J. Butler
2013-01-01
This publication provides an overview of forest resource attributes for Massachusetts based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 3 of this...
Maryland's Forest Resources, 2006
T.W. Lister; J. Perdue
2008-01-01
This publication provides an overview of forest resource attributes for this state based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory...
Connecticut's forest resources, 2012
Brett J. Butler
2013-01-01
This publication provides an overview of forest resource attributes for Connecticut based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 3 of this report...
Illinois' forest resources, 2006
S.J. Crocker; D.C. Little
2007-01-01
This publication provides an overview of forest resource attributes for Illinois based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for Illinois...
Pennsylvania's forest resources, 2012
G.L. McCaskill; W.H. McWilliams; C.J. Barnett
2013-01-01
This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...
Delaware's forest resources, 2010
T.W. Lister; G. Gladders
2010-01-01
This publication provides an overview of forest resource attributes for Delaware based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Pennsylvania's forest resources, 2010
G.L. McCaskill; W.H. McWilliams; C.J. Barnett
2011-01-01
This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...
R.H. Widmann; B.J. Butler; D. Balser
2010-01-01
This publication provides an overview of forest resource attributes for Ohio based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report.
Massachusetts' forest resources, 2009
Brett J. Butler; Gordon. Boyce
2011-01-01
This publication provides an overview of forest resource attributes for Massachusetts based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
Wisconsin's Forest Resources, 2007
C.H. Perry; V.A. Everson
2008-01-01
This publication provides an overview of forest resource attributes for Wisconsin based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, are updated annually. For more information please refer to page 4 of this report.
Wisconsin's forest resources, 2010
C.H. Perry
2011-01-01
This publication provides an overview of forest resource attributes for Wisconsin based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Nebraska's forest resources, 2011
D.M. Meneguzzo; B. Walters
2012-01-01
This publication provides an overview of forest resource attributes for Nebraska based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Vermont's forest resources, 2007
R.S. Morin; G.M. McCaskill; W. McWilliams; R. De Geus
2010-01-01
This publication provides an overview of forest resource attributes for Vermont based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 5 of this report....
Delaware's forest resources, 2008
T.W. Lister; G. Gladders; B. Butler; C. Barnett; B. O' Connell
2010-01-01
This publication provides an overview of forest resource attributes for Delaware based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to the last page of this...
Vermont's forest resources, 2012
R.S. Morin
2013-01-01
This publication provides an overview of forest resource attributes for Vermont based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report....
R.H. Widmann
2011-01-01
This publication provides an overview of forest resource attributes for Ohio based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report.
Maryland's forest resources, 2012
T.W. Lister; J. Perdue
2013-01-01
This publication provides an overview of forest resource attributes for Maryland based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Delaware's forest resources, 2012
T.W. Lister
2013-01-01
This publication provides an overview of forest resource attributes for Delaware based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Pennsylvania's forest resources, 2008
G.L. McCaskill; W.H. McWilliams; B.J. Butler; D.M. Meneguzzo; C.J. Barnett; M.H. Hansen
2011-01-01
This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...
New Jersey's forest resources, 2008
Susan. J. Crocker
2010-01-01
This publication provides an overview of forest resource attributes for New Jersey based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, will be updated annually. For more information, refer to page 4 of this report.
Maryland's forest resources, 2010
T.W. Lister; J. Perdue
2011-01-01
This publication provides an overview of forest resource attributes for Maryland based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Minnesota's forest resources, 2009
P.D. Miles; D. Heinzen
2010-01-01
This publication provides an overview of forest resource attributes for Minnesota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Illinois' forest resources, 2009
S.J. Crocker; C.W. Woodall
2011-01-01
This publication provides an overview of forest resource attributes for Illinois based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the Northern Research Station (NRS) of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
Maine's forest resources, 2008
G.L. McCaskill; W.H. McWilliams; B.J. Butler; D.M. Meneguzzo; C.J. Barnett; M.H. Hansen
2010-01-01
This publication provides an overview of forest resource attributes for this state based upon an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...
Illinois' forest resources, 2011
S.J. Crocker
2012-01-01
This publication provides an overview of forest resource attributes for Illinois based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the Northern Research Station (NRS) of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
Indiana's forest resources, 2010
C.W. Woodall; M.N. Webb
2011-01-01
This publication provides an overview of forest resource attributes for Indiana based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Maryland's forest resources, 2009
T.W. Lister; J. Perdue; A. Lister
2011-01-01
This publication provides an overview of forest resource attributes for Maryland based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Vermont's forest resources, 2011
R.S. Morin; C.W. Woodall
2012-01-01
This publication provides an overview of forest resource attributes for Vermont based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report....
Massachusetts' forest resources, 2010
Brett J. Butler; William N. Hill; Cassandra Kurtz; W. Keith. Moser
2011-01-01
This publication provides an overview of forest resource attributes for Massachusetts based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
Wisconsin's forest resources, 2009
C.H. Perry
2011-01-01
This publication provides an overview of forest resource attributes for Wisconsin based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...
Indiana's Forest Resources, 2007
C.W. Woodall; J. Gallion
2008-01-01
This publication provides an overview of forest resource attributes for Indiana based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Wisconsin's forest resources, 2012
C.H. Perry
2013-01-01
This publication provides an overview of forest resource attributes for Wisconsin based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Minnesota's Forest Resources, 2007
P.D. Miles; D. Heinzen
2008-01-01
This publication provides an overview of forest resource attributes for Minnesota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Maine's forest resources, 2009
G.L. McCaskill; W.H. McWilliams
2011-01-01
This publication provides an overview of forest resource attributes for Maine based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report....
Indiana's forest resources, 2009
C.W. Woodall; M.N. Webb; S.J. Crocker
2010-01-01
This publication provides an overview of forest resource attributes for Indiana based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Susan J. Crocker
2014-01-01
This publication provides an overview of forest resource attributes in New Jersey based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the U.S. Forest Service, Northern Research Station (NRS). These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to inventory citations on...
R.H. Widmann; R.S. Morin
2013-01-01
This publication provides an overview of forest resource attributes for Ohio based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report.
New Jersey's forest resources, 2010
S. J. Crocker
2011-01-01
This publication provides an overview of forest resource attributes for New Jersey based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, will be updated annually. For more information, refer to page 4 of this report.
Nebraska's Forest Resources, 2007
D.M. Meneguzzo
2009-01-01
This publication provides an overview of forest resource attributes for Nebraska based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
Missouri's forest resources, 2009
W.K. Moser; C.H. Barnett; M.H. Hansen; T.B. Treiman
2010-01-01
This publication provides an overview of forest resource attributes for Missouri based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...
Indiana's Forest Resources, 2008
C.W. Woodall; M.N. Webb; J. Gallion
2009-01-01
This publication provides an overview of forest resource attributes for Indiana based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Illinois' forest resources, 2008
S.J. Crocker
2010-01-01
This publication provides an overview of forest resource attributes for Illinois based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...
Maine's forest resources, 2006
G.L. McCaskill; W.H. McWilliams; B.J. Butler; C.J. Barnett; M.H. Hansen
2010-01-01
This publication provides an overview of forest resource attributes for Maine based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
Indiana's forest resources, 2012
C.W. Woodall; J. Gallion
2013-01-01
This publication provides an overview of forest resource attributes for Indiana based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
West Virginia's Forest Resources, 2006
Richard H. Widmann; Gregory W. Cook
2008-01-01
This publication provides an overview of forest resource attributes for this state based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory...
New Hampshire's Forest Resources, 2006
R.S. Morin; M. Tansey
2008-01-01
This publication provides an overview of forest resource attributes for New Hampshire based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory...
R.H. Widmann
2008-01-01
This publication provides an overview of forest resource attributes for Ohio based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for...
Nebraska's forest resources, 2006
D.M. Meneguzzo
2007-01-01
This publication provides an overview of forest resource attributes for Nebraska based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for...
Vermont's forest resources, 2008
R.S. Morin; B.J. Butler; R. De Geus
2010-01-01
This publication provides an overview of forest resource attributes for Vermont based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Content-Based Curriculum for High-Ability Learners, Second Edition
ERIC Educational Resources Information Center
VanTassel-Baska, Joyce, Ed.; Little, Catherine A., Ed.
2011-01-01
The newly updated "Content-Based Curriculum for High-Ability Learners" provides a solid introduction to curriculum development in gifted and talented education. Written by experts in the field of gifted education, this text uses cutting-edge design techniques and aligns the core content with national and state standards. In addition to a revision…
Pennsylvania's forest resources, 2009
G.L. McCaskill; W.H. McWilliams; B.J. Butler; D.M. Meneguzzo; C.J. Barnett; M.H. Hansen
2011-01-01
This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...
R.H. Widmann; D. Balser
2011-01-01
This publication provides an overview of forest resource attributes for Ohio based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report....
Susan J. Crocker
2014-01-01
This publication provides an overview of forest resource attributes for Illinois based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the Northern Research Station (NRS) of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to inventory...
Kansas' forest resources, 2006
W.K. Moser; M.H. Hansen; R.L. Atchison
2007-01-01
This publication provides an overview of forest resource attributes for this state based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for...
South Dakota's forest resources, 2008
Ronald J. Piva
2010-01-01
This publication provides an overview of forest resource attributes for South Dakota based on an annual inventory conducted by the Forest Inventory and Analysis program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for South...
South Dakota's Forest Resources, 2007
Ronald J. Piva; Andrew J. Lister; Douglas Haugan
2009-01-01
This publication provides an overview of forest resource attributes for South Dakota based on an annual inventory conducted by the Forest Inventory and Analysis program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for South...
South Dakota's forest resources, 2010
Brian F. Walters; Ronald J. Piva
2011-01-01
This publication provides an overview of forest resource attributes for South Dakota based on an annual inventory conducted by the Forest Inventory and Analysis program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for South...
Kansas' forest resources, 2010
W.K. Moser; C.H. Barnett; C.M. Kurtz; R.A. Atchison
2011-01-01
This publication provides an overview of forest resource attributes for Kansas based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
South Dakota's forest resources, 2012
Brian F. Walters
2013-01-01
This publication provides an overview of forest resource attributes for South Dakota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station. These estimates, along with Web-posted core tables, will be updated annually. For more information regarding past inventory reports for South...
South Dakota's forest resources, 2011
Brian F. Walters
2012-01-01
This publication provides an overview of forest resource attributes for South Dakota based on an annual inventory conducted by the Forest Inventory and Analysis program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for South...
South Dakota's forest resources, 2009
Ronald J. Piva
2010-01-01
This publication provides an overview of forest resource attributes for South Dakota based on an annual inventory conducted by the Forest Inventory and Analysis program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for South...
South Dakota's Forest Resources, 2006
Ronald J. Piva; Douglas Haugan; Gregory J. Josten
2007-01-01
This publication provides an overview of forest resource attributes for South Dakota based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports...
Kansas' forest resources, 2009
W.K. Moser; M.H. Hansen; C.H. Barnett; R.A. Atchison
2010-01-01
This publication provides an overview of forest resource attributes for Kansas based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Vermont's Forest Resources, 2006
R.S. Morin; R. De Geus
2008-01-01
This publication provides an overview of forest resource attributes for Vermont based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports...
Delaware's forest resources, 2009
T.W. Lister; G. Gladders
2011-01-01
This publication provides an overview of forest resource attributes for Delaware based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Illinois' forest resources, 2010
S.J. Crocker
2011-01-01
This publication provides an overview of forest resource attributes for Illinois based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) Program of the Northern Research Station (NRS) of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
Missouri's forest resources, 2006
W.K. Moser; M.H. Hansen; T.B. Treiman
2007-01-01
This publication provides an overview of forest resource attributes for Missouri based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information regarding past inventory reports for...
A Comprehensive Guide to Intellectual and Developmental Disabilities. Second Edition
ERIC Educational Resources Information Center
Wehmeyer, Michael L.. Ed.; Brown, Ivan, Ed.; Percy, Maire, Ed.; Fung, W. L. Alan, Ed.; Shogren, Karrie A., Ed.
2017-01-01
The trusted core disability textbook gets a comprehensive update in this second edition, now thoroughly revised to include all the critical topics today's professionals need to know about as they work with people who have intellectual and developmental disabilities. Brought to you by a new team of world-renowned experts and contributors, this…
Kansas' forest resources, 2012
W.K. Moser; P.D. Miles; R.A. Atchison
2013-01-01
This publication provides an overview of forest resource attributes for Kansas based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Kansas' forest resources, 2011
W.K. Moser; D.E. Haugen; R.A. Atchison
2012-01-01
This publication provides an overview of forest resource attributes for Kansas based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Missouri's Forest Resources, 2007
W.K. Moser; M.H. Hansen; S.J. Crocker; T.B. Treiman
2008-01-01
This publication provides an overview of forest resource attributes for Missouri based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
Kansas' Forest Resources, 2007
W.K. Moser; M.H. Hansen; R.L. Atchison
2008-01-01
This publication provides an overview of forest resource attributes for Kansas based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
MBGD update 2013: the microbial genome database for exploring the diversity of microbial world.
Uchiyama, Ikuo; Mihara, Motohiro; Nishide, Hiroyo; Chiba, Hirokazu
2013-01-01
The microbial genome database for comparative analysis (MBGD, available at http://mbgd.genome.ad.jp/) is a platform for microbial genome comparison based on orthology analysis. As its unique feature, MBGD allows users to conduct orthology analysis among any specified set of organisms; this flexibility allows MBGD to adapt to a variety of microbial genomic study. Reflecting the huge diversity of microbial world, the number of microbial genome projects now becomes several thousands. To efficiently explore the diversity of the entire microbial genomic data, MBGD now provides summary pages for pre-calculated ortholog tables among various taxonomic groups. For some closely related taxa, MBGD also provides the conserved synteny information (core genome alignment) pre-calculated using the CoreAligner program. In addition, efficient incremental updating procedure can create extended ortholog table by adding additional genomes to the default ortholog table generated from the representative set of genomes. Combining with the functionalities of the dynamic orthology calculation of any specified set of organisms, MBGD is an efficient and flexible tool for exploring the microbial genome diversity.
Update of the ERS international Adult Respiratory Medicine syllabus for postgraduate training.
Tabin, Nathalie; Mitchell, Sharon; O'Connell, Elaine; Stolz, Daiana; Rohde, Gernot
2018-03-01
First published in 2006, the first European core syllabus in Adult Respiratory Medicine was developed with the intention of harmonising education and training throughout Europe. Internationally recognised by the European Union of Medical Specialists and identified as the first document of its kind in respiratory medicine, it has provided a comprehensive guide for both local and national institutions in the development of adult respiratory training programmes. Like all fields in education, respiratory medicine is an ever-changing area and as such, respective syllabi, curricula and training programmes must adapt and diversify in line with the evolution of core medical concepts. Given the proven importance of the Adult Respiratory Medicine syllabus from both a national and international standpoint, it is of equal importance that said syllabus remains abreast of emerging trends so as to sustain the synchronisation of respiratory medicine in Europe. In order to develop an updated programme, a comprehensive review process of the current syllabus is a necessary endeavour and a step that the European Respiratory Society (ERS) has undertaken through the process of a needs assessment.
2015-08-05
This final rule updates the prospective payment rates for Medicare inpatient hospital services provided by inpatient psychiatric facilities (IPFs) (which are freestanding IPFs and psychiatric units of an acute care hospital or critical access hospital). These changes are applicable to IPF discharges occurring during fiscal year (FY) 2016 (October 1, 2015 through September 30, 2016). This final rule also implements: a new 2012-based IPF market basket; an updated IPF labor-related share; a transition to new Core Based Statistical Area (CBSA) designations in the FY 2016 IPF Prospective Payment System (PPS) wage index; a phase-out of the rural adjustment for IPF providers whose status changes from rural to urban as a result of the wage index CBSA changes; and new quality measures and reporting requirements under the IPF quality reporting program. This final rule also reminds IPFs of the October 1, 2015 implementation of the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM), and updates providers on the status of IPF PPS refinements.
NASA Astrophysics Data System (ADS)
Wang, Xing; Hill, Thomas L.; Neild, Simon A.; Shaw, Alexander D.; Haddad Khodaparast, Hamed; Friswell, Michael I.
2018-02-01
This paper proposes a model updating strategy for localised nonlinear structures. It utilises an initial finite-element (FE) model of the structure and primary harmonic response data taken from low and high amplitude excitations. The underlying linear part of the FE model is first updated using low-amplitude test data with established techniques. Then, using this linear FE model, the nonlinear elements are localised, characterised, and quantified with primary harmonic response data measured under stepped-sine or swept-sine excitations. Finally, the resulting model is validated by comparing the analytical predictions with both the measured responses used in the updating and with additional test data. The proposed strategy is applied to a clamped beam with a nonlinear mechanism and good agreements between the analytical predictions and measured responses are achieved. Discussions on issues of damping estimation and dealing with data from amplitude-varying force input in the updating process are also provided.
Utilizing Flight Data to Update Aeroelastic Stability Estimates
NASA Technical Reports Server (NTRS)
Lind, Rick; Brenner, Marty
1997-01-01
Stability analysis of high performance aircraft must account for errors in the system model. A method for computing flutter margins that incorporates flight data has been developed using robust stability theory. This paper considers applying this method to update flutter margins during a post-flight or on-line analysis. Areas of modeling uncertainty that arise when using flight data with this method are investigated. The amount of conservatism in the resulting flutter margins depends on the flight data sets used to update the model. Post-flight updates of flutter margins for an F/A-18 are presented along with a simulation of on-line updates during a flight test.
EPA's announced the availability of the final report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2). This update furthered land change modeling by providing nationwide housing developmen...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burmeister, Jay, E-mail: burmeist@karmanos.org; Chen, Zhe; Chetty, Indrin J.
Purpose: The American Society for Radiation Oncology (ASTRO) Physics Core Curriculum Subcommittee (PCCSC) has updated the recommended physics curriculum for radiation oncology resident education to improve consistency in teaching, intensity, and subject matter. Methods and Materials: The ASTRO PCCSC is composed of physicists and physicians involved in radiation oncology residency education. The PCCSC updated existing sections within the curriculum, created new sections, and attempted to provide additional clinical context to the curricular material through creation of practical clinical experiences. Finally, we reviewed the American Board of Radiology (ABR) blueprint of examination topics for correlation with this curriculum. Results: The newmore » curriculum represents 56 hours of resident physics didactic education, including a 4-hour initial orientation. The committee recommends completion of this curriculum at least twice to assure both timely presentation of material and re-emphasis after clinical experience. In addition, practical clinical physics and treatment planning modules were created as a supplement to the didactic training. Major changes to the curriculum include addition of Fundamental Physics, Stereotactic Radiosurgery/Stereotactic Body Radiation Therapy, and Safety and Incidents sections, and elimination of the Radiopharmaceutical Physics and Dosimetry and Hyperthermia sections. Simulation and Treatment Verification and optional Research and Development in Radiation Oncology sections were also added. A feedback loop was established with the ABR to help assure that the physics component of the ABR radiation oncology initial certification examination remains consistent with this curriculum. Conclusions: The ASTRO physics core curriculum for radiation oncology residents has been updated in an effort to identify the most important physics topics for preparing residents for careers in radiation oncology, to reflect changes in technology and practice since the publication of previous recommended curricula, and to provide practical training modules in clinical radiation oncology physics and treatment planning. The PCCSC is committed to keeping the curriculum current and consistent with the ABR examination blueprint.« less
Burmeister, Jay; Chen, Zhe; Chetty, Indrin J; Dieterich, Sonja; Doemer, Anthony; Dominello, Michael M; Howell, Rebecca M; McDermott, Patrick; Nalichowski, Adrian; Prisciandaro, Joann; Ritter, Tim; Smith, Chadd; Schreiber, Eric; Shafman, Timothy; Sutlief, Steven; Xiao, Ying
2016-07-15
The American Society for Radiation Oncology (ASTRO) Physics Core Curriculum Subcommittee (PCCSC) has updated the recommended physics curriculum for radiation oncology resident education to improve consistency in teaching, intensity, and subject matter. The ASTRO PCCSC is composed of physicists and physicians involved in radiation oncology residency education. The PCCSC updated existing sections within the curriculum, created new sections, and attempted to provide additional clinical context to the curricular material through creation of practical clinical experiences. Finally, we reviewed the American Board of Radiology (ABR) blueprint of examination topics for correlation with this curriculum. The new curriculum represents 56 hours of resident physics didactic education, including a 4-hour initial orientation. The committee recommends completion of this curriculum at least twice to assure both timely presentation of material and re-emphasis after clinical experience. In addition, practical clinical physics and treatment planning modules were created as a supplement to the didactic training. Major changes to the curriculum include addition of Fundamental Physics, Stereotactic Radiosurgery/Stereotactic Body Radiation Therapy, and Safety and Incidents sections, and elimination of the Radiopharmaceutical Physics and Dosimetry and Hyperthermia sections. Simulation and Treatment Verification and optional Research and Development in Radiation Oncology sections were also added. A feedback loop was established with the ABR to help assure that the physics component of the ABR radiation oncology initial certification examination remains consistent with this curriculum. The ASTRO physics core curriculum for radiation oncology residents has been updated in an effort to identify the most important physics topics for preparing residents for careers in radiation oncology, to reflect changes in technology and practice since the publication of previous recommended curricula, and to provide practical training modules in clinical radiation oncology physics and treatment planning. The PCCSC is committed to keeping the curriculum current and consistent with the ABR examination blueprint. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Application of Artificial Intelligence for Bridge Deterioration Model.
Chen, Zhang; Wu, Yangyang; Li, Li; Sun, Lijun
2015-01-01
The deterministic bridge deterioration model updating problem is well established in bridge management, while the traditional methods and approaches for this problem require manual intervention. An artificial-intelligence-based approach was presented to self-updated parameters of the bridge deterioration model in this paper. When new information and data are collected, a posterior distribution was constructed to describe the integrated result of historical information and the new gained information according to Bayesian theorem, which was used to update model parameters. This AI-based approach is applied to the case of updating parameters of bridge deterioration model, which is the data collected from bridges of 12 districts in Shanghai from 2004 to 2013, and the results showed that it is an accurate, effective, and satisfactory approach to deal with the problem of the parameter updating without manual intervention.
Application of Artificial Intelligence for Bridge Deterioration Model
Chen, Zhang; Wu, Yangyang; Sun, Lijun
2015-01-01
The deterministic bridge deterioration model updating problem is well established in bridge management, while the traditional methods and approaches for this problem require manual intervention. An artificial-intelligence-based approach was presented to self-updated parameters of the bridge deterioration model in this paper. When new information and data are collected, a posterior distribution was constructed to describe the integrated result of historical information and the new gained information according to Bayesian theorem, which was used to update model parameters. This AI-based approach is applied to the case of updating parameters of bridge deterioration model, which is the data collected from bridges of 12 districts in Shanghai from 2004 to 2013, and the results showed that it is an accurate, effective, and satisfactory approach to deal with the problem of the parameter updating without manual intervention. PMID:26601121
NASA Astrophysics Data System (ADS)
Karriem, Veronica V.
Nuclear reactor design incorporates the study and application of nuclear physics, nuclear thermal hydraulic and nuclear safety. Theoretical models and numerical methods implemented in computer programs are utilized to analyze and design nuclear reactors. The focus of this PhD study's is the development of an advanced high-fidelity multi-physics code system to perform reactor core analysis for design and safety evaluations of research TRIGA-type reactors. The fuel management and design code system TRIGSIMS was further developed to fulfill the function of a reactor design and analysis code system for the Pennsylvania State Breazeale Reactor (PSBR). TRIGSIMS, which is currently in use at the PSBR, is a fuel management tool, which incorporates the depletion code ORIGEN-S (part of SCALE system) and the Monte Carlo neutronics solver MCNP. The diffusion theory code ADMARC-H is used within TRIGSIMS to accelerate the MCNP calculations. It manages the data and fuel isotopic content and stores it for future burnup calculations. The contribution of this work is the development of an improved version of TRIGSIMS, named TRIGSIMS-TH. TRIGSIMS-TH incorporates a thermal hydraulic module based on the advanced sub-channel code COBRA-TF (CTF). CTF provides the temperature feedback needed in the multi-physics calculations as well as the thermal hydraulics modeling capability of the reactor core. The temperature feedback model is using the CTF-provided local moderator and fuel temperatures for the cross-section modeling for ADMARC-H and MCNP calculations. To perform efficient critical control rod calculations, a methodology for applying a control rod position was implemented in TRIGSIMS-TH, making this code system a modeling and design tool for future core loadings. The new TRIGSIMS-TH is a computer program that interlinks various other functional reactor analysis tools. It consists of the MCNP5, ADMARC-H, ORIGEN-S, and CTF. CTF was coupled with both MCNP and ADMARC-H to provide the heterogeneous temperature distribution throughout the core. Each of these codes is written in its own computer language performing its function and outputs a set of data. TRIGSIMS-TH provides an effective use and data manipulation and transfer between different codes. With the implementation of feedback and control- rod-position modeling methodologies, the TRIGSIMS-TH calculations are more accurate and in a better agreement with measured data. The PSBR is unique in many ways and there are no "off-the-shelf" codes, which can model this design in its entirety. In particular, PSBR has an open core design, which is cooled by natural convection. Combining several codes into a unique system brings many challenges. It also requires substantial knowledge of both operation and core design of the PSBR. This reactor is in operation decades and there is a fair amount of studies and developments in both PSBR thermal hydraulics and neutronics. Measured data is also available for various core loadings and can be used for validation activities. The previous studies and developments in PSBR modeling also aids as a guide to assess the findings of the work herein. In order to incorporate new methods and codes into exiting TRIGSIMS, a re-evaluation of various components of the code was performed to assure the accuracy and efficiency of the existing CTF/MCNP5/ADMARC-H multi-physics coupling. A new set of ADMARC-H diffusion coefficients and cross sections was generated using the SERPENT code. This was needed as the previous data was not generated with thermal hydraulic feedback and the ARO position was used as the critical rod position. The B4C was re-evaluated for this update. The data exchange between ADMARC-H and MCNP5 was modified. The basic core model is given a flexibility to allow for various changes within the core model, and this feature was implemented in TRIGSIMS-TH. The PSBR core in the new code model can be expanded and changed. This allows the new code to be used as a modeling tool for design and analyses of future code loadings.
Overview and Evaluation of the Community Multiscale Air ...
The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Protection Agency develops the CMAQ model and periodically releases new versions of the model that include bug fixes and various other improvements to the modeling system. In late 2016 or early 2017, CMAQ version 5.2 will be released. This new version of CMAQ will contain important updates from the current CMAQv5.1 modeling system, along with several instrumented versions of the model (e.g. decoupled direct method and sulfur tracking). Some specific model updates include the implementation of a new wind-blown dust treatment in CMAQv5.2, a significant improvement over the treatment in v5.1 which can severely overestimate wind-blown dust under certain conditions. Several other major updates to the modeling system include an update to the calculation of aerosols; implementation of full halogen chemistry (CMAQv5.1 contains a partial implementation of halogen chemistry); the new carbon bond 6 (CB6) chemical mechanism; updates to cloud model in CMAQ; and a new lightning assimilation scheme for the WRF model which significant improves the placement and timing of convective precipitation in the WRF precipitation fields. Numerous other updates to the modeling system will also be available in v5.2.
Kingswood, John C; Bruzzi, Paolo; Curatolo, Paolo; de Vries, Petrus J; Fladrowski, Carla; Hertzberg, Christoph; Jansen, Anna C; Jozwiak, Sergiusz; Nabbout, Rima; Sauter, Matthias; Touraine, Renaud; O'Callaghan, Finbar; Zonnenberg, Bernard; Crippa, Stefania; Comis, Silvia; d'Augères, Guillaume Beaure; Belousova, Elena; Carter, Tom; Cottin, Vincent; Dahlin, Maria; Ferreira, José Carlos; Macaya, Alfons; Benedik, Mirjana Perkovic; Sander, Valentin; Youroukos, Sotirios; Castellana, Ramon; Ulker, Bulent; Feucht, Martha
2014-11-26
Tuberous sclerosis complex (TSC) is a rare, multisystem, genetic disorder with an estimated prevalence between 1/6800 and 1/15000. Although recent years have seen huge progress in understanding the pathophysiology and in the management of TSC, several questions remain unanswered. A disease registry could be an effective tool to gain more insights into TSC and thus help in the development of improved management strategies. TuberOus SClerosis registry to increase disease Awareness (TOSCA) is a multicentre, international disease registry to assess manifestations, interventions, and outcomes in patients with TSC. Patients of any age diagnosed with TSC, having a documented visit for TSC within the preceding 12 months, or newly diagnosed individuals are eligible. Objectives include mapping the course of TSC manifestations and their effects on prognosis, identifying patients with rare symptoms and co-morbidities, recording interventions and their outcomes, contributing to creation of an evidence-base for disease assessment and therapy, informing further research on TSC, and evaluating the quality of life of patients with TSC. The registry includes a 'core' section and subsections or 'petals'. The 'core' section is designed to record general information on patients' background collected at baseline and updated annually. Subsections will be developed over time to record additional data related to specific disease manifestations and will be updated annually. The registry aimed to enrol approximately 2000 patients from about 250 sites in 31 countries. The initial enrolment period was of 24 months. A follow-up observation period of up to 5 years is planned. A pre-planned administrative analysis of 'core' data from the first 100 patients was performed to evaluate the feasibility of the registry. Results showed a high degree of accuracy of the data collection procedure. Annual interim analyses are scheduled. Results of first interim analysis will be presented subsequent to data availability in 2014. The results of TOSCA will assist in filling the gaps in understanding the natural history of TSC and help in planning better management and surveillance strategies. This large-scale international registry to study TSC could serve as a model to encourage planning of similar registries for other rare diseases.
SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating
Lee, Young-Joo; Cho, Soojin
2016-01-01
Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125
Normal response function method for mass and stiffness matrix updating using complex FRFs
NASA Astrophysics Data System (ADS)
Pradhan, S.; Modak, S. V.
2012-10-01
Quite often a structural dynamic finite element model is required to be updated so as to accurately predict the dynamic characteristics like natural frequencies and the mode shapes. Since in many situations undamped natural frequencies and mode shapes need to be predicted, it has generally been the practice in these situations to seek updating of only mass and stiffness matrix so as to obtain a reliable prediction model. Updating using frequency response functions (FRFs) has been one of the widely used approaches for updating, including updating of mass and stiffness matrices. However, the problem with FRF based methods, for updating mass and stiffness matrices, is that these methods are based on use of complex FRFs. Use of complex FRFs to update mass and stiffness matrices is not theoretically correct as complex FRFs are not only affected by these two matrices but also by the damping matrix. Therefore, in situations where updating of only mass and stiffness matrices using FRFs is required, the use of complex FRFs based updating formulation is not fully justified and would lead to inaccurate updated models. This paper addresses this difficulty and proposes an improved FRF based finite element model updating procedure using the concept of normal FRFs. The proposed method is a modified version of the existing response function method that is based on the complex FRFs. The effectiveness of the proposed method is validated through a numerical study of a simple but representative beam structure. The effect of coordinate incompleteness and robustness of method under presence of noise is investigated. The results of updating obtained by the improved method are compared with the existing response function method. The performance of the two approaches is compared for cases of light, medium and heavily damped structures. It is found that the proposed improved method is effective in updating of mass and stiffness matrices in all the cases of complete and incomplete data and with all levels and types of damping.
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. S. Chang
2007-09-01
The Advanced Test Reactor (ATR) is a high power density and high neutron flux research reactor operating in the United States. Powered with highly enriched uranium (HEU), the ATR has a maximum thermal power rating of 250 MWth. Because of the large test volumes located in high flux areas, the ATR is an ideal candidate for assessing the feasibility of converting an HEU driven reactor to a low-enriched core. The present work investigates the necessary modifications and evaluates the subsequent operating effects of this conversion. A detailed plate-by-plate MCNP ATR 1/8th core model was developed and validated for a fuelmore » cycle burnup comparison analysis. Using the current HEU U 235 enrichment of 93.0 % as a baseline, an analysis can be performed to determine the low-enriched uranium (LEU) density and U-235 enrichment required in the fuel meat to yield an equivalent K-eff between the HEU core and the LEU core versus effective full power days (EFPD). The MCNP ATR 1/8th core model will be used to optimize the U-235 loading in the LEU core, such that the differences in K-eff and heat flux profile between the HEU and LEU core can be minimized. The depletion methodology MCWO was used to calculate K-eff versus EFPDs in this paper. The MCWO-calculated results for the LEU cases with foil (U-10Mo) types demonstrated adequate excess reactivity such that the K-eff versus EFPDs plot is similar to the reference ATR HEU case. Each HEU fuel element contains 19 fuel plates with a fuel meat thickness of 0.508 mm. In this work, the proposed LEU (U-10Mo) core conversion case with a nominal fuel meat thickness of 0.508 mm and the same U-235 enrichment (15.5 wt%) can be used to optimize the radial heat flux profile by varying the fuel plate thickness from 0.254 to 0.457 mm at the inner 4 fuel plates (1-4) and outer 4 fuel plates (16-19). In addition, a 0.7g of burnable absorber Boron-10 was added in the inner and outer plates to reduce the initial excess reactivity, and the inner/outer heat flux more effectively. The optimized LEU relative radial fission heat flux profile is bounded by the reference ATR HEU case. However, to demonstrate that the LEU core fuel cycle performance can meet the Updated Final Safety Analysis Report (UFSAR) safety requirements, additional studies will be necessary to evaluate and compare safety parameters such as void reactivity and Doppler coefficients, control components worth (outer shim control cylinders, safety rods and regulating rod), and shutdown margins between the HEU and LEU cores.« less
NASA Astrophysics Data System (ADS)
Turnbull, Heather; Omenzetter, Piotr
2018-03-01
vDifficulties associated with current health monitoring and inspection practices combined with harsh, often remote, operational environments of wind turbines highlight the requirement for a non-destructive evaluation system capable of remotely monitoring the current structural state of turbine blades. This research adopted a physics based structural health monitoring methodology through calibration of a finite element model using inverse techniques. A 2.36m blade from a 5kW turbine was used as an experimental specimen, with operational modal analysis techniques utilised to realize the modal properties of the system. Modelling the experimental responses as fuzzy numbers using the sub-level technique, uncertainty in the response parameters was propagated back through the model and into the updating parameters. Initially, experimental responses of the blade were obtained, with a numerical model of the blade created and updated. Deterministic updating was carried out through formulation and minimisation of a deterministic objective function using both firefly algorithm and virus optimisation algorithm. Uncertainty in experimental responses were modelled using triangular membership functions, allowing membership functions of updating parameters (Young's modulus and shear modulus) to be obtained. Firefly algorithm and virus optimisation algorithm were again utilised, however, this time in the solution of fuzzy objective functions. This enabled uncertainty associated with updating parameters to be quantified. Varying damage location and severity was simulated experimentally through addition of small masses to the structure intended to cause a structural alteration. A damaged model was created, modelling four variable magnitude nonstructural masses at predefined points and updated to provide a deterministic damage prediction and information in relation to the parameters uncertainty via fuzzy updating.
Adapting to change: The role of the right hemisphere in mental model building and updating.
Filipowicz, Alex; Anderson, Britt; Danckert, James
2016-09-01
We recently proposed that the right hemisphere plays a crucial role in the processes underlying mental model building and updating. Here, we review the evidence we and others have garnered to support this novel account of right hemisphere function. We begin by presenting evidence from patient work that suggests a critical role for the right hemisphere in the ability to learn from the statistics in the environment (model building) and adapt to environmental change (model updating). We then provide a review of neuroimaging research that highlights a network of brain regions involved in mental model updating. Next, we outline specific roles for particular regions within the network such that the anterior insula is purported to maintain the current model of the environment, the medial prefrontal cortex determines when to explore new or alternative models, and the inferior parietal lobule represents salient and surprising information with respect to the current model. We conclude by proposing some future directions that address some of the outstanding questions in the field of mental model building and updating. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Burns, Lee; Merry, Carl; Decker, Ryan; Harrington, Brian
2008-01-01
The 2006 Cape Canaveral Air Force Station (CCAFS) Range Reference Atmosphere (RRA) is a statistical model summarizing the wind and thermodynamic atmospheric variability from surface to 70 kin. Launches of the National Aeronautics and Space Administration's (NASA) Space Shuttle from Kennedy Space Center utilize CCAFS RRA data to evaluate environmental constraints on various aspects of the vehicle during ascent. An update to the CCAFS RRA was recently completed. As part of the update, a validation study on the 2006 version was conducted as well as a comparison analysis of the 2006 version to the existing CCAFS RRA database version 1983. Assessments to the Space Shuttle vehicle ascent profile characteristics were performed to determine impacts of the updated model to the vehicle performance. Details on the model updates and the vehicle sensitivity analyses with the update model are presented.
Artificial Boundary Conditions for Finite Element Model Update and Damage Detection
2017-03-01
BOUNDARY CONDITIONS FOR FINITE ELEMENT MODEL UPDATE AND DAMAGE DETECTION by Emmanouil Damanakis March 2017 Thesis Advisor: Joshua H. Gordis...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE ARTIFICIAL BOUNDARY CONDITIONS FOR FINITE ELEMENT MODEL UPDATE AND DAMAGE DETECTION...release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) In structural engineering, a finite element model is often
An Investigation Into the Effects of Frequency Response Function Estimators on Model Updating
NASA Astrophysics Data System (ADS)
Ratcliffe, M. J.; Lieven, N. A. J.
1999-03-01
Model updating is a very active research field, in which significant effort has been invested in recent years. Model updating methodologies are invariably successful when used on noise-free simulated data, but tend to be unpredictable when presented with real experimental data that are—unavoidably—corrupted with uncorrelated noise content. In the development and validation of model-updating strategies, a random zero-mean Gaussian variable is added to simulated test data to tax the updating routines more fully. This paper proposes a more sophisticated model for experimental measurement noise, and this is used in conjunction with several different frequency response function estimators, from the classical H1and H2to more refined estimators that purport to be unbiased. Finite-element model case studies, in conjunction with a genuine experimental test, suggest that the proposed noise model is a more realistic representation of experimental noise phenomena. The choice of estimator is shown to have a significant influence on the viability of the FRF sensitivity method. These test cases find that the use of the H2estimator for model updating purposes is contraindicated, and that there is no advantage to be gained by using the sophisticated estimators over the classical H1estimator.
New York's forest resources, 2007
R.H. Widmann; S. Crawford
2010-01-01
This publication provides an overview of forest resource attributes for this state based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...
North Dakota's forest resources, 2010
D.E. Haugen; R.A. Harsel
2011-01-01
This publication provides an overview of forest resource attributes for North Dakota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
Rhode Island's forest resources, 2011
Brett J. Butler; Randall S. Morin; Mark D. Nelson
2012-01-01
This publication provides an overview of forest resource attributes for Rhode Island based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
New Hampshire's forest resources, 2010
R.S. Morin; M. Nelson
2011-01-01
This publication provides an overview of forest resource attributes for New Hampshire based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
Rhode Island's forest resources, 2010
Brett J. Butler; Cassandra Kurtz; W. Keith Moser; Bruce. Payton
2011-01-01
This publication provides an overview of forest resource attributes for Rhode Island based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
Michigan's forest resources, 2006
S.A. Pugh
2007-01-01
Figure 2 was revised by the author on August 20, 2008. This publication provides an overview of forest resource attributes for Michigan based on an annual inventory conducted by the Forest Inventory and Analysis program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. Note...
North Dakota's forest resources, 2008
D.E. Haugen; A.J. Lister
2010-01-01
This publication provides an overview of forest resource attributes for North Dakota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
Susan J. Crocker
2015-01-01
This publication provides an overview of forest resource attributes for Illinois based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the Northern Research Station (NRS) of the U.S. Forest Service. These estimates, along with web-posted core tables, are updated annually. In 2014, NRS-FIA changed from a 5- to a 7-year inventory...
Susan J. Crocker
2015-01-01
This publication provides an overview of forest resource attributes for New Jersey based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the Northern Research Station (NRS) of the U.S. Forest Service. These estimates, along with web-posted core tables, are updated annually. In 2014, NRS-FIA changed from a 5- to a 7-year inventory...
ERIC Educational Resources Information Center
Ivanova, Tamara N.; Gross, Christina; Mappus, Rudolph C.; Kwon, Yong Jun; Bassell, Gary J.; Liu, Robert C.
2017-01-01
Learning to recognize a stimulus category requires experience with its many natural variations. However, the mechanisms that allow a category's sensorineural representation to be updated after experiencing new exemplars are not well understood, particularly at the molecular level. Here we investigate how a natural vocal category induces expression…
Financial Accounting for Local and State School Systems: 2014 Edition. NCES 2015-347
ERIC Educational Resources Information Center
Allison, Gregory S.
2015-01-01
The 2014 edition of "Financial Accounting for Local and State School Systems" updates the 2009 (see ED505993) and 2003 editions of the handbook. The 2003 edition was the work of the NCES National Forum on Education Statistics, Core Finance Data Task Force. That task force systematically rewrote nearly the entire text, incorporating new…
Rhode Island's forest resources, 2012
Brett J. Butler
2013-01-01
This publication provides an overview of forest resource attributes for Rhode Island based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 3 of this...
New Hampshire's forest resources, 2011
R.S. Morin; C.W. Woodall
2012-01-01
This publication provides an overview of forest resource attributes for New Hampshire based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
New York's forest resources, 2012
R.H. Widmann
2013-01-01
This publication provides an overview of forest resource attributes for New York based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...
New York's forest resources, 2008
R.H. Widmann; B.J. Butler; S. Crawford
2010-01-01
This publication provides an overview of forest resource attributes for New York based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report....
New Hampshire's forest resources, 2012
R.S. Morin; K. Lombard
2013-01-01
This publication provides an overview of forest resource attributes for New Hampshire based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
West Virginia's forest resources, 2012
R.H. Widmann
2013-01-01
This publication provides an overview of forest resource attributes for West Virginia based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
New Hampshire's forest resources, 2007
R.S. Morin; G.M. McCaskill; W. McWilliams; M. Tansey
2010-01-01
This publication provides an overview of forest resource attributes for New Hampshire based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 5 of this...
R.H. Widmann; G.M. McCaskill; W. McWilliams; D. Balser
2010-01-01
This publication provides an overview of forest resource attributes for this state based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 5 of this report...
Rhode Island's forest resources, 2009
Brett J. Butler; Bruce. Payton
2011-01-01
This publication provides an overview of forest resource attributes for Rhode Island based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
North Dakota's forest resources, 2011
D.E. Haugen; R.A. Harsel
2012-01-01
This publication provides an overview of forest resource attributes for North Dakota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
North Dakota's Forest Resources, 2007
D.E. Haugen; M. Kangas
2008-01-01
This publication provides an overview of forest resource attributes for North Dakota based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
New Hampshire's forest resources, 2009
R.S. Morin
2011-01-01
This publication provides an overview of forest resource attributes for New Hampshire based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
New Hampshire's forest resources, 2008
R.S. Morin; B.J. Butler; M. Tansey
2010-01-01
This publication provides an overview of forest resource attributes for New Hampshire based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
Forest resources of the Shawnee National Forest, 2007
C.M. Kurtz; S.J. Crocker
2010-01-01
This publication provides an overview of forest resource attributes for the Shawnee National Forest based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of...
New York's forest resources, 2009
R.H. Widmann; S. Crawford
2011-01-01
This publication provides an overview of forest resource attributes for New York based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...
New York's forest resources, 2010
R.H. Widmann; S. Crawford
2011-01-01
This publication provides an overview of forest resource attributes for New York based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...
West Virginia's forest resources, 2009
R.H. Widmann; G.W. Cook
2011-01-01
This publication provides an overview of forest resource attributes for West Virginia based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this...
West Virginia's forest resources, 2007
R.H. Widmann; G.M. McCaskill; W. McWilliams; G.W. Cook
2010-01-01
This publication provides an overview of forest resource attributes for this state based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 5 of this report...
West Virginia's forest resources, 2010
R.H. Widmann; G.W. Cook
2011-01-01
This publication provides an overview of forest resource attributes for West Virginia based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
West Virginia's forest resources, 2008
R.H. Widmann; B.J. Butler; G.W. Cook
2010-01-01
This publication provides an overview of forest resource attributes for West Virginia based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this...
Invasive Species Science Update (No. 9)
Justin Runyon
2017-01-01
This newsletter is designed to keep managers and other users up-to-date with recently completed and ongoing research by RMRS scientists, as well as to highlight breaking news related to invasive species issues. The newsletter is produced by the RMRS Invasive Species Working Group (ISWG), a core group of scientists who volunteer to disseminate RMRS invasive species...
Adaptation of clinical prediction models for application in local settings.
Kappen, Teus H; Vergouwe, Yvonne; van Klei, Wilton A; van Wolfswinkel, Leo; Kalkman, Cor J; Moons, Karel G M
2012-01-01
When planning to use a validated prediction model in new patients, adequate performance is not guaranteed. For example, changes in clinical practice over time or a different case mix than the original validation population may result in inaccurate risk predictions. To demonstrate how clinical information can direct updating a prediction model and development of a strategy for handling missing predictor values in clinical practice. A previously derived and validated prediction model for postoperative nausea and vomiting was updated using a data set of 1847 patients. The update consisted of 1) changing the definition of an existing predictor, 2) reestimating the regression coefficient of a predictor, and 3) adding a new predictor to the model. The updated model was then validated in a new series of 3822 patients. Furthermore, several imputation models were considered to handle real-time missing values, so that possible missing predictor values could be anticipated during actual model use. Differences in clinical practice between our local population and the original derivation population guided the update strategy of the prediction model. The predictive accuracy of the updated model was better (c statistic, 0.68; calibration slope, 1.0) than the original model (c statistic, 0.62; calibration slope, 0.57). Inclusion of logistical variables in the imputation models, besides observed patient characteristics, contributed to a strategy to deal with missing predictor values at the time of risk calculation. Extensive knowledge of local, clinical processes provides crucial information to guide the process of adapting a prediction model to new clinical practices.
Coates, Peter S.; Casazza, Michael L.; Brussee, Brianne E.; Ricca, Mark A.; Gustafson, K. Benjamin; Sanchez-Chopitea, Erika; Mauch, Kimberly; Niell, Lara; Gardner, Scott; Espinosa, Shawn; Delehanty, David J.
2016-05-20
Successful adaptive management hinges largely upon integrating new and improved sources of information as they become available. As a timely example of this tenet, we updated a management decision support tool that was previously developed for greater sage-grouse (Centrocercus urophasianus, hereinafter referred to as “sage-grouse”) populations in Nevada and California. Specifically, recently developed spatially explicit habitat maps derived from empirical data played a key role in the conservation of this species facing listing under the Endangered Species Act. This report provides an updated process for mapping relative habitat suitability and management categories for sage-grouse in Nevada and northeastern California (Coates and others, 2014, 2016). These updates include: (1) adding radio and GPS telemetry locations from sage-grouse monitored at multiple sites during 2014 to the original location dataset beginning in 1998; (2) integrating output from high resolution maps (1–2 m2) of sagebrush and pinyon-juniper cover as covariates in resource selection models; (3) modifying the spatial extent of the analyses to match newly available vegetation layers; (4) explicit modeling of relative habitat suitability during three seasons (spring, summer, winter) that corresponded to critical life history periods for sage-grouse (breeding, brood-rearing, over-wintering); (5) accounting for differences in habitat availability between more mesic sagebrush steppe communities in the northern part of the study area and drier Great Basin sagebrush in more southerly regions by categorizing continuous region-wide surfaces of habitat suitability index (HSI) with independent locations falling within two hydrological zones; (6) integrating the three seasonal maps into a composite map of annual relative habitat suitability; (7) deriving updated land management categories based on previously determined cut-points for intersections of habitat suitability and an updated index of sage-grouse abundance and space-use (AUI); and (8) masking urban footprints and major roadways out of the final map products.Seasonal habitat maps were generated based on model-averaged resource selection functions (RSF) derived for 10 project areas (813 sage-grouse; 14,085 locations) during the spring season, 10 during the summer season (591 sage-grouse, 11,743 locations), and 7 during the winter season (288 sage-grouse, 4,862 locations). RSF surfaces were transformed to HSIs and averaged in a GIS framework for every pixel for each season. Validation analyses of categorized HSI surfaces using a suite of independent datasets resulted in an agreement of 93–97 percent for habitat versus non-habitat on an annual basis. Spring and summer maps validated similarly well at 94–97 percent, while winter maps validated slightly less accurately at 87–93 percent.We then provide an updated example of how space use models can be integrated with habitat models to help inform conservation planning. We used updated lek count data to calculate a composite abundance and space use index (AUI) that comprised the combination of probabilistic breeding density with a non-linear probability of occurrence relative to distance to nearest lek. The AUI was then classified into two categories of use (high and low-to-no) and intersected with the HSI categories to create potential management prioritization scenarios based on information about sage-grouse occupancy coupled with habitat suitability. Compared to Coates and others (2014, 2016), the amount of area classified as habitat across the region increased by 6.5 percent (approximately 1,700,000 acres). For management categories, core increased by 7.2 percent (approximately 865,000 acres), priority increased by 9.6 percent (approximately 855,000 acres), and general increased by 9.2 percent (approximately 768,000 acres), while non-habitat decreased (that is, classified non-habitat occurring outside of areas of concentrated use) by 11.9 percent (approximately 2,500,000 acres). Importantly, seasonal and annual maps represent habitat for all age and sex classes of sage-grouse (that is, sample sizes of marked grouse were insufficient to only construct models for reproductive females). This revised sage-grouse habitat mapping product helps improve adaptive application of conservation planning tools based on intersections of spatially explicit habitat suitability, abundance, and space use indices.
A review and update of the Virginia Department of Transportation cash flow forecasting model.
DOT National Transportation Integrated Search
1996-01-01
This report details the research done to review and update components of the VDOT cash flow forecasting model. Specifically, the study updated the monthly factors submodel used to predict payments on construction contracts. For the other submodel rev...
An Update on Improvements to NiCE Support for PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Andrew; McCaskey, Alexander J.; Billings, Jay Jay
2015-09-01
The Department of Energy Office of Nuclear Energy's Nuclear Energy Advanced Modeling and Simulation (NEAMS) program has supported the development of the NEAMS Integrated Computational Environment (NiCE), a modeling and simulation workflow environment that provides services and plugins to facilitate tasks such as code execution, model input construction, visualization, and data analysis. This report details the development of workflows for the reactor core neutronics application, PROTEUS. This advanced neutronics application (primarily developed at Argonne National Laboratory) aims to improve nuclear reactor design and analysis by providing an extensible and massively parallel, finite-element solver for current and advanced reactor fuel neutronicsmore » modeling. The integration of PROTEUS-specific tools into NiCE is intended to make the advanced capabilities that PROTEUS provides more accessible to the nuclear energy research and development community. This report will detail the work done to improve existing PROTEUS workflow support in NiCE. We will demonstrate and discuss these improvements, including the development of flexible IO services, an improved interface for input generation, and the addition of advanced Fortran development tools natively in the platform.« less
General Separations Area (GSA) Groundwater Flow Model Update: Hydrostratigraphic Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bagwell, L.; Bennett, P.; Flach, G.
2017-02-21
This document describes the assembly, selection, and interpretation of hydrostratigraphic data for input to an updated groundwater flow model for the General Separations Area (GSA; Figure 1) at the Department of Energy’s (DOE) Savannah River Site (SRS). This report is one of several discrete but interrelated tasks that support development of an updated groundwater model (Bagwell and Flach, 2016).
Dynamic analysis of I cross beam section dissimilar plate joined by TIG welding
NASA Astrophysics Data System (ADS)
Sani, M. S. M.; Nazri, N. A.; Rani, M. N. Abdul; Yunus, M. A.
2018-04-01
In this paper, finite element (FE) joint modelling technique for prediction of dynamic properties of sheet metal jointed by tungsten inert gas (TTG) will be presented. I cross section dissimilar flat plate with different series of aluminium alloy; AA7075 and AA6061 joined by TTG are used. In order to find the most optimum set of TTG welding dissimilar plate, the finite element model with three types of joint modelling were engaged in this study; bar element (CBAR), beam element and spot weld element connector (CWELD). Experimental modal analysis (EMA) was carried out by impact hammer excitation on the dissimilar plates that welding by TTG method. Modal properties of FE model with joints were compared and validated with model testing. CWELD element was chosen to represent weld model for TTG joints due to its accurate prediction of mode shapes and contains an updating parameter for weld modelling compare to other weld modelling. Model updating was performed to improve correlation between EMA and FEA and before proceeds to updating, sensitivity analysis was done to select the most sensitive updating parameter. After perform model updating, average percentage of error of the natural frequencies for CWELD model is improved significantly.
Test and analysis procedures for updating math models of Space Shuttle payloads
NASA Technical Reports Server (NTRS)
Craig, Roy R., Jr.
1991-01-01
Over the next decade or more, the Space Shuttle will continue to be the primary transportation system for delivering payloads to Earth orbit. Although a number of payloads have already been successfully carried by the Space Shuttle in the payload bay of the Orbiter vehicle, there continues to be a need for evaluation of the procedures used for verifying and updating the math models of the payloads. The verified payload math models is combined with an Orbiter math model for the coupled-loads analysis, which is required before any payload can fly. Several test procedures were employed for obtaining data for use in verifying payload math models and for carrying out the updating of the payload math models. Research was directed at the evaluation of test/update procedures for use in the verification of Space Shuttle payload math models. The following research tasks are summarized: (1) a study of free-interface test procedures; (2) a literature survey and evaluation of model update procedures; and (3) the design and construction of a laboratory payload simulator.
Build-up Approach to Updating the Mock Quiet Spike(TradeMark) Beam Model
NASA Technical Reports Server (NTRS)
Herrera, Claudia Y.; Pak, Chan-gi
2007-01-01
A crucial part of aircraft design is ensuring that the required margin for flutter is satisfied. A trustworthy flutter analysis, which begins by possessing an accurate dynamics model, is necessary for this task. Traditionally, a model was updated manually by fine tuning specific stiffness parameters until the analytical results matched test data. This is a time consuming iterative process. NASA Dryden Flight Research Center has developed a mode matching code to execute this process in a more efficient manner. Recently, this code was implemented in the F-15B/Quiet Spike(TradeMark) (Gulfstream Aerospace Corporation, Savannah, Georgia) model update. A build-up approach requiring several ground vibration test configurations and a series of model updates was implemented in order to determine the connection stiffness between aircraft and test article. The mode matching code successfully updated various models for the F-15B/Quiet Spike(TradeMark) project to within 1 percent error in frequency and the modal assurance criteria values ranged from 88.51-99.42 percent.
Build-up Approach to Updating the Mock Quiet Spike(TM)Beam Model
NASA Technical Reports Server (NTRS)
Herrera, Claudia Y.; Pak, Chan-gi
2007-01-01
A crucial part of aircraft design is ensuring that the required margin for flutter is satisfied. A trustworthy flutter analysis, which begins by possessing an accurate dynamics model, is necessary for this task. Traditionally, a model was updated manually by fine tuning specific stiffness parameters until the analytical results matched test data. This is a time consuming iterative process. The NASA Dryden Flight Research Center has developed a mode matching code to execute this process in a more efficient manner. Recently, this code was implemented in the F-15B/Quiet Spike (Gulfstream Aerospace Corporation, Savannah, Georgia) model update. A build-up approach requiring several ground vibration test configurations and a series of model updates was implemented to determine the connection stiffness between aircraft and test article. The mode matching code successfully updated various models for the F-15B/Quiet Spike project to within 1 percent error in frequency and the modal assurance criteria values ranged from 88.51-99.42 percent.
Updates to the Demographic and Spatial Allocation Models to ...
EPA announced the availability of the draft report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) for a 30-day public comment period. The ICLUS version 2 (v2) modeling tool furthered land change modeling by providing nationwide housing development scenarios up to 2100. ICLUS V2 includes updated population and land use data sets and addressing limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. [2017 UPDATE] Get the latest version of ICLUS and stay up-to-date by signing up to the ICLUS mailing list. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.
Lee, A J; Cunningham, A P; Kuchenbaecker, K B; Mavaddat, N; Easton, D F; Antoniou, A C
2014-01-01
Background: The Breast and Ovarian Analysis of Disease Incidence and Carrier Estimation Algorithm (BOADICEA) is a risk prediction model that is used to compute probabilities of carrying mutations in the high-risk breast and ovarian cancer susceptibility genes BRCA1 and BRCA2, and to estimate the future risks of developing breast or ovarian cancer. In this paper, we describe updates to the BOADICEA model that extend its capabilities, make it easier to use in a clinical setting and yield more accurate predictions. Methods: We describe: (1) updates to the statistical model to include cancer incidences from multiple populations; (2) updates to the distributions of tumour pathology characteristics using new data on BRCA1 and BRCA2 mutation carriers and women with breast cancer from the general population; (3) improvements to the computational efficiency of the algorithm so that risk calculations now run substantially faster; and (4) updates to the model's web interface to accommodate these new features and to make it easier to use in a clinical setting. Results: We present results derived using the updated model, and demonstrate that the changes have a significant impact on risk predictions. Conclusion: All updates have been implemented in a new version of the BOADICEA web interface that is now available for general use: http://ccge.medschl.cam.ac.uk/boadicea/. PMID:24346285
Updating the OMERACT filter: core areas as a basis for defining core outcome sets.
Kirwan, John R; Boers, Maarten; Hewlett, Sarah; Beaton, Dorcas; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; van der Heijde, Désirée M; Kloppenburg, Margreet; Kvien, Tore K; Landewé, Robert B M; Mackie, Sarah L; Matteson, Eric L; Mease, Philip J; Merkel, Peter A; Ostergaard, Mikkel; Saketkoo, Lesley Ann; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Tugwell, Peter
2014-05-01
The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes that are universal to all studies of the effects of intervention effects. There is no published outline for instrument choice or development that is aimed at measuring outcome, was derived from broad consensus over its underlying philosophy, or includes a structured and documented critique. Therefore, a new proposal for defining core areas of measurement ("Filter 2.0 Core Areas of Measurement") was presented at OMERACT 11 to explore areas of consensus and to consider whether already endorsed core outcome sets fit into this newly proposed framework. Discussion groups critically reviewed the extent to which case studies of current OMERACT Working Groups complied with or negated the proposed framework, whether these observations had a more general application, and what issues remained to be resolved. Although there was broad acceptance of the framework in general, several important areas of construction, presentation, and clarity of the framework were questioned. The discussion groups and subsequent feedback highlighted 20 such issues. These issues will require resolution to reach consensus on accepting the proposed Filter 2.0 framework of Core Areas as the basis for the selection of Core Outcome Domains and hence appropriate Core Outcome Sets for clinical trials.
Dissociable effects of surprise and model update in parietal and anterior cingulate cortex
O’Reilly, Jill X.; Schüffelgen, Urs; Cuell, Steven F.; Behrens, Timothy E. J.; Mars, Rogier B.; Rushworth, Matthew F. S.
2013-01-01
Brains use predictive models to facilitate the processing of expected stimuli or planned actions. Under a predictive model, surprising (low probability) stimuli or actions necessitate the immediate reallocation of processing resources, but they can also signal the need to update the underlying predictive model to reflect changes in the environment. Surprise and updating are often correlated in experimental paradigms but are, in fact, distinct constructs that can be formally defined as the Shannon information (IS) and Kullback–Leibler divergence (DKL) associated with an observation. In a saccadic planning task, we observed that distinct behaviors and brain regions are associated with surprise/IS and updating/DKL. Although surprise/IS was associated with behavioral reprogramming as indexed by slower reaction times, as well as with activity in the posterior parietal cortex [human lateral intraparietal area (LIP)], the anterior cingulate cortex (ACC) was specifically activated during updating of the predictive model (DKL). A second saccade-sensitive region in the inferior posterior parietal cortex (human 7a), which has connections to both LIP and ACC, was activated by surprise and modulated by updating. Pupillometry revealed a further dissociation between surprise and updating with an early positive effect of surprise and late negative effect of updating on pupil area. These results give a computational account of the roles of the ACC and two parietal saccade regions, LIP and 7a, by which their involvement in diverse tasks can be understood mechanistically. The dissociation of functional roles between regions within the reorienting/reprogramming network may also inform models of neurological phenomena, such as extinction and Balint syndrome, and neglect. PMID:23986499
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epiney, A.; Canepa, S.; Zerkak, O.
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
Chemical transport model simulations of organic aerosol in ...
Gasoline- and diesel-fueled engines are ubiquitous sources of air pollution in urban environments. They emit both primary particulate matter and precursor gases that react to form secondary particulate matter in the atmosphere. In this work, we updated the organic aerosol module and organic emissions inventory of a three-dimensional chemical transport model, the Community Multiscale Air Quality Model (CMAQ), using recent, experimentally derived inputs and parameterizations for mobile sources. The updated model included a revised volatile organic compound (VOC) speciation for mobile sources and secondary organic aerosol (SOA) formation from unspeciated intermediate volatility organic compounds (IVOCs). The updated model was used to simulate air quality in southern California during May and June 2010, when the California Research at the Nexus of Air Quality and Climate Change (CalNex) study was conducted. Compared to the Traditional version of CMAQ, which is commonly used for regulatory applications, the updated model did not significantly alter the predicted organic aerosol (OA) mass concentrations but did substantially improve predictions of OA sources and composition (e.g., POA–SOA split), as well as ambient IVOC concentrations. The updated model, despite substantial differences in emissions and chemistry, performed similar to a recently released research version of CMAQ (Woody et al., 2016) that did not include the updated VOC and IVOC emissions and SOA data
Brusaferro, S; Cookson, B; Kalenic, S; Cooper, T; Fabry, J; Gallagher, R; Hartemann, P; Mannerquist, K; Popp, W; Privitera, G; Ruef, C; Viale, P; Coiz, F; Fabbro, E; Suetens, C; Varela Santos, C
2014-12-11
The harmonisation of training programmes for infection control and hospital hygiene (IC/HH) professionals in Europe is a requirement of the Council recommendation on patient safety. The European Centre for Disease Prevention and Control commissioned the 'Training Infection Control in Europe' project to develop a consensus on core competencies for IC/HH professionals in the European Union (EU). Core competencies were drafted on the basis of the Improving Patient Safety in Europe (IPSE) project's core curriculum (CC), evaluated by questionnaire and approved by National Representatives (NRs) for IC/HH training. NRs also re-assessed the status of IC/HH training in European countries in 2010 in comparison with the situation before the IPSE CC in 2006. The IPSE CC had been used to develop or update 28 of 51 IC/HH courses. Only 10 of 33 countries offered training and qualification for IC/HH doctors and nurses. The proposed core competencies are structured in four areas and 16 professional tasks at junior and senior level. They form a reference for standardisation of IC/HH professional competencies and support recognition of training initiatives.
NASA Astrophysics Data System (ADS)
Yu, Liuqian; Fennel, Katja; Bertino, Laurent; Gharamti, Mohamad El; Thompson, Keith R.
2018-06-01
Effective data assimilation methods for incorporating observations into marine biogeochemical models are required to improve hindcasts, nowcasts and forecasts of the ocean's biogeochemical state. Recent assimilation efforts have shown that updating model physics alone can degrade biogeochemical fields while only updating biogeochemical variables may not improve a model's predictive skill when the physical fields are inaccurate. Here we systematically investigate whether multivariate updates of physical and biogeochemical model states are superior to only updating either physical or biogeochemical variables. We conducted a series of twin experiments in an idealized ocean channel that experiences wind-driven upwelling. The forecast model was forced with biased wind stress and perturbed biogeochemical model parameters compared to the model run representing the "truth". Taking advantage of the multivariate nature of the deterministic Ensemble Kalman Filter (DEnKF), we assimilated different combinations of synthetic physical (sea surface height, sea surface temperature and temperature profiles) and biogeochemical (surface chlorophyll and nitrate profiles) observations. We show that when biogeochemical and physical properties are highly correlated (e.g., thermocline and nutricline), multivariate updates of both are essential for improving model skill and can be accomplished by assimilating either physical (e.g., temperature profiles) or biogeochemical (e.g., nutrient profiles) observations. In our idealized domain, the improvement is largely due to a better representation of nutrient upwelling, which results in a more accurate nutrient input into the euphotic zone. In contrast, assimilating surface chlorophyll improves the model state only slightly, because surface chlorophyll contains little information about the vertical density structure. We also show that a degradation of the correlation between observed subsurface temperature and nutrient fields, which has been an issue in several previous assimilation studies, can be reduced by multivariate updates of physical and biogeochemical fields.
Model updating in flexible-link multibody systems
NASA Astrophysics Data System (ADS)
Belotti, R.; Caneva, G.; Palomba, I.; Richiedei, D.; Trevisani, A.
2016-09-01
The dynamic response of flexible-link multibody systems (FLMSs) can be predicted through nonlinear models based on finite elements, to describe the coupling between rigid- body and elastic behaviour. Their accuracy should be as high as possible to synthesize controllers and observers. Model updating based on experimental measurements is hence necessary. By taking advantage of the experimental modal analysis, this work proposes a model updating procedure for FLMSs and applies it experimentally to a planar robot. Indeed, several peculiarities of the model of FLMS should be carefully tackled. On the one hand, nonlinear models of a FLMS should be linearized about static equilibrium configurations. On the other, the experimental mode shapes should be corrected to be consistent with the elastic displacements represented in the model, which are defined with respect to a fictitious moving reference (the equivalent rigid link system). Then, since rotational degrees of freedom are also represented in the model, interpolation of the experimental data should be performed to match the model displacement vector. Model updating has been finally cast as an optimization problem in the presence of bounds on the feasible values, by also adopting methods to improve the numerical conditioning and to compute meaningful updated inertial and elastic parameters.
DISPATCH: a numerical simulation framework for the exa-scale era - I. Fundamentals
NASA Astrophysics Data System (ADS)
Nordlund, Åke; Ramsey, Jon P.; Popovas, Andrius; Küffmeier, Michael
2018-06-01
We introduce a high-performance simulation framework that permits the semi-independent, task-based solution of sets of partial differential equations, typically manifesting as updates to a collection of `patches' in space-time. A hybrid MPI/OpenMP execution model is adopted, where work tasks are controlled by a rank-local `dispatcher' which selects, from a set of tasks generally much larger than the number of physical cores (or hardware threads), tasks that are ready for updating. The definition of a task can vary, for example, with some solving the equations of ideal magnetohydrodynamics (MHD), others non-ideal MHD, radiative transfer, or particle motion, and yet others applying particle-in-cell (PIC) methods. Tasks do not have to be grid based, while tasks that are, may use either Cartesian or orthogonal curvilinear meshes. Patches may be stationary or moving. Mesh refinement can be static or dynamic. A feature of decisive importance for the overall performance of the framework is that time-steps are determined and applied locally; this allows potentially large reductions in the total number of updates required in cases when the signal speed varies greatly across the computational domain, and therefore a corresponding reduction in computing time. Another feature is a load balancing algorithm that operates `locally' and aims to simultaneously minimize load and communication imbalance. The framework generally relies on already existing solvers, whose performance is augmented when run under the framework, due to more efficient cache usage, vectorization, local time-stepping, plus near-linear and, in principle, unlimited OpenMP and MPI scaling.
NASA Astrophysics Data System (ADS)
Wolff, J.; Jankov, I.; Beck, J.; Carson, L.; Frimel, J.; Harrold, M.; Jiang, H.
2016-12-01
It is well known that global and regional numerical weather prediction ensemble systems are under-dispersive, producing unreliable and overconfident ensemble forecasts. Typical approaches to alleviate this problem include the use of multiple dynamic cores, multiple physics suite configurations, or a combination of the two. While these approaches may produce desirable results, they have practical and theoretical deficiencies and are more difficult and costly to maintain. An active area of research that promotes a more unified and sustainable system for addressing the deficiencies in ensemble modeling is the use of stochastic physics to represent model-related uncertainty. Stochastic approaches include Stochastic Parameter Perturbations (SPP), Stochastic Kinetic Energy Backscatter (SKEB), Stochastic Perturbation of Physics Tendencies (SPPT), or some combination of all three. The focus of this study is to assess the model performance within a convection-permitting ensemble at 3-km grid spacing across the Contiguous United States (CONUS) when using stochastic approaches. For this purpose, the test utilized a single physics suite configuration based on the operational High-Resolution Rapid Refresh (HRRR) model, with ensemble members produced by employing stochastic methods. Parameter perturbations were employed in the Rapid Update Cycle (RUC) land surface model and Mellor-Yamada-Nakanishi-Niino (MYNN) planetary boundary layer scheme. Results will be presented in terms of bias, error, spread, skill, accuracy, reliability, and sharpness using the Model Evaluation Tools (MET) verification package. Due to the high level of complexity of running a frequently updating (hourly), high spatial resolution (3 km), large domain (CONUS) ensemble system, extensive high performance computing (HPC) resources were needed to meet this objective. Supercomputing resources were provided through the National Center for Atmospheric Research (NCAR) Strategic Capability (NSC) project support, allowing for a more extensive set of tests over multiple seasons, consequently leading to more robust results. Through the use of these stochastic innovations and powerful supercomputing at NCAR, further insights and advancements in ensemble forecasting at convection-permitting scales will be possible.
A Low-Complexity and High-Performance 2D Look-Up Table for LDPC Hardware Implementation
NASA Astrophysics Data System (ADS)
Chen, Jung-Chieh; Yang, Po-Hui; Lain, Jenn-Kaie; Chung, Tzu-Wen
In this paper, we propose a low-complexity, high-efficiency two-dimensional look-up table (2D LUT) for carrying out the sum-product algorithm in the decoding of low-density parity-check (LDPC) codes. Instead of employing adders for the core operation when updating check node messages, in the proposed scheme, the main term and correction factor of the core operation are successfully merged into a compact 2D LUT. Simulation results indicate that the proposed 2D LUT not only attains close-to-optimal bit error rate performance but also enjoys a low complexity advantage that is suitable for hardware implementation.
The four-dimensional data assimilation (FDDA) technique in the Weather Research and Forecasting (WRF) meteorological model has recently undergone an important update from the original version. Previous evaluation results have demonstrated that the updated FDDA approach in WRF pr...
A Diversity 3.0 Update: Are We Moving the Needle Enough?
Nivet, Marc A
2015-12-01
Five years ago, in a previous Academic Medicine Commentary, the author asserted that the move toward health reform and a more equitable health system required a transformation of more than how we finance, deliver, and evaluate health care. It also required a new role for diversity and inclusion as a solution to our problems, rather than continuing to see it as just another problem to be fixed. In this update, the author assesses the collective progress made by the nation's medical schools and teaching hospitals in integrating diversity into their core strategic activities, as well as highlighting areas for continued improvement.The author identifies five new trends in diversity and inclusion within academic medicine: broader definitions of diversity to include lesbian, gay, bisexual, and transgender people and those who have disabilities; elevated roles for diversity leaders in medical school administration; growing use of a holistic approach to evaluating medical school applicants; recognition of diversity and inclusion as a core marker of excellence; and appreciation of the significance of subpopulations within minority and underrepresented groups.More work remains to be done, but institutional initiatives to foster and prioritize diversity and inclusion coupled with national efforts by organizations such as the Association of American Medical Colleges are working to build the capacity of U.S. medical schools and teaching hospitals to move diversity from a peripheral initiative to a core strategy for improving the education of medical students and, ultimately, the care delivered to all of our nation's people.
Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model
NASA Technical Reports Server (NTRS)
Boone, Spencer
2017-01-01
This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.
Valence-Dependent Belief Updating: Computational Validation
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments. PMID:28706499
Valence-Dependent Belief Updating: Computational Validation.
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments.
Attentional focus affects how events are segmented and updated in narrative reading.
Bailey, Heather R; Kurby, Christopher A; Sargent, Jesse Q; Zacks, Jeffrey M
2017-08-01
Readers generate situation models representing described events, but the nature of these representations may differ depending on the reading goals. We assessed whether instructions to pay attention to different situational dimensions affect how individuals structure their situation models (Exp. 1) and how they update these models when situations change (Exp. 2). In Experiment 1, participants read and segmented narrative texts into events. Some readers were oriented to pay specific attention to characters or space. Sentences containing character or spatial-location changes were perceived as event boundaries-particularly if the reader was oriented to characters or space, respectively. In Experiment 2, participants read narratives and responded to recognition probes throughout the texts. Readers who were oriented to the spatial dimension were more likely to update their situation models at spatial changes; all readers tracked the character dimension. The results from both experiments indicated that attention to individual situational dimensions influences how readers segment and update their situation models. More broadly, the results provide evidence for a global situation model updating mechanism that serves to set up new models at important narrative changes.
Unthank, Michael D.
2013-01-01
The Ohio River alluvial aquifer near Carrollton, Ky., is an important water resource for the cities of Carrollton and Ghent, as well as for several industries in the area. The groundwater of the aquifer is the primary source of drinking water in the region and a highly valued natural resource that attracts various water-dependent industries because of its quantity and quality. This report evaluates the performance of a numerical model of the groundwater-flow system in the Ohio River alluvial aquifer near Carrollton, Ky., published by the U.S. Geological Survey in 1999. The original model simulated conditions in November 1995 and was updated to simulate groundwater conditions estimated for September 2010. The files from the calibrated steady-state model of November 1995 conditions were imported into MODFLOW-2005 to update the model to conditions in September 2010. The model input files modified as part of this update were the well and recharge files. The design of the updated model and other input files are the same as the original model. The ability of the updated model to match hydrologic conditions for September 2010 was evaluated by comparing water levels measured in wells to those computed by the model. Water-level measurements were available for 48 wells in September 2010. Overall, the updated model underestimated the water levels at 36 of the 48 measured wells. The average difference between measured water levels and model-computed water levels was 3.4 feet and the maximum difference was 10.9 feet. The root-mean-square error of the simulation was 4.45 for all 48 measured water levels. The updated steady-state model could be improved by introducing more accurate and site-specific estimates of selected field parameters, refined model geometry, and additional numerical methods. Collection of field data to better estimate hydraulic parameters, together with continued review of available data and information from area well operators, could provide the model with revised estimates of conductance values for the riverbed and valley wall, hydraulic conductivities for the model layer, and target water levels for future simulations. Additional model layers, a redesigned model grid, and revised boundary conditions could provide a better framework for more accurate simulations. Additional numerical methods would identify possible parameter estimates and determine parameter sensitivities.
Strosberg, David S; Quinn, Kristen M; Abdel-Misih, Sherif R; Harzman, Alan E
2018-04-01
Our objective was to investigate the number and classify surgical operations performed by general surgery residents and compare these with the updated Surgical Council on Resident Education (SCORE) curriculum. We performed a retrospective review of logged surgical cases from general surgical residents who completed training at a single center from 2011 to 2015. The logged cases were correlated with the operations extracted from the SCORE curriculum. Hundred and fifty-one procedures were examined; there were 98 "core" and 53 "advanced" cases as determined by the SCORE. Twenty-eight residents graduated with an average of 1017 major cases. Each resident completed 66 (67%) core cases and 17 (32%) advanced cases an average of one or more times with 39 (40%) core cases and 6 (11%) advanced cases completed five or more times. Core procedures that are infrequently or not performed by residents should be identified in each program to focus on resident education.
Dekmezian, Mhair; Beal, Stacy G; Damashek, Mary Jane; Benavides, Raul; Dhiman, Neelam
2015-04-01
Successful performance and execution of rapid diagnostics in a clinical laboratory hinges heavily on careful validation, accurate and timely communication of results, and real-time quality monitoring. Laboratories must develop strategies to integrate diagnostics with stewardship and evidence-based clinical practice guidelines. We present a collaborative SUCCESS model for execution and monitoring of rapid sepsis diagnostics to facilitate timely treatment. Six months after execution of the Verigene Gram-Positive Blood Culture (BC-GP) and the AdvanDx PNA-FISH assays, data were collected on 579 and 28 episodes of bacteremia and fungemia, respectively. Clinical testing was executed using a SUCCESS model comprising the following components: stewardship, utilization of resources, core strategies, concierge services, education, support, and surveillance. Stewardship needs were identified by evaluating the specialty services benefiting from new testing. Utilization of resources was optimized by reviewing current treatment strategies and antibiogram and formulary options. Core strategies consisted of input from infectious disease leadership, pharmacy, and laboratory staff. Concierge services included automated Micro-eUpdate and physician-friendly actionable reports. Education modules were user-specific, and support was provided through a dedicated 24/7 microbiology hotline. Surveillance was performed by daily audit by the director. Using the SUCCESS model, the turnaround time for the detailed report with actionable guidelines to the physician was ∼3 hours from the time of culture positivity. The overall correlation between rapid methods and culture was 94% (546/579). Discrepant results were predominantly contaminants such as a coagulase-negative staphylococci or viridans streptococci in mixed cultures. SUCCESS is a cost-effective and easily adaptable model for clinical laboratories with limited stewardship resources.
Updating national standards for drinking-water: a Philippine experience.
Lomboy, M; Riego de Dios, J; Magtibay, B; Quizon, R; Molina, V; Fadrilan-Camacho, V; See, J; Enoveso, A; Barbosa, L; Agravante, A
2017-04-01
The latest version of the Philippine National Standards for Drinking-Water (PNSDW) was issued in 2007 by the Department of Health (DOH). Due to several issues and concerns, the DOH decided to make an update which is relevant and necessary to meet the needs of the stakeholders. As an output, the water quality parameters are now categorized into mandatory, primary, and secondary. The ten mandatory parameters are core parameters which all water service providers nationwide are obligated to test. These include thermotolerant coliforms or Escherichia coli, arsenic, cadmium, lead, nitrate, color, turbidity, pH, total dissolved solids, and disinfectant residual. The 55 primary parameters are site-specific and can be adopted as enforceable parameters when developing new water sources or when the existing source is at high risk of contamination. The 11 secondary parameters include operational parameters and those that affect the esthetic quality of drinking-water. In addition, the updated PNSDW include new sections: (1) reporting and interpretation of results and corrective actions; (2) emergency drinking-water parameters; (3) proposed Sustainable Development Goal parameters; and (4) standards for other drinking-water sources. The lessons learned and insights gained from the updating of standards are likewise incorporated in this paper.
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo
McDaniel, Tyler; D’Azevedo, Ed F.; Li, Ying Wai; ...
2017-11-07
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is therefore formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with applicationmore » of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. Here this procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi- core CPUs and GPUs.« less
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDaniel, Tyler; D’Azevedo, Ed F.; Li, Ying Wai
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is therefore formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with applicationmore » of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. Here this procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi- core CPUs and GPUs.« less
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo.
McDaniel, T; D'Azevedo, E F; Li, Y W; Wong, K; Kent, P R C
2017-11-07
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo
NASA Astrophysics Data System (ADS)
McDaniel, T.; D'Azevedo, E. F.; Li, Y. W.; Wong, K.; Kent, P. R. C.
2017-11-01
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.
UEDGE Simulations for Power and Particle Flow Analysis of FRC Rocket
NASA Astrophysics Data System (ADS)
Zheng, Fred; Evans, Eugene S.; McGreivy, Nick; Kaptanoglu, Alan; Izacard, Olivier; Cohen, Samuel A.
2017-10-01
The field-reversed configuration (FRC) is under consideration for use in a direct fusion drive (DFD) rocket propulsion system for future space missions. To achieve a rocket configuration, the FRC is embedded within an asymmetric magnetic mirror, in which one end is closed and contains a gas box, and the other end is open and incorporates a magnetic nozzle. Neutral deuterium is injected into the gas box, and flows through the scrape-off layer (SOL) around the core plasma and out the magnetic nozzle, both cooling the core and serving as propellant. Previous studies have examined a range of operating conditions for the SOL of a DFD using UEDGE, a 2D fluid code; discrepancies on the order of 5% were found during the analysis of overall power balance. This work extends the analysis of the previously-studied SOL geometry by updating boundary conditions and conducting a detailed study of power and particle flows within the simulation with the goals of modeling electrical power generation instead of thrust and achieving higher specific impulse. This work was supported, in part, by DOE Contract Number DE-AC02-09CH11466 and Princeton Environmental Institute.
NASA Astrophysics Data System (ADS)
Goswami, M.; O'Connor, K. M.; Shamseldin, A. Y.
The "Galway Real-Time River Flow Forecasting System" (GFFS) is a software pack- age developed at the Department of Engineering Hydrology, of the National University of Ireland, Galway, Ireland. It is based on a selection of lumped black-box and con- ceptual rainfall-runoff models, all developed in Galway, consisting primarily of both the non-parametric (NP) and parametric (P) forms of two black-box-type rainfall- runoff models, namely, the Simple Linear Model (SLM-NP and SLM-P) and the seasonally-based Linear Perturbation Model (LPM-NP and LPM-P), together with the non-parametric wetness-index-based Linearly Varying Gain Factor Model (LVGFM), the black-box Artificial Neural Network (ANN) Model, and the conceptual Soil Mois- ture Accounting and Routing (SMAR) Model. Comprised of the above suite of mod- els, the system enables the user to calibrate each model individually, initially without updating, and it is capable also of producing combined (i.e. consensus) forecasts us- ing the Simple Average Method (SAM), the Weighted Average Method (WAM), or the Artificial Neural Network Method (NNM). The updating of each model output is achieved using one of four different techniques, namely, simple Auto-Regressive (AR) updating, Linear Transfer Function (LTF) updating, Artificial Neural Network updating (NNU), and updating by the Non-linear Auto-Regressive Exogenous-input method (NARXM). The models exhibit a considerable range of variation in degree of complexity of structure, with corresponding degrees of complication in objective func- tion evaluation. Operating in continuous river-flow simulation and updating modes, these models and techniques have been applied to two Irish catchments, namely, the Fergus and the Brosna. A number of performance evaluation criteria have been used to comparatively assess the model discharge forecast efficiency.
NASA Technical Reports Server (NTRS)
Burns, Lee; Decker, Ryan; Harrington, Brian; Merry, Carl
2008-01-01
The Kennedy Space Center (KSC) Range Reference Atmosphere (RRA) is a statistical model that summarizes wind and thermodynamic atmospheric variability from surface to 70 km. The National Aeronautics and Space Administration's (NASA) Space Shuttle program, which launches from KSC, utilizes the KSC RRA data to evaluate environmental constraints on various aspects of the vehicle during ascent. An update to the KSC RRA was recently completed. As part of the update, the Natural Environments Branch at NASA's Marshall Space Flight Center (MSFC) conducted a validation study and a comparison analysis to the existing KSC RRA database version 1983. Assessments to the Space Shuttle vehicle ascent profile characteristics were performed by JSC/Ascent Flight Design Division to determine impacts of the updated model to the vehicle performance. Details on the model updates and the vehicle sensitivity analyses with the update model are presented.
ERIC Educational Resources Information Center
Silver, Edward A., Ed.; Kenney, Patricia Ann, Ed.
2015-01-01
This book's 28 chapters are adapted and updated from articles published in NCTM's "Journal for Research in Mathematics Education" between 2000 and 2010. The authors have rewritten and revised their work to make it clear, understandable, and--most of all--useful for mathematics teachers today. To help teachers even more, these articles…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... not listed on the Web site, but should note that the NRC's E-Filing system does not support unlisted... (COLR), to update the methodology reference list to support the core design with the new AREVA fuel... methodologies listed in Technical Specification 5.7.1.5 has no impact on any plant configuration or system...
The Internet Resource Directory for K-12 Teachers and Librarians, 95/96 Edition.
ERIC Educational Resources Information Center
Miller, Elizabeth B.
This directory is the second in an annual series of Internet guides for educators and librarians, and provides tips on access to, as well as addresses for, online resources that support the K-12 curriculum and supplement school library core collections. The listings in the catalog are limited to free and frequently updated resources; over 300 new…
Annual Report on Our Call to Action: Strategic Plan for the Montgomery County Public Schools
ERIC Educational Resources Information Center
Montgomery County Public Schools, 2004
2004-01-01
In June 2003 the Board of Education adopted "Our Call to Action, Pursuit of Excellence," the second edition of the school system's strategic plan. This update of the original November 1999 Our Call to Action, while remaining focused on the core mission of providing every student with a high-quality, world-class education, strengthened the…
SIGMA Release v1.2 - Capabilities, Enhancements and Fixes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Vijay; Grindeanu, Iulian R.; Ray, Navamita
In this report, we present details on SIGMA toolkit along with its component structure, capabilities, and feature additions in FY15, release cycles, and continuous integration process. These software processes along with updated documentation are imperative to successfully integrate and utilize in several applications including the SHARP coupled analysis toolkit for reactor core systems funded under the NEAMS DOE-NE program.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh.
This document is designed for use by teachers of Agricultural Production and Management courses in North Carolina. It updates the competencies and content outlines from the previous guide. It lists core and optional competencies for two courses in seven areas as follows: leadership; supervised agricultural experience programs; animal science;…
Singer, Susanne; Araújo, Cláudia; Arraras, Juan Ignacio; Baumann, Ingo; Boehm, Andreas; Brokstad Herlofson, Bente; Castro Silva, Joaquim; Chie, Wei-Chu; Fisher, Sheila; Guntinas-Lichius, Orlando; Hammerlid, Eva; Irarrázaval, María Elisa; Jensen Hjermstad, Marianne; Jensen, Kenneth; Kiyota, Naomi; Licitra, Lisa; Nicolatou-Galitis, Ourania; Pinto, Monica; Santos, Marcos; Schmalz, Claudia; Sherman, Allen C; Tomaszewska, Iwona M; Verdonck de Leeuw, Irma; Yarom, Noam; Zotti, Paola; Hofmeister, Dirk
2015-09-01
The objective of this study was to pilot test an updated version of the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire Head and Neck Module (EORTC QLQ-H&N60). Patients with head and neck cancer were asked to complete a list of 60 head and neck cancer-specific items comprising the updated EORTC head and neck module and the core questionnaire EORTC QLQ-C30. Debriefing interviews were conducted to identify any irrelevant items and confusing or upsetting wording. Interviews were performed with 330 patients from 17 countries, representing different head and neck cancer sites and treatments. Forty-one of the 60 items were retained according to the predefined EORTC criteria for module development, for another 2 items the wording was refined, and 17 items were removed. The preliminary EORTC QLQ-H&N43 can now be used in academic research. Psychometrics will be tested in a larger field study. © 2014 Wiley Periodicals, Inc.
Key algorithms used in GR02: A computer simulation model for predicting tree and stand growth
Garrett A. Hughes; Paul E. Sendak; Paul E. Sendak
1985-01-01
GR02 is an individual tree, distance-independent simulation model for predicting tree and stand growth over time. It performs five major functions during each run: (1) updates diameter at breast height, (2) updates total height, (3) estimates mortality, (4) determines regeneration, and (5) updates crown class.
NASA Astrophysics Data System (ADS)
Lifton, N. A.
2014-12-01
A recently published cosmogenic nuclide production rate scaling model based on analytical fits to Monte Carlo simulations of atmospheric cosmic ray flux spectra (both of which agree well with measured spectra) (Lifton et al., 2014, Earth Planet. Sci. Lett. 386, 149-160: termed the LSD model) provides two main advantages over previous scaling models: identification and quantification of potential sources of bias in the earlier models, and the ability to generate nuclide-specific scaling factors easily for a wide range of input parameters. The new model also provides a flexible framework for exploring the implications of advances in model inputs. In this work, the scaling implications of two recent time-dependent spherical harmonic geomagnetic models spanning the Holocene will be explored. Korte and Constable (2011, Phys. Earth Planet. Int. 188, 247-259) and Korte et al. (2011, Earth Planet. Sci. Lett. 312, 497-505) recently updated earlier spherical harmonic paleomagnetic models used by Lifton et al. (2014) with paleomagnetic measurements from sediment cores in addition to archeomagnetic and volcanic data. These updated models offer improved accuracy over the previous versions, in part to due to increased temporal and spatial data coverage. With the new models as input, trajectory-traced estimates of effective vertical cutoff rigidity (RC- the standard method for ordering cosmic ray data) yield significantly different time-integrated scaling predictions when compared to the earlier models. These results will be compared to scaling predictions using another recent time-dependent spherical harmonic model of the Holocene geomagnetic field by Pavón-Carrasco et al. (2014, Earth Planet. Sci. Lett. 388, 98-109), based solely on archeomagnetic and volcanic paleomagnetic data, but extending to 14 ka. In addition, the potential effects of time-dependent atmospheric models on LSD scaling predictions will be presented. Given the typical dominance of altitudinal over latitudinal scaling effects on cosmogenic nuclide production, incorporating transient global simulations of atmospheric structure (e.g., Liu et al., 2009, Science 325, 310-314) into scaling frameworks may contribute to improved understanding of long-term production rate variations.
User's guide to the MESOI diffusion model and to the utility programs UPDATE and LOGRVU
DOE Office of Scientific and Technical Information (OSTI.GOV)
Athey, G.F.; Allwine, K.J.; Ramsdell, J.V.
MESOI is an interactive, Lagrangian puff trajectory diffusion model. The model is documented separately (Ramsdell and Athey, 1981); this report is intended to provide MESOI users with the information needed to successfully conduct model simulations. The user is also provided with guidance in the use of the data file maintenance and review programs; UPDATE and LOGRVU. Complete examples are given for the operaton of all three programs and an appendix documents UPDATE and LOGRVU.
Basis for the ICRP’s updated biokinetic model for carbon inhaled as CO 2
Leggett, Richard W.
2017-03-02
Here, the International Commission on Radiological Protection (ICRP) is updating its biokinetic and dosimetric models for occupational intake of radionuclides (OIR) in a series of reports called the OIR series. This paper describes the basis for the ICRP's updated biokinetic model for inhalation of radiocarbon as carbon dioxide (CO 2) gas. The updated model is based on biokinetic data for carbon isotopes inhaled as carbon dioxide or injected or ingested as bicarbonatemore » $$({{{\\rm{HCO}}}_{3}}^{-}).$$ The data from these studies are expected to apply equally to internally deposited (or internally produced) carbon dioxide and bicarbonate based on comparison of excretion rates for the two administered forms and the fact that carbon dioxide and bicarbonate are largely carried in a common form (CO 2–H$${{{\\rm{CO}}}_{3}}^{-})$$ in blood. Compared with dose estimates based on current ICRP biokinetic models for inhaled carbon dioxide or ingested carbon, the updated model will result in a somewhat higher dose estimate for 14C inhaled as CO 2 and a much lower dose estimate for 14C ingested as bicarbonate.« less
Basis for the ICRP’s updated biokinetic model for carbon inhaled as CO 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leggett, Richard W.
Here, the International Commission on Radiological Protection (ICRP) is updating its biokinetic and dosimetric models for occupational intake of radionuclides (OIR) in a series of reports called the OIR series. This paper describes the basis for the ICRP's updated biokinetic model for inhalation of radiocarbon as carbon dioxide (CO 2) gas. The updated model is based on biokinetic data for carbon isotopes inhaled as carbon dioxide or injected or ingested as bicarbonatemore » $$({{{\\rm{HCO}}}_{3}}^{-}).$$ The data from these studies are expected to apply equally to internally deposited (or internally produced) carbon dioxide and bicarbonate based on comparison of excretion rates for the two administered forms and the fact that carbon dioxide and bicarbonate are largely carried in a common form (CO 2–H$${{{\\rm{CO}}}_{3}}^{-})$$ in blood. Compared with dose estimates based on current ICRP biokinetic models for inhaled carbon dioxide or ingested carbon, the updated model will result in a somewhat higher dose estimate for 14C inhaled as CO 2 and a much lower dose estimate for 14C ingested as bicarbonate.« less
Barnes, Marcia A.; Raghubar, Kimberly P.; Faulkner, Heather; Denton, Carolyn A.
2014-01-01
Readers construct mental models of situations described by text to comprehend what they read, updating these situation models based on explicitly described and inferred information about causal, temporal, and spatial relations. Fluent adult readers update their situation models while reading narrative text based in part on spatial location information that is consistent with the perspective of the protagonist. The current study investigates whether children update spatial situation models in a similar way, whether there are age-related changes in children's formation of spatial situation models during reading, and whether measures of the ability to construct and update spatial situation models are predictive of reading comprehension. Typically-developing children from ages 9 through 16 years (n=81) were familiarized with a physical model of a marketplace. Then the model was covered, and children read stories that described the movement of a protagonist through the marketplace and were administered items requiring memory for both explicitly stated and inferred information about the character's movements. Accuracy of responses and response times were evaluated. Results indicated that: (a) location and object information during reading appeared to be activated and updated not simply from explicit text-based information but from a mental model of the real world situation described by the text; (b) this pattern showed no age-related differences; and (c) the ability to update the situation model of the text based on inferred information, but not explicitly stated information, was uniquely predictive of reading comprehension after accounting for word decoding. PMID:24315376
Human Ageing Genomic Resources: new and updated databases
Tacutu, Robi; Thornton, Daniel; Johnson, Emily; Budovsky, Arie; Barardo, Diogo; Craig, Thomas; Diana, Eugene; Lehmann, Gilad; Toren, Dmitri; Wang, Jingwei; Fraifeld, Vadim E
2018-01-01
Abstract In spite of a growing body of research and data, human ageing remains a poorly understood process. Over 10 years ago we developed the Human Ageing Genomic Resources (HAGR), a collection of databases and tools for studying the biology and genetics of ageing. Here, we present HAGR’s main functionalities, highlighting new additions and improvements. HAGR consists of six core databases: (i) the GenAge database of ageing-related genes, in turn composed of a dataset of >300 human ageing-related genes and a dataset with >2000 genes associated with ageing or longevity in model organisms; (ii) the AnAge database of animal ageing and longevity, featuring >4000 species; (iii) the GenDR database with >200 genes associated with the life-extending effects of dietary restriction; (iv) the LongevityMap database of human genetic association studies of longevity with >500 entries; (v) the DrugAge database with >400 ageing or longevity-associated drugs or compounds; (vi) the CellAge database with >200 genes associated with cell senescence. All our databases are manually curated by experts and regularly updated to ensure a high quality data. Cross-links across our databases and to external resources help researchers locate and integrate relevant information. HAGR is freely available online (http://genomics.senescence.info/). PMID:29121237
A groundwater data assimilation application study in the Heihe mid-reach
NASA Astrophysics Data System (ADS)
Ragettli, S.; Marti, B. S.; Wolfgang, K.; Li, N.
2017-12-01
The present work focuses on modelling of the groundwater flow in the mid-reach of the endorheic river Heihe in the Zhangye oasis (Gansu province) in arid north-west China. In order to optimise the water resources management in the oasis, reliable forecasts of groundwater level development under different management options and environmental boundary conditions have to be produced. For this means, groundwater flow is modelled with Modflow and coupled to an Ensemble Kalman Filter programmed in Matlab. The model is updated with monthly time steps, featuring perturbed boundary conditions to account for uncertainty in model forcing. Constant biases between model and observations have been corrected prior to updating and compared to model runs without bias correction. Different options for data assimilation (states and/or parameters), updating frequency, and measures against filter inbreeding (damping factor, covariance inflation, spatial localization) have been tested against each other. Results show a high dependency of the Ensemble Kalman filter performance on the selection of observations for data assimilation. For the present regional model, bias correction is necessary for a good filter performance. A combination of spatial localization and covariance inflation is further advisable to reduce filter inbreeding problems. Best performance is achieved if parameter updates are not large, an indication for good prior model calibration. Asynchronous updating of parameter values once every five years (with data of the past five years) and synchronous updating of the groundwater levels is better suited for this groundwater system with not or slow changing parameter values than synchronous updating of both groundwater levels and parameters at every time step applying a damping factor. The filter is not able to correct time lags of signals.
The uploaded data consists of the BRACE Na aerosol observations paired with CMAQ model output, the updated model's parameterization of sea salt aerosol emission size distribution, and the model's parameterization of the sea salt emission factor as a function of sea surface temperature. This dataset is associated with the following publication:Gantt , B., J. Kelly , and J. Bash. Updating sea spray aerosol emissions in the Community Multiscale Air Quality (CMAQ) model version 5.0.2. Geoscientific Model Development. Copernicus Publications, Katlenburg-Lindau, GERMANY, 8: 3733-3746, (2015).
A functional model of sensemaking in a neurocognitive architecture.
Lebiere, Christian; Pirolli, Peter; Thomson, Robert; Paik, Jaehyon; Rutledge-Taylor, Matthew; Staszewski, James; Anderson, John R
2013-01-01
Sensemaking is the active process of constructing a meaningful representation (i.e., making sense) of some complex aspect of the world. In relation to intelligence analysis, sensemaking is the act of finding and interpreting relevant facts amongst the sea of incoming reports, images, and intelligence. We present a cognitive model of core information-foraging and hypothesis-updating sensemaking processes applied to complex spatial probability estimation and decision-making tasks. While the model was developed in a hybrid symbolic-statistical cognitive architecture, its correspondence to neural frameworks in terms of both structure and mechanisms provided a direct bridge between rational and neural levels of description. Compared against data from two participant groups, the model correctly predicted both the presence and degree of four biases: confirmation, anchoring and adjustment, representativeness, and probability matching. It also favorably predicted human performance in generating probability distributions across categories, assigning resources based on these distributions, and selecting relevant features given a prior probability distribution. This model provides a constrained theoretical framework describing cognitive biases as arising from three interacting factors: the structure of the task environment, the mechanisms and limitations of the cognitive architecture, and the use of strategies to adapt to the dual constraints of cognition and the environment.
Rusin; Hall; Nichol; Marlow; Richards; Myers
2000-04-20
We present adaptive optics imaging of the CLASS gravitational lens system B1359+154 obtained with the Canada-France-Hawaii Telescope (CFHT) in the infrared K band. The observations show at least three brightness peaks within the ring of lensed images, which we identify as emission from multiple lensing galaxies. The results confirm the suspected compound nature of the lens, as deduced from preliminary mass modeling. The detection of several additional nearby galaxies suggests that B1359+154 is lensed by the compact core of a small galaxy group. We attempted to produce an updated lens model based on the CFHT observations and new 5 GHz radio data obtained with the MERLIN array, but there are too few constraints to construct a realistic model at this time. The uncertainties inherent with modeling compound lenses make B1359+154 a challenging target for Hubble constant determination through the measurement of differential time delays. However, time delays will offer additional constraints to help pin down the mass model. This lens system therefore presents a unique opportunity to directly measure the mass distribution of a galaxy group at intermediate redshift.
A Functional Model of Sensemaking in a Neurocognitive Architecture
Lebiere, Christian; Paik, Jaehyon; Rutledge-Taylor, Matthew; Staszewski, James; Anderson, John R.
2013-01-01
Sensemaking is the active process of constructing a meaningful representation (i.e., making sense) of some complex aspect of the world. In relation to intelligence analysis, sensemaking is the act of finding and interpreting relevant facts amongst the sea of incoming reports, images, and intelligence. We present a cognitive model of core information-foraging and hypothesis-updating sensemaking processes applied to complex spatial probability estimation and decision-making tasks. While the model was developed in a hybrid symbolic-statistical cognitive architecture, its correspondence to neural frameworks in terms of both structure and mechanisms provided a direct bridge between rational and neural levels of description. Compared against data from two participant groups, the model correctly predicted both the presence and degree of four biases: confirmation, anchoring and adjustment, representativeness, and probability matching. It also favorably predicted human performance in generating probability distributions across categories, assigning resources based on these distributions, and selecting relevant features given a prior probability distribution. This model provides a constrained theoretical framework describing cognitive biases as arising from three interacting factors: the structure of the task environment, the mechanisms and limitations of the cognitive architecture, and the use of strategies to adapt to the dual constraints of cognition and the environment. PMID:24302930
DTU candidate field models for IGRF-12 and the CHAOS-5 geomagnetic field model
NASA Astrophysics Data System (ADS)
Finlay, Christopher C.; Olsen, Nils; Tøffner-Clausen, Lars
2015-07-01
We present DTU's candidate field models for IGRF-12 and the parent field model from which they were derived, CHAOS-5. Ten months of magnetic field observations from ESA's Swarm mission, together with up-to-date ground observatory monthly means, were used to supplement the data sources previously used to construct CHAOS-4. The internal field part of CHAOS-5, from which our IGRF-12 candidate models were extracted, is time-dependent up to spherical harmonic degree 20 and involves sixth-order splines with a 0.5 year knot spacing. In CHAOS-5, compared with CHAOS-4, we update only the low-degree internal field model (degrees 1 to 24) and the associated external field model. The high-degree internal field (degrees 25 to 90) is taken from the same model CHAOS-4h, based on low-altitude CHAMP data, which was used in CHAOS-4. We find that CHAOS-5 is able to consistently fit magnetic field data from six independent low Earth orbit satellites: Ørsted, CHAMP, SAC-C and the three Swarm satellites (A, B and C). It also adequately describes the secular variation measured at ground observatories. CHAOS-5 thus contributes to an initial validation of the quality of the Swarm magnetic data, in particular demonstrating that Huber weighted rms model residuals to Swarm vector field data are lower than those to Ørsted and CHAMP vector data (when either one or two star cameras were operating). CHAOS-5 shows three pulses of secular acceleration at the core surface over the past decade; the 2006 and 2009 pulses have previously been documented, but the 2013 pulse has only recently been identified. The spatial signature of the 2013 pulse at the core surface, under the Atlantic sector where it is strongest, is well correlated with the 2006 pulse, but anti-correlated with the 2009 pulse.
Identifying western yellow-billed cuckoo breeding habitat with a dual modelling approach
Johnson, Matthew J.; Hatten, James R.; Holmes, Jennifer A.; Shafroth, Patrick B.
2017-01-01
The western population of the yellow-billed cuckoo (Coccyzus americanus) was recently listed as threatened under the federal Endangered Species Act. Yellow-billed cuckoo conservation efforts require the identification of features and area requirements associated with high quality, riparian forest habitat at spatial scales that range from nest microhabitat to landscape, as well as lower-suitability areas that can be enhanced or restored. Spatially explicit models inform conservation efforts by increasing ecological understanding of a target species, especially at landscape scales. Previous yellow-billed cuckoo modelling efforts derived plant-community maps from aerial photography, an expensive and oftentimes inconsistent approach. Satellite models can remotely map vegetation features (e.g., vegetation density, heterogeneity in vegetation density or structure) across large areas with near perfect repeatability, but they usually cannot identify plant communities. We used aerial photos and satellite imagery, and a hierarchical spatial scale approach, to identify yellow-billed cuckoo breeding habitat along the Lower Colorado River and its tributaries. Aerial-photo and satellite models identified several key features associated with yellow-billed cuckoo breeding locations: (1) a 4.5 ha core area of dense cottonwood-willow vegetation, (2) a large native, heterogeneously dense forest (72 ha) around the core area, and (3) moderately rough topography. The odds of yellow-billed cuckoo occurrence decreased rapidly as the amount of tamarisk cover increased or when cottonwood-willow vegetation was limited. We achieved model accuracies of 75–80% in the project area the following year after updating the imagery and location data. The two model types had very similar probability maps, largely predicting the same areas as high quality habitat. While each model provided unique information, a dual-modelling approach provided a more complete picture of yellow-billed cuckoo habitat requirements and will be useful for management and conservation activities.
NASA Astrophysics Data System (ADS)
Hipp, J. R.; Encarnacao, A.; Ballard, S.; Young, C. J.; Phillips, W. S.; Begnaud, M. L.
2011-12-01
Recently our combined SNL-LANL research team has succeeded in developing a global, seamless 3D tomographic P-velocity model (SALSA3D) that provides superior first P travel time predictions at both regional and teleseismic distances. However, given the variable data quality and uneven data sampling associated with this type of model, it is essential that there be a means to calculate high-quality estimates of the path-dependent variance and covariance associated with the predicted travel times of ray paths through the model. In this paper, we show a methodology for accomplishing this by exploiting the full model covariance matrix. Our model has on the order of 1/2 million nodes, so the challenge in calculating the covariance matrix is formidable: 0.9 TB storage for 1/2 of a symmetric matrix, necessitating an Out-Of-Core (OOC) blocked matrix solution technique. With our approach the tomography matrix (G which includes Tikhonov regularization terms) is multiplied by its transpose (GTG) and written in a blocked sub-matrix fashion. We employ a distributed parallel solution paradigm that solves for (GTG)-1 by assigning blocks to individual processing nodes for matrix decomposition update and scaling operations. We first find the Cholesky decomposition of GTG which is subsequently inverted. Next, we employ OOC matrix multiply methods to calculate the model covariance matrix from (GTG)-1 and an assumed data covariance matrix. Given the model covariance matrix we solve for the travel-time covariance associated with arbitrary ray-paths by integrating the model covariance along both ray paths. Setting the paths equal gives variance for that path. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Zarriello, Phillip J.; Olson, Scott A.; Flynn, Robert H.; Strauch, Kellan R.; Murphy, Elizabeth A.
2014-01-01
Heavy, persistent rains from late February through March 2010 caused severe flooding that set, or nearly set, peaks of record for streamflows and water levels at many long-term streamgages in Rhode Island. In response to this event, hydraulic models were updated for selected reaches covering about 56 river miles in the Pawtuxet River Basin to simulate water-surface elevations (WSEs) at specified flows and boundary conditions. Reaches modeled included the main stem of the Pawtuxet River, the North and South Branches of the Pawtuxet River, Pocasset River, Simmons Brook, Dry Brook, Meshanticut Brook, Furnace Hill Brook, Flat River, Quidneck Brook, and two unnamed tributaries referred to as South Branch Pawtuxet River Tributary A1 and Tributary A2. All the hydraulic models were updated to Hydrologic Engineering Center-River Analysis System (HEC-RAS) version 4.1.0 using steady-state simulations. Updates to the models included incorporation of new field-survey data at structures, high resolution land-surface elevation data, and updated flood flows from a related study. The models were assessed using high-water marks (HWMs) obtained in a related study following the March– April 2010 flood and the simulated water levels at the 0.2-percent annual exceedance probability (AEP), which is the estimated AEP of the 2010 flood in the basin. HWMs were obtained at 110 sites along the main stem of the Pawtuxet River, the North and South Branches of the Pawtuxet River, Pocasset River, Simmons Brook, Furnace Hill Brook, Flat River, and Quidneck Brook. Differences between the 2010 HWM elevations and the simulated 0.2-percent AEP WSEs from flood insurance studies (FISs) and the updated models developed in this study varied with most differences attributed to the magnitude of the 0.2-percent AEP flows. WSEs from the updated models generally are in closer agreement with the observed 2010 HWMs than with the FIS WSEs. The improved agreement of the updated simulated water elevations to observed 2010 HWMs provides a measure of the hydraulic model performance, which indicates the updated models better represent flooding at other AEPs than the existing FIS models.
Imputatoin and Model-Based Updating Technique for Annual Forest Inventories
Ronald E. McRoberts
2001-01-01
The USDA Forest Service is developing an annual inventory system to establish the capability of producing annual estimates of timber volume and related variables. The inventory system features measurement of an annual sample of field plots with options for updating data for plots measured in previous years. One imputation and two model-based updating techniques are...
A National Disturbance Modeling System to Support Ecological Carbon Sequestration Assessments
NASA Astrophysics Data System (ADS)
Hawbaker, T. J.; Rollins, M. G.; Volegmann, J. E.; Shi, H.; Sohl, T. L.
2009-12-01
The U.S. Geological Survey (USGS) is prototyping a methodology to fulfill requirements of Section 712 of the Energy Independence and Security Act (EISA) of 2007. At the core of the EISA requirements is the development of a methodology to complete a two-year assessment of current carbon stocks and other greenhouse gas (GHG) fluxes, and potential increases for ecological carbon sequestration under a range of future climate changes, land-use / land-cover configurations, and policy, economic and management scenarios. Disturbances, especially fire, affect vegetation dynamics and ecosystem processes, and can also introduce substantial uncertainty and risk to the efficacy of long-term carbon sequestration strategies. Thus, the potential impacts of disturbances need to be considered under different scenarios. As part of USGS efforts to meet EISA requirements, we developed the National Disturbance Modeling System (NDMS) using a series of statistical and process-based simulation models. NDMS produces spatially-explicit forecasts of future disturbance locations and severity, and the resulting effects on vegetation dynamics. NDMS is embedded within the Forecasting Scenarios of Future Land Cover (FORE-SCE) model and informs the General Ensemble Biogeochemical Modeling System (GEMS) for quantifying carbon stocks and GHG fluxes. For fires, NDMS relies on existing disturbance histories, such as the Landsat derived Monitoring Trends in Burn Severity (MTBS) and Vegetation Change Tracker (VCT) data being used to update LANDFIRE fuels data. The MTBS and VCT data are used to parameterize models predicting the number and size of fires in relation to climate, land-use/land-cover change, and socioeconomic variables. The locations of individual fire ignitions are determined by an ignition probability surface and then FARSITE is used to simulate fire spread in response to weather, fuels, and topography. Following the fire spread simulations, a burn severity model is used to determine annual changes in biomass pools. Vegetation succession among LANDFIRE vegetation types is initiated using burn perimeter and severity data at the end of each annual simulation. Results from NDMS are used to update land-use/land-cover layers used by FORE-SCE and also transferred to GEMS for quantifying and updating carbon stocks and greenhouse gas fluxes. In this presentation, we present: 1) an overview of NDMS and its role in USGS's national ecological carbon sequestration assessment; 2) validation of NDMS using historic data; and 3) initial forecasts of disturbances for the southeastern United States and their impacts on greenhouse gas emissions, and post-fire carbon stocks and fluxes.
Quantitative prediction of cellular metabolism with constraint-based models: the COBRA Toolbox v2.0
Schellenberger, Jan; Que, Richard; Fleming, Ronan M. T.; Thiele, Ines; Orth, Jeffrey D.; Feist, Adam M.; Zielinski, Daniel C.; Bordbar, Aarash; Lewis, Nathan E.; Rahmanian, Sorena; Kang, Joseph; Hyduke, Daniel R.; Palsson, Bernhard Ø.
2012-01-01
Over the past decade, a growing community of researchers has emerged around the use of COnstraint-Based Reconstruction and Analysis (COBRA) methods to simulate, analyze and predict a variety of metabolic phenotypes using genome-scale models. The COBRA Toolbox, a MATLAB package for implementing COBRA methods, was presented earlier. Here we present a significant update of this in silico ToolBox. Version 2.0 of the COBRA Toolbox expands the scope of computations by including in silico analysis methods developed since its original release. New functions include: (1) network gap filling, (2) 13C analysis, (3) metabolic engineering, (4) omics-guided analysis, and (5) visualization. As with the first version, the COBRA Toolbox reads and writes Systems Biology Markup Language formatted models. In version 2.0, we improved performance, usability, and the level of documentation. A suite of test scripts can now be used to learn the core functionality of the Toolbox and validate results. This Toolbox lowers the barrier of entry to use powerful COBRA methods. PMID:21886097
Modifications and Modelling of the Fission Surface Power Primary Test Circuit (FSP-PTC)
NASA Technical Reports Server (NTRS)
Garber, Ann E.
2008-01-01
An actively pumped alkali metal flow circuit, designed and fabricated at the NASA Marshall Space Flight Center, underwent a range of tests at MSFC in early 2007. During this period, system transient responses and the performance of the liquid metal pump were evaluated. In May of 2007, the circuit was drained and cleaned to prepare for multiple modifications: the addition of larger upper and lower reservoirs, the installation of an annular linear induction pump (ALIP), and the inclusion of the Single Flow Cell Test Apparatus (SFCTA) in the test section. Performance of the ALIP, provided by Idaho National Laboratory (INL), will be evaluated when testing resumes. The SFCTA, which will be tested simultaneously, will provide data on alkali metal flow behavior through the simulated core channels and assist in the development of a second generation thermal simulator. Additionally, data from the first round of testing has been used to refine the working system model, developed using the Generalized Fluid System Simulation Program (GFSSP). This paper covers the modifications of the FSP-PTC and the updated GFSSP system model.
Machine learning in updating predictive models of planning and scheduling transportation projects
DOT National Transportation Integrated Search
1997-01-01
A method combining machine learning and regression analysis to automatically and intelligently update predictive models used in the Kansas Department of Transportations (KDOTs) internal management system is presented. The predictive models used...
Benefits of Model Updating: A Case Study Using the Micro-Precision Interferometer Testbed
NASA Technical Reports Server (NTRS)
Neat, Gregory W.; Kissil, Andrew; Joshi, Sanjay S.
1997-01-01
This paper presents a case study on the benefits of model updating using the Micro-Precision Interferometer (MPI) testbed, a full-scale model of a future spaceborne optical interferometer located at JPL.
Preliminary Model of Porphyry Copper Deposits
Berger, Byron R.; Ayuso, Robert A.; Wynn, Jeffrey C.; Seal, Robert R.
2008-01-01
The U.S. Geological Survey (USGS) Mineral Resources Program develops mineral-deposit models for application in USGS mineral-resource assessments and other mineral resource-related activities within the USGS as well as for nongovernmental applications. Periodic updates of models are published in order to incorporate new concepts and findings on the occurrence, nature, and origin of specific mineral deposit types. This update is a preliminary model of porphyry copper deposits that begins an update process of porphyry copper models published in USGS Bulletin 1693 in 1986. This update includes a greater variety of deposit attributes than were included in the 1986 model as well as more information about each attribute. It also includes an expanded discussion of geophysical and remote sensing attributes and tools useful in resource evaluations, a summary of current theoretical concepts of porphyry copper deposit genesis, and a summary of the environmental attributes of unmined and mined deposits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, G.S.
2008-07-15
The Advanced Test Reactor (ATR) is a high power density and high neutron flux research reactor operating in the United States. Powered with highly enriched uranium (HEU), the ATR has a maximum thermal power rating of 250 MWth. Because of the large test volumes located in high flux areas, the ATR is an ideal candidate for assessing the feasibility of converting an HEU driven reactor to a low-enriched core. The present work investigates the necessary modifications and evaluates the subsequent operating effects of this conversion. A detailed plate-by-plate MCNP ATR 1/8th core model was developed and validated for a fuelmore » cycle burnup comparison analysis. Using the current HEU U-235 enrichment of 93.0 % as a baseline, an analysis can be performed to determine the low-enriched uranium (LEU) density and U-235 enrichment required in the fuel meat to yield an equivalent K-eff between the HEU core th and the LEU core versus effective full power days (EFPD). The MCNP ATR 1/8th core model will be used to optimize the U-235 loading in the LEU core, such that the differences in K-eff and heat flux profile between the HEU and LEU core can be minimized. The depletion methodology MCWO was used to calculate K-eff versus EFPDs in this paper. The MCWO-calculated results for the LEU cases with foil (U-10Mo) types demonstrated adequate excess reactivity such that the K-eff versus EFPDs plot is similar to the reference ATR HEU case. Each HEU fuel element contains 19 fuel plates with a fuel meat thickness of 0.508 mm. In this work, the proposed LEU (U-10Mo) core conversion case with a nominal fuel meat thickness of 0.381 mm and the same U-235 enrichment (19.7 wt%) can be used to optimize the radial heat flux profile by varying the fuel meat thickness from 0.191 mm (7.5 mil) to 0.343 mm (13.5 mil) at the inner 4 fuel plates (1-4) and outer 4 fuel plates (16-19). In addition, 0.8g of a burnable absorber, Boron-10, was added in the inner and outer plates to reduce the initial excess reactivity, and the inner/outer heat flux more effectively. The optimized LEU relative radial fission heat flux profile is bounded by the reference ATR HEU case. However, to demonstrate that the LEU core fuel cycle performance can meet the Updated Final Safety Analysis Report (UFSAR) safety requirements, additional studies will be necessary to evaluate and compare safety parameters such as void reactivity and Doppler coefficients, control components worth (outer shim control cylinders, safety rods and regulating rod), and shutdown margins between the HEU and LEU cores. (author)« less
Medendorp, W. P.
2015-01-01
It is known that the brain uses multiple reference frames to code spatial information, including eye-centered and body-centered frames. When we move our body in space, these internal representations are no longer in register with external space, unless they are actively updated. Whether the brain updates multiple spatial representations in parallel, or whether it restricts its updating mechanisms to a single reference frame from which other representations are constructed, remains an open question. We developed an optimal integration model to simulate the updating of visual space across body motion in multiple or single reference frames. To test this model, we designed an experiment in which participants had to remember the location of a briefly presented target while being translated sideways. The behavioral responses were in agreement with a model that uses a combination of eye- and body-centered representations, weighted according to the reliability in which the target location is stored and updated in each reference frame. Our findings suggest that the brain simultaneously updates multiple spatial representations across body motion. Because both representations are kept in sync, they can be optimally combined to provide a more precise estimate of visual locations in space than based on single-frame updating mechanisms. PMID:26490289
Validity, reliability, and generalizability in qualitative research
Leung, Lawrence
2015-01-01
In general practice, qualitative research contributes as significantly as quantitative research, in particular regarding psycho-social aspects of patient-care, health services provision, policy setting, and health administrations. In contrast to quantitative research, qualitative research as a whole has been constantly critiqued, if not disparaged, by the lack of consensus for assessing its quality and robustness. This article illustrates with five published studies how qualitative research can impact and reshape the discipline of primary care, spiraling out from clinic-based health screening to community-based disease monitoring, evaluation of out-of-hours triage services to provincial psychiatric care pathways model and finally, national legislation of core measures for children's healthcare insurance. Fundamental concepts of validity, reliability, and generalizability as applicable to qualitative research are then addressed with an update on the current views and controversies. PMID:26288766
NASA Astrophysics Data System (ADS)
Maskal, Alan B.
Spacer grids maintain the structural integrity of the fuel rods within fuel bundles of nuclear power plants. They can also improve flow characteristics within the nuclear reactor core. However, spacer grids add reactor coolant pressure losses, which require estimation and engineering into the design. Several mathematical models and computer codes were developed over decades to predict spacer grid pressure loss. Most models use generalized characteristics, measured by older, less precise equipment. The study of OECD/US-NRC BWR Full-Size Fine Mesh Bundle Tests (BFBT) provides updated and detailed experimental single and two-phase results, using technically advanced flow measurements for a wide range of boundary conditions. This thesis compares the predictions from the mathematical models to the BFBT experimental data by utilizing statistical formulae for accuracy and precision. This thesis also analyzes the effects of BFBT flow characteristics on spacer grids. No single model has been identified as valid for all flow conditions. However, some models' predictions perform better than others within a range of flow conditions, based on the accuracy and precision of the models' predictions. This study also demonstrates that pressure and flow quality have a significant effect on two-phase flow spacer grid models' biases.
[Purity Detection Model Update of Maize Seeds Based on Active Learning].
Tang, Jin-ya; Huang, Min; Zhu, Qi-bing
2015-08-01
Seed purity reflects the degree of seed varieties in typical consistent characteristics, so it is great important to improve the reliability and accuracy of seed purity detection to guarantee the quality of seeds. Hyperspectral imaging can reflect the internal and external characteristics of seeds at the same time, which has been widely used in nondestructive detection of agricultural products. The essence of nondestructive detection of agricultural products using hyperspectral imaging technique is to establish the mathematical model between the spectral information and the quality of agricultural products. Since the spectral information is easily affected by the sample growth environment, the stability and generalization of model would weaken when the test samples harvested from different origin and year. Active learning algorithm was investigated to add representative samples to expand the sample space for the original model, so as to implement the rapid update of the model's ability. Random selection (RS) and Kennard-Stone algorithm (KS) were performed to compare the model update effect with active learning algorithm. The experimental results indicated that in the division of different proportion of sample set (1:1, 3:1, 4:1), the updated purity detection model for maize seeds from 2010 year which was added 40 samples selected by active learning algorithm from 2011 year increased the prediction accuracy for 2011 new samples from 47%, 33.75%, 49% to 98.89%, 98.33%, 98.33%. For the updated purity detection model of 2011 year, its prediction accuracy for 2010 new samples increased by 50.83%, 54.58%, 53.75% to 94.57%, 94.02%, 94.57% after adding 56 new samples from 2010 year. Meanwhile the effect of model updated by active learning algorithm was better than that of RS and KS. Therefore, the update for purity detection model of maize seeds is feasible by active learning algorithm.
Build-Up Approach to Updating the Mock Quiet Spike Beam Model
NASA Technical Reports Server (NTRS)
Herrera, Claudia Y.; Pak, Chan-gi
2007-01-01
When a new aircraft is designed or a modification is done to an existing aircraft, the aeroelastic properties of the aircraft should be examined to ensure the aircraft is flight worthy. Evaluating the aeroelastic properties of a new or modified aircraft can include performing a variety of analyses, such as modal and flutter analyses. In order to produce accurate results from these analyses, it is imperative to work with finite element models (FEM) that have been validated by or correlated to ground vibration test (GVT) data, Updating an analytical model using measured data is a challenge in the area of structural dynamics. The analytical model update process encompasses a series of optimizations that match analytical frequencies and mode shapes to the measured modal characteristics of structure. In the past, the method used to update a model to test data was "trial and error." This is an inefficient method - running a modal analysis, comparing the analytical results to the GVT data, manually modifying one or more structural parameters (mass, CG, inertia, area, etc.), rerunning the analysis, and comparing the new analytical modal characteristics to the GVT modal data. If the match is close enough (close enough defined by analyst's updating requirements), then the updating process is completed. If the match does not meet updating-requirements, then the parameters are changed again and the process is repeated. Clearly, this manual optimization process is highly inefficient for large FEM's and/or a large number of structural parameters. NASA Dryden Flight Research Center (DFRC) has developed, in-house, a Mode Matching Code that automates the above-mentioned optimization process, DFRC's in-house Mode Matching Code reads mode shapes and frequencies acquired from GVT to create the target model. It also reads the current analytical model, as we11 as the design variables and their upper and lower limits. It performs a modal analysis on this model and modifies it to create an updated model that has similar mode shapes and frequencies as those of the target model. The Mode Matching Code output frequencies and modal assurance criteria (MAC) values that allow for the quantified comparison of the updated model versus the target model. A recent application of this code is the F453 supersonic flight testing platform, NASA DFRC possesses a modified F-15B that is used as a test bed aircraft for supersonic flight experiments. Traditionally, the finite element model of the test article is generated. A GVT is done on the test article ta validate and update its FEM. This FEM is then mated to the F-15B model, which was correlated to GVT data in fall of 2004, A GVT is conducted with the test article mated to the aircraft, and this mated F-15B/ test article FEM is correlated to this final GVT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berz, Martin; Makino, Kyoko
The ARRA funds were utilized to acquire a cluster of high performance computers, consisting of one Altus 2804 Server based on a Quad AMD Opteron 6174 12C with 4 2.2 GHz nodes of 12 cores each, resulting in 48 directly usable cores; as well as a Relion 1751 Server using an Intel Xeon X5677 consisting of 4 3.46 GHz cores supporting 8 threads. Both systems run the Unix flavor CentOS, which is designed for use without need of updates, which greatly enhances their reliability. The systems are used to operate our COSY INFINITY environment which supports MPI parallelization. The unitsmore » arrived at MSU in September 2010, and were taken into operation shortly thereafter.« less
2013-01-01
In 2003, the International Patient Decision Aid Standards (IPDAS) Collaboration was established to enhance the quality and effectiveness of patient decision aids by establishing an evidence-informed framework for improving their content, development, implementation, and evaluation. Over this 10 year period, the Collaboration has established: a) the background document on 12 core dimensions to inform the original modified Delphi process to establish the IPDAS checklist (74 items); b) the valid and reliable IPDAS instrument (47 items); and c) the IPDAS qualifying (6 items), certifying (6 items + 4 items for screening), and quality criteria (28 items). The objective of this paper is to describe the evolution of the IPDAS Collaboration and discuss the standardized process used to update the background documents on the theoretical rationales, evidence and emerging issues underlying the 12 core dimensions for assessing the quality of patient decision aids. PMID:24624947
The search for failed supernovae with the Large Binocular Telescope: constraints from 7 yr of data
NASA Astrophysics Data System (ADS)
Adams, S. M.; Kochanek, C. S.; Gerke, J. R.; Stanek, K. Z.
2017-08-01
We report updated results for the first 7 yr of our programme to monitor 27 galaxies within 10 Mpc using the Large Binocular Telescope to search for failed supernovae (SNe) - core collapses of massive stars that form black holes without luminous SNe. In the new data, we identify no new compelling candidates and confirm the existing candidate. Given the six successful core-collapse SNe in the sample and one likely failed SN, the implied fraction of core collapses that result in failed SNe is f=0.14^{+0.33}_{-0.10} at 90 per cent confidence. If the current candidate is a failed SN, the fraction of failed SN naturally explains the missing high-mass red supergiants SN progenitors and the black hole mass function. If the current candidate is ultimately rejected, the data imply a 90 per cent confidence upper limit on the failed SN fraction of f < 0.35.
NASA Astrophysics Data System (ADS)
Wang, Zuo-Cai; Xin, Yu; Ren, Wei-Xin
2016-08-01
This paper proposes a new nonlinear joint model updating method for shear type structures based on the instantaneous characteristics of the decomposed structural dynamic responses. To obtain an accurate representation of a nonlinear system's dynamics, the nonlinear joint model is described as the nonlinear spring element with bilinear stiffness. The instantaneous frequencies and amplitudes of the decomposed mono-component are first extracted by the analytical mode decomposition (AMD) method. Then, an objective function based on the residuals of the instantaneous frequencies and amplitudes between the experimental structure and the nonlinear model is created for the nonlinear joint model updating. The optimal values of the nonlinear joint model parameters are obtained by minimizing the objective function using the simulated annealing global optimization method. To validate the effectiveness of the proposed method, a single-story shear type structure subjected to earthquake and harmonic excitations is simulated as a numerical example. Then, a beam structure with multiple local nonlinear elements subjected to earthquake excitation is also simulated. The nonlinear beam structure is updated based on the global and local model using the proposed method. The results show that the proposed local nonlinear model updating method is more effective for structures with multiple local nonlinear elements. Finally, the proposed method is verified by the shake table test of a real high voltage switch structure. The accuracy of the proposed method is quantified both in numerical and experimental applications using the defined error indices. Both the numerical and experimental results have shown that the proposed method can effectively update the nonlinear joint model.
Simulated and observed 2010 floodwater elevations in the Pawcatuck and Wood Rivers, Rhode Island
Zarriello, Phillip J.; Straub, David E.; Smith, Thor E.
2014-01-01
Heavy, persistent rains from late February through March 2010 caused severe flooding that set, or nearly set, peaks of record for streamflows and water levels at many long-term U.S. Geological Survey streamgages in Rhode Island. In response to this flood, hydraulic models of Pawcatuck River (26.9 miles) and Wood River (11.6 miles) were updated from the most recent approved U.S. Department of Homeland Security-Federal Emergency Management Agency flood insurance study (FIS) to simulate water-surface elevations (WSEs) for specified flows and boundary conditions. The hydraulic models were updated to Hydrologic Engineering Center-River Analysis System (HEC-RAS) using steady-state simulations and incorporate new field-survey data at structures, high resolution land-surface elevation data, and updated flood flows from a related study. The models were used to simulate the 0.2-percent annual exceedance probability (AEP) flood, which is the AEP determined for the 2010 flood in the Pawcatuck and Wood Rivers. The simulated WSEs were compared to high-water mark (HWM) elevation data obtained in a related study following the March–April 2010 flood, which included 39 HWMs along the Pawcatuck River and 11 HWMs along the Wood River. The 2010 peak flow generally was larger than the 0.2-percent AEP flow, which, in part, resulted in the FIS and updated model WSEs to be lower than the 2010 HWMs. The 2010 HWMs for the Pawcatuck River averaged about 1.6 feet (ft) higher than the 0.2-percent AEP WSEs simulated in the updated model and 2.5 ft higher than the WSEs in the FIS. The 2010 HWMs for the Wood River averaged about 1.3 ft higher than the WSEs simulated in the updated model and 2.5 ft higher than the WSEs in the FIS. The improved agreement of the updated simulated water elevations to observed 2010 HWMs provides a measure of the hydraulic model performance, which indicates the updated models better represent flooding at other AEPs than the existing FIS models.
Yi, Wei; Sheng-de, Wu; Lian-Ju, Shen; Tao, Lin; Da-Wei, He; Guang-Hui, Wei
2018-05-24
To investigate whether management of undescended testis (UDT) may be improved with educational updates and new transferring model among referring providers (RPs). The age of orchidopexies performed in Children's Hospital of Chongqing Medical University were reviewed. We then proposed educational updates and new transferring model among RPs. The age of orchidopexies performed after our intervention were collected. Data were represented graphically and statistical analysis Chi-square for trend were used. A total of 1543 orchidopexies were performed. The median age of orchidopexy did not matched the target age of 6-12 months in any subsequent year. Survey of the RPs showed that 48.85% of their recommended age was below 12 months. However, only 25.50% of them would directly make a surgical referral to pediatric surgery specifically at this point. After we proposed educational updates, tracking the age of orchidopexy revealed a statistically significant trend downward. The management of undescended testis may be improved with educational updates and new transferring model among primary healthcare practitioners.
2007-06-01
data repository that will create a metadata card for each message for use by the federated search catalog as a reference. c. Joint DMS Core Product...yet. Once resolved, NREMS can move forward afloat. The AMHS in concert with NCES will be updated with the federated search capability. AMHS
A Comparison of Combustor-Noise Models
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.
2012-01-01
The present status of combustor-noise prediction in the NASA Aircraft Noise Prediction Program (ANOPP)1 for current-generation (N) turbofan engines is summarized. Several semi-empirical models for turbofan combustor noise are discussed, including best methods for near-term updates to ANOPP. An alternate turbine-transmission factor2 will appear as a user selectable option in the combustor-noise module GECOR in the next release. The three-spectrum model proposed by Stone et al.3 for GE turbofan-engine combustor noise is discussed and compared with ANOPP predictions for several relevant cases. Based on the results presented herein and in their report,3 it is recommended that the application of this fully empirical combustor-noise prediction method be limited to situations involving only General-Electric turbofan engines. Long-term needs and challenges for the N+1 through N+3 time frame are discussed. Because the impact of other propulsion-noise sources continues to be reduced due to turbofan design trends, advances in noise-mitigation techniques, and expected aircraft configuration changes, the relative importance of core noise is expected to greatly increase in the future. The noise-source structure in the combustor, including the indirect one, and the effects of the propagation path through the engine and exhaust nozzle need to be better understood. In particular, the acoustic consequences of the expected trends toward smaller, highly efficient gas-generator cores and low-emission fuel-flexible combustors need to be fully investigated since future designs are quite likely to fall outside of the parameter space of existing (semi-empirical) prediction tools.
Sonuga-Barke, Edmund J S
2014-08-01
In the U.S. the National Institute of Mental Health (NIMH), the main funder of mental health research in the world, has recently changed its funding model to promote a radically new perspective for mental health science. This bold, and for some controversial, initiative, termed the Research Diagnostic Criteria (or RDoC for short), intends to shift the focus of research, and eventually clinical practice, away from existing diagnostic categories, as recently updated in the DSM-5, towards 'new ways of classifying psychopathology based on dimensions of observable behavior and neurobiological measures.' This reorientation from discrete categorical disorder manifestations to underlying cross-cutting dimensions of individual functioning has generated considerable debate across the community of mental health researchers and clinicians (with strong views voiced both pro and con). Given its pivotal role in defining the research agenda globally, there is little doubt that this US science funding initiative will also have ramifications for researchers and clinicians worldwide. In this Editorial we focus specifically on the translational potential of the dimensional RDoC approach, properly extended to developmental models of early risk, in terms of its value as a potential driver of early intervention/prevention models; in the current issue of the JCPP this is exemplified by a number of papers thata address the mapping of underlying dimensions of core functioning to disorder risk, providing evidence for their potential predictive power as early markers of later disorder processes. © 2014 The Author. Journal of Child Psychology and Psychiatry. © 2014 Association for Child and Adolescent Mental Health.
NASA Technical Reports Server (NTRS)
Newman, C. M.
1977-01-01
The updated consumables flight planning worksheet (CFPWS) is documented. The update includes: (1) additional consumables: ECLSS ammonia, APU propellant, HYD water; (2) additional on orbit activity for development flight instrumentation (DFI); (3) updated use factors for all consumables; and (4) sources and derivations of the use factors.
Towards a Consolidated Approach for the Assessment of Evaluation Models of Nuclear Power Reactors
Epiney, A.; Canepa, S.; Zerkak, O.; ...
2016-11-02
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
Nonequivalence of updating rules in evolutionary games under high mutation rates.
Kaiping, G A; Jacobs, G S; Cox, S J; Sluckin, T J
2014-10-01
Moran processes are often used to model selection in evolutionary simulations. The updating rule in Moran processes is a birth-death process, i. e., selection according to fitness of an individual to give birth, followed by the death of a random individual. For well-mixed populations with only two strategies this updating rule is known to be equivalent to selecting unfit individuals for death and then selecting randomly for procreation (biased death-birth process). It is, however, known that this equivalence does not hold when considering structured populations. Here we study whether changing the updating rule can also have an effect in well-mixed populations in the presence of more than two strategies and high mutation rates. We find, using three models from different areas of evolutionary simulation, that the choice of updating rule can change model results. We show, e. g., that going from the birth-death process to the death-birth process can change a public goods game with punishment from containing mostly defectors to having a majority of cooperative strategies. From the examples given we derive guidelines indicating when the choice of the updating rule can be expected to have an impact on the results of the model.
Nonequivalence of updating rules in evolutionary games under high mutation rates
NASA Astrophysics Data System (ADS)
Kaiping, G. A.; Jacobs, G. S.; Cox, S. J.; Sluckin, T. J.
2014-10-01
Moran processes are often used to model selection in evolutionary simulations. The updating rule in Moran processes is a birth-death process, i. e., selection according to fitness of an individual to give birth, followed by the death of a random individual. For well-mixed populations with only two strategies this updating rule is known to be equivalent to selecting unfit individuals for death and then selecting randomly for procreation (biased death-birth process). It is, however, known that this equivalence does not hold when considering structured populations. Here we study whether changing the updating rule can also have an effect in well-mixed populations in the presence of more than two strategies and high mutation rates. We find, using three models from different areas of evolutionary simulation, that the choice of updating rule can change model results. We show, e. g., that going from the birth-death process to the death-birth process can change a public goods game with punishment from containing mostly defectors to having a majority of cooperative strategies. From the examples given we derive guidelines indicating when the choice of the updating rule can be expected to have an impact on the results of the model.
Klein-Fedyshin, Michele; Ketchum, Andrea M; Arnold, Robert M; Fedyshin, Peter J
2014-12-01
MEDLINE offers the Core Clinical Journals filter to limit to clinically useful journals. To determine its effectiveness for searching and patient-centric decision making, this study compared literature used for Morning Report in Internal Medicine with journals in the filter. An EndNote library with references answering 327 patient-related questions during Morning Report from 2007 to 2012 was exported to a file listing variables including designated Core Clinical Journal, Impact Factor, date used and medical subject. Bradford's law of scattering was applied ranking the journals and reflecting their clinical utility. Recall (sensitivity) and precision of the Core Morning Report journals and non-Core set was calculated. This study applied bibliometrics to compare the 628 articles used against these criteria to determine journals impacting decision making. Analysis shows 30% of clinically used articles are from the Core Clinical Journals filter and 16% of the journals represented are Core titles. When Bradford-ranked, 55% of the top 20 journals are Core. Articles <5 years old furnish 63% of sources used. Among the 63 Morning Report subjects, 55 have <50% precision and 41 have <50% recall including 37 subjects with 0% precision and 0% recall. Low usage of publications within the Core Clinical Journals filter indicates less relevance for hospital-based care. The divergence from high-impact medicine titles suggests clinically valuable journals differ from academically important titles. With few subjects demonstrating high recall or precision, the MEDLINE Core Clinical Journals filter may require a review and update to better align with current clinical needs. © 2014 John Wiley & Sons, Ltd.
Incremental k-core decomposition: Algorithms and evaluation
Sariyuce, Ahmet Erdem; Gedik, Bugra; Jacques-SIlva, Gabriela; ...
2016-02-01
A k-core of a graph is a maximal connected subgraph in which every vertex is connected to at least k vertices in the subgraph. k-core decomposition is often used in large-scale network analysis, such as community detection, protein function prediction, visualization, and solving NP-hard problems on real networks efficiently, like maximal clique finding. In many real-world applications, networks change over time. As a result, it is essential to develop efficient incremental algorithms for dynamic graph data. In this paper, we propose a suite of incremental k-core decomposition algorithms for dynamic graph data. These algorithms locate a small subgraph that ismore » guaranteed to contain the list of vertices whose maximum k-core values have changed and efficiently process this subgraph to update the k-core decomposition. We present incremental algorithms for both insertion and deletion operations, and propose auxiliary vertex state maintenance techniques that can further accelerate these operations. Our results show a significant reduction in runtime compared to non-incremental alternatives. We illustrate the efficiency of our algorithms on different types of real and synthetic graphs, at varying scales. Furthermore, for a graph of 16 million vertices, we observe relative throughputs reaching a million times, relative to the non-incremental algorithms.« less
OSATE Overview & Community Updates
2015-02-15
update 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Delange /Julien 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK...main language capabilities Modeling patterns & model samples for beginners Error-Model examples EMV2 model constructs Demonstration of tools Case
Systematic meta-review of supported self-management for asthma: a healthcare perspective.
Pinnock, Hilary; Parke, Hannah L; Panagioti, Maria; Daines, Luke; Pearce, Gemma; Epiphaniou, Eleni; Bower, Peter; Sheikh, Aziz; Griffiths, Chris J; Taylor, Stephanie J C
2017-03-17
Supported self-management has been recommended by asthma guidelines for three decades; improving current suboptimal implementation will require commitment from professionals, patients and healthcare organisations. The Practical Systematic Review of Self-Management Support (PRISMS) meta-review and Reducing Care Utilisation through Self-management Interventions (RECURSIVE) health economic review were commissioned to provide a systematic overview of supported self-management to inform implementation. We sought to investigate if supported asthma self-management reduces use of healthcare resources and improves asthma control; for which target groups it works; and which components and contextual factors contribute to effectiveness. Finally, we investigated the costs to healthcare services of providing supported self-management. We undertook a meta-review (systematic overview) of systematic reviews updated with randomised controlled trials (RCTs) published since the review search dates, and health economic meta-analysis of RCTs. Twelve electronic databases were searched in 2012 (updated in 2015; pre-publication update January 2017) for systematic reviews reporting RCTs (and update RCTs) evaluating supported asthma self-management. We assessed the quality of included studies and undertook a meta-analysis and narrative synthesis. A total of 27 systematic reviews (n = 244 RCTs) and 13 update RCTs revealed that supported self-management can reduce hospitalisations, accident and emergency attendances and unscheduled consultations, and improve markers of control and quality of life for people with asthma across a range of cultural, demographic and healthcare settings. Core components are patient education, provision of an action plan and regular professional review. Self-management is most effective when delivered in the context of proactive long-term condition management. The total cost (n = 24 RCTs) of providing self-management support is offset by a reduction in hospitalisations and accident and emergency visits (standard mean difference 0.13, 95% confidence interval -0.09 to 0.34). Evidence from a total of 270 RCTs confirms that supported self-management for asthma can reduce unscheduled care and improve asthma control, can be delivered effectively for diverse demographic and cultural groups, is applicable in a broad range of clinical settings, and does not significantly increase total healthcare costs. Informed by this comprehensive synthesis of the literature, clinicians, patient-interest groups, policy-makers and providers of healthcare services should prioritise provision of supported self-management for people with asthma as a core component of routine care. RECURSIVE: PROSPERO CRD42012002694 ; PRISMS: PROSPERO does not register meta-reviews.
Klemans, Rob J B; Otte, Dianne; Knol, Mirjam; Knol, Edward F; Meijer, Yolanda; Gmelig-Meyling, Frits H J; Bruijnzeel-Koomen, Carla A F M; Knulst, André C; Pasmans, Suzanne G M A
2013-01-01
A diagnostic prediction model for peanut allergy in children was recently published, using 6 predictors: sex, age, history, skin prick test, peanut specific immunoglobulin E (sIgE), and total IgE minus peanut sIgE. To validate this model and update it by adding allergic rhinitis, atopic dermatitis, and sIgE to peanut components Ara h 1, 2, 3, and 8 as candidate predictors. To develop a new model based only on sIgE to peanut components. Validation was performed by testing discrimination (diagnostic value) with an area under the receiver operating characteristic curve and calibration (agreement between predicted and observed frequencies of peanut allergy) with the Hosmer-Lemeshow test and a calibration plot. The performance of the (updated) models was similarly analyzed. Validation of the model in 100 patients showed good discrimination (88%) but poor calibration (P < .001). In the updating process, age, history, and additional candidate predictors did not significantly increase discrimination, being 94%, and leaving only 4 predictors of the original model: sex, skin prick test, peanut sIgE, and total IgE minus sIgE. When building a model with sIgE to peanut components, Ara h 2 was the only predictor, with a discriminative ability of 90%. Cutoff values with 100% positive and negative predictive values could be calculated for both the updated model and sIgE to Ara h 2. In this way, the outcome of the food challenge could be predicted with 100% accuracy in 59% (updated model) and 50% (Ara h 2) of the patients. Discrimination of the validated model was good; however, calibration was poor. The discriminative ability of Ara h 2 was almost comparable to that of the updated model, containing 4 predictors. With both models, the need for peanut challenges could be reduced by at least 50%. Copyright © 2012 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.
Description and evaluation of the Community Multiscale Air ...
The Community Multiscale Air Quality (CMAQ) model is a comprehensive multipollutant air quality modeling system developed and maintained by the US Environmental Protection Agency's (EPA) Office of Research and Development (ORD). Recently, version 5.1 of the CMAQ model (v5.1) was released to the public, incorporating a large number of science updates and extended capabilities over the previous release version of the model (v5.0.2). These updates include the following: improvements in the meteorological calculations in both CMAQ and the Weather Research and Forecast (WRF) model used to provide meteorological fields to CMAQ, updates to the gas and aerosol chemistry, revisions to the calculations of clouds and photolysis, and improvements to the dry and wet deposition in the model. Sensitivity simulations isolating several of the major updates to the modeling system show that changes to the meteorological calculations result in enhanced afternoon and early evening mixing in the model, periods when the model historically underestimates mixing. This enhanced mixing results in higher ozone (O3) mixing ratios on average due to reduced NO titration, and lower fine particulate matter (PM2. 5) concentrations due to greater dilution of primary pollutants (e.g., elemental and organic carbon). Updates to the clouds and photolysis calculations greatly improve consistency between the WRF and CMAQ models and result in generally higher O3 mixing ratios, primarily due to reduced
Enhancement of ELDA Tracker Based on CNN Features and Adaptive Model Update.
Gao, Changxin; Shi, Huizhang; Yu, Jin-Gang; Sang, Nong
2016-04-15
Appearance representation and the observation model are the most important components in designing a robust visual tracking algorithm for video-based sensors. Additionally, the exemplar-based linear discriminant analysis (ELDA) model has shown good performance in object tracking. Based on that, we improve the ELDA tracking algorithm by deep convolutional neural network (CNN) features and adaptive model update. Deep CNN features have been successfully used in various computer vision tasks. Extracting CNN features on all of the candidate windows is time consuming. To address this problem, a two-step CNN feature extraction method is proposed by separately computing convolutional layers and fully-connected layers. Due to the strong discriminative ability of CNN features and the exemplar-based model, we update both object and background models to improve their adaptivity and to deal with the tradeoff between discriminative ability and adaptivity. An object updating method is proposed to select the "good" models (detectors), which are quite discriminative and uncorrelated to other selected models. Meanwhile, we build the background model as a Gaussian mixture model (GMM) to adapt to complex scenes, which is initialized offline and updated online. The proposed tracker is evaluated on a benchmark dataset of 50 video sequences with various challenges. It achieves the best overall performance among the compared state-of-the-art trackers, which demonstrates the effectiveness and robustness of our tracking algorithm.
Enhancement of ELDA Tracker Based on CNN Features and Adaptive Model Update
Gao, Changxin; Shi, Huizhang; Yu, Jin-Gang; Sang, Nong
2016-01-01
Appearance representation and the observation model are the most important components in designing a robust visual tracking algorithm for video-based sensors. Additionally, the exemplar-based linear discriminant analysis (ELDA) model has shown good performance in object tracking. Based on that, we improve the ELDA tracking algorithm by deep convolutional neural network (CNN) features and adaptive model update. Deep CNN features have been successfully used in various computer vision tasks. Extracting CNN features on all of the candidate windows is time consuming. To address this problem, a two-step CNN feature extraction method is proposed by separately computing convolutional layers and fully-connected layers. Due to the strong discriminative ability of CNN features and the exemplar-based model, we update both object and background models to improve their adaptivity and to deal with the tradeoff between discriminative ability and adaptivity. An object updating method is proposed to select the “good” models (detectors), which are quite discriminative and uncorrelated to other selected models. Meanwhile, we build the background model as a Gaussian mixture model (GMM) to adapt to complex scenes, which is initialized offline and updated online. The proposed tracker is evaluated on a benchmark dataset of 50 video sequences with various challenges. It achieves the best overall performance among the compared state-of-the-art trackers, which demonstrates the effectiveness and robustness of our tracking algorithm. PMID:27092505
An interval model updating strategy using interval response surface models
NASA Astrophysics Data System (ADS)
Fang, Sheng-En; Zhang, Qiu-Hu; Ren, Wei-Xin
2015-08-01
Stochastic model updating provides an effective way of handling uncertainties existing in real-world structures. In general, probabilistic theories, fuzzy mathematics or interval analyses are involved in the solution of inverse problems. However in practice, probability distributions or membership functions of structural parameters are often unavailable due to insufficient information of a structure. At this moment an interval model updating procedure shows its superiority in the aspect of problem simplification since only the upper and lower bounds of parameters and responses are sought. To this end, this study develops a new concept of interval response surface models for the purpose of efficiently implementing the interval model updating procedure. The frequent interval overestimation due to the use of interval arithmetic can be maximally avoided leading to accurate estimation of parameter intervals. Meanwhile, the establishment of an interval inverse problem is highly simplified, accompanied by a saving of computational costs. By this means a relatively simple and cost-efficient interval updating process can be achieved. Lastly, the feasibility and reliability of the developed method have been verified against a numerical mass-spring system and also against a set of experimentally tested steel plates.
Peters, Sanne A. E.; Jones, Alexandra; Crino, Michelle; Taylor, Fraser; Woodward, Mark; Neal, Bruce
2017-01-01
Background: The Health Star Rating (HSR) is an interpretive front-of-pack labelling system that rates the overall nutritional profile of packaged foods. The algorithm underpinning the HSR includes total sugar content as one of the components. This has been criticised because intrinsic sugars naturally present in dairy, fruits, and vegetables are treated the same as sugars added during food processing. We assessed whether the HSR could better discriminate between core and discretionary foods by including added sugar in the underlying algorithm. Methods: Nutrition information was extracted for 34,135 packaged foods available in The George Institute’s Australian FoodSwitch database. Added sugar levels were imputed from food composition databases. Products were classified as ‘core’ or ‘discretionary’ based on the Australian Dietary Guidelines. The ability of each of the nutrients included in the HSR algorithm, as well as added sugar, to discriminate between core and discretionary foods was estimated using the area under the curve (AUC). Results: 15,965 core and 18,350 discretionary foods were included. Of these, 8230 (52%) core foods and 15,947 (87%) discretionary foods contained added sugar. Median (Q1, Q3) HSRs were 4.0 (3.0, 4.5) for core foods and 2.0 (1.0, 3.0) for discretionary foods. Median added sugar contents (g/100 g) were 3.3 (1.5, 5.5) for core foods and 14.6 (1.8, 37.2) for discretionary foods. Of all the nutrients used in the current HSR algorithm, total sugar had the greatest individual capacity to discriminate between core and discretionary foods; AUC 0.692 (0.686; 0.697). Added sugar alone achieved an AUC of 0.777 (0.772; 0.782). A model with all nutrients in the current HSR algorithm had an AUC of 0.817 (0.812; 0.821), which increased to 0.871 (0.867; 0.874) with inclusion of added sugar. Conclusion: The HSR nutrients discriminate well between core and discretionary packaged foods. However, discrimination was improved when added sugar was also included. These data argue for inclusion of added sugar in an updated HSR algorithm and declaration of added sugar as part of mandatory nutrient declarations. PMID:28678187
Malinowski, Kathleen; McAvoy, Thomas J; George, Rohini; Dieterich, Sonja; D'Souza, Warren D
2013-07-01
To determine how best to time respiratory surrogate-based tumor motion model updates by comparing a novel technique based on external measurements alone to three direct measurement methods. Concurrently measured tumor and respiratory surrogate positions from 166 treatment fractions for lung or pancreas lesions were analyzed. Partial-least-squares regression models of tumor position from marker motion were created from the first six measurements in each dataset. Successive tumor localizations were obtained at a rate of once per minute on average. Model updates were timed according to four methods: never, respiratory surrogate-based (when metrics based on respiratory surrogate measurements exceeded confidence limits), error-based (when localization error ≥ 3 mm), and always (approximately once per minute). Radial tumor displacement prediction errors (mean ± standard deviation) for the four schema described above were 2.4 ± 1.2, 1.9 ± 0.9, 1.9 ± 0.8, and 1.7 ± 0.8 mm, respectively. The never-update error was significantly larger than errors of the other methods. Mean update counts over 20 min were 0, 4, 9, and 24, respectively. The same improvement in tumor localization accuracy could be achieved through any of the three update methods, but significantly fewer updates were required when the respiratory surrogate method was utilized. This study establishes the feasibility of timing image acquisitions for updating respiratory surrogate models without direct tumor localization.
Updating the Behavior Engineering Model.
ERIC Educational Resources Information Center
Chevalier, Roger
2003-01-01
Considers Thomas Gilbert's Behavior Engineering Model as a tool for systematically identifying barriers to individual and organizational performance. Includes a detailed case study and a performance aid that incorporates gap analysis, cause analysis, and force field analysis to update the original model. (Author/LRW)
Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model
Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua
2015-01-01
We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.
Li, Yan; Wang, Dejun; Zhang, Shaoyi
2014-01-01
Updating the structural model of complex structures is time-consuming due to the large size of the finite element model (FEM). Using conventional methods for these cases is computationally expensive or even impossible. A two-level method, which combined the Kriging predictor and the component mode synthesis (CMS) technique, was proposed to ensure the successful implementing of FEM updating of large-scale structures. In the first level, the CMS was applied to build a reasonable condensed FEM of complex structures. In the second level, the Kriging predictor that was deemed as a surrogate FEM in structural dynamics was generated based on the condensed FEM. Some key issues of the application of the metamodel (surrogate FEM) to FEM updating were also discussed. Finally, the effectiveness of the proposed method was demonstrated by updating the FEM of a real arch bridge with the measured modal parameters. PMID:24634612
ADAS Update and Maintainability
NASA Technical Reports Server (NTRS)
Watson, Leela R.
2010-01-01
Since 2000, both the National Weather Service Melbourne (NWS MLB) and the Spaceflight Meteorology Group (SMG) have used a local data integration system (LOIS) as part of their forecast and warning operations. The original LOIS was developed by the Applied Meteorology Unit (AMU) in 1998 (Manobianco and Case 1998) and has undergone subsequent improvements. Each has benefited from three-dimensional (3-D) analyses that are delivered to forecasters every 15 minutes across the peninsula of Florida. The intent is to generate products that enhance short-range weather forecasts issued in support of NWS MLB and SMG operational requirements within East Central Florida. The current LDIS uses the Advanced Regional Prediction System (ARPS) Data Analysis System (AD AS) package as its core, which integrates a wide variety of national, regional, and local observational data sets. It assimilates all available real-time data within its domain and is run at a finer spatial and temporal resolution than current national or regional-scale analysis packages. As such, it provides local forecasters with a more comprehensive understanding of evolving fine-scale weather features. Over the years, the LDIS has become problematic to maintain since it depends on AMU-developed shell scripts that were written for an earlier version of the ADAS software. The goals of this task were to update the NWS MLB/SMG LDIS with the latest version of ADAS, incorporate new sources of observational data, and upgrade and modify the AMU-developed shell scripts written to govern the system. In addition, the previously developed ADAS graphical user interface (GUI) was updated. Operationally, these upgrades will result in more accurate depictions of the current local environment to help with short-range weather forecasting applications, while also offering an improved initialization for local versions of the Weather Research and Forecasting (WRF) model used by both groups.
Information dissemination model for social media with constant updates
NASA Astrophysics Data System (ADS)
Zhu, Hui; Wu, Heng; Cao, Jin; Fu, Gang; Li, Hui
2018-07-01
With the development of social media tools and the pervasiveness of smart terminals, social media has become a significant source of information for many individuals. However, false information can spread rapidly, which may result in negative social impacts and serious economic losses. Thus, reducing the unfavorable effects of false information has become an urgent challenge. In this paper, a new competitive model called DMCU is proposed to describe the dissemination of information with constant updates in social media. In the model, we focus on the competitive relationship between the original false information and updated information, and then propose the priority of related information. To more effectively evaluate the effectiveness of the proposed model, data sets containing actual social media activity are utilized in experiments. Simulation results demonstrate that the DMCU model can precisely describe the process of information dissemination with constant updates, and that it can be used to forecast information dissemination trends on social media.
Seismic hazard in the eastern United States
Mueller, Charles; Boyd, Oliver; Petersen, Mark D.; Moschetti, Morgan P.; Rezaeian, Sanaz; Shumway, Allison
2015-01-01
The U.S. Geological Survey seismic hazard maps for the central and eastern United States were updated in 2014. We analyze results and changes for the eastern part of the region. Ratio maps are presented, along with tables of ground motions and deaggregations for selected cities. The Charleston fault model was revised, and a new fault source for Charlevoix was added. Background seismicity sources utilized an updated catalog, revised completeness and recurrence models, and a new adaptive smoothing procedure. Maximum-magnitude models and ground motion models were also updated. Broad, regional hazard reductions of 5%–20% are mostly attributed to new ground motion models with stronger near-source attenuation. The revised Charleston fault geometry redistributes local hazard, and the new Charlevoix source increases hazard in northern New England. Strong increases in mid- to high-frequency hazard at some locations—for example, southern New Hampshire, central Virginia, and eastern Tennessee—are attributed to updated catalogs and/or smoothing.
Core outcome domains for clinical trials in non-specific low back pain.
Chiarotto, Alessandro; Deyo, Richard A; Terwee, Caroline B; Boers, Maarten; Buchbinder, Rachelle; Corbin, Terry P; Costa, Leonardo O P; Foster, Nadine E; Grotle, Margreth; Koes, Bart W; Kovacs, Francisco M; Lin, Chung-Wei Christine; Maher, Chris G; Pearson, Adam M; Peul, Wilco C; Schoene, Mark L; Turk, Dennis C; van Tulder, Maurits W; Ostelo, Raymond W
2015-06-01
Inconsistent reporting of outcomes in clinical trials of patients with non-specific low back pain (NSLBP) hinders comparison of findings and the reliability of systematic reviews. A core outcome set (COS) can address this issue as it defines a minimum set of outcomes that should be reported in all clinical trials. In 1998, Deyo et al. recommended a standardized set of outcomes for LBP clinical research. The aim of this study was to update these recommendations by determining which outcome domains should be included in a COS for clinical trials in NSLBP. An International Steering Committee established the methodology to develop this COS. The OMERACT Filter 2.0 framework was used to draw a list of potential core domains that were presented in a Delphi study. Researchers, care providers and patients were invited to participate in three Delphi rounds and were asked to judge which domains were core. A priori criteria for consensus were established before each round and were analysed together with arguments provided by panellists on importance, overlap, aggregation and/or addition of potential core domains. The Steering Committee discussed the final results and made final decisions. A set of 280 experts was invited to participate in the Delphi; response rates in the three rounds were 52, 50 and 45%. Of 41 potential core domains presented in the first round, 13 had sufficient support to be presented for rating in the third round. Overall consensus was reached for the inclusion of three domains in this COS: 'physical functioning', 'pain intensity' and 'health-related quality of life'. Consensus on 'physical functioning' and 'pain intensity' was consistent across all stakeholders, 'health-related quality of life' was not supported by the patients, and all the other domains were not supported by two or more groups of stakeholders. Weighting all possible argumentations, the Steering Committee decided to include in the COS the three domains that reached overall consensus and the domain 'number of deaths'. The following outcome domains were included in this updated COS: 'physical functioning', 'pain intensity', 'health-related quality of life' and 'number of deaths'. The next step for the development of this COS will be to determine which measurement instruments best measure these domains.
An Update on NASA's Arctic Boreal Vulnerability Experiment
NASA Astrophysics Data System (ADS)
Goetz, S. J.; Miller, C. E.; Griffith, P. C.; Larson, E. K.; Kasischke, E. S.; Margolis, H. A.
2016-12-01
ABoVE is a NASA-led field campaign taking place in Alaska and western Canada over the next 8-10 years, with a wide range of interdisciplinary science objectives designed to address the extent to which ecosystems and society are vulnerable, or resilient, to environmental changes underway and expected. The first phase of ABoVE is underway, with a focus on ecosystem dynamics and ecosystem services objectives. Some 45 core and affiliated projects are currently included, and another 10-20 will be added in late 2016 with initiation of the airborne science component. The ABoVE leadership is fostering partnerships with several other major arctic and boreal research, management and policy initiatives. The Science Team is organized around science themes, with Working Groups (WGs) on vegetation, permafrost and hydrology, disturbance, carbon dynamics, wildlife and ecosystem services, and modeling. Despite the disciplinary science WGs, ABoVE research broadly focuses the complex interdependencies and feedbacks across disciplines. Additional WGs focus on airborne science, geospatial products, core variables and standards, and stakeholder engagement - all supplemented by a range of infrastructure activities such as data management, cloud computing, laboratory and field support. Ultimately ABoVE research will improve our understanding of the consequences of environmental changes occurring across the study domain, as well as increase our confidence in making projections of the ecosystem responses and vulnerability to changes taking place both within and outside the domain. ABoVE will also build a lasting legacy of research through an expanded knowledge base, the provision of key datasets archived for a broader network of researchers and resource managers, and the development of data products and knowledge designed to foster decision support and applied research partnerships with broad societal relevance. We will provide a brief status update of ABoVE activities and plans, including the upcoming airborne campaigns, science team meetings, and the potential for partnerships and engagement.
Imaging The Shallow Velocity Structure Of The Hikurangi Megathrust Using Full-Waveform Inversion
NASA Astrophysics Data System (ADS)
Gray, M.; Bell, R. E.; Morgan, J. V.
2017-12-01
The Hikurangi margin, offshore North Island, New Zealand, exhibits a number of different slip behaviours, including shallow slow slip events (SSEs) (<2km to 15 km). There is also a strong contrast in geodetic coupling along the margin. While reflection data provides an image of the structure, no information about physical properties is provided. Full-waveform inversion (FWI) is an imaging technique which incorporates the full seismic wavelet rather than just the first arrivals, as in traditional tomography. By propagating synthetic seismic waves through a velocity model and comparing the synthetic wavelets to the field data, we update the velocity model until the real and synthetic wavelets match. In this way, we can resolve high-resolution physical property variations which influence the seismic wavefield. In our study, FWI was used to resolve the P-wave velocity structure at the Hikurangi megathrust up to 2km. This method enables investigation of how upper-plate structure may influence plate boundary slip behaviour. In 2005, a seismic survey was carried out over the Hikurangi megathrust. The data was acquired from a 12km streamer, allowing FWI analysis up to 2km below the seabed. The results show low velocity zones correlating to faults interpreted from reflection seismic imaging. We believe these low velocity zones, particularly near the frontal thrust resolve faulting in the area, and present these faults as possible fluid conduits. As the dataset was not collected specifically for FWI, the results show promise in resolving more information at depth. As such, both a 3D seismic survey and two drilling expeditions have been approved for the period November 2017 - May 2018. The seismic survey will be carried out with parameters optimal for FWI, allow imaging of the fault boundary, which is not possible with the current 2D data. The cores will provide direct geological evidence which can be used in conjunction with velocity models to discern lithology and structure. The current result identifies the existence of overpressure and aids in drilling safety when collecting these cores. In conjunction with the new IODP cores, the FWI model will improve understanding of properties of the shallow structure of the megathrust.
Disruption of the Right Temporoparietal Junction Impairs Probabilistic Belief Updating.
Mengotti, Paola; Dombert, Pascasie L; Fink, Gereon R; Vossel, Simone
2017-05-31
Generating and updating probabilistic models of the environment is a fundamental modus operandi of the human brain. Although crucial for various cognitive functions, the neural mechanisms of these inference processes remain to be elucidated. Here, we show the causal involvement of the right temporoparietal junction (rTPJ) in updating probabilistic beliefs and we provide new insights into the chronometry of the process by combining online transcranial magnetic stimulation (TMS) with computational modeling of behavioral responses. Female and male participants performed a modified location-cueing paradigm, where false information about the percentage of cue validity (%CV) was provided in half of the experimental blocks to prompt updating of prior expectations. Online double-pulse TMS over rTPJ 300 ms (but not 50 ms) after target appearance selectively decreased participants' updating of false prior beliefs concerning %CV, reflected in a decreased learning rate of a Rescorla-Wagner model. Online TMS over rTPJ also impacted on participants' explicit beliefs, causing them to overestimate %CV. These results confirm the involvement of rTPJ in updating of probabilistic beliefs, thereby advancing our understanding of this area's function during cognitive processing. SIGNIFICANCE STATEMENT Contemporary views propose that the brain maintains probabilistic models of the world to minimize surprise about sensory inputs. Here, we provide evidence that the right temporoparietal junction (rTPJ) is causally involved in this process. Because neuroimaging has suggested that rTPJ is implicated in divergent cognitive domains, the demonstration of an involvement in updating internal models provides a novel unifying explanation for these findings. We used computational modeling to characterize how participants change their beliefs after new observations. By interfering with rTPJ activity through online transcranial magnetic stimulation, we showed that participants were less able to update prior beliefs with TMS delivered at 300 ms after target onset. Copyright © 2017 the authors 0270-6474/17/375419-10$15.00/0.
FORCARB2: An updated version of the U.S. Forest Carbon Budget Model
Linda S. Heath; Michael C. Nichols; James E. Smith; John R. Mills
2010-01-01
FORCARB2, an updated version of the U.S. FORest CARBon Budget Model (FORCARB), produces estimates of carbon stocks and stock changes for forest ecosystems and forest products at 5-year intervals. FORCARB2 includes a new methodology for carbon in harvested wood products, updated initial inventory data, a revised algorithm for dead wood, and now includes public forest...
Status Report on NEAMS System Analysis Module Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, R.; Fanning, T. H.; Sumner, T.
2015-12-01
Under the Reactor Product Line (RPL) of DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, an advanced SFR System Analysis Module (SAM) is being developed at Argonne National Laboratory. The goal of the SAM development is to provide fast-running, improved-fidelity, whole-plant transient analyses capabilities. SAM utilizes an object-oriented application framework MOOSE), and its underlying meshing and finite-element library libMesh, as well as linear and non-linear solvers PETSc, to leverage modern advanced software environments and numerical methods. It also incorporates advances in physical and empirical models and seeks closure models based on information from high-fidelity simulations and experiments. This reportmore » provides an update on the SAM development, and summarizes the activities performed in FY15 and the first quarter of FY16. The tasks include: (1) implement the support of 2nd-order finite elements in SAM components for improved accuracy and computational efficiency; (2) improve the conjugate heat transfer modeling and develop pseudo 3-D full-core reactor heat transfer capabilities; (3) perform verification and validation tests as well as demonstration simulations; (4) develop the coupling requirements for SAS4A/SASSYS-1 and SAM integration.« less
NASA Technical Reports Server (NTRS)
Foster, Lucas E.; Britcher, Colin P.
1995-01-01
The Large Angle Magnetic Suspension Test Fixture (LAMSTF) is a laboratory scale proof-of-concept system. The configuration is unique in that the electromagnets are mounted in a circular planar array. A mathematical model of the system had previously been developed, but was shown to have inaccuracies. These inaccuracies showed up in the step responses. Eddy currents were found to be the major cause of the modeling errors. In the original system, eddy currents existed in the aluminum baseplate, iron cores, and the sensor support frame. An attempt to include the eddy current dynamics in the system model is presented. The dynamics of a dummy sensor ring were added to the system. Adding the eddy current dynamics to the simulation improves the way it compares to the actual experiment. Also presented is a new method of determining the yaw angle of the suspended element. From the coil currents the yaw angle can be determined and the controller can be updated to suspend at the new current. This method has been used to demonstrate a 360 degree yaw angle rotation.
Discretely Integrated Condition Event (DICE) Simulation for Pharmacoeconomics.
Caro, J Jaime
2016-07-01
Several decision-analytic modeling techniques are in use for pharmacoeconomic analyses. Discretely integrated condition event (DICE) simulation is proposed as a unifying approach that has been deliberately designed to meet the modeling requirements in a straightforward transparent way, without forcing assumptions (e.g., only one transition per cycle) or unnecessary complexity. At the core of DICE are conditions that represent aspects that persist over time. They have levels that can change and many may coexist. Events reflect instantaneous occurrences that may modify some conditions or the timing of other events. The conditions are discretely integrated with events by updating their levels at those times. Profiles of determinant values allow for differences among patients in the predictors of the disease course. Any number of valuations (e.g., utility, cost, willingness-to-pay) of conditions and events can be applied concurrently in a single run. A DICE model is conveniently specified in a series of tables that follow a consistent format and the simulation can be implemented fully in MS Excel, facilitating review and validation. DICE incorporates both state-transition (Markov) models and non-resource-constrained discrete event simulation in a single formulation; it can be executed as a cohort or a microsimulation; and deterministically or stochastically.
Sequential updating of a new dynamic pharmacokinetic model for caffeine in premature neonates.
Micallef, Sandrine; Amzal, Billy; Bach, Véronique; Chardon, Karen; Tourneux, Pierre; Bois, Frédéric Y
2007-01-01
Caffeine treatment is widely used in nursing care to reduce the risk of apnoea in premature neonates. To check the therapeutic efficacy of the treatment against apnoea, caffeine concentration in blood is an important indicator. The present study was aimed at building a pharmacokinetic model as a basis for a medical decision support tool. In the proposed model, time dependence of physiological parameters is introduced to describe rapid growth of neonates. To take into account the large variability in the population, the pharmacokinetic model is embedded in a population structure. The whole model is inferred within a Bayesian framework. To update caffeine concentration predictions as data of an incoming patient are collected, we propose a fast method that can be used in a medical context. This involves the sequential updating of model parameters (at individual and population levels) via a stochastic particle algorithm. Our model provides better predictions than the ones obtained with models previously published. We show, through an example, that sequential updating improves predictions of caffeine concentration in blood (reduce bias and length of credibility intervals). The update of the pharmacokinetic model using body mass and caffeine concentration data is studied. It shows how informative caffeine concentration data are in contrast to body mass data. This study provides the methodological basis to predict caffeine concentration in blood, after a given treatment if data are collected on the treated neonate.
Highly efficient model updating for structural condition assessment of large-scale bridges.
DOT National Transportation Integrated Search
2015-02-01
For eciently updating models of large-scale structures, the response surface (RS) method based on radial basis : functions (RBFs) is proposed to model the input-output relationship of structures. The key issues for applying : the proposed method a...
UPDATE ON EPA'S URBAN WATERSHED MANAGEMENT BRANCH MODELING ACTIVITIES
This paper provides the Stormwater Management Model (SWMM) user community with a description of the Environmental Protection Agency (EPA's) Office of Research and Development (ORD) approach to urban watershed modeling research and provides an update on current ORD SWMM-related pr...
ASTRO's 2007 core physics curriculum for radiation oncology residents.
Klein, Eric E; Gerbi, Bruce J; Price, Robert A; Balter, James M; Paliwal, Bhudatt; Hughes, Lesley; Huang, Eugene
2007-08-01
In 2004, the American Society for Therapeutic Radiology and Oncology (ASTRO) published a curriculum for physics education. The document described a 54-hour course. In 2006, the committee reconvened to update the curriculum. The committee is composed of physicists and physicians from various residency program teaching institutions. Simultaneously, members have associations with the American Association of Physicists in Medicine, ASTRO, Association of Residents in Radiation Oncology, American Board of Radiology, and American College of Radiology. Representatives from the latter two organizations are key to provide feedback between the examining organizations and ASTRO. Subjects are based on Accreditation Council for Graduate Medical Education requirements (particles and hyperthermia), whereas the majority of subjects and appropriated hours/subject were developed by consensus. The new curriculum is 55 hours, containing new subjects, redistribution of subjects with updates, and reorganization of core topics. For each subject, learning objectives are provided, and for each lecture hour, a detailed outline of material to be covered is provided. Some changes include a decrease in basic radiologic physics, addition of informatics as a subject, increase in intensity-modulated radiotherapy, and migration of some brachytherapy hours to radiopharmaceuticals. The new curriculum was approved by the ASTRO board in late 2006. It is hoped that physicists will adopt the curriculum for structuring their didactic teaching program, and simultaneously, the American Board of Radiology, for its written examination. The American College of Radiology uses the ASTRO curriculum for their training examination topics. In addition to the curriculum, the committee added suggested references, a glossary, and a condensed version of lectures for a Postgraduate Year 2 resident physics orientation. To ensure continued commitment to a current and relevant curriculum, subject matter will be updated again in 2 years.
ASTRO's 2007 Core Physics Curriculum for Radiation Oncology Residents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Eric E.; Gerbi, Bruce J.; Price, Robert A.
2007-08-01
In 2004, American Society for Therapeutic Radiology and Oncology (ASTRO) published a curriculum for physics education. The document described a 54-hour course. In 2006, the committee reconvened to update the curriculum. The committee is composed of physicists and physicians from various residency program teaching institutions. Simultaneously, members have associations with American Association of Physicists in Medicine, ASTRO, Association of Residents in Radiation Oncology, American Board of Radiology, and American College of Radiology. Representatives from the latter two organizations are key to provide feedback between the examining organizations and ASTRO. Subjects are based on Accreditation Council for Graduate Medical Education requirementsmore » (particles and hyperthermia), whereas the majority of subjects and appropriated hours/subject were developed by consensus. The new curriculum is 55 hours, containing new subjects, redistribution of subjects with updates, and reorganization of core topics. For each subject, learning objectives are provided, and for each lecture hour, a detailed outline of material to be covered is provided. Some changes include a decrease in basic radiologic physics, addition of informatics as a subject, increase in intensity-modulated radiotherapy, and migration of some brachytherapy hours to radiopharmaceuticals. The new curriculum was approved by the ASTRO board in late 2006. It is hoped that physicists will adopt the curriculum for structuring their didactic teaching program, and simultaneously, American Board of Radiology, for its written examination. American College of Radiology uses the ASTRO curriculum for their training examination topics. In addition to the curriculum, the committee added suggested references, a glossary, and a condensed version of lectures for a Postgraduate Year 2 resident physics orientation. To ensure continued commitment to a current and relevant curriculum, subject matter will be updated again in 2 years.« less
Update schemes of multi-velocity floor field cellular automaton for pedestrian dynamics
NASA Astrophysics Data System (ADS)
Luo, Lin; Fu, Zhijian; Cheng, Han; Yang, Lizhong
2018-02-01
Modeling pedestrian movement is an interesting problem both in statistical physics and in computational physics. Update schemes of cellular automaton (CA) models for pedestrian dynamics govern the schedule of pedestrian movement. Usually, different update schemes make the models behave in different ways, which should be carefully recalibrated. Thus, in this paper, we investigated the influence of four different update schemes, namely parallel/synchronous scheme, random scheme, order-sequential scheme and shuffled scheme, on pedestrian dynamics. The multi-velocity floor field cellular automaton (FFCA) considering the changes of pedestrians' moving properties along walking paths and heterogeneity of pedestrians' walking abilities was used. As for parallel scheme only, the collisions detection and resolution should be considered, resulting in a great difference from any other update schemes. For pedestrian evacuation, the evacuation time is enlarged, and the difference in pedestrians' walking abilities is better reflected, under parallel scheme. In face of a bottleneck, for example a exit, using a parallel scheme leads to a longer congestion period and a more dispersive density distribution. The exit flow and the space-time distribution of density and velocity have significant discrepancies under four different update schemes when we simulate pedestrian flow with high desired velocity. Update schemes may have no influence on pedestrians in simulation to create tendency to follow others, but sequential and shuffled update scheme may enhance the effect of pedestrians' familiarity with environments.
Selective updating of working memory content modulates meso-cortico-striatal activity.
Murty, Vishnu P; Sambataro, Fabio; Radulescu, Eugenia; Altamura, Mario; Iudicello, Jennifer; Zoltick, Bradley; Weinberger, Daniel R; Goldberg, Terry E; Mattay, Venkata S
2011-08-01
Accumulating evidence from non-human primates and computational modeling suggests that dopaminergic signals arising from the midbrain (substantia nigra/ventral tegmental area) mediate striatal gating of the prefrontal cortex during the selective updating of working memory. Using event-related functional magnetic resonance imaging, we explored the neural mechanisms underlying the selective updating of information stored in working memory. Participants were scanned during a novel working memory task that parses the neurophysiology underlying working memory maintenance, overwriting, and selective updating. Analyses revealed a functionally coupled network consisting of a midbrain region encompassing the substantia nigra/ventral tegmental area, caudate, and dorsolateral prefrontal cortex that was selectively engaged during working memory updating compared to the overwriting and maintenance of working memory content. Further analysis revealed differential midbrain-dorsolateral prefrontal interactions during selective updating between low-performing and high-performing individuals. These findings highlight the role of this meso-cortico-striatal circuitry during the selective updating of working memory in humans, which complements previous research in behavioral neuroscience and computational modeling. Published by Elsevier Inc.
Orbai, Ana-Maria; de Wit, Maarten; Mease, Philip; Shea, Judy A; Gossec, Laure; Leung, Ying Ying; Tillett, William; Elmamoun, Musaab; Callis Duffin, Kristina; Campbell, Willemina; Christensen, Robin; Coates, Laura; Dures, Emma; Eder, Lihi; FitzGerald, Oliver; Gladman, Dafna; Goel, Niti; Grieb, Suzanne Dolwick; Hewlett, Sarah; Hoejgaard, Pil; Kalyoncu, Umut; Lindsay, Chris; McHugh, Neil; Shea, Bev; Steinkoenig, Ingrid; Strand, Vibeke; Ogdie, Alexis
2017-04-01
To identify a core set of domains (outcomes) to be measured in psoriatic arthritis (PsA) clinical trials that represent both patients' and physicians' priorities. We conducted (1) a systematic literature review (SLR) of domains assessed in PsA; (2) international focus groups to identify domains important to people with PsA; (3) two international surveys with patients and physicians to prioritise domains; (4) an international face-to-face meeting with patients and physicians using the nominal group technique method to agree on the most important domains; and (5) presentation and votes at the Outcome Measures in Rheumatology (OMERACT) conference in May 2016. All phases were performed in collaboration with patient research partners. We identified 39 unique domains through the SLR (24 domains) and international focus groups (34 domains). 50 patients and 75 physicians rated domain importance. During the March 2016 consensus meeting, 12 patients and 12 physicians agreed on 10 candidate domains. Then, 49 patients and 71 physicians rated these domains' importance. Five were important to >70% of both groups: musculoskeletal disease activity, skin disease activity, structural damage, pain and physical function. Fatigue and participation were important to >70% of patients. Patient global and systemic inflammation were important to >70% of physicians. The updated PsA core domain set endorsed by 90% of OMERACT 2016 participants includes musculoskeletal disease activity, skin disease activity, pain, patient global, physical function, health-related quality of life, fatigue and systemic inflammation. The updated PsA core domain set incorporates patients' and physicians' priorities and evolving PsA research. Next steps include identifying outcome measures that adequately assess these domains. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
openBEB: open biological experiment browser for correlative measurements
2014-01-01
Background New experimental methods must be developed to study interaction networks in systems biology. To reduce biological noise, individual subjects, such as single cells, should be analyzed using high throughput approaches. The measurement of several correlative physical properties would further improve data consistency. Accordingly, a considerable quantity of data must be acquired, correlated, catalogued and stored in a database for subsequent analysis. Results We have developed openBEB (open Biological Experiment Browser), a software framework for data acquisition, coordination, annotation and synchronization with database solutions such as openBIS. OpenBEB consists of two main parts: A core program and a plug-in manager. Whereas the data-type independent core of openBEB maintains a local container of raw-data and metadata and provides annotation and data management tools, all data-specific tasks are performed by plug-ins. The open architecture of openBEB enables the fast integration of plug-ins, e.g., for data acquisition or visualization. A macro-interpreter allows the automation and coordination of the different modules. An update and deployment mechanism keeps the core program, the plug-ins and the metadata definition files in sync with a central repository. Conclusions The versatility, the simple deployment and update mechanism, and the scalability in terms of module integration offered by openBEB make this software interesting for a large scientific community. OpenBEB targets three types of researcher, ideally working closely together: (i) Engineers and scientists developing new methods and instruments, e.g., for systems-biology, (ii) scientists performing biological experiments, (iii) theoreticians and mathematicians analyzing data. The design of openBEB enables the rapid development of plug-ins, which will inherently benefit from the “house keeping” abilities of the core program. We report the use of openBEB to combine live cell microscopy, microfluidic control and visual proteomics. In this example, measurements from diverse complementary techniques are combined and correlated. PMID:24666611
Zdanowicz, Christian; Kruemmel, Eva; Lean, David; Poulain, Alexandre; Kinnard, Christophe; Yumvihoze, Emmanuel; Chen, JiuBin; Hintelmann, Holger
2015-03-15
Sulfate (SO4(2-)) and mercury (Hg) are airborne pollutants transported to the Arctic where they can affect properties of the atmosphere and the health of marine or terrestrial ecosystems. Detecting trends in Arctic Hg pollution is challenging because of the short period of direct observations, particularly of actual deposition. Here, we present an updated proxy record of atmospheric SO4(2-) and a new 40-year record of total Hg (THg) and monomethyl Hg (MeHg) deposition developed from a firn core (P2010) drilled from Penny Ice Cap, Baffin Island, Canada. The updated P2010 record shows stable mean SO4(2-) levels over the past 40 years, which is inconsistent with observations of declining atmospheric SO4(2-) or snow acidity in the Arctic during the same period. A sharp THg enhancement in the P2010 core ca 1991 is tentatively attributed to the fallout from the eruption of the Icelandic volcano Hekla. Although MeHg accumulation on Penny Ice Cap had remained constant since 1970, THg accumulation increased after the 1980s. This increase is not easily explained by changes in snow accumulation, marine aerosol inputs or air mass trajectories; however, a causal link may exist with the declining sea-ice cover conditions in the Baffin Bay sector. The ratio of THg accumulation between pre-industrial times (reconstructed from archived ice cores) and the modern industrial era is estimated at between 4- and 16-fold, which is consistent with estimates from Arctic lake sediment cores. The new P2010 THg record is the first of its kind developed from the Baffin Island region of the eastern Canadian Arctic and one of very few such records presently available in the Arctic. As such, it may help to bridge the knowledge gap linking direct observation of gaseous Hg in the Arctic atmosphere and actual net deposition and accumulation in various terrestrial media. Copyright © 2014 Elsevier B.V. All rights reserved.
Compendium of animal rabies prevention and control, 2011.
2011-11-04
Rabies has one of the highest case-fatality ratios of any infectious disease. This report provides recommendations for public health officials, veterinarians, animal control officials, and other parties engaged in rabies prevention and control activities and should serve as the basis for standardizing procedures among jurisdictions. The recommendations regarding domestic animal vaccination, management of animals exposed to rabies, and management of animals that bite humans are the core elements of animal rabies control and human rabies prevention. These updated 2011 guidelines include the national case definition for animal rabies and clarify the role of the CDC rabies laboratory in providing confirmatory testing of suspect animals. The table of rabies vaccines licensed and marketed in the United States has been updated, and additional references have been included to provide scientific support for information in this report.
NASA Astrophysics Data System (ADS)
Hadade, Ioan; di Mare, Luca
2016-08-01
Modern multicore and manycore processors exhibit multiple levels of parallelism through a wide range of architectural features such as SIMD for data parallel execution or threads for core parallelism. The exploitation of multi-level parallelism is therefore crucial for achieving superior performance on current and future processors. This paper presents the performance tuning of a multiblock CFD solver on Intel SandyBridge and Haswell multicore CPUs and the Intel Xeon Phi Knights Corner coprocessor. Code optimisations have been applied on two computational kernels exhibiting different computational patterns: the update of flow variables and the evaluation of the Roe numerical fluxes. We discuss at great length the code transformations required for achieving efficient SIMD computations for both kernels across the selected devices including SIMD shuffles and transpositions for flux stencil computations and global memory transformations. Core parallelism is expressed through threading based on a number of domain decomposition techniques together with optimisations pertaining to alleviating NUMA effects found in multi-socket compute nodes. Results are correlated with the Roofline performance model in order to assert their efficiency for each distinct architecture. We report significant speedups for single thread execution across both kernels: 2-5X on the multicore CPUs and 14-23X on the Xeon Phi coprocessor. Computations at full node and chip concurrency deliver a factor of three speedup on the multicore processors and up to 24X on the Xeon Phi manycore coprocessor.
Samman, Samir; McCarthur, Jennifer O; Peat, Mary
2006-01-01
Benchmarking has been adopted by educational institutions as a potentially sensitive tool for improving learning and teaching. To date there has been limited application of benchmarking methodology in the Discipline of Nutritional Science. The aim of this survey was to define core elements and outstanding practice in Nutritional Science through collaborative benchmarking. Questionnaires that aimed to establish proposed core elements for Nutritional Science, and inquired about definitions of " good" and " outstanding" practice were posted to named representatives at eight Australian universities. Seven respondents identified core elements that included knowledge of nutrient metabolism and requirement, food production and processing, modern biomedical techniques that could be applied to understanding nutrition, and social and environmental issues as related to Nutritional Science. Four of the eight institutions who agreed to participate in the present survey identified the integration of teaching with research as an indicator of outstanding practice. Nutritional Science is a rapidly evolving discipline. Further and more comprehensive surveys are required to consolidate and update the definition of the discipline, and to identify the optimal way of teaching it. Global ideas and specific regional requirements also need to be considered.
Peterson, M.D.; Mueller, C.S.
2011-01-01
The USGS National Seismic Hazard Maps are updated about every six years by incorporating newly vetted science on earthquakes and ground motions. The 2008 hazard maps for the central and eastern United States region (CEUS) were updated by using revised New Madrid and Charleston source models, an updated seismicity catalog and an estimate of magnitude uncertainties, a distribution of maximum magnitudes, and several new ground-motion prediction equations. The new models resulted in significant ground-motion changes at 5 Hz and 1 Hz spectral acceleration with 5% damping compared to the 2002 version of the hazard maps. The 2008 maps have now been incorporated into the 2009 NEHRP Recommended Provisions, the 2010 ASCE-7 Standard, and the 2012 International Building Code. The USGS is now planning the next update of the seismic hazard maps, which will be provided to the code committees in December 2013. Science issues that will be considered for introduction into the CEUS maps include: 1) updated recurrence models for New Madrid sources, including new geodetic models and magnitude estimates; 2) new earthquake sources and techniques considered in the 2010 model developed by the nuclear industry; 3) new NGA-East ground-motion models (currently under development); and 4) updated earthquake catalogs. We will hold a regional workshop in late 2011 or early 2012 to discuss these and other issues that will affect the seismic hazard evaluation in the CEUS.
SysML model of exoplanet archive functionality and activities
NASA Astrophysics Data System (ADS)
Ramirez, Solange
2016-08-01
The NASA Exoplanet Archive is an online service that serves data and information on exoplanets and their host stars to help astronomical research related to search for and characterization of extra-solar planetary systems. In order to provide the most up to date data sets to the users, the exoplanet archive performs weekly updates that include additions into the database and updates to the services as needed. These weekly updates are complex due to interfaces within the archive. I will be presenting a SysML model that helps us perform these update activities in a weekly basis.
NASA Astrophysics Data System (ADS)
Barton, E.; Middleton, C.; Koo, K.; Crocker, L.; Brownjohn, J.
2011-07-01
This paper presents the results from collaboration between the National Physical Laboratory (NPL) and the University of Sheffield on an ongoing research project at NPL. A 50 year old reinforced concrete footbridge has been converted to a full scale structural health monitoring (SHM) demonstrator. The structure is monitored using a variety of techniques; however, interrelating results and converting data to knowledge are not possible without a reliable numerical model. During the first stage of the project, the work concentrated on static loading and an FE model of the undamaged bridge was created, and updated, under specified static loading and temperature conditions. This model was found to accurately represent the response under static loading and it was used to identify locations for sensor installation. The next stage involves the evaluation of repair/strengthening patches under both static and dynamic loading. Therefore, before deliberately introducing significant damage, the first set of dynamic tests was conducted and modal properties were estimated. The measured modal properties did not match the modal analysis from the statically updated FE model; it was clear that the existing model required updating. This paper introduces the results of the dynamic testing and model updating. It is shown that the structure exhibits large non-linear, amplitude dependant characteristics. This creates a difficult updating process, but we attempt to produce the best linear representation of the structure. A sensitivity analysis is performed to determine the most sensitive locations for planned damage/repair scenarios and is used to decide whether additional sensors will be necessary.