Updates to the Demographic and Spatial Allocation Models to ...
EPA's announced the availability of the final report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2). This update furthered land change modeling by providing nationwide housing development scenarios up to 2100. This newest version includes updated population and land use data sets and addresses limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide (Final Report) describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.
Updates to the Demographic and Spatial Allocation Models to ...
EPA announced the availability of the draft report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) for a 30-day public comment period. The ICLUS version 2 (v2) modeling tool furthered land change modeling by providing nationwide housing development scenarios up to 2100. ICLUS V2 includes updated population and land use data sets and addressing limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. [2017 UPDATE] Get the latest version of ICLUS and stay up-to-date by signing up to the ICLUS mailing list. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.
Machine learning in updating predictive models of planning and scheduling transportation projects
DOT National Transportation Integrated Search
1997-01-01
A method combining machine learning and regression analysis to automatically and intelligently update predictive models used in the Kansas Department of Transportations (KDOTs) internal management system is presented. The predictive models used...
OSATE Overview & Community Updates
2015-02-15
update 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Delange /Julien 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK...main language capabilities Modeling patterns & model samples for beginners Error-Model examples EMV2 model constructs Demonstration of tools Case
Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model
NASA Technical Reports Server (NTRS)
Boone, Spencer
2017-01-01
This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-05
... projections models, as well as changes to future vehicle mix assumptions, that influence the emission... methodology that may occur in the future such as updated socioeconomic data, new models, and other factors... updated mobile emissions model, the Motor Vehicle Emissions Simulator (also known as MOVES2010a), and to...
Updating of states in operational hydrological models
NASA Astrophysics Data System (ADS)
Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.
2012-04-01
Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.
NASA Astrophysics Data System (ADS)
Barton, E.; Middleton, C.; Koo, K.; Crocker, L.; Brownjohn, J.
2011-07-01
This paper presents the results from collaboration between the National Physical Laboratory (NPL) and the University of Sheffield on an ongoing research project at NPL. A 50 year old reinforced concrete footbridge has been converted to a full scale structural health monitoring (SHM) demonstrator. The structure is monitored using a variety of techniques; however, interrelating results and converting data to knowledge are not possible without a reliable numerical model. During the first stage of the project, the work concentrated on static loading and an FE model of the undamaged bridge was created, and updated, under specified static loading and temperature conditions. This model was found to accurately represent the response under static loading and it was used to identify locations for sensor installation. The next stage involves the evaluation of repair/strengthening patches under both static and dynamic loading. Therefore, before deliberately introducing significant damage, the first set of dynamic tests was conducted and modal properties were estimated. The measured modal properties did not match the modal analysis from the statically updated FE model; it was clear that the existing model required updating. This paper introduces the results of the dynamic testing and model updating. It is shown that the structure exhibits large non-linear, amplitude dependant characteristics. This creates a difficult updating process, but we attempt to produce the best linear representation of the structure. A sensitivity analysis is performed to determine the most sensitive locations for planned damage/repair scenarios and is used to decide whether additional sensors will be necessary.
Water Erosion Prediction Project (WEPP) model status and updates
USDA-ARS?s Scientific Manuscript database
This presentation will provide current information on the USDA-ARS Water Erosion Prediction Project (WEPP) model, and its implementation by the USDA-Forest Service (FS), USDA-Natural Resources Conservation Service (NRCS), and other agencies and universities. Most recently, the USDA-NRCS has begun ef...
Build-up Approach to Updating the Mock Quiet Spike(TradeMark) Beam Model
NASA Technical Reports Server (NTRS)
Herrera, Claudia Y.; Pak, Chan-gi
2007-01-01
A crucial part of aircraft design is ensuring that the required margin for flutter is satisfied. A trustworthy flutter analysis, which begins by possessing an accurate dynamics model, is necessary for this task. Traditionally, a model was updated manually by fine tuning specific stiffness parameters until the analytical results matched test data. This is a time consuming iterative process. NASA Dryden Flight Research Center has developed a mode matching code to execute this process in a more efficient manner. Recently, this code was implemented in the F-15B/Quiet Spike(TradeMark) (Gulfstream Aerospace Corporation, Savannah, Georgia) model update. A build-up approach requiring several ground vibration test configurations and a series of model updates was implemented in order to determine the connection stiffness between aircraft and test article. The mode matching code successfully updated various models for the F-15B/Quiet Spike(TradeMark) project to within 1 percent error in frequency and the modal assurance criteria values ranged from 88.51-99.42 percent.
Build-up Approach to Updating the Mock Quiet Spike(TM)Beam Model
NASA Technical Reports Server (NTRS)
Herrera, Claudia Y.; Pak, Chan-gi
2007-01-01
A crucial part of aircraft design is ensuring that the required margin for flutter is satisfied. A trustworthy flutter analysis, which begins by possessing an accurate dynamics model, is necessary for this task. Traditionally, a model was updated manually by fine tuning specific stiffness parameters until the analytical results matched test data. This is a time consuming iterative process. The NASA Dryden Flight Research Center has developed a mode matching code to execute this process in a more efficient manner. Recently, this code was implemented in the F-15B/Quiet Spike (Gulfstream Aerospace Corporation, Savannah, Georgia) model update. A build-up approach requiring several ground vibration test configurations and a series of model updates was implemented to determine the connection stiffness between aircraft and test article. The mode matching code successfully updated various models for the F-15B/Quiet Spike project to within 1 percent error in frequency and the modal assurance criteria values ranged from 88.51-99.42 percent.
Resource-Constrained Project Scheduling Under Uncertainty: Models, Algorithms and Applications
2014-11-10
Make-to-Order (MTO) Production Planning using Bayesian Updating, International Journal of Production Economics (04 2014) Norman Keith Womer, Haitao...2013) Made-to-Order Production Scheduling using Bayesian Updating, Working Paper, under second-round review in International Journal of Production Economics . VI
What's New: Update on GASB and Accounting Standards.
ERIC Educational Resources Information Center
Marrone, Robert S.; Scharle, Robert E.
1996-01-01
Updates the Governmental Accounting Standards Board (GASB) statements, which pronounce upon and provide guidance in accounting and financial reporting for state and local governmental entities. Describes the development of GASB's governmental finance-reporting model project and identifies five components of internal control. One figure and two…
Title V in South Carolina -- An Update.
ERIC Educational Resources Information Center
Jacob, Nelson L.
Since South Carolina's Title V Community and Resource Development (CRD) project is limited to one small rural county (Williamsburg) affording careful documentation, this paper explicates South Carolina's CRD process via a social action model. This project, then, is described in terms of the following model components: (1) community initiative…
NASA Astrophysics Data System (ADS)
Cook, L. M.; Samaras, C.; McGinnis, S. A.
2017-12-01
Intensity-duration-frequency (IDF) curves are a common input to urban drainage design, and are used to represent extreme rainfall in a region. As rainfall patterns shift into a non-stationary regime as a result of climate change, these curves will need to be updated with future projections of extreme precipitation. Many regions have begun to update these curves to reflect the trends from downscaled climate models; however, few studies have compared the methods for doing so, as well as the uncertainty that results from the selection of the native grid scale and temporal resolution of the climate model. This study examines the variability in updated IDF curves for Pittsburgh using four different methods for adjusting gridded regional climate model (RCM) outputs into station scale precipitation extremes: (1) a simple change factor applied to observed return levels, (2) a naïve adjustment of stationary and non-stationary Generalized Extreme Value (GEV) distribution parameters, (3) a transfer function of the GEV parameters from the annual maximum series, and (4) kernel density distribution mapping bias correction of the RCM time series. Return level estimates (rainfall intensities) and confidence intervals from these methods for the 1-hour to 48-hour duration are tested for sensitivity to the underlying spatial and temporal resolution of the climate ensemble from the NA-CORDEX project, as well as, the future time period for updating. The first goal is to determine if uncertainty is highest for: (i) the downscaling method, (ii) the climate model resolution, (iii) the climate model simulation, (iv) the GEV parameters, or (v) the future time period examined. Initial results of the 6-hour, 10-year return level adjusted with the simple change factor method using four climate model simulations of two different spatial resolutions show that uncertainty is highest in the estimation of the GEV parameters. The second goal is to determine if complex downscaling methods and high-resolution climate models are necessary for updating, or if simpler methods and lower resolution climate models will suffice. The final results can be used to inform the most appropriate method and climate model resolutions to use for updating IDF curves for urban drainage design.
AG Channel Measurement and Modeling Results for Over-Water and Hilly Terrain Conditions
NASA Technical Reports Server (NTRS)
Matolak, David W.; Sun, Ruoyu
2015-01-01
This report describes work completed over the past year on our project, entitled "Unmanned Aircraft Systems (UAS) Research: The AG Channel, Robust Waveforms, and Aeronautical Network Simulations." This project is funded under the NASA project "Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS)." In this report we provide the following: an update on project progress; a description of the over-freshwater and hilly terrain initial results on path loss, delay spread, small-scale fading, and correlations; complete path loss models for the over-water AG channels; analysis for obtaining parameter statistics required for development of accurate wideband AG channel models; and analysis of an atypical AG channel in which the aircraft flies out of the ground site antenna main beam. We have modeled the small-scale fading of these channels with Ricean statistics, and have quantified the behavior of the Ricean K-factor. We also provide some results for correlations of signal components, both intra-band and inter-band. An updated literature review, and a summary that also describes future work, are also included.
ERIC Educational Resources Information Center
Zylinski, Doris; And Others
In 1991-92, a project was undertaken at Napa Valley College to update the college's 1990 Comparative Study of Vocational Nursing Curriculum and Employer Requirements, to develop a model articulation program for licensed nurses pursuing associate degrees, and to produce a guide for recruiting and retaining underrepresented groups in vocational…
ERIC Educational Resources Information Center
Banta, Trudy W., Ed.
2014-01-01
This issue of "Assessment Update" presents the following articles: (1) Effective Leadership Assessment: A 360-Degree Process; (2) Editor's Notes: Accentuating the Positive in Our Work; (3) The Broadcast Education Association's Model Rubrics Project: Building Consensus One Rubric at a Time; (4) Building a Better…
Distribution factors for construction loads and girder capacity equations [project summary].
DOT National Transportation Integrated Search
2017-03-01
This project focused on the use of Florida I-beams (FIBs) in bridge construction. University of Florida researchers used analytical models and finite element analysis to update equations used in the design of bridges using FIBs. They were particularl...
NASA Astrophysics Data System (ADS)
Sani, M. S. M.; Nazri, N. A.; Alawi, D. A. J.
2017-09-01
Resistance spot welding (RSW) is a proficient joining method commonly used for sheet metal joining and become one of the oldest spot welding processes use in industry especially in the automotive. RSW involves the application of heat and pressure without neglecting time taken when joining two or more metal sheets at a localized area which is claimed as the most efficient welding process in metal fabrication. The purpose of this project is to perform model updating of RSW plate structure between mild steel 1010 and stainless steel 304. In order to do the updating, normal mode finite element analysis (FEA) and experimental modal analysis (EMA) have been carried out. Result shows that the discrepancies of natural frequency between FEA and EMA are below than 10 %. Sensitivity model updating is evaluated in order to make sure which parameters are influences in this structural dynamic modification. Young’s modulus and density both materials are indicate significant parameters to do model updating. As a conclusion, after perform model updating, total average error of dissimilar RSW plate is improved significantly.
NASA Astrophysics Data System (ADS)
Keil, M.; Esch, T.; Feigenspan, S.; Marconcini, M.; Metz, A.; Ottinger, M.; Zeidler, J.
2015-04-01
For the update 2012 of CORINE Land Cover, in Germany a new approach has been developed in order to profit from the higher accuracies of the national topographic database. In agreement between the Federal Environment Agency (UBA) and the Federal Agency for Cartography and Geodesy (BKG), CLC2012 has been derived from an updated digital landscape model DLM-DE, which is based on the Official Topographical Cartographic Information System ATKIS of the land survey authorities. The DLM-DE 2009 created by the BKG served as the base for the update 2012 in the national and EU context, both under the responsibility of the BKG. In addition to the updated CLC2012, a second product, the layer "CLC_Change" (2006-2012) was also requested by the European Environment Agency. The objective of the project part of DLR-DFD was to contribute the primary change areas from 2006 to 2009 in the phase of method change using the refined 2009 geometry of the DLM-DE 2009 for a retrospective view back to 2006. A semiautomatic approach was developed for this task, with an important role of AWiFS time series data of 2005 / 2006 in the context of separation between grassland - arable land. Other valuable datasets for the project were already available GMES land monitoring products of 2006 like the soil sealing layer 2006. The paper describes the developed method and discusses exemplary results of the CORINE backdating project part.
2007-02-01
responsible to the Government for certifying these technical risks [4] and [5] The current funding model for Project S&T Plans is: • Pre-First...the new costing spreadsheets at Annexes C-E. 3.1 A complete set of S&T Activities The 6th July 2006 DCIC decisions to change the funding model increase...changes to the funding model mean that the set of S&T Activities in the Project S&T Plans will need to be categorised in new ways to fit in with the
NASA Astrophysics Data System (ADS)
Díaz, Verónica; Poblete, Alvaro
2017-07-01
This paper describes part of a research and development project carried out in public elementary schools. Its objective was to update the mathematical and didactic knowledge of teachers in two consecutive levels in urban and rural public schools of Region de Los Lagos and Region de Los Rios of southern Chile. To that effect, and by means of an advanced training project based on a professional competences model, didactic interventions based on types of problems and types of mathematical competences with analysis of contents and learning assessment were designed. The teachers' competence regarding the didactic strategy used and its results, as well as the students' learning achievements are specified. The project made possible to validate a strategy of lifelong improvement in mathematics, based on the professional competences of teachers and their didactic transposition in the classroom, as an alternative to consolidate learning in areas considered vulnerable in two regions of the country.
NASA Technical Reports Server (NTRS)
1998-01-01
Under an SBIR (Small Business Innovative Research) contract with Johnson Space Center, Knowledge Based Systems Inc. (KBSI) developed an intelligent software environment for modeling and analyzing mission planning activities, simulating behavior, and, using a unique constraint propagation mechanism, updating plans with each change in mission planning activities. KBSI developed this technology into a commercial product, PROJECTLINK, a two-way bridge between PROSIm, KBSI's process modeling and simulation software and leading project management software like Microsoft Project and Primavera's SureTrak Project Manager.
Continuous Improvement of a Groundwater Model over a 20-Year Period: Lessons Learned.
Andersen, Peter F; Ross, James L; Fenske, Jon P
2018-04-17
Groundwater models developed for specific sites generally become obsolete within a few years due to changes in: (1) modeling technology; (2) site/project personnel; (3) project funding; and (4) modeling objectives. Consequently, new models are sometimes developed for the same sites using the latest technology and data, but without potential knowledge gained from the prior models. When it occurs, this practice is particularly problematic because, although technology, data, and observed conditions change, development of the new numerical model may not consider the conceptual model's underpinnings. As a contrary situation, we present the unique case of a numerical flow and trichloroethylene (TCE) transport model that was first developed in 1993 and since revised and updated annually by the same personnel. The updates are prompted by an increase in the amount of data, exposure to a wider range of hydrologic conditions over increasingly longer timeframes, technological advances, evolving modeling objectives, and revised modeling methodologies. The history of updates shows smooth, incremental changes in the conceptual model and modeled aquifer parameters that result from both increase and decrease in complexity. Myriad modeling objectives have included demonstrating the ineffectiveness of a groundwater extraction/injection system, evaluating potential TCE degradation, locating new monitoring points, and predicting likelihood of exceedance of groundwater standards. The application emphasizes an original tenet of successful groundwater modeling: iterative adjustment of the conceptual model based on observations of actual vs. model response. © 2018, National Ground Water Association.
Technical Update for Vocational Agriculture Teachers in Secondary Schools. Final Report.
ERIC Educational Resources Information Center
Iowa State Univ. of Science and Technology, Ames. Dept. of Agricultural Education.
A project provided ongoing opportunities for teachers in Iowa to upgrade their expertise in agribusiness management using new technology; production, processing, and marketing agricultural products; biotechnology in agriculture; and conservation of natural resources. The project also modeled effective teaching methods and strategies. Project…
NASA Technical Reports Server (NTRS)
Bopp, Genie; Somers, Jeff; Granderson, Brad; Gernhardt, Mike; Currie, Nancy; Lawrence, Chuck
2010-01-01
Topics include occupant protection overview with a focus on crew protection during dynamic phases of flight; occupant protection collaboration; modeling occupant protection; occupant protection considerations; project approach encompassing analysis tools, injury criteria, and testing program development; injury criteria update methodology, unique effects of pressure suits and other factors; and a summary.
Brooks, Lynette E.
2013-01-01
The U.S. Geological Survey (USGS), in cooperation with the Southern Utah Valley Municipal Water Association, updated an existing USGS model of southern Utah and Goshen Valleys for hydrologic and climatic conditions from 1991 to 2011 and used the model for projection and groundwater management simulations. All model files used in the transient model were updated to be compatible with MODFLOW-2005 and with the additional stress periods. The well and recharge files had the most extensive changes. Discharge to pumping wells in southern Utah and Goshen Valleys was estimated and simulated on an annual basis from 1991 to 2011. Recharge estimates for 1991 to 2011 were included in the updated model by using precipitation, streamflow, canal diversions, and irrigation groundwater withdrawals for each year. The model was evaluated to determine how well it simulates groundwater conditions during recent increased withdrawals and drought, and to determine if the model is adequate for use in future planning. In southern Utah Valley, the magnitude and direction of annual water-level fluctuation simulated by the updated model reasonably match measured water-level changes, but they do not simulate as much decline as was measured in some locations from 2000 to 2002. Both the rapid increase in groundwater withdrawals and the total groundwater withdrawals in southern Utah Valley during this period exceed the variations and magnitudes simulated during the 1949 to 1990 calibration period. It is possible that hydraulic properties may be locally incorrect or that changes, such as land use or irrigation diversions, occurred that are not simulated. In the northern part of Goshen Valley, simulated water-level changes reasonably match measured changes. Farther south, however, simulated declines are much less than measured declines. Land-use changes indicate that groundwater withdrawals in Goshen Valley are possibly greater than estimated and simulated. It is also possible that irrigation methods, amount of diversions, or other factors have changed that are not simulated or that aquifer properties are incorrectly simulated. The model can be used for projections about the effects of future groundwater withdrawals and managed aquifer recharge in southern Utah Valley, but rapid changes in withdrawals and increasing withdrawals dramatically may reduce the accuracy of the predicted water-level and groundwater-budget changes. The model should not be used for projections in Goshen Valley until additional withdrawal and discharge data are collected and the model is recalibrated if necessary. Model projections indicate large drawdowns of up to 400 feet and complete cessation of natural discharge in some areas with potential future increases in water use. Simulated managed aquifer recharge counteracts those effects. Groundwater management examples indicate that drawdown could be less, and discharge at selected springs could be greater, with optimized groundwater withdrawals and managed aquifer recharge than without optimization. Recalibration to more recent stresses and seasonal stress periods, and collection of new withdrawal, stream, land-use, and discharge data could improve the model fit to water-level changes and the accuracy of predictions.
Environmental Control Systems for Exploration Missions One and Two
NASA Technical Reports Server (NTRS)
Falcone, Mark A.
2017-01-01
In preparing for Exploration Missions One and Two (EM-1 & EM-2), the Ground Systems Development and Operations Program has significant updates to be made to nearly all facilities. This is all being done to accommodate the Space Launch System, which will be the world’s largest rocket in history upon fruition. Facilitating the launch of such a rocket requires an updated Vehicle Assembly Building, an upgraded Launchpad, Payload Processing Facility, and more. In this project, Environmental Control Systems across several facilities were involved, though there is a focus around the Mobile Launcher and Launchpad. Parts were ordered, analysis models were updated, design drawings were updated, and more.
The Design and Product of National 1:1000000 Cartographic Data of Topographic Map
NASA Astrophysics Data System (ADS)
Wang, Guizhi
2016-06-01
National administration of surveying, mapping and geoinformation started to launch the project of national fundamental geographic information database dynamic update in 2012. Among them, the 1:50000 database was updated once a year, furthermore the 1:250000 database was downsized and linkage-updated on the basis. In 2014, using the latest achievements of 1:250000 database, comprehensively update the 1:1000000 digital line graph database. At the same time, generate cartographic data of topographic map and digital elevation model data. This article mainly introduce national 1:1000000 cartographic data of topographic map, include feature content, database structure, Database-driven Mapping technology, workflow and so on.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weiss, J.O.; Lapham, E.V.
1996-12-31
This meeting was held June 10, 1996 at Georgetown University. The purpose of this meeting was to provide a multidisciplinary forum for exchange of state-of-the-art information on the human genome education model. Topics of discussion include the following: psychosocial issues; ethical issues for professionals; legislative issues and update; and education issues.
Evaluation of Lower East Fork Poplar Creek Mercury Sources - Model Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ketelle, Richard; Brandt, Craig C.; Peterson, Mark J.
The purpose of this report is to assess new data that has become available and provide an update to the evaluations and modeling presented in the Oak Ridge National Laboratory (ORNL) Technical Manuscript Evaluation of lower East Fork Poplar Creek (LEFPC) Mercury Sources (Watson et al., 2016). Primary sources of field and laboratory data for this update include multiple US Department of Energy (DOE) programs including Environmental Management (EM; e.g., Biological Monitoring and Abatement Program, Mercury Remediation Technology Development [TD], and Applied Field Research Initiative), Office of Science (Mercury Science Focus Areas [SFA] project), and the Y-12 National Security Complexmore » (Y-12) Compliance Department.« less
SMART-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan: Hodge, Bri-Mathias
This presentation provides a Smart-DS project overview and status update for the ARPA-e GRID DATA program meeting 2017, including distribution systems, models, and scenarios, as well as opportunities for GRID DATA collaborations.
Career Education Community Resource Guide.
ERIC Educational Resources Information Center
D'Lugin, Victor; And Others
This guide, developed by the State Project to Implement Career Education (SPICE) in New York, is intended to serve as a model to assist teachers, guidance counselors, administrators, and project staff in using business and community resources in career education programs. The first section of the guide contains information on ways of updating and…
DOT National Transportation Integrated Search
2010-09-15
This project develops a foundation for analysis of the effects of U.S. biofuel energy policy on domestic : and international grain flows and patterns. The primary deliverable of this project is an updated and : expanded spatial equilibrium model of w...
EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than tr...
Employing Retired Military Personnel as Vocational Education Teachers.
ERIC Educational Resources Information Center
Chase, Shirley A.; Tennant, John E.
A project was undertaken to facilitate the employment of retired military personnel as vocational education teachers. The specific objectives of the project were as follows: update the issues, barriers, and incentives involved in employing retired military personnel to fill teaching positions in civilian vocational education, develop a model for…
ERIC Educational Resources Information Center
Díaz, Verónica; Poblete, Alvaro
2017-01-01
This paper describes part of a research and development project carried out in public elementary schools. Its objective was to update the mathematical and didactic knowledge of teachers in two consecutive levels in urban and rural public schools of Region de Los Lagos and Region de Los Rios of southern Chile. To that effect, and by means of an…
Improving prediction of fall risk among nursing home residents using electronic medical records.
Marier, Allison; Olsho, Lauren E W; Rhodes, William; Spector, William D
2016-03-01
Falls are physically and financially costly, but may be preventable with targeted intervention. The Minimum Data Set (MDS) is one potential source of information on fall risk factors among nursing home residents, but its limited breadth and relatively infrequent updates may limit its practical utility. Richer, more frequently updated data from electronic medical records (EMRs) may improve ability to identify individuals at highest risk for falls. The authors applied a repeated events survival model to analyze MDS 3.0 and EMR data for 5129 residents in 13 nursing homes within a single large California chain that uses a centralized EMR system from a leading vendor. Estimated regression parameters were used to project resident fall probability. The authors examined the proportion of observed falls within each projected fall risk decile to assess improvements in predictive power from including EMR data. In a model incorporating fall risk factors from the MDS only, 28.6% of observed falls occurred among residents in the highest projected risk decile. In an alternative specification incorporating more frequently updated measures for the same risk factors from the EMR data, 32.3% of observed falls occurred among residents in the highest projected risk decile, a 13% increase over the base MDS-only specification. Incorporating EMR data improves ability to identify those at highest risk for falls relative to prediction using MDS data alone. These improvements stem chiefly from the greater frequency with which EMR data are updated, with minimal additional gains from availability of additional risk factor variables. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Long-distance travel modeling: proof of concept : research results digest.
DOT National Transportation Integrated Search
2016-04-01
This research project was established to provide ADOT with direction on the best sources of data and best practices for updating its long-distance personal travel models to better reflect observed travel behavior. Its original intent was to recommend...
Orientation Modeling for Amateur Cameras by Matching Image Line Features and Building Vector Data
NASA Astrophysics Data System (ADS)
Hung, C. H.; Chang, W. C.; Chen, L. C.
2016-06-01
With the popularity of geospatial applications, database updating is getting important due to the environmental changes over time. Imagery provides a lower cost and efficient way to update the database. Three dimensional objects can be measured by space intersection using conjugate image points and orientation parameters of cameras. However, precise orientation parameters of light amateur cameras are not always available due to their costliness and heaviness of precision GPS and IMU. To automatize data updating, the correspondence of object vector data and image may be built to improve the accuracy of direct georeferencing. This study contains four major parts, (1) back-projection of object vector data, (2) extraction of image feature lines, (3) object-image feature line matching, and (4) line-based orientation modeling. In order to construct the correspondence of features between an image and a building model, the building vector features were back-projected onto the image using the initial camera orientation from GPS and IMU. Image line features were extracted from the imagery. Afterwards, the matching procedure was done by assessing the similarity between the extracted image features and the back-projected ones. Then, the fourth part utilized line features in orientation modeling. The line-based orientation modeling was performed by the integration of line parametric equations into collinearity condition equations. The experiment data included images with 0.06 m resolution acquired by Canon EOS Mark 5D II camera on a Microdrones MD4-1000 UAV. Experimental results indicate that 2.1 pixel accuracy may be reached, which is equivalent to 0.12 m in the object space.
Daucourt, Mia C; Schatschneider, Christopher; Connor, Carol M; Al Otaiba, Stephanie; Hart, Sara A
2018-01-01
Recent achievement research suggests that executive function (EF), a set of regulatory processes that control both thought and action necessary for goal-directed behavior, is related to typical and atypical reading performance. This project examines the relation of EF, as measured by its components, Inhibition, Updating Working Memory, and Shifting, with a hybrid model of reading disability (RD). Our sample included 420 children who participated in a broader intervention project when they were in KG-third grade (age M = 6.63 years, SD = 1.04 years, range = 4.79-10.40 years). At the time their EF was assessed, using a parent-report Behavior Rating Inventory of Executive Function (BRIEF), they had a mean age of 13.21 years ( SD = 1.54 years; range = 10.47-16.63 years). The hybrid model of RD was operationalized as a composite consisting of four symptoms, and set so that any child could have any one, any two, any three, any four, or none of the symptoms included in the hybrid model. The four symptoms include low word reading achievement, unexpected low word reading achievement, poorer reading comprehension compared to listening comprehension, and dual-discrepancy response-to-intervention, requiring both low achievement and low growth in word reading. The results of our multilevel ordinal logistic regression analyses showed a significant relation between all three components of EF (Inhibition, Updating Working Memory, and Shifting) and the hybrid model of RD, and that the strength of EF's predictive power for RD classification was the highest when RD was modeled as having at least one or more symptoms. Importantly, the chances of being classified as having RD increased as EF performance worsened and decreased as EF performance improved. The question of whether any one EF component would emerge as a superior predictor was also examined and results showed that Inhibition, Updating Working Memory, and Shifting were equally valuable as predictors of the hybrid model of RD. In total, all EF components were significant and equally effective predictors of RD when RD was operationalized using the hybrid model.
Daucourt, Mia C.; Schatschneider, Christopher; Connor, Carol M.; Al Otaiba, Stephanie; Hart, Sara A.
2018-01-01
Recent achievement research suggests that executive function (EF), a set of regulatory processes that control both thought and action necessary for goal-directed behavior, is related to typical and atypical reading performance. This project examines the relation of EF, as measured by its components, Inhibition, Updating Working Memory, and Shifting, with a hybrid model of reading disability (RD). Our sample included 420 children who participated in a broader intervention project when they were in KG-third grade (age M = 6.63 years, SD = 1.04 years, range = 4.79–10.40 years). At the time their EF was assessed, using a parent-report Behavior Rating Inventory of Executive Function (BRIEF), they had a mean age of 13.21 years (SD = 1.54 years; range = 10.47–16.63 years). The hybrid model of RD was operationalized as a composite consisting of four symptoms, and set so that any child could have any one, any two, any three, any four, or none of the symptoms included in the hybrid model. The four symptoms include low word reading achievement, unexpected low word reading achievement, poorer reading comprehension compared to listening comprehension, and dual-discrepancy response-to-intervention, requiring both low achievement and low growth in word reading. The results of our multilevel ordinal logistic regression analyses showed a significant relation between all three components of EF (Inhibition, Updating Working Memory, and Shifting) and the hybrid model of RD, and that the strength of EF’s predictive power for RD classification was the highest when RD was modeled as having at least one or more symptoms. Importantly, the chances of being classified as having RD increased as EF performance worsened and decreased as EF performance improved. The question of whether any one EF component would emerge as a superior predictor was also examined and results showed that Inhibition, Updating Working Memory, and Shifting were equally valuable as predictors of the hybrid model of RD. In total, all EF components were significant and equally effective predictors of RD when RD was operationalized using the hybrid model. PMID:29662458
Vocational Training and Placement of Severely Disabled Persons. Project Employability--Volume III.
ERIC Educational Resources Information Center
Wehman, Paul, Ed.; Hill, Mark, Ed.
The document contains nine papers reporting the effectiveness of Project Employability, a program to demonstrate and evaluate a training model providing job site training, advocacy, and long term followup for severely disabled individuals. In "Job Placement and Follow-Up of Moderately and Severely Handicapped Individuals--An Update After…
Multimedia Projects in Education: Designing, Producing, and Assessing, Third Edition
ERIC Educational Resources Information Center
Ivers, Karen S.; Barron, Ann E.
2005-01-01
Building on the materials in the two previous successful editions, this book features approximately 40% all new material and updates the previous information. The authors use the DDD-E model (Decide, Design, Develop--Evaluate) to show how to select and plan multimedia projects, use presentation and development tools, manage graphics, audio, and…
NASA Technical Reports Server (NTRS)
Mathews, William S.; Liu, Ning; Francis, Laurie K.; OReilly, Taifun L.; Schrock, Mitchell; Page, Dennis N.; Morris, John R.; Joswig, Joseph C.; Crockett, Thomas M.; Shams, Khawaja S.
2011-01-01
Previously, it was time-consuming to hand-edit data and then set up simulation runs to find the effect and impact of the input data on a spacecraft. MPS Editor provides the user the capability to create/edit/update models and sequences, and immediately try them out using what appears to the user as one piece of software. MPS Editor provides an integrated sequencing environment for users. It provides them with software that can be utilized during development as well as actual operations. In addition, it provides them with a single, consistent, user friendly interface. MPS Editor uses the Eclipse Rich Client Platform to provide an environment that can be tailored to specific missions. It provides the capability to create and edit, and includes an Activity Dictionary to build the simulation spacecraft models, build and edit sequences of commands, and model the effects of those commands on the spacecraft. MPS Editor is written in Java using the Eclipse Rich Client Platform. It is currently built with four perspectives: the Activity Dictionary Perspective, the Project Adaptation Perspective, the Sequence Building Perspective, and the Sequence Modeling Perspective. Each perspective performs a given task. If a mission doesn't require that task, the unneeded perspective is not added to that project's delivery. In the Activity Dictionary Perspective, the user builds the project-specific activities, observations, calibrations, etc. Typically, this is used during the development phases of the mission, although it can be used later to make changes and updates to the Project Activity Dictionary. In the Adaptation Perspective, the user creates the spacecraft models such as power, data store, etc. Again, this is typically used during development, but will be used to update or add models of the spacecraft. The Sequence Building Perspective allows the user to create a sequence of activities or commands that go to the spacecraft. It provides a simulation of the activities and commands that have been created.
A vision and strategy for the virtual physiological human: 2012 update
Hunter, Peter; Chapman, Tara; Coveney, Peter V.; de Bono, Bernard; Diaz, Vanessa; Fenner, John; Frangi, Alejandro F.; Harris, Peter; Hose, Rod; Kohl, Peter; Lawford, Pat; McCormack, Keith; Mendes, Miriam; Omholt, Stig; Quarteroni, Alfio; Shublaq, Nour; Skår, John; Stroetmann, Karl; Tegner, Jesper; Thomas, S. Randall; Tollis, Ioannis; Tsamardinos, Ioannis; van Beek, Johannes H. G. M.; Viceconti, Marco
2013-01-01
European funding under Framework 7 (FP7) for the virtual physiological human (VPH) project has been in place now for 5 years. The VPH Network of Excellence (NoE) has been set up to help develop common standards, open source software, freely accessible data and model repositories, and various training and dissemination activities for the project. It is also working to coordinate the many clinically targeted projects that have been funded under the FP7 calls. An initial vision for the VPH was defined by the FP6 STEP project in 2006. In 2010, we wrote an assessment of the accomplishments of the first two years of the VPH in which we considered the biomedical science, healthcare and information and communications technology challenges facing the project (Hunter et al. 2010 Phil. Trans. R. Soc. A 368, 2595–2614 (doi:10.1098/rsta.2010.0048)). We proposed that a not-for-profit professional umbrella organization, the VPH Institute, should be established as a means of sustaining the VPH vision beyond the time-frame of the NoE. Here, we update and extend this assessment and in particular address the following issues raised in response to Hunter et al.: (i) a vision for the VPH updated in the light of progress made so far, (ii) biomedical science and healthcare challenges that the VPH initiative can address while also providing innovation opportunities for the European industry, and (iii) external changes needed in regulatory policy and business models to realize the full potential that the VPH has to offer to industry, clinics and society generally. PMID:24427536
A vision and strategy for the virtual physiological human: 2012 update.
Hunter, Peter; Chapman, Tara; Coveney, Peter V; de Bono, Bernard; Diaz, Vanessa; Fenner, John; Frangi, Alejandro F; Harris, Peter; Hose, Rod; Kohl, Peter; Lawford, Pat; McCormack, Keith; Mendes, Miriam; Omholt, Stig; Quarteroni, Alfio; Shublaq, Nour; Skår, John; Stroetmann, Karl; Tegner, Jesper; Thomas, S Randall; Tollis, Ioannis; Tsamardinos, Ioannis; van Beek, Johannes H G M; Viceconti, Marco
2013-04-06
European funding under Framework 7 (FP7) for the virtual physiological human (VPH) project has been in place now for 5 years. The VPH Network of Excellence (NoE) has been set up to help develop common standards, open source software, freely accessible data and model repositories, and various training and dissemination activities for the project. It is also working to coordinate the many clinically targeted projects that have been funded under the FP7 calls. An initial vision for the VPH was defined by the FP6 STEP project in 2006. In 2010, we wrote an assessment of the accomplishments of the first two years of the VPH in which we considered the biomedical science, healthcare and information and communications technology challenges facing the project (Hunter et al. 2010 Phil. Trans. R. Soc. A 368, 2595-2614 (doi:10.1098/rsta.2010.0048)). We proposed that a not-for-profit professional umbrella organization, the VPH Institute, should be established as a means of sustaining the VPH vision beyond the time-frame of the NoE. Here, we update and extend this assessment and in particular address the following issues raised in response to Hunter et al.: (i) a vision for the VPH updated in the light of progress made so far, (ii) biomedical science and healthcare challenges that the VPH initiative can address while also providing innovation opportunities for the European industry, and (iii) external changes needed in regulatory policy and business models to realize the full potential that the VPH has to offer to industry, clinics and society generally.
Updating Known Distribution Models for Forecasting Climate Change Impact on Endangered Species
Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo
2013-01-01
To plan endangered species conservation and to design adequate management programmes, it is necessary to predict their distributional response to climate change, especially under the current situation of rapid change. However, these predictions are customarily done by relating de novo the distribution of the species with climatic conditions with no regard of previously available knowledge about the factors affecting the species distribution. We propose to take advantage of known species distribution models, but proceeding to update them with the variables yielded by climatic models before projecting them to the future. To exemplify our proposal, the availability of suitable habitat across Spain for the endangered Bonelli's Eagle (Aquila fasciata) was modelled by updating a pre-existing model based on current climate and topography to a combination of different general circulation models and Special Report on Emissions Scenarios. Our results suggested that the main threat for this endangered species would not be climate change, since all forecasting models show that its distribution will be maintained and increased in mainland Spain for all the XXI century. We remark on the importance of linking conservation biology with distribution modelling by updating existing models, frequently available for endangered species, considering all the known factors conditioning the species' distribution, instead of building new models that are based on climate change variables only. PMID:23840330
Updating known distribution models for forecasting climate change impact on endangered species.
Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo
2013-01-01
To plan endangered species conservation and to design adequate management programmes, it is necessary to predict their distributional response to climate change, especially under the current situation of rapid change. However, these predictions are customarily done by relating de novo the distribution of the species with climatic conditions with no regard of previously available knowledge about the factors affecting the species distribution. We propose to take advantage of known species distribution models, but proceeding to update them with the variables yielded by climatic models before projecting them to the future. To exemplify our proposal, the availability of suitable habitat across Spain for the endangered Bonelli's Eagle (Aquila fasciata) was modelled by updating a pre-existing model based on current climate and topography to a combination of different general circulation models and Special Report on Emissions Scenarios. Our results suggested that the main threat for this endangered species would not be climate change, since all forecasting models show that its distribution will be maintained and increased in mainland Spain for all the XXI century. We remark on the importance of linking conservation biology with distribution modelling by updating existing models, frequently available for endangered species, considering all the known factors conditioning the species' distribution, instead of building new models that are based on climate change variables only.
1986-87 atomic mass predictions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haustein, P.E.
A project to perform a comprehensive update of the atomic mass predictions has recently been concluded and will be published shortly in Atomic Data and Nuclear Data Tables. The project evolved from an ongoing comparison between available mass predictions and reports of newly measured masses of isotopes throughout the mass surface. These comparisons have highlighted a variety of features in current mass models which are responsible for predictions that diverge from masses determined experimentally. The need for a comprehensive update of the atomic mass predictions was therefore apparent and the project was organized and began at the last mass conferencemore » (AMCO-VII). Project participants included: Pape and Anthony; Dussel, Caurier and Zuker; Moeller and Nix; Moeller, Myers, Swiatecki and Treiner; Comay, Kelson, and Zidon; Satpathy and Nayak; Tachibana, Uno, Yamada and Yamada; Spanier and Johansson; Jaenecke and Masson; and Wapstra, Audi and Hoekstra. An overview of the new atomic mass predictions may be obtained by written request.« less
The 1986-87 atomic mass predictions
NASA Astrophysics Data System (ADS)
Haustein, P. E.
1987-12-01
A project to perform a comprehensive update of the atomic mass predictions has recently been concluded and will be published shortly in Atomic Data and Nuclear Data Tables. The project evolved from an ongoing comparison between available mass predictions and reports of newly measured masses of isotopes throughout the mass surface. These comparisons have highlighted a variety of features in current mass models which are responsible for predictions that diverge from masses determined experimentally. The need for a comprehensive update of the atomic mass predictions was therefore apparent and the project was organized and began at the last mass conference (AMCO-VII). Project participants included: Pape and Anthony; Dussel, Caurier and Zuker; Möller and Nix; Möller, Myers, Swiatecki and Treiner; Comay, Kelson, and Zidon; Satpathy and Nayak; Tachibana, Uno, Yamada and Yamada; Spanier and Johansson; Jänecke and Masson; and Wapstra, Audi and Hoekstra. An overview of the new atomic mass predictions may be obtained by written request.
Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P
2018-01-01
Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gettelman, Andrew
2015-10-01
In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.
GRAPPA 2015 Research and Education Project Reports.
Mease, Philip J; Helliwell, Philip S; Boehncke, Wolf-Henning; Coates, Laura C; FitzGerald, Oliver; Gladman, Dafna D; Deodhar, Atul A; Callis Duffin, Kristina
2016-05-01
At the 2015 annual meeting of the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis (GRAPPA), attendees were presented with brief updates on several ongoing initiatives, including educational projects. Updates were presented on the treatment recommendations project, the development of simple criteria to identify inflammatory musculoskeletal disease, new patient/physician Delphi exercises, and BIODAM (identifying biomarkers that predict progressive structural joint damage). The publication committee also gave a report. Herein we summarize those project updates.
NREL: International Activities - U.S.-China Renewable Energy Partnership
Solar PV and TC88 Wind working groups. Renewable Energy Technology These projects enhance policies to Collaboration on innovative business models and financing solutions for solar PV deployment. Micrositing and O development. Current Projects Recommendations for photovoltaic (PV) and wind grid code updates. New energy
Timber products output and timber harvests in Alaska: an addendum
Allen M. Brackley; Richard W. Haynes
2008-01-01
Updated projections of demand for Alaska timber were published July 2006. Their application in land management planning for the Tongass National Forest has resulted in numerous questions and requests for clarification. This note discusses a broad range of these questions from the context of why we do projections, the model we used, the assumptions that determine the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gragg, Evan James; Middleton, Richard Stephen
This report describes the benefits of the BECCUS screening tools. The goals of this project are to utilize NATCARB database for site screening; enhance NATCARB database; run CO 2-EOR simulations and economic models using updated reservoir data sets (SCO 2T-EOR).
Seismic hazard in the Nation's breadbasket
Boyd, Oliver; Haller, Kathleen; Luco, Nicolas; Moschetti, Morgan P.; Mueller, Charles; Petersen, Mark D.; Rezaeian, Sanaz; Rubinstein, Justin L.
2015-01-01
The USGS National Seismic Hazard Maps were updated in 2014 and included several important changes for the central United States (CUS). Background seismicity sources were improved using a new moment-magnitude-based catalog; a new adaptive, nearest-neighbor smoothing kernel was implemented; and maximum magnitudes for background sources were updated. Areal source zones developed by the Central and Eastern United States Seismic Source Characterization for Nuclear Facilities project were simplified and adopted. The weighting scheme for ground motion models was updated, giving more weight to models with a faster attenuation with distance compared to the previous maps. Overall, hazard changes (2% probability of exceedance in 50 years, across a range of ground-motion frequencies) were smaller than 10% in most of the CUS relative to the 2008 USGS maps despite new ground motion models and their assigned logic tree weights that reduced the probabilistic ground motions by 5–20%.
NASA Astrophysics Data System (ADS)
Noël, Brice; van de Berg, Willem Jan; Melchior van Wessem, J.; van Meijgaard, Erik; van As, Dirk; Lenaerts, Jan T. M.; Lhermitte, Stef; Kuipers Munneke, Peter; Smeets, C. J. P. Paul; van Ulft, Lambertus H.; van de Wal, Roderik S. W.; van den Broeke, Michiel R.
2018-03-01
We evaluate modelled Greenland ice sheet (GrIS) near-surface climate, surface energy balance (SEB) and surface mass balance (SMB) from the updated regional climate model RACMO2 (1958-2016). The new model version, referred to as RACMO2.3p2, incorporates updated glacier outlines, topography and ice albedo fields. Parameters in the cloud scheme governing the conversion of cloud condensate into precipitation have been tuned to correct inland snowfall underestimation: snow properties are modified to reduce drifting snow and melt production in the ice sheet percolation zone. The ice albedo prescribed in the updated model is lower at the ice sheet margins, increasing ice melt locally. RACMO2.3p2 shows good agreement compared to in situ meteorological data and point SEB/SMB measurements, and better resolves the spatial patterns and temporal variability of SMB compared with the previous model version, notably in the north-east, south-east and along the K-transect in south-western Greenland. This new model version provides updated, high-resolution gridded fields of the GrIS present-day climate and SMB, and will be used for projections of the GrIS climate and SMB in response to a future climate scenario in a forthcoming study.
Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2011
DOE Office of Scientific and Technical Information (OSTI.GOV)
David W. Nigg; Devin A. Steuhm
2011-09-01
Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance and, to some extent, experiment management are obsolete, inconsistent with the state of modern nuclear engineering practice, and are becoming increasingly difficult to properly verify and validate (V&V). Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In 2009 the Idaho National Laboratory (INL) initiated a focused effort to address this situation through the introduction of modern high-fidelitymore » computational software and protocols, with appropriate V&V, within the next 3-4 years via the ATR Core Modeling and Simulation and V&V Update (or 'Core Modeling Update') Project. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF). The ATR Core Modeling Update Project, targeted for full implementation in phase with the anticipated ATR Core Internals Changeout (CIC) in the 2014 time frame, began during the last quarter of Fiscal Year 2009, and has just completed its first full year. Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (SCALE, KENO-6, HELIOS, NEWT, and ATTILA) have been installed at the INL under various permanent sitewide license agreements and corresponding baseline models of the ATR and ATRC are now operational, demonstrating the basic feasibility of these code packages for their intended purpose. Furthermore, a capability for rigorous sensitivity analysis and uncertainty quantification based on the TSUNAMI system is being implemented and initial computational results have been obtained. This capability will have many applications in 2011 and beyond as a tool for understanding the margins of uncertainty in the new models as well as for validation experiment design and interpretation. Finally we note that although full implementation of the new computational models and protocols will extend over a period 3-4 years as noted above, interim applications in the much nearer term have already been demonstrated. In particular, these demonstrations included an analysis that was useful for understanding the cause of some issues in December 2009 that were triggered by a larger than acceptable discrepancy between the measured excess core reactivity and a calculated value that was based on the legacy computational methods. As the Modeling Update project proceeds we anticipate further such interim, informal, applications in parallel with formal qualification of the system under the applicable INL Quality Assurance procedures and standards.« less
Demonstration PreFab Solar Heated Vacation Home
ERIC Educational Resources Information Center
Ariola, Frank; Walencik, Vincent J.
1978-01-01
To update a traditional construction shop program, students at Passaic Valley High School, New Jersey, developed a mock-up model of a solar-heated A-frame vacation house using prefab construction. The article describes the project and illustrates it with photographs of the model and a drawing of the solar collector. (MF)
Civilian Agency Industry Working Group EVM World Update
NASA Technical Reports Server (NTRS)
Kerby, Jerald
2013-01-01
Objectives include: Promote the use of standards ]based, objective, and quantitative systems for managing projects and programs in the federal government. Understand how civilian agencies in general, manage their projects and programs. Project management survey expected to go out soon to civilian agencies. Describe how EVM and other best practices can be applied by the government to better manage its project and programs irrespective of whether work is contracted out or the types of contracts employed. Develop model policies aimed at project and program managers that are transportable across the government.
Global Modeling and Assimilation Office Annual Report and Research Highlights 2011-2012
NASA Technical Reports Server (NTRS)
Rienecker, Michele M.
2012-01-01
Over the last year, the Global Modeling and Assimilation Office (GMAO) has continued to advance our GEOS-5-based systems, updating products for both weather and climate applications. We contributed hindcasts and forecasts to the National Multi-Model Ensemble (NMME) of seasonal forecasts and the suite of decadal predictions to the Coupled Model Intercomparison Project (CMIP5).
Annual updating of plantation inventory estimates using hybrid models
Peter Snowdon
2000-01-01
Data for Pinus radiata D. Don grown in the Australian Capital Territory (ACT) are used to show that annual indices of growth potential can be successfully incorporated into Schumacher projection models of stand basal area growth. Significant reductions in the error mean squares of the models can be obtained by including an annual growth index derived...
Development and Implementation of a Curriculum for Nurse Refresher Course. Final Report.
ERIC Educational Resources Information Center
Eggland, Carol
The project reported here was done to design and implement a model curriculum for a nurse refresher course to update the unemployed nurse's knowledge and skills in preparation for a return to employment. This report begins with an abstract of the project, a course evaluation based on the second of two field tests, and a brief course syllabus. The…
The USGS national geothermal resource assessment: An update
Williams, C.F.; Reed, M.J.; Galanis, S.P.; DeAngelo, J.
2007-01-01
The U. S. Geological Survey (USGS) is working with the Department of Energy's (DOE) Geothermal Technologies Program and other geothermal organizations on a three-year effort to produce an updated assessment of available geothermal resources. The new assessment will introduce significant changes in the models for geothermal energy recovery factors, estimates of reservoir volumes, and limits to temperatures and depths for electric power production. It will also include the potential impact of evolving Enhanced Geothermal Systems (EGS) technology. An important focus in the assessment project is on the development of geothermal resource models consistent with the production histories and observed characteristics of exploited geothermal fields. New models for the recovery of heat from heterogeneous, fractured reservoirs provide a physically realistic basis for evaluating the production potential of both natural geothermal reservoirs and reservoirs that may be created through the application of EGS technology. Project investigators have also made substantial progress studying geothermal systems and the factors responsible for their formation through studies in the Great Basin-Modoc Plateau region, Coso, Long Valley, the Imperial Valley and central Alaska, Project personnel are also entering the supporting data and resulting analyses into geospatial databases that will be produced as part of the resource assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCright, R D
1998-06-30
This Engineered Materials Characterization Report (EMCR), Volume 3, discusses in considerable detail the work of the past 18 months on testing the candidate materials proposed for the waste-package (WP) container and on modeling the performance of those materials in the Yucca Mountain (YM) repository setting This report was prepared as an update of information and serves as one of the supporting documents to the Viability Assessment (VA) of the Yucca Mountain Project. Previous versions of the EMCR have provided a history and background of container-materials selection and evaluation (Volume I), a compilation of physical and mechanical properties for the WPmore » design effort (Volume 2), and corrosion-test data and performance-modeling activities (Volume 3). Because the information in Volumes 1 and 2 is still largely current, those volumes are not being revised. As new information becomes available in the testing and modeling efforts, Volume 3 is periodically updated to include that information.« less
Lepton flavor violating radiative decays in EW-scale ν R model: an update
Hung, P. Q.; Le, Trinh; Tran, Van Que; ...
2015-12-28
Here, we perform an updated analysis for the one-loop induced charged lepton flavor violating radiative decays l i → l j γ in an extended mirror model. Mixing effects of the neutrinos and charged leptons constructed with a horizontal A 4 symmetry are also taken into account. Current experimental limit and projected sensitivity on the branching ratio of μ → eγ are used to constrain the parameter space of the model. Calculations of two related observables, the electric and magnetic dipole moments of the leptons, are included. Implications concerning the possible detection of mirror leptons at the LHC and themore » ILC are also discussed.« less
Landfill Gas Energy Cost Model Version 3.0 (LFGcost-Web V3.0)
To help stakeholders estimate the costs of a landfill gas (LFG) energy project, in 2002, LMOP developed a cost tool (LFGcost). Since then, LMOP has routinely updated the tool to reflect changes in the LFG energy industry. Initially the model was designed for EPA to assist landfil...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-19
... benchmarks, projects future population conditions, and recommends research and monitoring needs. Participants....--4 p.m. Assessment panelists will discuss data inputs to the stock assessment model and make recommendations for additional years of data to be updated in the model. New information on black sea bass life...
Bozorgnia, Yousef; Abrahamson, Norman A.; Al Atik, Linda; Ancheta, Timothy D.; Atkinson, Gail M.; Baker, Jack W.; Baltay, Annemarie S.; Boore, David M.; Campbell, Kenneth W.; Chiou, Brian S.J.; Darragh, Robert B.; Day, Steve; Donahue, Jennifer; Graves, Robert W.; Gregor, Nick; Hanks, Thomas C.; Idriss, I. M.; Kamai, Ronnie; Kishida, Tadahiro; Kottke, Albert; Mahin, Stephen A.; Rezaeian, Sanaz; Rowshandel, Badie; Seyhan, Emel; Shahi, Shrey; Shantz, Tom; Silva, Walter; Spudich, Paul A.; Stewart, Jonathan P.; Watson-Lamprey, Jennie; Wooddell, Kathryn; Youngs, Robert
2014-01-01
The NGA-West2 project is a large multidisciplinary, multi-year research program on the Next Generation Attenuation (NGA) models for shallow crustal earthquakes in active tectonic regions. The research project has been coordinated by the Pacific Earthquake Engineering Research Center (PEER), with extensive technical interactions among many individuals and organizations. NGA-West2 addresses several key issues in ground-motion seismic hazard, including updating the NGA database for a magnitude range of 3.0–7.9; updating NGA ground-motion prediction equations (GMPEs) for the “average” horizontal component; scaling response spectra for damping values other than 5%; quantifying the effects of directivity and directionality for horizontal ground motion; resolving discrepancies between the NGA and the National Earthquake Hazards Reduction Program (NEHRP) site amplification factors; analysis of epistemic uncertainty for NGA GMPEs; and developing GMPEs for vertical ground motion. This paper presents an overview of the NGA-West2 research program and its subprojects.
The VEPSY UPDATED Project: clinical rationale and technical approach.
Riva, G; Alcãniz, M; Anolli, L; Bacchetta, M; Baños, R; Buselli, C; Beltrame, F; Botella, C; Castelnuovo, G; Cesa, G; Conti, S; Galimberti, C; Gamberini, L; Gaggioli, A; Klinger, E; Legeron, P; Mantovani, F; Mantovani, G; Molinari, E; Optale, G; Ricciardiello, L; Perpiñá, C; Roy, S; Spagnolli, A; Troiani, R; Weddle, C
2003-08-01
More than 10 years ago, Tart (1990) described virtual reality (VR) as a technological model of consciousness offering intriguing possibilities for developing diagnostic, inductive, psychotherapeutic, and training techniques that can extend and supplement current ones. To exploit and understand this potential is the overall goal of the "Telemedicine and Portable Virtual Environment in Clinical Psychology"--VEPSY UPDATED--a European Community-funded research project (IST-2000-25323, www.cybertherapy.info). Particularly, its specific goal is the development of different PC-based virtual reality modules to be used in clinical assessment and treatment of social phobia, panic disorders, male sexual disorders, obesity, and eating disorders. The paper describes the clinical and technical rationale behind the clinical applications developed by the project. Moreover, the paper focuses its analysis on the possible role of VR in clinical psychology and how it can be used for therapeutic change.
Obs4MIPS: Satellite Observations for Model Evaluation
NASA Astrophysics Data System (ADS)
Ferraro, R.; Waliser, D. E.; Gleckler, P. J.
2017-12-01
This poster will review the current status of the obs4MIPs project, whose purpose is to provide a limited collection of well-established and documented datasets for comparison with Earth system models (https://www.earthsystemcog.org/projects/obs4mips/). These datasets have been reformatted to correspond with the CMIP5 model output requirements, and include technical documentation specifically targeted for their use in model output evaluation. The project holdings now exceed 120 datasets with observations that directly correspond to CMIP5 model output variables, with new additions in response to the CMIP6 experiments. With the growth in climate model output data volume, it is increasing more difficult to bring the model output and the observations together to do evaluations. The positioning of the obs4MIPs datasets within the Earth System Grid Federation (ESGF) allows for the use of currently available and planned online tools within the ESGF to perform analysis using model output and observational datasets without necessarily downloading everything to a local workstation. This past year, obs4MIPs has updated its submission guidelines to closely align with changes in the CMIP6 experiments, and is implementing additional indicators and ancillary data to allow users to more easily determine the efficacy of an obs4MIPs dataset for specific evaluation purposes. This poster will present the new guidelines and indicators, and update the list of current obs4MIPs holdings and their connection to the ESGF evaluation and analysis tools currently available, and being developed for the CMIP6 experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.
Ashley E. Askew; J.M. Bowker; Donald B.K. English; Stanley J. Zarnoch; Gary T. Green
2017-01-01
The outdoor recreation component of the 2010 Resources Planning Act (RPA) Assessment provided projections and modeling of participation and intensity by activity. Results provided insight into the future of multiple outdoor recreation activities through projections of participation rates, numbers of participants, days per participant, and total activity days. These...
Systems Thinking and Simulation Modeling to Inform Childhood Obesity Policy and Practice.
Powell, Kenneth E; Kibbe, Debra L; Ferencik, Rachel; Soderquist, Chris; Phillips, Mary Ann; Vall, Emily Anne; Minyard, Karen J
In 2007, 31.7% of Georgia adolescents in grades 9-12 were overweight or obese. Understanding the impact of policies and interventions on obesity prevalence among young people can help determine statewide public health and policy strategies. This article describes a systems model, originally launched in 2008 and updated in 2014, that simulates the impact of policy interventions on the prevalence of childhood obesity in Georgia through 2034. In 2008, using information from peer-reviewed reports and quantitative estimates by experts in childhood obesity, physical activity, nutrition, and health economics and policy, a group of legislators, legislative staff members, and experts trained in systems thinking and system dynamics modeling constructed a model simulating the impact of policy interventions on the prevalence of childhood obesity in Georgia through 2034. Use of the 2008 model contributed to passage of a bill requiring annual fitness testing of schoolchildren and stricter enforcement of physical education requirements. We updated the model in 2014. With no policy change, the updated model projects that the prevalence of obesity among children and adolescents aged ≤18 in Georgia would hold at 18% from 2014 through 2034. Mandating daily school physical education (which would reduce prevalence to 12%) and integrating moderate to vigorous physical activity into elementary classrooms (which would reduce prevalence to 10%) would have the largest projected impact. Enacting all policies simultaneously would lower the prevalence of childhood obesity from 18% to 3%. Systems thinking, especially with simulation models, facilitates understanding of complex health policy problems. Using a simulation model to educate legislators, educators, and health experts about the policies that have the greatest short- and long-term impact should encourage strategic investment in low-cost, high-return policies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalantari, F; Wang, J; Li, T
2015-06-15
Purpose: In conventional 4D-PET, images from different frames are reconstructed individually and aligned by registration methods. Two issues with these approaches are: 1) Reconstruction algorithms do not make full use of all projections statistics; and 2) Image registration between noisy images can Result in poor alignment. In this study we investigated the use of simultaneous motion estimation and image reconstruction (SMEIR) method for cone beam CT for motion estimation/correction in 4D-PET. Methods: Modified ordered-subset expectation maximization algorithm coupled with total variation minimization (OSEM- TV) is used to obtain a primary motion-compensated PET (pmc-PET) from all projection data using Demons derivedmore » deformation vector fields (DVFs) as initial. Motion model update is done to obtain an optimal set of DVFs between the pmc-PET and other phases by matching the forward projection of the deformed pmc-PET and measured projections of other phases. Using updated DVFs, OSEM- TV image reconstruction is repeated and new DVFs are estimated based on updated images. 4D XCAT phantom with typical FDG biodistribution and a 10mm diameter tumor was used to evaluate the performance of the SMEIR algorithm. Results: Image quality of 4D-PET is greatly improved by the SMEIR algorithm. When all projections are used to reconstruct a 3D-PET, motion blurring artifacts are present, leading to a more than 5 times overestimation of the tumor size and 54% tumor to lung contrast ratio underestimation. This error reduced to 37% and 20% for post reconstruction registration methods and SMEIR respectively. Conclusion: SMEIR method can be used for motion estimation/correction in 4D-PET. The statistics is greatly improved since all projection data are combined together to update the image. The performance of the SMEIR algorithm for 4D-PET is sensitive to smoothness control parameters in the DVF estimation step.« less
Projecting U.S. climate forcing and criteria pollutant emissions through 2050
Presentation highlighting a method for translating emission scenarios to model-ready emission inventories. The presentation highlights new features for spatially allocating emissions to counties and grid cells and identifies areas of potential improvement, such as updating tempor...
Recent Updates to the System Advisor Model (SAM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
DiOrio, Nicholas A
The System Advisor Model (SAM) is a mature suite of techno-economic models for many renewable energy technologies that can be downloaded for free as a desktop application or software development kit. SAM is used for system-level modeling, including generating performance pro the release of the code as an open source project on GitHub. Other additions that will be covered include the ability to download data directly into SAM from the National Solar Radiation Database (NSRDB) and up- dates to a user-interface macro that assists with PV system sizing. A brief update on SAM's battery model and its integration with themore » detailed photovoltaic model will also be discussed. Finally, an outline of planned work for the next year will be presented, including the addition of a bifacial model, support for multiple MPPT inputs for detailed inverter modeling, and the addition of a model for inverter thermal behavior.« less
Power Management and Distribution (PMAD) Model Development: Final Report
NASA Technical Reports Server (NTRS)
Metcalf, Kenneth J.
2011-01-01
Power management and distribution (PMAD) models were developed in the early 1990's to model candidate architectures for various Space Exploration Initiative (SEI) missions. They were used to generate "ballpark" component mass estimates to support conceptual PMAD system design studies. The initial set of models was provided to NASA Lewis Research Center (since renamed Glenn Research Center) in 1992. They were developed to estimate the characteristics of power conditioning components predicted to be available in the 2005 timeframe. Early 90's component and device designs and material technologies were projected forward to the 2005 timeframe, and algorithms reflecting those design and material improvements were incorporated into the models to generate mass, volume, and efficiency estimates for circa 2005 components. The models are about ten years old now and NASA GRC requested a review of them to determine if they should be updated to bring them into agreement with current performance projections or to incorporate unforeseen design or technology advances. This report documents the results of this review and the updated power conditioning models and new transmission line models generated to estimate post 2005 PMAD system masses and sizes. This effort continues the expansion and enhancement of a library of PMAD models developed to allow system designers to assess future power system architectures and distribution techniques quickly and consistently.
Resource Tracking Model Updates and Trade Studies
NASA Technical Reports Server (NTRS)
Chambliss, Joe; Stambaugh, Imelda; Moore, Michael
2016-01-01
The Resource Tracking Model has been updated to capture system manager and project manager inputs. Both the Trick/General Use Nodal Network Solver Resource Tracking Model (RTM) simulator and the RTM mass balance spreadsheet have been revised to address inputs from system managers and to refine the way mass balance is illustrated. The revisions to the RTM included the addition of a Plasma Pyrolysis Assembly (PPA) to recover hydrogen from Sabatier Reactor methane, which was vented in the prior version of the RTM. The effect of the PPA on the overall balance of resources in an exploration vehicle is illustrated in the increased recycle of vehicle oxygen. Case studies have been run to show the relative effect of performance changes on vehicle resources.
Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan
2015-10-29
This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parametersmore » applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Middleton, Richard Stephen
2017-05-22
This presentation is part of US-China Clean Coal project and describes the impact of power plant cycling, techno economic modeling of combined IGCC and CCS, integrated capacity generation decision making for power utilities, and a new decision support tool for integrated assessment of CCUS.
Mattson, Marifran; Basu, Ambar
2010-07-01
The Center for Disease Control's (CDC) Diethylstilbestrol (DES) Update, a campaign to educate people who may have been exposed to the drug DES, is framed on the premises of the social marketing model, namely formative research, audience segmentation, product, price, placement, promotion, and campaign evaluation. More than that, the campaign takes a critical step in extending the social marketing paradigm by highlighting the need to situate the messaging process at the heart of any health communication campaign. This article uses CDC's DES Update as a case study to illustrate an application of a message development tool within social marketing. This tool promotes the operationalization of messaging within health campaigns. Ultimately, the goal of this project is to extend the social marketing model and provide useful theoretical guidance to health campaign practitioners on how to accomplish stellar communication within a social marketing campaign.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacFarlane, Joseph J.; Golovkin, I. E.; Woodruff, P. R.
2009-08-07
This Final Report summarizes work performed under DOE STTR Phase II Grant No. DE-FG02-05ER86258 during the project period from August 2006 to August 2009. The project, “Development of Spectral and Atomic Models for Diagnosing Energetic Particle Characteristics in Fast Ignition Experiments,” was led by Prism Computational Sciences (Madison, WI), and involved collaboration with subcontractors University of Nevada-Reno and Voss Scientific (Albuquerque, NM). In this project, we have: Developed and implemented a multi-dimensional, multi-frequency radiation transport model in the LSP hybrid fluid-PIC (particle-in-cell) code [1,2]. Updated the LSP code to support the use of accurate equation-of-state (EOS) tables generated by Prism’smore » PROPACEOS [3] code to compute more accurate temperatures in high energy density physics (HEDP) plasmas. Updated LSP to support the use of Prism’s multi-frequency opacity tables. Generated equation of state and opacity data for LSP simulations for several materials being used in plasma jet experimental studies. Developed and implemented parallel processing techniques for the radiation physics algorithms in LSP. Benchmarked the new radiation transport and radiation physics algorithms in LSP and compared simulation results with analytic solutions and results from numerical radiation-hydrodynamics calculations. Performed simulations using Prism radiation physics codes to address issues related to radiative cooling and ionization dynamics in plasma jet experiments. Performed simulations to study the effects of radiation transport and radiation losses due to electrode contaminants in plasma jet experiments. Updated the LSP code to generate output using NetCDF to provide a better, more flexible interface to SPECT3D [4] in order to post-process LSP output. Updated the SPECT3D code to better support the post-processing of large-scale 2-D and 3-D datasets generated by simulation codes such as LSP. Updated atomic physics modeling to provide for more comprehensive and accurate atomic databases that feed into the radiation physics modeling (spectral simulations and opacity tables). Developed polarization spectroscopy modeling techniques suitable for diagnosing energetic particle characteristics in HEDP experiments. A description of these items is provided in this report. The above efforts lay the groundwork for utilizing the LSP and SPECT3D codes in providing simulation support for DOE-sponsored HEDP experiments, such as plasma jet and fast ignition physics experiments. We believe that taken together, the LSP and SPECT3D codes have unique capabilities for advancing our understanding of the physics of these HEDP plasmas. Based on conversations early in this project with our DOE program manager, Dr. Francis Thio, our efforts emphasized developing radiation physics and atomic modeling capabilities that can be utilized in the LSP PIC code, and performing radiation physics studies for plasma jets. A relatively minor component focused on the development of methods to diagnose energetic particle characteristics in short-pulse laser experiments related to fast ignition physics. The period of performance for the grant was extended by one year to August 2009 with a one-year no-cost extension, at the request of subcontractor University of Nevada-Reno.« less
The EPA Children’s Environmental Health Yearbook Supplement (2000)
New projects and updates to some ongoing projects already described in the 1998 Yearbook, including sections on asthma, childhood cancer, developmental/neurological toxicity, pesticides, contaminated water, and updated list of Children's Health Resources.
Kernodle, J.M.
1998-01-01
The ground-water-flow model of the Albuquerque Basin (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) was updated to include new information on the hydrogeologic framework (Hawley, J.W., Haase, C.S., and Lozinsky, R.P., 1995, An underground view of the Albuquerque Basin: Proceedings of the 39th Annual New Mexico Water Conference, November 3-4, 1994, p. 37-55). An additional year of ground-water-withdrawal data was appended to the simulation of the historical period and incorporated into the base for future projections to the year 2020. The revised model projects the simulated ground-water levels associated with an aerally enlarged occurrence of the relatively high hydraulic conductivity in the upper part of the Santa Fe Group east and west of the Rio Grande in the Albuquerque area and north to Bernalillo. Although the differences between the two model versions are substantial, the revised model does not contradict any previous conclusions about the effect of City of Albuquerque ground-water withdrawals on flow in the Rio Grande or the net benefits of an effort to conserve ground water. Recent revisions to the hydrogeologic model (Hawley, J.W., Haneberg, W.C., and Whitworth, P.M., in press, Hydrogeologic investigations in the Albuquerque Basin, central New Mexico, 1992-1995: Socorro, New Mexico Bureau of Mines and Mineral Resources Open- File Report 402) of the Albuquerque Basin eventually will require that this model version also be revised and updated.
Apollo Lunar Sample Photograph Digitization Project Update
NASA Technical Reports Server (NTRS)
Todd, N. S.; Lofgren, G. E.
2012-01-01
This is an update of the progress of a 4-year data restoration project effort funded by the LASER program to digitize photographs of the Apollo lunar rock samples and create high resolution digital images and undertaken by the Astromaterials Acquisition and Curation Office at JSC [1]. The project is currently in its last year of funding. We also provide an update on the derived products that make use of the digitized photos including the Lunar Sample Catalog and Photo Database[2], Apollo Sample data files for GoogleMoon[3].
Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)
NASA Technical Reports Server (NTRS)
Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.
2015-01-01
The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of the IMM Project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the HRP NASA-STD-7009 Guidance Document working group and the NASA-HDBK-7009 [2]. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including operations, science and technology planning, and exploration planning. IMM v4.0 is slated for operational release in the FY015 and current VVC assessments illustrate the expected VVC status prior to the completion of customer lead external review efforts. CONCLUSIONS: The VVC approach established by the IMM Project of incorporating Project-specific recommended practices and guidelines for implementing the 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM Project represented a critical communication tool in providing clear and concise suitability assessments to IMM customers. These processes have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.
NASA Technical Reports Server (NTRS)
Liou, J.-C.; Fitz-Coy, N.; Werremeyer, M.; Huynh, T.; Voelker, M.; Opiela, J.
2012-01-01
DebriSat is a planned laboratory ]based satellite hypervelocity impact experiment. The goal of the project is to characterize the orbital debris that would be generated by a hypervelocity collision involving a modern satellite in low Earth orbit (LEO). The DebriSat project will update and expand upon the information obtained in the 1992 Satellite Orbital Debris Characterization Impact Test (SOCIT), which characterized the breakup of a 1960 's US Navy Transit satellite. There are three phases to this project: the design and fabrication of an engineering model representing a modern, 50-cm/50-kg class LEO satellite known as DebriSat; conduction of a laboratory-based hypervelocity impact to catastrophically break up the satellite; and characterization of the properties of breakup fragments down to 2 mm in size. The data obtained, including fragment size, area ]to ]mass ratio, density, shape, material composition, optical properties, and radar cross ]section distributions, will be used to supplement the DoD fs and NASA fs satellite breakup models to better describe the breakup outcome of a modern satellite. Updated breakup models will improve mission planning, environmental models, and event response. The DebriSat project is sponsored by the Air Force fs Space and Missile Systems Center and the NASA Orbital Debris Program Office. The design and fabrication of DebriSat is led by University of Florida with subject matter experts f support from The Aerospace Corporation. The major milestones of the project include the complete fabrication of DebriSat by September 2013, the hypervelocity impact of DebriSat at the Air Force fs Arnold Engineering Development Complex in early 2014, and fragment characterization and data analyses in late 2014.
A knowledge management platform for infrastructure performance modeling
DOT National Transportation Integrated Search
2011-05-10
The ITS/JPO Evaluation Program is requesting ITS costs information in order to update the ITS Costs database with current data and account for new/emerging services and technologies. If you have ITS Costs on recent ITS projects, or if you have ITS co...
Ocean regional circulation model sensitizes to resolution of the lateral boundary conditions
NASA Astrophysics Data System (ADS)
Pham, Van Sy; Hwang, Jin Hwan
2017-04-01
Dynamical downscaling with nested regional oceanographic models is an effective approach for forecasting operationally coastal weather and projecting long term climate on the ocean. Nesting procedures deliver the unwanted in dynamic downscaling due to the differences of numerical grid sizes and updating steps. Therefore, such unavoidable errors restrict the application of the Ocean Regional Circulation Model (ORCMs) in both short-term forecasts and long-term projections. The current work identifies the effects of errors induced by computational limitations during nesting procedures on the downscaled results of the ORCMs. The errors are quantitatively evaluated for each error source and its characteristics by the Big-Brother Experiments (BBE). The BBE separates identified errors from each other and quantitatively assess the amount of uncertainties employing the same model to simulate for both nesting and nested model. Here, we focus on discussing errors resulting from two main matters associated with nesting procedures. They should be the spatial grids' differences and the temporal updating steps. After the diverse cases from separately running of the BBE, a Taylor diagram was adopted to analyze the results and suggest an optimization intern of grid size and updating period and domain sizes. Key words: lateral boundary condition, error, ocean regional circulation model, Big-Brother Experiment. Acknowledgement: This research was supported by grants from the Korean Ministry of Oceans and Fisheries entitled "Development of integrated estuarine management system" and a National Research Foundation of Korea (NRF) Grant (No. 2015R1A5A 7037372) funded by MSIP of Korea. The authors thank the Integrated Research Institute of Construction and Environmental Engineering of Seoul National University for administrative support.
Updated Intensity - Duration - Frequency Curves Under Different Future Climate Scenarios
NASA Astrophysics Data System (ADS)
Ragno, E.; AghaKouchak, A.
2016-12-01
Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of the existing and future infrastructures. Here we employ historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on climatic model projections. This presentation summarizes the projected changes in statistics of extremes. We show that, based on CMIP5 simulations, extreme precipitation events in some urban areas can be 20% more severe in the future, even when projected annual mean precipitation is expected to remain similar to the ground-based climatology.
Online Sequential Projection Vector Machine with Adaptive Data Mean Update
Chen, Lin; Jia, Ji-Ting; Zhang, Qiong; Deng, Wan-Yu; Wei, Wei
2016-01-01
We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM) which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1) the projection vectors for dimension reduction, (2) the input weights, biases, and output weights, and (3) the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD) approach, adaptive multihyperplane machine (AMM), primal estimated subgradient solver (Pegasos), online sequential extreme learning machine (OSELM), and SVD + OSELM (feature selection based on SVD is performed before OSELM). The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM. PMID:27143958
Online Sequential Projection Vector Machine with Adaptive Data Mean Update.
Chen, Lin; Jia, Ji-Ting; Zhang, Qiong; Deng, Wan-Yu; Wei, Wei
2016-01-01
We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM) which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1) the projection vectors for dimension reduction, (2) the input weights, biases, and output weights, and (3) the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD) approach, adaptive multihyperplane machine (AMM), primal estimated subgradient solver (Pegasos), online sequential extreme learning machine (OSELM), and SVD + OSELM (feature selection based on SVD is performed before OSELM). The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM.
ERIC Educational Resources Information Center
Herreid, Charlene H.; Miller, Thomas E.
2009-01-01
This article is the fourth in a series of articles describing an attrition prediction and intervention project at the University of South Florida (USF) in Tampa. In this article, the researchers describe the updated version of the prediction model. The original model was developed from a sample of about 900 First Time in College (FTIC) students…
Rubbertown NGEM Demonstration Project?Community Update Meeting 2, July 25, 2017
This is a slide deck for part of a community webinar held by LMAPCD to provide an update on the project. The slides will be delivered by team lead E. Thoma remotely by webinar. A draft version of the QAPP for the project will be provided to the community as part of this communi...
Cost and schedule estimation study report
NASA Technical Reports Server (NTRS)
Condon, Steve; Regardie, Myrna; Stark, Mike; Waligora, Sharon
1993-01-01
This report describes the analysis performed and the findings of a study of the software development cost and schedule estimation models used by the Flight Dynamics Division (FDD), Goddard Space Flight Center. The study analyzes typical FDD projects, focusing primarily on those developed since 1982. The study reconfirms the standard SEL effort estimation model that is based on size adjusted for reuse; however, guidelines for the productivity and growth parameters in the baseline effort model have been updated. The study also produced a schedule prediction model based on empirical data that varies depending on application type. Models for the distribution of effort and schedule by life-cycle phase are also presented. Finally, this report explains how to use these models to plan SEL projects.
HFE Process Guidance and Standards for potential application to updating NRC guidance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo; J. J. Persensky
2012-07-01
The U.S. Nuclear Regulatory Commission (NRC) reviews and evaluates the human factors engineering (HFE) programs of applicants for nuclear power plant construction permits, operating licenses, standard design certifications, and combined operating licenses. The purpose of these safety reviews is to help ensure that personnel performance and reliability are appropriately supported. Detailed design review procedures and guidance for the evaluations is provided in three key documents: the Standard Review Plan (NUREG-0800), the HFE Program Review Model (NUREG-0711), and the Human-System Interface Design Review Guidelines (NUREG-0700). These documents were last revised in 2007, 2004 and 2002, respectively. The NRC is committed tomore » the periodic update and improvement of these guidance documents to ensure that they remain state-of-the-art design evaluation tools. Thus, the NRC has initiated a project with BNL to update the NRC guidance to remain current with recent research on human performance, advances in HFE methods and tools, and new technology. INL supported Brookhaven National Lab (BNL) to update the detailed HFE review criteria contained in NUREG-0711 and NUREG-0700 based on (1) feedback obtained from end users, (2) the results of NRC research and development efforts supporting the NRC staff’s HFE safety reviews, and (3) other material the project staff identify as applicable to the update effort. INL submitted comments on development plans and sections of NUREGs 0800, 0711, and 0700. The contractor prepared the report attached here as the deliverable for this work.« less
Key points in this presentation are: (1) How and why hydroclimatic province can help precipitation projection for water program engineering and management, (2) Implications of initial research results and planned further monitoring / research activities, (3) Five adaptation t...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-01
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 2778-062] Idaho Power; Notice of Availability of Land Management Plan Update for the Shoshone Falls Project and Soliciting Comments, Motions To Intervene, and Protests Take notice that the following hydroelectric application has been filed with the Commission and is available...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-01
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 2055-087] Idaho Power; Notice of Availability of Land Management Plan Update for the Shoshone Falls Project and Soliciting Comments, Motions To Intervene, and Protests Take notice that the following hydroelectric application has been filed with the Commission and is available...
Capital planning for operating theatres based on projecting future theatre requirements.
Sheehan, Jennifer A; Tyler, Peter; Jayasinha, Hirani; Meleady, Kathleen T; Jones, Neill
2011-05-01
During 2006, NSW and ACT Health Departments jointly engaged KPMG to develop an Operating Theatre Requirements' Projection Model and an accompanying planning guideline. A research scan was carried out to identify drivers of surgical demand, theatre capacity and theatre performance, as well as locating existing approaches to modelling operating theatre requirements for planning purposes. The project delivered a Microsoft Excel-based model for projecting future operating theatre requirements, together with an accompanying guideline for use of the model and interpretation of its outputs. It provides a valuable addition to the suite of tools available to Health staff for service and capital planning. The model operates with several limitations, largely due to being data dependent, and the state and completeness of available theatre activity data. However, the operational flexibility built into the model allows users to compensate for these limitations, on a case by case basis, when the user has access to suitable, local data. The design flexibility of the model means that updating the model as improved data become available is not difficult; resulting in revisions being able to be made quickly, and disseminated to users rapidly.
DOT National Transportation Integrated Search
2016-05-01
Florida International University researchers examined the existing performance measures and the project prioritization method in the CMP and updated them to better reflect the current conditions and strategic goals of FDOT. They also developed visual...
Email Updates about ADOPT | Transportation Research | NREL
Email Updates about ADOPT Email Updates about ADOPT Subscribe Please provide the following information to subscribe for email updates about ADOPT, the Automotive Deployment Options Projection Tool . * indicates required Email Address: * Name (first and last): Organization/Affiliation Subscribe
Kurtz, Steven M; Ong, Kevin L; Lau, Edmund; Bozic, Kevin J
2014-04-16
Few studies have explored the role of the National Health Expenditure and macroeconomics on the utilization of total joint replacement. The economic downturn has raised questions about the sustainability of growth for total joint replacement in the future. Previous projections of total joint replacement demand in the United States were based on data up to 2003 using a statistical methodology that neglected macroeconomic factors, such as the National Health Expenditure. Data from the Nationwide Inpatient Sample (1993 to 2010) were used with United States Census and National Health Expenditure data to quantify historical trends in total joint replacement rates, including the two economic downturns in the 2000s. Primary and revision hip and knee arthroplasty were identified using codes from the International Classification of Diseases, Ninth Revision, Clinical Modification. Projections in total joint replacement were estimated using a regression model incorporating the growth in population and rate of arthroplasties from 1993 to 2010 as a function of age, sex, race, and census region using the National Health Expenditure as the independent variable. The regression model was used in conjunction with government projections of National Health Expenditure from 2011 to 2021 to estimate future arthroplasty rates in subpopulations of the United States and to derive national estimates. The growth trend for the incidence of joint arthroplasty, for the overall United States population as well as for the United States workforce, was insensitive to economic downturns. From 2009 to 2010, the total number of procedures increased by 6.0% for primary total hip arthroplasty, 6.1% for primary total knee arthroplasty, 10.8% for revision total hip arthroplasty, and 13.5% for revision total knee arthroplasty. The National Health Expenditure model projections for primary hip replacement in 2020 were higher than a previously projected model, whereas the current model estimates for total knee arthroplasty were lower. Economic downturns in the 2000s did not substantially influence the national growth trends for hip and knee arthroplasty in the United States. These latest updated projections provide a basis for surgeons, hospitals, payers, and policy makers to plan for the future demand for total joint replacement surgery.
NASA Technical Reports Server (NTRS)
Dunn, Michael R.
2014-01-01
Over the course of my internship in the Flight Projects Office of NASA's Launch Services Program (LSP), I worked on two major projects, both of which dealt with updating current systems to make them more accurate and to allow them to operate more efficiently. The first project dealt with the Mission Integration Reporting System (MIRS), a web-accessible database application used to manage and provide mission status reporting for the LSP portfolio of awarded missions. MIRS had not gone through any major updates since its implementation in 2005, and it was my job to formulate a recommendation for the improvement of the system. The second project I worked on dealt with the Mission Plan, a document that contains an overview of the general life cycle that is followed by every LSP mission. My job on this project was to update the information currently in the mission plan and to add certain features in order to increase the accuracy and thoroughness of the document. The outcomes of these projects have implications in the orderly and efficient operation of the Flight Projects Office, and the process of Mission Management in the Launch Services Program as a whole.
A Meteorological Model's Dependence on Radiation Update Frequency
NASA Technical Reports Server (NTRS)
Eastman, Joseph L.; Peters-Lidard, Christa; Tao, Wei-Kuo; Kumar, Sujay; Tian, Yudong; Lang, Stephen E.; Zeng, Xiping
2004-01-01
Numerical weather models are used to simulate circulations in the atmosphere including clouds and precipitation by applying a set of mathematical equations over a three-dimensional grid. The grid is composed of discrete points at which the meteorological variables are defined. As computing power continues to rise these models are being used at finer grid spacing, but they must still cover a wide range of scales. Some of the physics that must be accounted for in the model cannot be explicitly resolved, and their effects, therefore, must be estimated or "parameterized". Some of these parameterizations are computationally expensive. To alleviate the problem, they are not always updated at the time resolution of the model with the assumption being that the impact will be small. In this study, a coupled land-atmosphere model is used to assess the impact of less frequent updates of the computationally expensive radiation physics for a case on June 6, 2002, that occurred during a field experiment over the central plains known as International H20 Project (IHOP). The model was tested using both the original conditions, which were dry, and with modified conditions wherein moisture was added to the lower part of the atmosphere to produce clouds and precipitation (i.e., a wet case). For each of the conditions (i.e., dry and wet), four set of experiments were conducted wherein the model was run for a period of 24 hours and the radiation fields (including both incoming solar and outgoing longwave) were updated every 1, 3, 10, and 100 time steps. Statistical tests indicated that average quantities of surface variables for both the dry and wet cases were the same for the various update frequencies. However, spatially the results could be quite different especially in the wet case after it began to rain. The near-surface wind field was found to be different most of the time even for the dry case. In the wet case, rain intensities and average vertical profiles of heating associated with cloudy areas were found to differ for the various radiation update frequencies. The latter implies that the mean state of the model could be different as a result of not updating the radiation fields every time step and has important implications for longer term climate studies
This presentation provides a brief update on activities of ISO (International Organization for Standardization) TC (Technical Committee) 285, Clean Cooking Solutions. Slides include information on: (1) Working Group project status updates, (2) background on laboratory and field ...
NASA Astrophysics Data System (ADS)
Melchior van Wessem, Jan; van de Berg, Willem Jan; Noël, Brice P. Y.; van Meijgaard, Erik; Amory, Charles; Birnbaum, Gerit; Jakobs, Constantijn L.; Krüger, Konstantin; Lenaerts, Jan T. M.; Lhermitte, Stef; Ligtenberg, Stefan R. M.; Medley, Brooke; Reijmer, Carleen H.; van Tricht, Kristof; Trusel, Luke D.; van Ulft, Lambertus H.; Wouters, Bert; Wuite, Jan; van den Broeke, Michiel R.
2018-04-01
We evaluate modelled Antarctic ice sheet (AIS) near-surface climate, surface mass balance (SMB) and surface energy balance (SEB) from the updated polar version of the regional atmospheric climate model, RACMO2 (1979-2016). The updated model, referred to as RACMO2.3p2, incorporates upper-air relaxation, a revised topography, tuned parameters in the cloud scheme to generate more precipitation towards the AIS interior and modified snow properties reducing drifting snow sublimation and increasing surface snowmelt. Comparisons of RACMO2 model output with several independent observational data show that the existing biases in AIS temperature, radiative fluxes and SMB components are further reduced with respect to the previous model version. The model-integrated annual average SMB for the ice sheet including ice shelves (minus the Antarctic Peninsula, AP) now amounts to 2229 Gt y-1, with an interannual variability of 109 Gt y-1. The largest improvement is found in modelled surface snowmelt, which now compares well with satellite and weather station observations. For the high-resolution ( ˜ 5.5 km) AP simulation, results remain comparable to earlier studies. The updated model provides a new, high-resolution data set of the contemporary near-surface climate and SMB of the AIS; this model version will be used for future climate scenario projections in a forthcoming study.
Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment
NASA Astrophysics Data System (ADS)
Zeigler, Bernard P.; Lee, J. S.
1998-08-01
In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.
RAEGE Project Update: Yebes Observatory Broadband Receiver Ready for VGOS
NASA Astrophysics Data System (ADS)
IGN Yebes Observatory staff
2016-12-01
An update of the deployment and activities at the Spanish/Portuguese RAEGE project (``Atlantic Network of Geodynamical and Space Stations'') is presented. While regular observations with the Yebes radio telescope are on-going, technological developments about receivers for VGOS are progressing at the Yebes laboratories.
Industry Panel Curriculum Project. Final Report.
ERIC Educational Resources Information Center
Kenneke, Larry J.
This project was conducted to develop and disseminate updated curriculum guides for nine selected cluster areas of vocational education programs in Oregon; only five were developed. The updates were based on recommendations made by industry review panels (IRPs). Specialists from the Oregon Department of Education and Oregon State University…
Program Updates - San Antonio River Basin
This page will house updates for this urban waters partnership location. As projects progress, status updates can be posted here to reflect the ongoing work by partners in San Antonio working on the San Antonio River Basin.
NASA Astrophysics Data System (ADS)
Sofieva, V. F.; Liu, C.; Huang, F.; Kyrola, E.; Liu, Y.; Ialongo, I.; Hakkarainen, J.; Zhang, Y.
2016-08-01
The DRAGON-3 cooperation study on the upper troposphere and the lower stratosphere (UTLS) is based on new satellite data and modern atmospheric models. The objectives of the project are: (i) assessment of satellite data on chemical composition in UTLS, (ii) dynamical and chemical structures of the UTLS and its variability, (iii) multi-scale variability of stratospheric ozone, (iv) climatology of the stratospheric aerosol layer and its variability, and (v) updated ozone climatology and its relation to tropopause/multiple tropopauses.In this paper, we present the main results of the project.
The Gypsy Database (GyDB) of mobile genetic elements: release 2.0
Llorens, Carlos; Futami, Ricardo; Covelli, Laura; Domínguez-Escribá, Laura; Viu, Jose M.; Tamarit, Daniel; Aguilar-Rodríguez, Jose; Vicente-Ripolles, Miguel; Fuster, Gonzalo; Bernet, Guillermo P.; Maumus, Florian; Munoz-Pomer, Alfonso; Sempere, Jose M.; Latorre, Amparo; Moya, Andres
2011-01-01
This article introduces the second release of the Gypsy Database of Mobile Genetic Elements (GyDB 2.0): a research project devoted to the evolutionary dynamics of viruses and transposable elements based on their phylogenetic classification (per lineage and protein domain). The Gypsy Database (GyDB) is a long-term project that is continuously progressing, and that owing to the high molecular diversity of mobile elements requires to be completed in several stages. GyDB 2.0 has been powered with a wiki to allow other researchers participate in the project. The current database stage and scope are long terminal repeats (LTR) retroelements and relatives. GyDB 2.0 is an update based on the analysis of Ty3/Gypsy, Retroviridae, Ty1/Copia and Bel/Pao LTR retroelements and the Caulimoviridae pararetroviruses of plants. Among other features, in terms of the aforementioned topics, this update adds: (i) a variety of descriptions and reviews distributed in multiple web pages; (ii) protein-based phylogenies, where phylogenetic levels are assigned to distinct classified elements; (iii) a collection of multiple alignments, lineage-specific hidden Markov models and consensus sequences, called GyDB collection; (iv) updated RefSeq databases and BLAST and HMM servers to facilitate sequence characterization of new LTR retroelement and caulimovirus queries; and (v) a bibliographic server. GyDB 2.0 is available at http://gydb.org. PMID:21036865
The Gypsy Database (GyDB) of mobile genetic elements: release 2.0.
Llorens, Carlos; Futami, Ricardo; Covelli, Laura; Domínguez-Escribá, Laura; Viu, Jose M; Tamarit, Daniel; Aguilar-Rodríguez, Jose; Vicente-Ripolles, Miguel; Fuster, Gonzalo; Bernet, Guillermo P; Maumus, Florian; Munoz-Pomer, Alfonso; Sempere, Jose M; Latorre, Amparo; Moya, Andres
2011-01-01
This article introduces the second release of the Gypsy Database of Mobile Genetic Elements (GyDB 2.0): a research project devoted to the evolutionary dynamics of viruses and transposable elements based on their phylogenetic classification (per lineage and protein domain). The Gypsy Database (GyDB) is a long-term project that is continuously progressing, and that owing to the high molecular diversity of mobile elements requires to be completed in several stages. GyDB 2.0 has been powered with a wiki to allow other researchers participate in the project. The current database stage and scope are long terminal repeats (LTR) retroelements and relatives. GyDB 2.0 is an update based on the analysis of Ty3/Gypsy, Retroviridae, Ty1/Copia and Bel/Pao LTR retroelements and the Caulimoviridae pararetroviruses of plants. Among other features, in terms of the aforementioned topics, this update adds: (i) a variety of descriptions and reviews distributed in multiple web pages; (ii) protein-based phylogenies, where phylogenetic levels are assigned to distinct classified elements; (iii) a collection of multiple alignments, lineage-specific hidden Markov models and consensus sequences, called GyDB collection; (iv) updated RefSeq databases and BLAST and HMM servers to facilitate sequence characterization of new LTR retroelement and caulimovirus queries; and (v) a bibliographic server. GyDB 2.0 is available at http://gydb.org.
Kalantari, Faraz; Li, Tianfang; Jin, Mingwu; Wang, Jing
2016-01-01
In conventional 4D positron emission tomography (4D-PET), images from different frames are reconstructed individually and aligned by registration methods. Two issues that arise with this approach are as follows: 1) the reconstruction algorithms do not make full use of projection statistics; and 2) the registration between noisy images can result in poor alignment. In this study, we investigated the use of simultaneous motion estimation and image reconstruction (SMEIR) methods for motion estimation/correction in 4D-PET. A modified ordered-subset expectation maximization algorithm coupled with total variation minimization (OSEM-TV) was used to obtain a primary motion-compensated PET (pmc-PET) from all projection data, using Demons derived deformation vector fields (DVFs) as initial motion vectors. A motion model update was performed to obtain an optimal set of DVFs in the pmc-PET and other phases, by matching the forward projection of the deformed pmc-PET with measured projections from other phases. The OSEM-TV image reconstruction was repeated using updated DVFs, and new DVFs were estimated based on updated images. A 4D-XCAT phantom with typical FDG biodistribution was generated to evaluate the performance of the SMEIR algorithm in lung and liver tumors with different contrasts and different diameters (10 to 40 mm). The image quality of the 4D-PET was greatly improved by the SMEIR algorithm. When all projections were used to reconstruct 3D-PET without motion compensation, motion blurring artifacts were present, leading up to 150% tumor size overestimation and significant quantitative errors, including 50% underestimation of tumor contrast and 59% underestimation of tumor uptake. Errors were reduced to less than 10% in most images by using the SMEIR algorithm, showing its potential in motion estimation/correction in 4D-PET. PMID:27385378
NASA Astrophysics Data System (ADS)
Kalantari, Faraz; Li, Tianfang; Jin, Mingwu; Wang, Jing
2016-08-01
In conventional 4D positron emission tomography (4D-PET), images from different frames are reconstructed individually and aligned by registration methods. Two issues that arise with this approach are as follows: (1) the reconstruction algorithms do not make full use of projection statistics; and (2) the registration between noisy images can result in poor alignment. In this study, we investigated the use of simultaneous motion estimation and image reconstruction (SMEIR) methods for motion estimation/correction in 4D-PET. A modified ordered-subset expectation maximization algorithm coupled with total variation minimization (OSEM-TV) was used to obtain a primary motion-compensated PET (pmc-PET) from all projection data, using Demons derived deformation vector fields (DVFs) as initial motion vectors. A motion model update was performed to obtain an optimal set of DVFs in the pmc-PET and other phases, by matching the forward projection of the deformed pmc-PET with measured projections from other phases. The OSEM-TV image reconstruction was repeated using updated DVFs, and new DVFs were estimated based on updated images. A 4D-XCAT phantom with typical FDG biodistribution was generated to evaluate the performance of the SMEIR algorithm in lung and liver tumors with different contrasts and different diameters (10-40 mm). The image quality of the 4D-PET was greatly improved by the SMEIR algorithm. When all projections were used to reconstruct 3D-PET without motion compensation, motion blurring artifacts were present, leading up to 150% tumor size overestimation and significant quantitative errors, including 50% underestimation of tumor contrast and 59% underestimation of tumor uptake. Errors were reduced to less than 10% in most images by using the SMEIR algorithm, showing its potential in motion estimation/correction in 4D-PET.
1994-07-01
House 1730-1900 Examples of Successful Strategies: Maj. Richard Travis, TECOM 1- ’-4 I I Workshop Agenda, 27 April 1994 (Wednesday) 5 0800-0830 Coffee ...NAWC-WD (China Lake)-SERDP Update, requirements coordination Mr. Lance VanderZyle, Yuma Proving Ground -Environmental Simulation Model Project Dr. Regina...maintenance of buildings and grounds . Planting projects must clearly identify vegetation to be planted. For example, if the primary purpose of an urban mee
Curriculum Bank for Individualized Electronic Instruction. Final Report.
ERIC Educational Resources Information Center
Williamson, Bert; Pedersen, Joe F.
Objectives of this project were to update and convert to disk storage appropriate handout materials for courses for the electronic technology open classroom. Project activities were an ERIC search for computer-managed instructional materials; updating of the course outline, lesson outlines, information handouts, and unit tests; and storage of the…
77 FR 20006 - Reports and Updates on Programs and Research Projects
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-03
... ARCTIC RESEARCH COMMISSION Reports and Updates on Programs and Research Projects Notice is hereby given that the U.S. Arctic Research Commission will hold its 97th meeting in Montreal, Quebec, Canada... meeting. (3) Commissioners and staff reports. (4) Discussion and presentations concerning Arctic research...
A selective-update affine projection algorithm with selective input vectors
NASA Astrophysics Data System (ADS)
Kong, NamWoong; Shin, JaeWook; Park, PooGyeon
2011-10-01
This paper proposes an affine projection algorithm (APA) with selective input vectors, which based on the concept of selective-update in order to reduce estimation errors and computations. The algorithm consists of two procedures: input- vector-selection and state-decision. The input-vector-selection procedure determines the number of input vectors by checking with mean square error (MSE) whether the input vectors have enough information for update. The state-decision procedure determines the current state of the adaptive filter by using the state-decision criterion. As the adaptive filter is in transient state, the algorithm updates the filter coefficients with the selected input vectors. On the other hand, as soon as the adaptive filter reaches the steady state, the update procedure is not performed. Through these two procedures, the proposed algorithm achieves small steady-state estimation errors, low computational complexity and low update complexity for colored input signals.
A Review of Recent Updates of Sea-Level Projections at Global and Regional Scales
NASA Technical Reports Server (NTRS)
Slangen, A. B. A.; Adloff, F.; Jevrejeva, S.; Leclercq, P. W.; Marzeion, B.; Wada, Yoshihide; Winkelmann, R.
2016-01-01
Sea-level change (SLC) is a much-studied topic in the area of climate research, integrating a range of climate science disciplines, and is expected to impact coastal communities around the world. As a result, this field is rapidly moving, and the knowledge and understanding of processes contributing to SLC is increasing. Here, we discuss noteworthy recent developments in the projection of SLC contributions and in the global mean and regional sea-level projections. For the Greenland Ice Sheet contribution to SLC, earlier estimates have been confirmed in recent research, but part of the source of this contribution has shifted from dynamics to surface melting. New insights into dynamic discharge processes and the onset of marine ice sheet instability increase the projected range for the Antarctic contribution by the end of the century. The contribution from both ice sheets is projected to increase further in the coming centuries to millennia. Recent updates of the global glacier outline database and new global glacier models have led to slightly lower projections for the glacier contribution to SLC (7-17 cm by 2100), but still project the glaciers to be an important contribution. For global mean sea-level projections, the focus has shifted to better estimating the uncertainty distributions of the projection time series, which may not necessarily follow a normal distribution. Instead, recent studies use skewed distributions with longer tails to higher uncertainties. Regional projections have been used to study regional uncertainty distributions, and regional projections are increasingly being applied to specific regions, countries, and coastal areas.
Hu, Yanhui; Comjean, Aram; Roesel, Charles; Vinayagam, Arunachalam; Flockhart, Ian; Zirin, Jonathan; Perkins, Lizabeth; Perrimon, Norbert; Mohr, Stephanie E.
2017-01-01
The FlyRNAi database of the Drosophila RNAi Screening Center (DRSC) and Transgenic RNAi Project (TRiP) at Harvard Medical School and associated DRSC/TRiP Functional Genomics Resources website (http://fgr.hms.harvard.edu) serve as a reagent production tracking system, screen data repository, and portal to the community. Through this portal, we make available protocols, online tools, and other resources useful to researchers at all stages of high-throughput functional genomics screening, from assay design and reagent identification to data analysis and interpretation. In this update, we describe recent changes and additions to our website, database and suite of online tools. Recent changes reflect a shift in our focus from a single technology (RNAi) and model species (Drosophila) to the application of additional technologies (e.g. CRISPR) and support of integrated, cross-species approaches to uncovering gene function using functional genomics and other approaches. PMID:27924039
NASA Astrophysics Data System (ADS)
Desconnets, Jean-Christophe; Giuliani, Gregory; Guigoz, Yaniss; Lacroix, Pierre; Mlisa, Andiswa; Noort, Mark; Ray, Nicolas; Searby, Nancy D.
2017-02-01
The discovery of and access to capacity building resources are often essential to conduct environmental projects based on Earth Observation (EO) resources, whether they are Earth Observation products, methodological tools, techniques, organizations that impart training in these techniques or even projects that have shown practical achievements. Recognizing this opportunity and need, the European Commission through two FP7 projects jointly with the Group on Earth Observations (GEO) teamed up with the Committee on Earth observation Satellites (CEOS). The Global Earth Observation CApacity Building (GEOCAB) portal aims at compiling all current capacity building efforts on the use of EO data for societal benefits into an easily updateable and user-friendly portal. GEOCAB offers a faceted search to improve user discovery experience with a fully interactive world map with all inventoried projects and activities. This paper focuses on the conceptual framework used to implement the underlying platform. An ISO19115 metadata model associated with a terminological repository are the core elements that provide a semantic search application and an interoperable discovery service. The organization and the contribution of different user communities to ensure the management and the update of the content of GEOCAB are addressed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-22
... familiar with the updated closing documents, the existing closing documents, which these updated closing... assures the industry and the public that sufficient staff will be available and thoroughly familiar with... documents that are updated for current commercial legal standards, balanced with the public policy role that...
NASA Astrophysics Data System (ADS)
O'Dea, Enda; Furner, Rachel; Wakelin, Sarah; Siddorn, John; While, James; Sykes, Peter; King, Robert; Holt, Jason; Hewitt, Helene
2017-08-01
We describe the physical model component of the standard Coastal Ocean version 5 configuration (CO5) of the European north-west shelf (NWS). CO5 was developed jointly between the Met Office and the National Oceanography Centre. CO5 is designed with the seamless approach in mind, which allows for modelling of multiple timescales for a variety of applications from short-range ocean forecasting to climate projections. The configuration constitutes the basis of the latest update to the ocean and data assimilation components of the Met Office's operational Forecast Ocean Assimilation Model (FOAM) for the NWS. A 30.5-year non-assimilating control hindcast of CO5 was integrated from January 1981 to June 2012. Sensitivity simulations were conducted with reference to the control run. The control run is compared against a previous non-assimilating Proudman Oceanographic Laboratory Coastal Ocean Modelling System (POLCOMS) hindcast of the NWS. The CO5 control hindcast is shown to have much reduced biases compared to POLCOMS. Emphasis in the system description is weighted to updates in CO5 over previous versions. Updates include an increase in vertical resolution, a new vertical coordinate stretching function, the replacement of climatological riverine sources with the pan-European hydrological model E-HYPE, a new Baltic boundary condition and switching from directly imposed atmospheric model boundary fluxes to calculating the fluxes within the model using a bulk formula. Sensitivity tests of the updates are detailed with a view toward attributing observed changes in the new system from the previous system and suggesting future directions of research to further improve the system.
76 FR 61074 - Reports and Updates on Arctic Research Programs and Projects; Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-03
... UNITED STATES ARCTIC RESEARCH COMMISSION Reports and Updates on Arctic Research Programs and Projects; Meetings Notice is hereby given that the US Arctic Research Commission will hold its 96th meeting... about topics of interest related to research activities in the Arctic. 96th Meeting Schedule: Wed., Oct...
The Southern Ocean in the Coupled Model Intercomparison Project phase 5
Meijers, A. J. S.
2014-01-01
The Southern Ocean is an important part of the global climate system, but its complex coupled nature makes both its present state and its response to projected future climate forcing difficult to model. Clear trends in wind, sea-ice extent and ocean properties emerged from multi-model intercomparison in the Coupled Model Intercomparison Project phase 3 (CMIP3). Here, we review recent analyses of the historical and projected wind, sea ice, circulation and bulk properties of the Southern Ocean in the updated Coupled Model Intercomparison Project phase 5 (CMIP5) ensemble. Improvements to the models include higher resolutions, more complex and better-tuned parametrizations of ocean mixing, and improved biogeochemical cycles and atmospheric chemistry. CMIP5 largely reproduces the findings of CMIP3, but with smaller inter-model spreads and biases. By the end of the twenty-first century, mid-latitude wind stresses increase and shift polewards. All water masses warm, and intermediate waters freshen, while bottom waters increase in salinity. Surface mixed layers shallow, warm and freshen, whereas sea ice decreases. The upper overturning circulation intensifies, whereas bottom water formation is reduced. Significant disagreement exists between models for the response of the Antarctic Circumpolar Current strength, for reasons that are as yet unclear. PMID:24891395
State Renewable Energy Requirements and Goals: Update through 2009 (Update) (released in AEO2010)
2010-01-01
To the extent possible,Annual Energy Outlook 2010 (AEO) incorporates the impacts of state laws requiring the addition of renewable generation or capacity by utilities doing business in the states. Currently, 30 states and the District of Columbia have enforceable renewable portfolio standards (RPS) or similar laws). Under such standards, each state determines its own levels of generation, eligible technologies, and noncompliance penalties. AEO2010 includes the impacts of all laws in effect as of September 2009 (with the exception of Hawaii, because the National Energy Modeling System provides electricity market projections for the continental United States only).
Alligator, Alligator mississippiensis, habitat suitability index model
Waddle, J. Hardin
2017-01-01
The 2012 Coastal Master Plan utilized Habitat Suitability Indices (HSIs) to evaluate potential project effects on wildlife species. Even though HSIs quantify habitat condition, which may not directly correlate to species abundance, they remain a practical and tractable way to assess changes in habitat quality from various restoration actions. As part of the legislatively mandated five year update to the 2012 plan, the wildlife habitat suitability indices were updated and revised using literature and existing field data where available. The outcome of these efforts resulted in improved, or in some cases entirely new suitability indices. This report describes the development of the habitat suitability indices for the American alligator, Alligator mississippiensis.
Intraseasonal Variability in the Atmosphere-Ocean Climate System. Second Edition
NASA Technical Reports Server (NTRS)
Lau, William K. M.; Waliser, Duane E.
2011-01-01
Understanding and predicting the intraseasonal variability (ISV) of the ocean and atmosphere is crucial to improving long-range environmental forecasts and the reliability of climate change projections through climate models. This updated, comprehensive and authoritative second edition has a balance of observation, theory and modeling and provides a single source of reference for all those interested in this important multi-faceted natural phenomenon and its relation to major short-term climatic variations.
The Cosmetics Europe strategy for animal-free genotoxicity testing: project status up-date.
Pfuhler, S; Fautz, R; Ouedraogo, G; Latil, A; Kenny, J; Moore, C; Diembeck, W; Hewitt, N J; Reisinger, K; Barroso, J
2014-02-01
The Cosmetics Europe (formerly COLIPA) Genotoxicity Task Force has driven and funded three projects to help address the high rate of misleading positives in in vitro genotoxicity tests: The completed "False Positives" project optimized current mammalian cell assays and showed that the predictive capacity of the in vitro micronucleus assay was improved dramatically by selecting more relevant cells and more sensitive toxicity measures. The on-going "3D skin model" project has been developed and is now validating the use of human reconstructed skin (RS) models in combination with the micronucleus (MN) and Comet assays. These models better reflect the in use conditions of dermally applied products, such as cosmetics. Both assays have demonstrated good inter- and intra-laboratory reproducibility and are entering validation stages. The completed "Metabolism" project investigated enzyme capacities of human skin and RS models. The RS models were shown to have comparable metabolic capacity to native human skin, confirming their usefulness for testing of compounds with dermal exposure. The program has already helped to improve the initial test battery predictivity and the RS projects have provided sound support for their use as a follow-up test in the assessment of the genotoxic hazard of cosmetic ingredients in the absence of in vivo data. Copyright © 2013 Elsevier Ltd. All rights reserved.
Optimizing dynamic downscaling in one-way nesting using a regional ocean model
NASA Astrophysics Data System (ADS)
Pham, Van Sy; Hwang, Jin Hwan; Ku, Hyeyun
2016-10-01
Dynamical downscaling with nested regional oceanographic models has been demonstrated to be an effective approach for both operationally forecasted sea weather on regional scales and projections of future climate change and its impact on the ocean. However, when nesting procedures are carried out in dynamic downscaling from a larger-scale model or set of observations to a smaller scale, errors are unavoidable due to the differences in grid sizes and updating intervals. The present work assesses the impact of errors produced by nesting procedures on the downscaled results from Ocean Regional Circulation Models (ORCMs). Errors are identified and evaluated based on their sources and characteristics by employing the Big-Brother Experiment (BBE). The BBE uses the same model to produce both nesting and nested simulations; so it addresses those error sources separately (i.e., without combining the contributions of errors from different sources). Here, we focus on discussing errors resulting from the spatial grids' differences, the updating times and the domain sizes. After the BBE was separately run for diverse cases, a Taylor diagram was used to analyze the results and recommend an optimal combination of grid size, updating period and domain sizes. Finally, suggested setups for the downscaling were evaluated by examining the spatial correlations of variables and the relative magnitudes of variances between the nested model and the original data.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-19
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 2056-049] Xcel Energy; Notice of Application of Recreational Resources Management Plan Update for the St. Anthony Falls Project and Soliciting Comments, Motions To Intervene, and Protests Take notice that the following hydroelectric application has been filed with the...
ERIC Educational Resources Information Center
Brown, Nicola
2017-01-01
While teaching methods tend to be updated frequently, the implementation of new innovative assessment tools is much slower. For example project based learning has become popular as a teaching technique, however, the assessment tends to be via traditional reports. This paper reports on the implementation and evaluation of using website development…
Overview of the National Health Educator Competencies Update Project, 1998-2004
ERIC Educational Resources Information Center
Gilmore, Gary David; Olsen, Larry K.; Taub, Alyson; Connell, David
2005-01-01
The National Health Educator Competencies Update Project (CUP), conducted during 1998-2004, addressed what health educators currently do in practice, the degree to which the role definition of the entry-level health educator is still up-to-date, and the validation of advanced-level competencies. A 19-page questionnaire was sent to a representative…
Online coupled camera pose estimation and dense reconstruction from video
Medioni, Gerard; Kang, Zhuoliang
2016-11-01
A product may receive each image in a stream of video image of a scene, and before processing the next image, generate information indicative of the position and orientation of an image capture device that captured the image at the time of capturing the image. The product may do so by identifying distinguishable image feature points in the image; determining a coordinate for each identified image feature point; and for each identified image feature point, attempting to identify one or more distinguishable model feature points in a three dimensional (3D) model of at least a portion of the scene that appears likely to correspond to the identified image feature point. Thereafter, the product may find each of the following that, in combination, produce a consistent projection transformation of the 3D model onto the image: a subset of the identified image feature points for which one or more corresponding model feature points were identified; and, for each image feature point that has multiple likely corresponding model feature points, one of the corresponding model feature points. The product may update a 3D model of at least a portion of the scene following the receipt of each video image and before processing the next video image base on the generated information indicative of the position and orientation of the image capture device at the time of capturing the received image. The product may display the updated 3D model after each update to the model.
The updated billion-ton resource assessment
Anthony Turhollow; Robert Perlack; Laurence Eaton; Matthew Langholtz; Craig Brandt; Mark Downing; Lynn Wright; Kenneth Skog; Chad Hellwinckel; Bryce Stokes; Patricia Lebow
2014-01-01
This paper summarizes the results of an update to a resource assessment, published in 2005, commonly referred to as the Billion-Ton Study (BTS). The updated results are consistent with the 2005 BTS in terms of overall magnitude. The 2005 BTS projected between 860 and 1240 Tg of biomass available in the 2050 timeframe, while the Billion-Ton Update (BT2), for a price of...
Updated polychlorinated biphenyl mass budget for Lake Michigan
This study revisits and updates the Lake Michigan Mass Balance Project (LMMBP) for polychlorinated biphenyls (PCBs) that was conducted in 1994-1995. This work uses recent concentrations of PCBs in tributary and open lake water, air, and sediment to calculate an updated mass budg...
NASA Astrophysics Data System (ADS)
DeLorme, D.; Lea, K.; Hagen, S. C.
2016-12-01
As coastal Louisiana evolves morphologically, ecologically, and from engineering advancements, there is a crucial need to continually adjust real-time forecasting and coastal restoration planning models. This presentation discusses planning, conducting, and evaluating stakeholder workshops to support such an endeavor. The workshops are part of an ongoing Louisiana Sea Grant-sponsored project. The project involves updating an ADCIRC (Advanced Circulation) mesh representation of topography including levees and other flood control structures by applying previously-collected elevation data and new data acquired during the project. The workshops are designed to educate, solicit input, and ensure incorporation of topographic features into the framework is accomplished in the best interest of stakeholders. During this project's first year, three one-day workshops directed to levee managers and other local officials were convened at agricultural extension facilities in Hammond, Houma, and Lake Charles, Louisiana. The objectives were to provide a forum for participants to learn about the ADCIRC framework, understand the importance of accurate elevations for a robust surge model, discuss and identify additional data sources, and become familiar with the CERA (Coastal Emergency Risks Assessment) visualization tool. The workshop structure consisted of several scientific presentations with questions/answer time (ADCIRC simulation inputs and outputs; ADCIRC framework elevation component; description and examples of topographic features such as levees, roadways, railroads, etc. currently utilized in the mesh; ADCIRC model validation demonstration through historic event simulations; CERA demonstration), a breakout activity for participant groups to identify and discuss raised features not currently in the mesh and document them on provided worksheets, and a closing session for debriefing and discussion of future model improvements. Evaluation involved developing, and analyzing a written survey of participants administered at the workshop conclusion. The survey measured satisfaction with the workshop's content, format, and utility and gathered future recommendations. Results showed the workshops were successful and further feedback will be shared in this presentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mobrand, Lars Erik; Lestelle, Lawrence C.
In the spring of 1994 a technical planning support project was initiated by the Grande Ronde Model Watershed Board of Directors (Board) with funding from the Bonneville Power Administration. The project was motivated by a need for a science based method for prioritizing restoration actions in the basin that would promote effectiveness and accountability. In this section the authors recall the premises for the project. The authors also present a set of recommendations for implementing a watershed planning process that incorporates a science-based framework to help guide decision making. This process is intended to assist the Grande Ronde Model Watershedmore » Board in its effort to plan and implement watershed improvement measures. The process would also assist the Board in coordinating its efforts with other entities in the region. The planning process is based on an approach for developing an ecosystem management strategy referred to as the Ecosystem Diagnosis and Treatment (EDT) method (Lichatowich et al. 1995, Lestelle et al. 1996). The process consists of an on-going planning cycle. Included in this cycle is an assessment of the ability of the watershed to support and sustain natural resources and other economic and societal values. This step in the process, which the authors refer to as the diagnosis, helps guide the development of actions (also referred to as treatments) aimed at improving the conditions of the watershed to achieve long-term objectives. The planning cycle calls for routinely reviewing and updating, as necessary, the basis for the diagnosis and other analyses used by the Board in adopting actions for implementation. The recommendations offered here address this critical need to habitually update the information used in setting priorities for action.« less
Kramer, Kirsten E; Small, Gary W
2009-02-01
Fourier transform near-infrared (NIR) transmission spectra are used for quantitative analysis of glucose for 17 sets of prediction data sampled as much as six months outside the timeframe of the corresponding calibration data. Aqueous samples containing physiological levels of glucose in a matrix of bovine serum albumin and triacetin are used to simulate clinical samples such as blood plasma. Background spectra of a single analyte-free matrix sample acquired during the instrumental warm-up period on the prediction day are used for calibration updating and for determining the optimal frequency response of a preprocessing infinite impulse response time-domain digital filter. By tuning the filter and the calibration model to the specific instrumental response associated with the prediction day, the calibration model is given enhanced ability to operate over time. This methodology is demonstrated in conjunction with partial least squares calibration models built with a spectral range of 4700-4300 cm(-1). By using a subset of the background spectra to evaluate the prediction performance of the updated model, projections can be made regarding the success of subsequent glucose predictions. If a threshold standard error of prediction (SEP) of 1.5 mM is used to establish successful model performance with the glucose samples, the corresponding threshold for the SEP of the background spectra is found to be 1.3 mM. For calibration updating in conjunction with digital filtering, SEP values of all 17 prediction sets collected over 3-178 days displaced from the calibration data are below 1.5 mM. In addition, the diagnostic based on the background spectra correctly assesses the prediction performance in 16 of the 17 cases.
The Silicon Trypanosome: a test case of iterative model extension in systems biology
Achcar, Fiona; Fadda, Abeer; Haanstra, Jurgen R.; Kerkhoven, Eduard J.; Kim, Dong-Hyun; Leroux, Alejandro E.; Papamarkou, Theodore; Rojas, Federico; Bakker, Barbara M.; Barrett, Michael P.; Clayton, Christine; Girolami, Mark; Luise Krauth-Siegel, R.; Matthews, Keith R.; Breitling, Rainer
2016-01-01
The African trypanosome, Trypanosoma brucei, is a unicellular parasite causing African Trypanosomiasis (sleeping sickness in humans and nagana in animals). Due to some of its unique properties, it has emerged as a popular model organism in systems biology. A predictive quantitative model of glycolysis in the bloodstream form of the parasite has been constructed and updated several times. The Silicon Trypanosome (SilicoTryp) is a project that brings together modellers and experimentalists to improve and extend this core model with new pathways and additional levels of regulation. These new extensions and analyses use computational methods that explicitly take different levels of uncertainty into account. During this project, numerous tools and techniques have been developed for this purpose, which can now be used for a wide range of different studies in systems biology. PMID:24797926
Integrated Medical Model Verification, Validation, and Credibility
NASA Technical Reports Server (NTRS)
Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry
2014-01-01
The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including Operations, Science and Technology Planning, and Exploration Planning. CONCLUSIONS: The VVC approach established by the IMM project of combining the IMM VV Plan with 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.
Evaluation of grid generation technologies from an applied perspective
NASA Technical Reports Server (NTRS)
Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.
1995-01-01
An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.
Solar heating and cooling technical data and systems analysis
NASA Technical Reports Server (NTRS)
Christensen, D. L.
1976-01-01
The accomplishments of a project to study solar heating and air conditioning are outlined. Presentation materials (data packages, slides, charts, and visual aids) were developed. Bibliographies and source materials on materials and coatings, solar water heaters, systems analysis computer models, solar collectors and solar projects were developed. Detailed MIRADS computer formats for primary data parameters were developed and updated. The following data were included: climatic, architectural, topography, heating and cooling equipment, thermal loads, and economics. Data sources in each of these areas were identified as well as solar radiation data stations and instruments.
The Development of a New Model of Solar EUV Irradiance Variability
NASA Technical Reports Server (NTRS)
Warren, Harry; Wagner, William J. (Technical Monitor)
2002-01-01
The goal of this research project is the development of a new model of solar EUV (Extreme Ultraviolet) irradiance variability. The model is based on combining differential emission measure distributions derived from spatially and spectrally resolved observations of active regions, coronal holes, and the quiet Sun with full-disk solar images. An initial version of this model was developed with earlier funding from NASA. The new version of the model developed with this research grant will incorporate observations from SoHO as well as updated compilations of atomic data. These improvements will make the model calculations much more accurate.
NASA Astrophysics Data System (ADS)
Pankatz, Klaus; Kerkweg, Astrid
2015-04-01
The work presented is part of the joint project "DecReg" ("Regional decadal predictability") which is in turn part of the project "MiKlip" ("Decadal predictions"), an effort funded by the German Federal Ministry of Education and Research to improve decadal predictions on a global and regional scale. In MiKlip, one big question is if regional climate modeling shows "added value", i.e. to evaluate, if regional climate models (RCM) produce better results than the driving models. However, the scope of this study is to look more closely at the setup specific details of regional climate modeling. As regional models only simulate a small domain, they have to inherit information about the state of the atmosphere at their lateral boundaries from external data sets. There are many unresolved questions concerning the setup of lateral boundary conditions (LBC). External data sets come from global models or from global reanalysis data-sets. A temporal resolution of six hours is common for this kind of data. This is mainly due to the fact, that storage space is a limiting factor, especially for climate simulations. However, theoretically, the coupling frequency could be as high as the time step of the driving model. Meanwhile, it is unclear if a more frequent update of the LBCs has a significant effect on the climate in the domain of the RCM. The first study examines how the RCM reacts to a higher update frequency. The study is based on a 30 year time slice experiment for three update frequencies of the LBC, namely six hours, one hour and six minutes. The evaluation of means, standard deviations and statistics of the climate in the regional domain shows only small deviations, some statistically significant though, of 2m temperature, sea level pressure and precipitation. The second part of the first study assesses parameters linked to cyclone activity, which is affected by the LBC update frequency. Differences in track density and strength are found when comparing the simulations. Theoretically, regional down-scaling should act like a magnifying glass. It should reveal details on small scales which a global model cannot resolve, but it should not affect the large scale flow. As the development of the small scale features takes some time, it is important that the air stays long enough within the regional domain. The spin-up time of the small scale features is, of course, dependent on the resolution of the LBC and the resolution of the RCM. The second study examines the quality of decadal hind-casts over Europe of the decade 2001-2010 when the horizontal resolution of the driving model, namely 2.8°, 1.8°, 1.4°, 1.1°, from which the LBC are calculated, is altered. The study shows, that a smaller resolution gap between LBC resolution and RCM resolution might be beneficial.
Scale-adaptive compressive tracking with feature integration
NASA Astrophysics Data System (ADS)
Liu, Wei; Li, Jicheng; Chen, Xiao; Li, Shuxin
2016-05-01
Numerous tracking-by-detection methods have been proposed for robust visual tracking, among which compressive tracking (CT) has obtained some promising results. A scale-adaptive CT method based on multifeature integration is presented to improve the robustness and accuracy of CT. We introduce a keypoint-based model to achieve the accurate scale estimation, which can additionally give a prior location of the target. Furthermore, by the high efficiency of data-independent random projection matrix, multiple features are integrated into an effective appearance model to construct the naïve Bayes classifier. At last, an adaptive update scheme is proposed to update the classifier conservatively. Experiments on various challenging sequences demonstrate substantial improvements by our proposed tracker over CT and other state-of-the-art trackers in terms of dealing with scale variation, abrupt motion, deformation, and illumination changes.
U.S. Crude Oil Production to 2025: Updated Projection of Crude Types
2015-01-01
This report updates and extends a May 2014 EIA report, U.S. crude oil production forecast – analysis of crude types. It provides a projection of domestic crude oil production by crude type through 2025, supplementing the overall production projection provided in the AEO2015. Projections of production by crude type matter for several reasons. First, U.S. crude streams vary widely in quality. Second, the economics surrounding various options for the domestic use of additional domestic oil production are directly dependent on crude quality characteristics. Third, actual or potential export values also vary significantly with quality characteristics.
The maturing of the quality improvement paradigm in the SEL
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1993-01-01
The Software Engineering Laboratory uses a paradigm for improving the software process and product, called the quality improvement paradigm. This paradigm has evolved over the past 18 years, along with our software development processes and product. Since 1976, when we first began the SEL, we have learned a great deal about improving the software process and product, making a great many mistakes along the way. Quality improvement paradigm, as it is currently defined, can be broken up into six steps: characterize the current project and its environment with respect to the appropriate models and metrics; set the quantifiable goals for successful project performance and improvement; choose the appropriate process model and supporting methods and tools for this project; execute the processes, construct the products, and collect, validate, and analyze the data to provide real-time feedback for corrective action; analyze the data to evaluate the current practices, determine problems, record findings, and make recommendations for future project improvements; and package the experience gained in the form of updated and refined models and other forms of structured knowledge gained from this and prior projects and save it in an experience base to be reused on future projects.
A Comparison of Combustor-Noise Models
NASA Technical Reports Server (NTRS)
Hultgren, Lennart, S.
2012-01-01
The current status of combustor-noise prediction in the NASA Aircraft Noise Prediction Program (ANOPP) for current-generation (N) turbofan engines is summarized. Best methods for near-term updates are reviewed. Long-term needs and challenges for the N+1 through N+3 timeframe are discussed. This work was carried out under the NASA Fundamental Aeronautics Program, Subsonic Fixed Wing Project, Quiet Aircraft Subproject.
John Hof; Curtis Flather; Tony Baltic; Rudy King
2006-01-01
The 2005 Forest and Rangeland Condition Indicator Model is a set of classification trees for forest and rangeland condition indicators at the national scale. This report documents the development of the database and the nonparametric statistical estimation for this analytical structure, with emphasis on three special characteristics of condition indicator production...
NASA Astrophysics Data System (ADS)
Bertholet, Jenny; Toftegaard, Jakob; Hansen, Rune; Worm, Esben S.; Wan, Hanlin; Parikh, Parag J.; Weber, Britta; Høyer, Morten; Poulsen, Per R.
2018-03-01
The purpose of this study was to develop, validate and clinically demonstrate fully automatic tumour motion monitoring on a conventional linear accelerator by combined optical and sparse monoscopic imaging with kilovoltage x-rays (COSMIK). COSMIK combines auto-segmentation of implanted fiducial markers in cone-beam computed tomography (CBCT) projections and intra-treatment kV images with simultaneous streaming of an external motion signal. A pre-treatment CBCT is acquired with simultaneous recording of the motion of an external marker block on the abdomen. The 3-dimensional (3D) marker motion during the CBCT is estimated from the auto-segmented positions in the projections and used to optimize an external correlation model (ECM) of internal motion as a function of external motion. During treatment, the ECM estimates the internal motion from the external motion at 20 Hz. KV images are acquired every 3 s, auto-segmented, and used to update the ECM for baseline shifts between internal and external motion. The COSMIK method was validated using Calypso-recorded internal tumour motion with simultaneous camera-recorded external motion for 15 liver stereotactic body radiotherapy (SBRT) patients. The validation included phantom experiments and simulations hereof for 12 fractions and further simulations for 42 fractions. The simulations compared the accuracy of COSMIK with ECM-based monitoring without model updates and with model updates based on stereoscopic imaging as well as continuous kilovoltage intrafraction monitoring (KIM) at 10 Hz without an external signal. Clinical real-time tumour motion monitoring with COSMIK was performed offline for 14 liver SBRT patients (41 fractions) and online for one patient (two fractions). The mean 3D root-mean-square error for the four monitoring methods was 1.61 mm (COSMIK), 2.31 mm (ECM without updates), 1.49 mm (ECM with stereoscopic updates) and 0.75 mm (KIM). COSMIK is the first combined kV/optical real-time motion monitoring method used clinically online on a conventional accelerator. COSMIK gives less imaging dose than KIM and is in addition applicable when the kV imager cannot be deployed such as during non-coplanar fields.
Dawn Orbit Determination Team: Trajectory Modeling and Reconstruction Processes at Vesta
NASA Technical Reports Server (NTRS)
Abrahamson, Matthew J.; Ardito, Alessandro; Han, Dongsuk; Haw, Robert; Kennedy, Brian; Mastrodemos, Nick; Nandi, Sumita; Park, Ryan; Rush, Brian; Vaughan, Andrew
2013-01-01
The Dawn spacecraft spent over a year in orbit around Vesta from July 2011 through August 2012. In order to maintain the designated science reference orbits and enable the transfers between those orbits, precise and timely orbit determination was required. Challenges included low-thrust ion propulsion modeling, estimation of relatively unknown Vesta gravity and rotation models, track-ing data limitations, incorporation of real-time telemetry into dynamics model updates, and rapid maneuver design cycles during transfers. This paper discusses the dynamics models, filter configuration, and data processing implemented to deliver a rapid orbit determination capability to the Dawn project.
The advantage of flexible neuronal tunings in neural network models for motor learning
Marongelli, Ellisha N.; Thoroughman, Kurt A.
2013-01-01
Human motor adaptation to novel environments is often modeled by a basis function network that transforms desired movement properties into estimated forces. This network employs a layer of nodes that have fixed broad tunings that generalize across the input domain. Learning is achieved by updating the weights of these nodes in response to training experience. This conventional model is unable to account for rapid flexibility observed in human spatial generalization during motor adaptation. However, added plasticity in the widths of the basis function tunings can achieve this flexibility, and several neurophysiological experiments have revealed flexibility in tunings of sensorimotor neurons. We found a model, Locally Weighted Projection Regression (LWPR), which uniquely possesses the structure of a basis function network in which both the weights and tuning widths of the nodes are updated incrementally during adaptation. We presented this LWPR model with training functions of different spatial complexities and monitored incremental updates to receptive field widths. An inverse pattern of dependence of receptive field adaptation on experienced error became evident, underlying both a relationship between generalization and complexity, and a unique behavior in which generalization always narrows after a sudden switch in environmental complexity. These results implicate a model that is flexible in both basis function widths and weights, like LWPR, as a viable alternative model for human motor adaptation that can account for previously observed plasticity in spatial generalization. This theory can be tested by using the behaviors observed in our experiments as novel hypotheses in human studies. PMID:23888141
Fuentes, Màrius V
2006-11-01
Fasciolosis caused by Fasciola hepatica in various South American countries located on the slopes of the Andes has been recognized as an important public health problem. However, the importance of this zoonotic hepatic parasite was neglected until the last decade. Countries such as Peru and Bolivia are considered to be hyperendemic areas for human and animal fasciolosis, and other countries such as Chile, Ecuador, Colombia and Venezuela are also affected. At the beginning of the 1990s a multidisciplinary project was launched with the aim to shed light on the problems related to this parasitic disease in the Northern Bolivian Altiplano. A few years later, a geographic information system (GIS) was incorporated into this multidisciplinary project analysing the epidemiology of human and animal fasciolosis in this South American Andean region. Various GIS projects were developed in some Andean regions using climatic data, climatic forecast indices and remote sensing data. Step by step, all these GIS projects concerning the forecast of the fasciolosis transmission risk in the Andean mountain range were revised and in some cases updated taking into account new data. The first of these projects was developed on a regional scale for the central Chilean regions and the proposed model was validated on a local scale in the Northern Bolivian Altiplano. This validated mixed model, based on both fasciolosis climatic forecast indices and normalized difference vegetation index values from Advanced Very High Resolution Radiometer satellite sensor, was extrapolated to other human and/or animal endemic areas of Peru and Ecuador. The resulting fasciolosis risk maps make it possible to show the known human endemic areas of, mainly, the Peruvian Altiplano, Cajamarca and Mantaro Peruvian valleys, and some valleys of the Ecuadorian Cotopaxi province. Nevertheless, more climate and remote sensing data, as well as more accurate epidemiological reports, have to be incorporated into these GIS projects, which should be considered the key in understanding fasciolosis transmission in the Andes.
Share Your Solar Project Experience
The Local Government Solar Project Portal's Share Your Solar Project Experience page includes details on how municipalities can be listed in the Portal and update EPA with their solar project development achievements and progress.
Customizing WRF-Hydro for the Laurentian Great Lakes Basin
NASA Astrophysics Data System (ADS)
Gronewold, A.; Pei, L.; Gochis, D.; Mason, L.; Sampson, K. M.; Dugger, A. L.; Read, L.; McCreight, J. L.; Xiao, C.; Lofgren, B. M.; Anderson, E. J.; Chu, P. Y.
2017-12-01
To advance the state of the art in regional hydrological forecasting, and to align with operational deployment of the National Water Model, a team of scientists has been customizing WRF-Hydro (the Weather Research and Forecasting model - Hydrological modeling extension package) to the entirety (including binational land and lake surfaces) of the Laurentian Great Lakes basin. Objectives of this customization project include opererational simulation and forecasting of the Great Lakes water balance and, in the short-term, research-oriented insights into modeling one- and two-way coupled lake-atmosphere and near-shore processes. Initial steps in this project have focused on overcoming inconsistencies in land surface hydrographic datasets between the United States and Canada. Improvements in the model's current representation of lake physics and stream routing are also critical components of this effort. Here, we present an update on the status of this project, including a synthesis of offline tests with WRF-Hydro based on the newly developed Great Lakes hydrographic data, and an assessment of the model's ability to simulate seasonal and multi-decadal hydrological response across the Great Lakes.
Danielson, Jeffrey J.; Poppenga, Sandra K.; Brock, John C.; Evans, Gayla A.; Tyler, Dean; Gesch, Dean B.; Thatcher, Cindy A.; Barras, John
2016-01-01
During the coming decades, coastlines will respond to widely predicted sea-level rise, storm surge, and coastalinundation flooding from disastrous events. Because physical processes in coastal environments are controlled by the geomorphology of over-the-land topography and underwater bathymetry, many applications of geospatial data in coastal environments require detailed knowledge of the near-shore topography and bathymetry. In this paper, an updated methodology used by the U.S. Geological Survey Coastal National Elevation Database (CoNED) Applications Project is presented for developing coastal topobathymetric elevation models (TBDEMs) from multiple topographic data sources with adjacent intertidal topobathymetric and offshore bathymetric sources to generate seamlessly integrated TBDEMs. This repeatable, updatable, and logically consistent methodology assimilates topographic data (land elevation) and bathymetry (water depth) into a seamless coastal elevation model. Within the overarching framework, vertical datum transformations are standardized in a workflow that interweaves spatially consistent interpolation (gridding) techniques with a land/water boundary mask delineation approach. Output gridded raster TBDEMs are stacked into a file storage system of mosaic datasets within an Esri ArcGIS geodatabase for efficient updating while maintaining current and updated spatially referenced metadata. Topobathymetric data provide a required seamless elevation product for several science application studies, such as shoreline delineation, coastal inundation mapping, sediment-transport, sea-level rise, storm surge models, and tsunami impact assessment. These detailed coastal elevation data are critical to depict regions prone to climate change impacts and are essential to planners and managers responsible for mitigating the associated risks and costs to both human communities and ecosystems. The CoNED methodology approach has been used to construct integrated TBDEM models in Mobile Bay, the northern Gulf of Mexico, San Francisco Bay, the Hurricane Sandy region, and southern California.
NASA Astrophysics Data System (ADS)
Lauer, Axel; Jones, Colin; Eyring, Veronika; Evaldsson, Martin; Hagemann, Stefan; Mäkelä, Jarmo; Martin, Gill; Roehrig, Romain; Wang, Shiyu
2018-01-01
The performance of updated versions of the four earth system models (ESMs) CNRM, EC-Earth, HadGEM, and MPI-ESM is assessed in comparison to their predecessor versions used in Phase 5 of the Coupled Model Intercomparison Project. The Earth System Model Evaluation Tool (ESMValTool) is applied to evaluate selected climate phenomena in the models against observations. This is the first systematic application of the ESMValTool to assess and document the progress made during an extensive model development and improvement project. This study focuses on the South Asian monsoon (SAM) and the West African monsoon (WAM), the coupled equatorial climate, and Southern Ocean clouds and radiation, which are known to exhibit systematic biases in present-day ESMs. The analysis shows that the tropical precipitation in three out of four models is clearly improved. Two of three updated coupled models show an improved representation of tropical sea surface temperatures with one coupled model not exhibiting a double Intertropical Convergence Zone (ITCZ). Simulated cloud amounts and cloud-radiation interactions are improved over the Southern Ocean. Improvements are also seen in the simulation of the SAM and WAM, although systematic biases remain in regional details and the timing of monsoon rainfall. Analysis of simulations with EC-Earth at different horizontal resolutions from T159 up to T1279 shows that the synoptic-scale variability in precipitation over the SAM and WAM regions improves with higher model resolution. The results suggest that the reasonably good agreement of modeled and observed mean WAM and SAM rainfall in lower-resolution models may be a result of unrealistic intensity distributions.
The Potential for Predicting Precipitation on Seasonal-to-Interannual Timescales
NASA Technical Reports Server (NTRS)
Koster, R. D.
1999-01-01
The ability to predict precipitation several months in advance would have a significant impact on water resource management. This talk provides an overview of a project aimed at developing this prediction capability. NASA's Seasonal-to-Interannual Prediction Project (NSIPP) will generate seasonal-to-interannual sea surface temperature predictions through detailed ocean circulation modeling and will then translate these SST forecasts into forecasts of continental precipitation through the application of an atmospheric general circulation model and a "SVAT"-type land surface model. As part of the process, ocean variables (e.g., height) and land variables (e.g., soil moisture) will be updated regularly via data assimilation. The overview will include a discussion of the variability inherent in such a modeling system and will provide some quantitative estimates of the absolute upper limits of seasonal-to-interannual precipitation predictability.
Mitra, Ayan; Politte, David G; Whiting, Bruce R; Williamson, Jeffrey F; O'Sullivan, Joseph A
2017-01-01
Model-based image reconstruction (MBIR) techniques have the potential to generate high quality images from noisy measurements and a small number of projections which can reduce the x-ray dose in patients. These MBIR techniques rely on projection and backprojection to refine an image estimate. One of the widely used projectors for these modern MBIR based technique is called branchless distance driven (DD) projection and backprojection. While this method produces superior quality images, the computational cost of iterative updates keeps it from being ubiquitous in clinical applications. In this paper, we provide several new parallelization ideas for concurrent execution of the DD projectors in multi-GPU systems using CUDA programming tools. We have introduced some novel schemes for dividing the projection data and image voxels over multiple GPUs to avoid runtime overhead and inter-device synchronization issues. We have also reduced the complexity of overlap calculation of the algorithm by eliminating the common projection plane and directly projecting the detector boundaries onto image voxel boundaries. To reduce the time required for calculating the overlap between the detector edges and image voxel boundaries, we have proposed a pre-accumulation technique to accumulate image intensities in perpendicular 2D image slabs (from a 3D image) before projection and after backprojection to ensure our DD kernels run faster in parallel GPU threads. For the implementation of our iterative MBIR technique we use a parallel multi-GPU version of the alternating minimization (AM) algorithm with penalized likelihood update. The time performance using our proposed reconstruction method with Siemens Sensation 16 patient scan data shows an average of 24 times speedup using a single TITAN X GPU and 74 times speedup using 3 TITAN X GPUs in parallel for combined projection and backprojection.
Revenues: Where Does the Money Come from? A Delta Data Update, 2000-2010
ERIC Educational Resources Information Center
Kirshstein, Rita J.; Hurlburt, Steven
2012-01-01
This is one in a series of data briefs developed by the Delta Cost Project at AIR using data from the "IPEDS Analytics: Delta Cost Project Database 1987-2010," which was released on August 14, 2012, by the U.S. Department of Education, National Center for Education Statistics. The intent of these briefs is to update key tables and figures from…
Spending: Where Does the Money Go? A Delta Data Update, 2000-2010
ERIC Educational Resources Information Center
Hurlburt, Steven; Kirshstein, Rita J.
2012-01-01
This is one in a series of data briefs developed by the Delta Cost Project at AIR using data from the "IPEDS Analytics: Delta Cost Project Database 1987-2010," which was released on August 14, 2012, by the U.S. Department of Education, National Center for Education Statistics. The intent of these briefs is to update key tables and figures from…
Jovian Plasma Modeling for Mission Design
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin
2015-01-01
The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and results of those modifications to the DG1 model to produce the new DG2 model presented here and the steps taken to integrate the DG2 predictions into Nascap-2k are described in this report
Jovian plasma modeling for mission design
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin
2015-01-01
The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and results of those modifications to the DG1 model to produce the new DG2 model presented here and the steps taken to integrate the DG2 predictions into Nascap-2k are described in this report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shonder, John A; Hughes, Patrick; Atkin, Erica
2006-11-01
A study was sponsored by FEMP in 2001 - 2002 to develop methods to compare life-cycle costs of federal energy conservation projects carried out through energy savings performance contracts (ESPCs) and projects that are directly funded by appropriations. The study described in this report follows up on the original work, taking advantage of new pricing data on equipment and on $500 million worth of Super ESPC projects awarded since the end of FY 2001. The methods developed to compare life-cycle costs of ESPCs and directly funded energy projects are based on the following tasks: (1) Verify the parity of equipmentmore » prices in ESPC vs. directly funded projects; (2) Develop a representative energy conservation project; (3) Determine representative cycle times for both ESPCs and appropriations-funded projects; (4) Model the representative energy project implemented through an ESPC and through appropriations funding; and (5) Calculate the life-cycle costs for each project.« less
Trends in College Spending: 2001-2011. A Delta Data Update
ERIC Educational Resources Information Center
Desrochers, Donna M.; Hurlburt, Steven
2014-01-01
This "Trends in College Spending" update presents national-level estimates for the "Delta Cost Project" data metrics during the period 2001-11. To accelerate the release of more current trend data, however, this update includes only a brief summary of the financial patterns and trends observed during the decade 2001-11, with…
2001-01-31
function of Jini, UPnP, SLP, Bluetooth , and HAVi • Projected specific UML models for Jini, UPnP, and SLP • Developed a Rapide Model of Jini...is used by all JINI entities in directed -- discovery mode. It is part of the SCM_Discovery -- Module. Sends Unicast messages to SCMs on list of... SCMS to be discovered until all SCMS are found. -- Receives updates from SCM DB of discovered SCMs and -- removes SCMs accordingly -- NOTE
Implementation of NGA-West2 ground motion models in the 2014 U.S. National Seismic Hazard Maps
Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Harmsen, Stephen C.; Frankel, Arthur D.
2014-01-01
The U.S. National Seismic Hazard Maps (NSHMs) have been an important component of seismic design regulations in the United States for the past several decades. These maps present earthquake ground shaking intensities at specified probabilities of being exceeded over a 50-year time period. The previous version of the NSHMs was developed in 2008; during 2012 and 2013, scientists at the U.S. Geological Survey have been updating the maps based on their assessment of the “best available science,” resulting in the 2014 NSHMs. The update includes modifications to the seismic source models and the ground motion models (GMMs) for sites across the conterminous United States. This paper focuses on updates in the Western United States (WUS) due to the use of new GMMs for shallow crustal earthquakes in active tectonic regions developed by the Next Generation Attenuation (NGA-West2) project. Individual GMMs, their weighted combination, and their impact on the hazard maps relative to 2008 are discussed. In general, the combined effects of lower medians and increased standard deviations in the new GMMs have caused only small changes, within 5–20%, in the probabilistic ground motions for most sites across the WUS compared to the 2008 NSHMs.
News and Updates from Proctor Creek
This page contains news and updates from the Proctor Creek Urban Waters Partnership location. They span ongoing projects, programs, and initiatives that this Atlanta-based partnership is taking on in its work plan.
NASA Astrophysics Data System (ADS)
Son, Seok-Woo; Han, Bo-Reum; Garfinkel, Chaim I.; Kim, Seo-Yeon; Park, Rokjin; Abraham, N. Luke; Akiyoshi, Hideharu; Archibald, Alexander T.; Butchart, N.; Chipperfield, Martyn P.; Dameris, Martin; Deushi, Makoto; Dhomse, Sandip S.; Hardiman, Steven C.; Jöckel, Patrick; Kinnison, Douglas; Michou, Martine; Morgenstern, Olaf; O’Connor, Fiona M.; Oman, Luke D.; Plummer, David A.; Pozzer, Andrea; Revell, Laura E.; Rozanov, Eugene; Stenke, Andrea; Stone, Kane; Tilmes, Simone; Yamashita, Yousuke; Zeng, Guang
2018-05-01
The Southern Hemisphere (SH) zonal-mean circulation change in response to Antarctic ozone depletion is re-visited by examining a set of the latest model simulations archived for the Chemistry-Climate Model Initiative (CCMI) project. All models reasonably well reproduce Antarctic ozone depletion in the late 20th century. The related SH-summer circulation changes, such as a poleward intensification of westerly jet and a poleward expansion of the Hadley cell, are also well captured. All experiments exhibit quantitatively the same multi-model mean trend, irrespective of whether the ocean is coupled or prescribed. Results are also quantitatively similar to those derived from the Coupled Model Intercomparison Project phase 5 (CMIP5) high-top model simulations in which the stratospheric ozone is mostly prescribed with monthly- and zonally-averaged values. These results suggest that the ozone-hole-induced SH-summer circulation changes are robust across the models irrespective of the specific chemistry-atmosphere-ocean coupling.
Draft project management update to the Iowa DOT Project Development Manual : final report.
DOT National Transportation Integrated Search
2016-08-01
This work supported drafting project management guidance for the Iowa Department of Transportation (DOT). The goal is to : incorporate a greater focus on project management in their project development process. : A technical advisory committee (TAC) ...
ERIC Educational Resources Information Center
Smith, Mike U.; Adkison, Linda R.
2010-01-01
Gericke and Hagberg (G & H, "Sci Educ" 16:849-881, 2007) recently published in this journal a thoughtful analysis of the historical progression of our understanding of the nature of the gene for use in instruction. This analysis, however, did not include the findings of the Human Genome Project (HGP), which must be included in any introductory…
Updated constraints on the dark matter interpretation of CDMS-II-Si data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witte, Samuel J.; Gelmini, Graciela B., E-mail: switte@physics.ucla.edu, E-mail: gelmini@physics.ucla.edu
2017-05-01
We present an updated halo-dependent and halo-independent analysis of viable light WIMP dark matter candidates which could account for the excess observed in CDMS-II-Si. We include recent constraints from LUX, PandaX-II, and PICO-60, as well as projected sensitivities for XENON1T, SuperCDMS SNOLAB, LZ, DARWIN, DarkSide-20k, and PICO-250, on candidates with spin-independent isospin conserving and isospin-violating interactions, and either elastic or exothermic scattering. We show that there exist dark matter candidates which can explain the CDMS-II-Si data and remain very marginally consistent with the null results of all current experiments, however such models are highly tuned, making a dark matter interpretationmore » of CDMS-II-Si very unlikely. We find that these models can only be ruled out in the future by an experiment comparable to LZ or PICO-250.« less
NASA Technical Reports Server (NTRS)
Jiang, Jian-Ping; Murphy, Elizabeth D.; Bailin, Sidney C.; Truszkowski, Walter F.
1993-01-01
Capturing human factors knowledge about the design of graphical user interfaces (GUI's) and applying this knowledge on-line are the primary objectives of the Computer-Human Interaction Models (CHIMES) project. The current CHIMES prototype is designed to check a GUI's compliance with industry-standard guidelines, general human factors guidelines, and human factors recommendations on color usage. Following the evaluation, CHIMES presents human factors feedback and advice to the GUI designer. The paper describes the approach to modeling human factors guidelines, the system architecture, a new method developed to convert quantitative RGB primaries into qualitative color representations, and the potential for integrating CHIMES with user interface management systems (UIMS). Both the conceptual approach and its implementation are discussed. This paper updates the presentation on CHIMES at the first International Symposium on Ground Data Systems for Spacecraft Control.
NASA Technical Reports Server (NTRS)
Schubert, Siegfried
2011-01-01
The Global Modeling and Assimilation Office at NASA's Goddard Space Flight Center is developing a number of experimental prediction and analysis products suitable for research and applications. The prediction products include a large suite of subseasonal and seasonal hindcasts and forecasts (as a contribution to the US National MME), a suite of decadal (10-year) hindcasts (as a contribution to the IPCC decadal prediction project), and a series of large ensemble and high resolution simulations of selected extreme events, including the 2010 Russian and 2011 US heat waves. The analysis products include an experimental atlas of climate (in particular drought) and weather extremes. This talk will provide an update on those activities, and discuss recent efforts by WCRP to leverage off these and similar efforts at other institutions throughout the world to develop an experimental global drought early warning system.
Resource Tracking Model Updates and Trade Studies
NASA Technical Reports Server (NTRS)
Chambliss, Joe; Stambaugh, Imelda; Moore, Michael
2016-01-01
The Resource tracking model has been updated to capture system manager and project manager inputs. Both the Trick/GUNNS RTM simulator and the RTM mass balance spreadsheet have been revised to address inputs from system managers and to refine the way mass balance is illustrated. The revisions to the RTM included addition of a Plasma Pyrolysis Assembly (PPA) to recover hydrogen from Sabatier reactor methane which was vented in the prior version of the RTM. The effect of the PPA on the overall balance of resources in an exploration vehicle is illustrated in the increased recycle of vehicle oxygen. Additionally simulation of EVAs conducted from the exploration module was added. Since the focus of the exploration module is to provide a habitat during deep space operations the EVA simulation approach to EVA is based on ISS EVA protocol and processes. Case studies have been run to show the relative effect of performance changes on vehicle resources.
2017 update of the discoveries of nuclides
NASA Astrophysics Data System (ADS)
Thoennessen, M.
The 2017 update of the discovery of nuclide project is presented. 34 new nuclides were observed for the first time in 2017. However, the assignment of six previously identified nuclides had to be retracted.
Rosa, Sarah N.; Hay, Lauren E.
2017-12-01
In 2014, the U.S. Geological Survey, in cooperation with the U.S. Department of Defense’s Strategic Environmental Research and Development Program, initiated a project to evaluate the potential impacts of projected climate-change on Department of Defense installations that rely on Guam’s water resources. A major task of that project was to develop a watershed model of southern Guam and a water-balance model for the Fena Valley Reservoir. The southern Guam watershed model provides a physically based tool to estimate surface-water availability in southern Guam. The U.S. Geological Survey’s Precipitation Runoff Modeling System, PRMS-IV, was used to construct the watershed model. The PRMS-IV code simulates different parts of the hydrologic cycle based on a set of user-defined modules. The southern Guam watershed model was constructed by updating a watershed model for the Fena Valley watersheds, and expanding the modeled area to include all of southern Guam. The Fena Valley watershed model was combined with a previously developed, but recently updated and recalibrated Fena Valley Reservoir water-balance model.Two important surface-water resources for the U.S. Navy and the citizens of Guam were modeled in this study; the extended model now includes the Ugum River watershed and improves upon the previous model of the Fena Valley watersheds. Surface water from the Ugum River watershed is diverted and treated for drinking water, and the Fena Valley watersheds feed the largest surface-water reservoir on Guam. The southern Guam watershed model performed “very good,” according to the criteria of Moriasi and others (2007), in the Ugum River watershed above Talofofo Falls with monthly Nash-Sutcliffe efficiency statistic values of 0.97 for the calibration period and 0.93 for the verification period (a value of 1.0 represents perfect model fit). In the Fena Valley watershed, monthly simulated streamflow volumes from the watershed model compared reasonably well with the measured values for the gaging stations on the Almagosa, Maulap, and Imong Rivers—tributaries to the Fena Valley Reservoir—with Nash-Sutcliffe efficiency values of 0.87 or higher. The southern Guam watershed model simulated the total volume of the critical dry season (January to May) streamflow for the entire simulation period within –0.54 percent at the Almagosa River, within 6.39 percent at the Maulap River, and within 6.06 percent at the Imong River.The recalibrated water-balance model of the Fena Valley Reservoir generally simulated monthly reservoir storage volume with reasonable accuracy. For the calibration and verification periods, errors in end-of-month reservoir-storage volume ranged from 6.04 percent (284.6 acre-feet or 92.7 million gallons) to –5.70 percent (–240.8 acre-feet or –78.5 million gallons). Monthly simulation bias ranged from –0.48 percent for the calibration period to 0.87 percent for the verification period; relative error ranged from –0.60 to 0.88 percent for the calibration and verification periods, respectively. The small bias indicated that the model did not consistently overestimate or underestimate reservoir storage volume.In the entirety of southern Guam, the watershed model has a “satisfactory” to “very good” rating when simulating monthly mean streamflow for all but one of the gaged watersheds during the verification period. The southern Guam watershed model uses a more sophisticated climate-distribution scheme than the older model to make use of the sparse climate data, as well as includes updated land-cover parameters and the capability to simulate closed depression areas.The new Fena Valley Reservoir water-balance model is useful as an updated tool to forecast short-term changes in the surface-water resources of Guam. Furthermore, the now spatially complete southern Guam watershed model can be used to evaluate changes in streamflow and recharge owing to climate or land-cover changes. These are substantial improvements to the previous models of the Fena Valley watershed and Reservoir. Datasets associated with this report are available as a U.S. Geological Survey data release (Rosa and Hay, 2017; DOI:10.5066/F7HH6HV4).
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-07-01
This report provides an update of the New York State Energy Research and Development Authority (NYSERDA) program. The NYSERDA research and development program has five major areas: industry, buildings, energy resources, transportation, and environment. NYSERDA organizes projects within these five major areas based on energy use and supply, and end-use sectors. Therefore, issues such as waste management, energy products and renewable energy technologies are addressed in several areas of the program. The project descriptions presented are organized within the five program areas. Descriptions of projects completed between the period April 1, 1996, and March 31, 1997, including technology-transfer activities, aremore » at the end of each subprogram section.« less
Regional groundwater flow model for C, K. L. and P reactor areas, Savannah River Site, Aiken, SC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.P.
2000-02-11
A regional groundwater flow model encompassing approximately 100 mi2 surrounding the C, K, L, and P reactor areas has been developed. The reactor flow model is designed to meet the planning objectives outlined in the General Groundwater Strategy for Reactor Area Projects by providing a common framework for analyzing groundwater flow, contaminant migration and remedial alternatives within the Reactor Projects team of the Environmental Restoration Department. The model provides a quantitative understanding of groundwater flow on a regional scale within the near surface aquifers and deeper semi-confined to confined aquifers. The model incorporates historical and current field characterization data upmore » through Spring 1999. Model preprocessing is automated so that future updates and modifications can be performed quickly and efficiently. The CKLP regional reactor model can be used to guide characterization, perform scoping analyses of contaminant transport, and serve as a common base for subsequent finer-scale transport and remedial/feasibility models for each reactor area.« less
NASA Astrophysics Data System (ADS)
Thirel, Guillaume; de Lavenne, Alban; Wagner, Jean-Pierre; Perrin, Charles; Gerlinger, Kai; Drogue, Gilles; Renard, Benjamin
2016-04-01
Several projects studied the impact of climate change on the Rhine basin during the past years, using the CMIP3 projections (see Explore2070, FLOW MS, RheinBlick2050 or VULNAR), either on the French or German sides. These studies showed the likely decrease of low flows and a high uncertainty regarding the evolution of high flows. This may have tremendous impacts on several aspects related to discharge, including pollution, flood protection, irrigation, rivers ecosystems and drinking water. While focusing on the same basin (or part of it), many differences including the climate scenarios and models, the hydrological models and the study periods used for these projects make the outcomes of these projects difficult to compare rigorously. Therefore the MOSARH21 (stands for MOselle-SArre-RHine discharge in the 21st century) was built to update and homogenise discharge projections for the French tributaries of the Rhine basin. Two types of models were used: the physically-oriented LARSIM model, which is widely used in Germany and was used in one of the previous projects (FLOW MS), and the semi-distributed conceptual GRSD model tested on French catchments for various objectives. Through the use of these two hydrological models and multiple sets of parameters obtained by various calibrations runs, the structural and parametric uncertainties in the hydrological projections were quantified, as they tend to be neglected in climate change impact studies. The focus of the impact analysis is put on low flows, high flows and regime. Although this study considers only French tributaries of the Rhine, it will foster further cooperation on transboundary basins across Europe, and should contribute to propose better bases for the future definition of adaptation strategies between riverine countries.
NASA Astrophysics Data System (ADS)
Perna, L.; Pezzopane, M.; Pietrella, M.; Zolesi, B.; Cander, L. R.
2017-09-01
The SIRM model proposed by Zolesi et al. (1993, 1996) is an ionospheric regional model for predicting the vertical-sounding characteristics that has been frequently used in developing ionospheric web prediction services (Zolesi and Cander, 2014). Recently the model and its outputs were implemented in the framework of two European projects: DIAS (DIgital upper Atmosphere Server; http://www.iono.noa.gr/DIAS/ (Belehaki et al., 2005, 2015) and ESPAS (Near-Earth Space Data Infrastructure for e-Science; http://www.espas-fp7.eu/) (Belehaki et al., 2016). In this paper an updated version of the SIRM model, called SIRMPol, is described and corresponding outputs in terms of the F2-layer critical frequency (foF2) are compared with values recorded at the mid-latitude station of Rome (41.8°N, 12.5°E), for extremely high (year 1958) and low (years 2008 and 2009) solar activity. The main novelties introduced in the SIRMPol model are: (1) an extension of the Rome ionosonde input dataset that, besides data from 1957 to 1987, includes also data from 1988 to 2007; (2) the use of second order polynomial regressions, instead of linear ones, to fit the relation foF2 vs. solar activity index R12; (3) the use of polynomial relations, instead of linear ones, to fit the relations A0 vs. R12, An vs. R12 and Yn vs. R12, where A0, An and Yn are the coefficients of the Fourier analysis performed by the SIRM model to reproduce the values calculated by using relations in (2). The obtained results show that SIRMPol outputs are better than those of the SIRM model. As the SIRMPol model represents only a partial updating of the SIRM model based on inputs from only Rome ionosonde data, it can be considered a particular case of a single-station model. Nevertheless, the development of the SIRMPol model allowed getting some useful guidelines for a future complete and more accurate updating of the SIRM model, of which both DIAS and ESPAS could benefit.
NASA Astrophysics Data System (ADS)
Leskiw, Donald M.; Zhau, Junmei
2000-06-01
This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.
Update of the DTM thermosphere model in the framework of the H2020 project `SWAMI'
NASA Astrophysics Data System (ADS)
Bruinsma, S.; Jackson, D.; Stolle, C.; Negrin, S.
2017-12-01
In the framework of the H2020 project SWAMI (Space Weather Atmosphere Model and Indices), which is expected to start in January 2018, the CIRA thermosphere specification model DTM2013 will be improved through the combination of assimilating more density data to drive down remaining biases and a new high cadence kp geomagnetic index in order to improve storm-time performance. Five more years of GRACE high-resolution densities from 2012-2016, densities from the last year of the GOCE mission, Swarm mean densities, and mean densities from 2010-2017 inferred from the geodetic satellites at about 800 km are available now. The DTM2013 model will be compared with the new density data in order to detect possible systematic errors or other kinds of deficiencies and a first analysis will be presented. Also, a more detailed analysis of model performance under storm conditions will be provided, which will then be the benchmark to quantify model improvement expected with the higher cadence kp indices. In the SWAMI project, the DTM model will be coupled in the 120-160 km altitude region to the Met Office Unified Model in order to create a whole atmosphere model. It can be used for launch operations, re-entry computations, orbit prediction, and aeronomy and space weather studies. The project objectives and time line will be given.
NASA Astrophysics Data System (ADS)
Sutanudjaja, Edwin; van Beek, Rens; Winsemius, Hessel; Ward, Philip; Bierkens, Marc
2017-04-01
The Aqueduct Global Flood Analyzer, launched in 2015, is an open-access and free-of-charge web-based interactive platform which assesses and visualises current and future projections of river flood impacts across the globe. One of the key components in the Analyzer is a set of river flood inundation hazard maps derived from the global hydrological model simulation of PCR-GLOBWB. For the current version of the Analyzer, accessible on http://floods.wri.org/#/, the early generation of PCR-GLOBWB 1.0 was used and simulated at 30 arc-minute ( 50 km at the equator) resolution. In this presentation, we will show the new version of these hazard maps. This new version is based on the latest version of PCR-GLOBWB 2.0 (https://github.com/UU-Hydro/PCR-GLOBWB_model, Sutanudjaja et al., 2016, doi:10.5281/zenodo.60764) simulated at 5 arc-minute ( 10 km at the equator) resolution. The model simulates daily hydrological and water resource fluxes and storages, including the simulation of overbank volume that ends up on the floodplain (if flooding occurs). The simulation was performed for the present day situation (from 1960) and future climate projections (until 2099) using the climate forcing created in the ISI-MIP project. From the simulated flood inundation volume time series, we then extract annual maxima for each cell, and fit these maxima to a Gumbel extreme value distribution. This allows us to derive flood volume maps of any hazard magnitude (ranging from 2-year to 1000-year flood events) and for any time period (e.g. 1960-1999, 2010-2049, 2030-2069, and 2060-2099). The derived flood volumes (at 5 arc-minute resolution) are then spread over the high resolution terrain model using an updated GLOFRIS downscaling module (Winsemius et al., 2013, doi:10.5194/hess-17-1871-2013). The updated version performs a volume spreading sequentially from more upstream basins to downstream basins, hence enabling a better inclusion of smaller streams, and takes into account spreading of water over diverging deltaic regions. This results in a set of high resolution hazard maps of flood inundation depth at 30 arc-second ( 1 km at the equator) resolution. Together with many other updates and new features, the resulting flood hazard maps will be used in the next generation of the Aqueduct Global Flood Analyzer.
Venus Global Reference Atmospheric Model Status and Planned Updates
NASA Technical Reports Server (NTRS)
Justh, H. L.; Dwyer Cianciolo, A. M.
2017-01-01
The Venus Global Reference Atmospheric Model (Venus-GRAM) was originally developed in 2004 under funding from NASA's In Space Propulsion (ISP) Aerocapture Project to support mission studies at the planet. Many proposals, including NASA New Frontiers and Discovery, as well as other studies have used Venus-GRAM to design missions and assess system robustness. After Venus-GRAM's release in 2005, several missions to Venus have generated a wealth of additional atmospheric data, yet few model updates have been made to Venus-GRAM. This paper serves to address three areas: (1) to present the current status of Venus-GRAM, (2) to identify new sources of data and other upgrades that need to be incorporated to maintain Venus-GRAM credibility and (3) to identify additional Venus-GRAM options and features that could be included to increase its capability. This effort will de-pend on understanding the needs of the user community, obtaining new modeling data and establishing a dedicated funding source to support continual up-grades. This paper is intended to initiate discussion that can result in an upgraded and validated Venus-GRAM being available to future studies and NASA proposals.
Characterization of Orbital Debris via Hyper-Velocity Laboratory-Based Tests
NASA Technical Reports Server (NTRS)
Cowardin, Heather; Liou, J.-C.; Anz-Meador, Phillip; Sorge, Marlon; Opiela, John; Fitz-Coy, Norman; Huynh, Tom; Krisko, Paula
2017-01-01
Existing DOD and NASA satellite breakup models are based on a key laboratory test, Satellite Orbital debris Characterization Impact Test (SOCIT), which has supported many applications and matched on-orbit events involving older satellite designs reasonably well over the years. In order to update and improve these models, the NASA Orbital Debris Program Office, in collaboration with the Air Force Space and Missile Systems Center, The Aerospace Corporation, and the University of Florida, replicated a hypervelocity impact using a mock-up satellite, DebriSat, in controlled laboratory conditions. DebriSat is representative of present-day LEO satellites, built with modern spacecraft materials and construction techniques. Fragments down to 2 mm in size will be characterized by their physical and derived properties. A subset of fragments will be further analyzed in laboratory radar and optical facilities to update the existing radar-based NASA Size Estimation Model (SEM) and develop a comparable optical-based SEM. A historical overview of the project, status of the characterization process, and plans for integrating the data into various models will be discussed herein.
Characterization of Orbital Debris via Hyper-Velocity Laboratory-Based Tests
NASA Technical Reports Server (NTRS)
Cowardin, Heather; Liou, J.-C.; Krisko, Paula; Opiela, John; Fitz-Coy, Norman; Sorge, Marlon; Huynh, Tom
2017-01-01
Existing DoD and NASA satellite breakup models are based on a key laboratory test, Satellite Orbital debris Characterization Impact Test (SOCIT), which has supported many applications and matched on-orbit events involving older satellite designs reasonably well over the years. In order to update and improve these models, the NASA Orbital Debris Program Office, in collaboration with the Air Force Space and Missile Systems Center, The Aerospace Corporation, and the University of Florida, replicated a hypervelocity impact using a mock-up satellite, DebriSat, in controlled laboratory conditions. DebriSat is representative of present-day LEO satellites, built with modern spacecraft materials and construction techniques. Fragments down to 2 mm in size will be characterized by their physical and derived properties. A subset of fragments will be further analyzed in laboratory radar and optical facilities to update the existing radar-based NASA Size Estimation Model (SEM) and develop a comparable optical-based SEM. A historical overview of the project, status of the characterization process, and plans for integrating the data into various models will be discussed herein.
An affine projection algorithm using grouping selection of input vectors
NASA Astrophysics Data System (ADS)
Shin, JaeWook; Kong, NamWoong; Park, PooGyeon
2011-10-01
This paper present an affine projection algorithm (APA) using grouping selection of input vectors. To improve the performance of conventional APA, the proposed algorithm adjusts the number of the input vectors using two procedures: grouping procedure and selection procedure. In grouping procedure, the some input vectors that have overlapping information for update is grouped using normalized inner product. Then, few input vectors that have enough information for for coefficient update is selected using steady-state mean square error (MSE) in selection procedure. Finally, the filter coefficients update using selected input vectors. The experimental results show that the proposed algorithm has small steady-state estimation errors comparing with the existing algorithms.
Environmental Regulatory Update Table, January/February 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.
1995-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives impacting environmental, health, and safety management responsibilities. the table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
2009-04-13
Angelo Vourlidas, project scientist, Sun Earth Connection Coronal and Heliospheric Investigation, at the Naval Research Laboratory makes a comment during a Science Update on the STEREO mission at NASA Headquarters in Washington, Tuesday, April 14, 2009. Photo Credit: (NASA/Paul E. Alers)
Update on Supersonic Jet Noise Research at NASA
NASA Technical Reports Server (NTRS)
Henderson, Brenda
2010-01-01
An update on jet noise research conducted in the Fundamental Aeronautics and Integrated Systems Research Programs was presented. Highlighted research projects included those focused on the development of prediction tools, diagnostic tools, and noise reduction concepts.
ODOT research news : winter quarter 2003.
DOT National Transportation Integrated Search
2003-01-01
The newsletter includes: : 1) Cracked Bridges; : 2) Research Outreach; : 3) LTPP Update: A Long Shot Pays Off; : 4) Railroad Crossing Intrusion Detection Update; : 5) Guiding Drivers through Work Zones; : 6) New Projects to start in July; : and other...
A sensitivity analysis of "Forests on the Edge: Housing Development on America's Private Forests."
Eric M. White; Ralph J. Alig; Lisa G. Mahal; David M. Theobald
2009-01-01
The original Forests on the Edge report (FOTE 1) indicated that 44.2 million acres of private forest land was projected to experience substantial increases in residential development in the coming decades. In this study, we examined the sensitivity of the FOTE 1 results to four factors: (1) use of updated private land and forest cover spatial data and a revised model...
Optimization and Control of Burning Plasmas Through High Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pankin, Alexei
This project has revived the FACETS code, that has been developed under SciDAC fund- ing in 2008-2012. The code has been dormant for a number of years after the SciDAC funding stopped. FACETS depends on external packages. The external packages and libraries such as PETSc, FFTW, HDF5 and NETCDF that are included in FACETS have evolved during these years. Some packages in FACETS are also parts of other codes such as PlasmaState, NUBEAM, GACODES, and UEDGE. These packages have been also evolved together with their host codes which include TRANSP, TGYRO and XPTOR. Finally, there is also a set ofmore » packages in FACETS that are being developed and maintained by Tech-X. These packages include BILDER, SciMake, and FcioWrappers. Many of these packages evolved significantly during the last several years and FACETS had to be updated to synchronize with the re- cent progress in the external packages. The PI has introduced new changes to the BILDER package to support the updated interfaces to the external modules. During the last year of the project, the FACETS version of the UEDGE code has been extracted from FACETS as a standalone package. The PI collaborates with the scientists from LLNL on the updated UEDGE model in FACETS. Drs. T. Rognlien, M. Umansky and A. Dimits from LLNL are contributing to this task.« less
NASA Technical Reports Server (NTRS)
Cox, T. H.; Gilyard, G. B.
1986-01-01
The drones for aerodynamic and structural testing (DAST) project was designed to control flutter actively at high subsonic speeds. Accurate knowledge of the structural model was critical for the successful design of the control system. A ground vibration test was conducted on the DAST vehicle to determine the structural model characteristics. This report presents and discusses the vibration and test equipment, the test setup and procedures, and the antisymmetric and symmetric mode shape results. The modal characteristics were subsequently used to update the structural model employed in the control law design process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
BEVINS, R.R.
This study is a requirements document that presents analysis for the functional description for the master pump shutdown system. This document identifies the sources of the requirements and/or how these were derived. Each requirement is validated either by quoting the source or an analysis process involving the required functionality, performance characteristics, operations input or engineering judgment. The requirements in this study apply to the first phase of the W314 Project. This document has been updated during the definitive design portion of the first phase of the W314 Project to capture additional software requirements and is planned to be updated duringmore » the second phase of the W314 Project to cover the second phase of the project's scope.« less
Improved annotation with de novo transcriptome assembly in four social amoeba species.
Singh, Reema; Lawal, Hajara M; Schilde, Christina; Glöckner, Gernot; Barton, Geoffrey J; Schaap, Pauline; Cole, Christian
2017-01-31
Annotation of gene models and transcripts is a fundamental step in genome sequencing projects. Often this is performed with automated prediction pipelines, which can miss complex and atypical genes or transcripts. RNA sequencing (RNA-seq) data can aid the annotation with empirical data. Here we present de novo transcriptome assemblies generated from RNA-seq data in four Dictyostelid species: D. discoideum, P. pallidum, D. fasciculatum and D. lacteum. The assemblies were incorporated with existing gene models to determine corrections and improvement on a whole-genome scale. This is the first time this has been performed in these eukaryotic species. An initial de novo transcriptome assembly was generated by Trinity for each species and then refined with Program to Assemble Spliced Alignments (PASA). The completeness and quality were assessed with the Benchmarking Universal Single-Copy Orthologs (BUSCO) and Transrate tools at each stage of the assemblies. The final datasets of 11,315-12,849 transcripts contained 5,610-7,712 updates and corrections to >50% of existing gene models including changes to hundreds or thousands of protein products. Putative novel genes are also identified and alternative splice isoforms were observed for the first time in P. pallidum, D. lacteum and D. fasciculatum. In taking a whole transcriptome approach to genome annotation with empirical data we have been able to enrich the annotations of four existing genome sequencing projects. In doing so we have identified updates to the majority of the gene annotations across all four species under study and found putative novel genes and transcripts which could be worthy for follow-up. The new transcriptome data we present here will be a valuable resource for genome curators in the Dictyostelia and we propose this effective methodology for use in other genome annotation projects.
NASA Technical Reports Server (NTRS)
Butler, Doug; Bauman, David; Johnson-Throop, Kathy
2011-01-01
The Integrated Medical Model (IMM) Project has been developing a probabilistic risk assessment tool, the IMM, to help evaluate in-flight crew health needs and impacts to the mission due to medical events. This package is a follow-up to a data package provided in June 2009. The IMM currently represents 83 medical conditions and associated ISS resources required to mitigate medical events. IMM end state forecasts relevant to the ISS PRA model include evacuation (EVAC) and loss of crew life (LOCL). The current version of the IMM provides the basis for the operational version of IMM expected in the January 2011 timeframe. The objectives of this data package are: 1. To provide a preliminary understanding of medical risk data used to update the ISS PRA Model. The IMM has had limited validation and an initial characterization of maturity has been completed using NASA STD 7009 Standard for Models and Simulation. The IMM has been internally validated by IMM personnel but has not been validated by an independent body external to the IMM Project. 2. To support a continued dialogue between the ISS PRA and IMM teams. To ensure accurate data interpretation, and that IMM output format and content meets the needs of the ISS Risk Management Office and ISS PRA Model, periodic discussions are anticipated between the risk teams. 3. To help assess the differences between the current ISS PRA and IMM medical risk forecasts of EVAC and LOCL. Follow-on activities are anticipated based on the differences between the current ISS PRA medical risk data and the latest medical risk data produced by IMM.
Recent Updates of A Multi-Phase Transport (AMPT) Model
NASA Astrophysics Data System (ADS)
Lin, Zi-Wei
2008-10-01
We will present recent updates to the AMPT model, a Monte Carlo transport model for high energy heavy ion collisions, since its first public release in 2004 and the corresponding detailed descriptions in Phys. Rev. C 72, 064901 (2005). The updates often result from user requests. Some of these updates expand the physics processes or descriptions in the model, while some updates improve the usability of the model such as providing the initial parton distributions or help avoid crashes on some operating systems. We will also explain how the AMPT model is being maintained and updated.
DNAPL Remediation: Selected Projects Approaching Regulatory Closure
This paper is a status update on the use of DNAPL source reduction remedial technologies, and provides information about recent projects where regulatory closure has been reached or projects are approaching regulatory closure, following source reduction.
2009-04-13
Michael Kaiser, project scientist, Solar Terrestrial Relations Observatory (STEREO) at Goddard Space Flight Center, left, makes a point during a Science Update on the STEREO mission at NASA Headquarters in Washington, Tuesday, April 14, 2009, as Angelo Vourlidas, project scientist, Sun Earth Connection Coronal and Heliospheric Investigation, at the Naval Research Laboratory, Toni Galvin, principal investigator, Plasma and Superthermal Ion Composition instrument at the University of New Hampshire and Madhulika Guhathkurta, STEREO program scientist, right, look on. Photo Credit: (NASA/Paul E. Alers)
Updating National Topographic Data Base Using Change Detection Methods
NASA Astrophysics Data System (ADS)
Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.
2016-06-01
The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.
Guidelines and Recommendations to Accommodate Older Drivers and Pedestrians
DOT National Transportation Integrated Search
2001-05-01
This project updated, revised, and expanded the scope of the "Older Driver Highway Design Handbook" published by the Federal Highway Administration (FHWA) in 1998. Development of the updated Handbook (FHWA-RD-01-103) was complemented by a technology ...
Update LADOTD policy on pile driving vibration management.
DOT National Transportation Integrated Search
2012-02-01
The main objective of this project was to update the current Louisiana Department of Transportation and Development (LADOTD) policy on pile driving vibration risk management with a focus on how to determine an appropriate vibration monitoring area. T...
Guidelines and Recommendations to Accommodate Older Drivers and Pedestrians
DOT National Transportation Integrated Search
2001-10-01
This project updated, revised, and expanded the scope of the "Older Driver Highway Design Handbook" published by the Federal Highway Administration (FHWA) in 1998. Development of the updated Handbook (FHWA-RD-01-103) was complemented by a technology ...
Progress Update: Stack Project Complete
Cody, Tom
2017-12-12
Progress update from the Savannah River Site. The 75 foot 293 F Stack, built for plutonium production, was cut down to size in order to prevent injury or release of toxic material if the structure were to collapse due to harsh weather.
Some safe and sensible shortcuts for efficiently upscaled updates of existing elevation models.
NASA Astrophysics Data System (ADS)
Knudsen, Thomas; Aasbjerg Nielsen, Allan
2013-04-01
The Danish national elevation model, DK-DEM, was introduced in 2009 and is based on LiDAR data collected in the time frame 2005-2007. Hence, DK-DEM is aging, and it is time to consider how to integrate new data with the current model in a way that improves the representation of new landscape features, while still preserving the overall (very high) quality of the model. In LiDAR terms, 2005 is equivalent to some time between the palaeolithic and the neolithic. So evidently, when (and if) an update project is launched, we may expect some notable improvements due to the technical and scientific developments from the last half decade. To estimate the magnitude of these potential improvements, and to devise efficient and effective ways of integrating the new and old data, we currently carry out a number of case studies based on comparisons between the current terrain model (with a ground sample distance, GSD, of 1.6 m), and a number of new high resolution point clouds (10-70 points/m2). Not knowing anything about the terms of a potential update project, we consider multiple scenarios ranging from business as usual: A new model with the same GSD, but improved precision, to aggressive upscaling: A new model with 4 times better GSD, i.e. a 16-fold increase in the amount of data. Especially in the latter case speeding up the gridding process is important. Luckily recent results from one of our case studies reveal that for very high resolution data in smooth terrain (which is the common case in Denmark), using local mean (LM) as grid value estimator is only negligibly worse than using the theoretically "best" estimator, i.e. ordinary kriging (OK) with rigorous modelling of the semivariogram. The bias in a leave one out cross validation differs on the micrometer level, while the RMSE differs on the 0.1 mm level. This is fortunate, since a LM estimator can be implemented in plain stream mode, letting the points from the unstructured point cloud (i.e. no TIN generation) stream through the processor, individually contributing to the nearest grid posts in a memory mapped grid file. Algorithmically this is very efficient, but it would be even more efficient if we did not have to handle so much data. Another of our recent case studies focuses on this. The basic idea is to ignore data that does not tell us anything new. We do this by looking at anomalies between the current height model and the new point cloud, then computing a correction grid for the current model. Points with insignificant anomalies are simply removed from the point cloud, and the correction grid is computed using the remaining point anomalies only. Hence, we only compute updates in areas of significant change, speeding up the process, and giving us new insight of the precision of the current model which in turn results in improved metadata for both the current and the new model. Currently we focus on simple approaches for creating a smooth update process for integration of heterogeneous data sets. On the other hand, as years go by and multiple generations of data become available, more advanced approaches will probably become necessary (e.g. a multi campaign bundle adjustment, improving the oldest data using cross-over adjustment with newer campaigns). But to prepare for such approaches, it is important already now to organize and evaluate the ancillary (GPS, INS) and engineering level data for the current data sets. This is essential if future generations of DEM users should be able to benefit from future conceptions of "some safe and sensible shortcuts for efficiently upscaled updates of existing elevation models".
Linking_Learning: Migrant Education Technology Projects, 1999.
ERIC Educational Resources Information Center
Carson, Nancy
1999-01-01
The two issues of Linking_Learning published in 1999 update the education community and others regarding six migrant education technology projects funded by the U.S. Department of Education. The projects are the Anchor School Project, InTime (Integrating Technology into Migrant Education), MECHA, KMTP (Kentucky Migrant Technology Project),…
Production and Uses of Multi-Decade Geodetic Earth Science Data Records
NASA Astrophysics Data System (ADS)
Bock, Y.; Kedar, S.; Moore, A. W.; Fang, P.; Liu, Z.; Sullivan, A.; Argus, D. F.; Jiang, S.; Marshall, S. T.
2017-12-01
The Solid Earth Science ESDR System (SESES) project funded under the NASA MEaSUREs program produces and disseminates mature, long-term, calibrated and validated, GNSS based Earth Science Data Records (ESDRs) that encompass multiple diverse areas of interest in Earth Science, such as tectonic motion, transient slip and earthquake dynamics, as well as meteorology, climate, and hydrology. The ESDRs now span twenty-five years for the earliest stations and today are available for thousands of global and regional stations. Using a unified metadata database and a combination of GNSS solutions generated by two independent analysis centers, the project currently produces four long-term ESDR's: Geodetic Displacement Time Series: Daily, combined, cleaned and filtered, GIPSY and GAMIT long-term time series of continuous GPS station positions (global and regional) in the latest version of ITRF, automatically updated weekly. Geodetic Velocities: Weekly updated velocity field + velocity field histories in various reference frames; compendium of all model parameters including earthquake catalog, coseismic offsets, and postseismic model parameters (exponential or logarithmic). Troposphere Delay Time Series: Long-term time series of troposphere delay (30-min resolution) at geodetic stations, necessarily estimated during position time series production and automatically updated weekly. Seismogeodetic records for historic earthquakes: High-rate broadband displacement and seismic velocity time series combining 1 Hz GPS displacements and 100 Hz accelerometer data for select large earthquakes and collocated cGPS and seismic instruments from regional networks. We present several recent notable examples of the ESDR's usage: A transient slip study that uses the combined position time series to unravel "tremor-less" slow tectonic transient events. Fault geometry determination from geodetic slip rates. Changes in water resources across California's physiographic provinces at a spatial resolution of 75 km. Retrospective study of a southern California summer monsoon event.
Kozar, Mark D.; Kahle, Sue C.
2013-01-01
This report documents the standard procedures, policies, and field methods used by the U.S. Geological Survey’s (USGS) Washington Water Science Center staff for activities related to the collection, processing, analysis, storage, and publication of groundwater data. This groundwater quality-assurance plan changes through time to accommodate new methods and requirements developed by the Washington Water Science Center and the USGS Office of Groundwater. The plan is based largely on requirements and guidelines provided by the USGS Office of Groundwater, or the USGS Water Mission Area. Regular updates to this plan represent an integral part of the quality-assurance process. Because numerous policy memoranda have been issued by the Office of Groundwater since the previous groundwater quality assurance plan was written, this report is a substantial revision of the previous report, supplants it, and contains significant additional policies not covered in the previous report. This updated plan includes information related to the organization and responsibilities of USGS Washington Water Science Center staff, training, safety, project proposal development, project review procedures, data collection activities, data processing activities, report review procedures, and archiving of field data and interpretative information pertaining to groundwater flow models, borehole aquifer tests, and aquifer tests. Important updates from the previous groundwater quality assurance plan include: (1) procedures for documenting and archiving of groundwater flow models; (2) revisions to procedures and policies for the creation of sites in the Groundwater Site Inventory database; (3) adoption of new water-level forms to be used within the USGS Washington Water Science Center; (4) procedures for future creation of borehole geophysics, surface geophysics, and aquifer-test archives; and (5) use of the USGS Multi Optional Network Key Entry System software for entry of routine water-level data collected as part of long-term water-level monitoring networks.
Environmental Regulatory Update Table, October 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1991-11-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, December 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1992-01-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, January--February 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1993-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table, March--April 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.
1994-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table, September--October 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Lewis, E.B.
1992-11-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, September/October 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1993-11-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operation and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, January--February 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1994-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations ad contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table November--December 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.
1995-01-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, November--December 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Lewis, E.B.
1993-01-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly wit information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, July--August 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Lewis, E.B.
1992-09-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, September 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1991-10-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table July/August 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1993-09-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, August 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M., Hawkins, G.T.; Salk, M.S.
1991-09-01
This Environmental Regulatory Update Table (August 1991) provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, November 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1991-12-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, March/April 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1993-05-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bimonthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table, July/August 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.
1994-09-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, November--December 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1994-01-01
The Environmental Regulatory Update Table provides information on regulatory of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, March/April 1993. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1993-05-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bimonthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, May/June 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1993-07-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bimonthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, May--June 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.
1994-07-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bimonthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
NASA Astrophysics Data System (ADS)
Adamson, E. T.; Pizzo, V. J.; Biesecker, D. A.; Mays, M. L.; MacNeice, P. J.; Taktakishvili, A.; Viereck, R. A.
2017-12-01
In 2011, NOAA's Space Weather Prediction Center (SWPC) transitioned the world's first operational space weather model into use at the National Weather Service's Weather and Climate Operational Supercomputing System (WCOSS). This operational forecasting tool is comprised of the Wang-Sheeley-Arge (WSA) solar wind model coupled with the Enlil heliospheric MHD model. Relying on daily-updated photospheric magnetograms produced by the National Solar Observatory's Global Oscillation Network Group (GONG), this tool provides critical predictive knowledge of heliospheric dynamics such as high speed streams and coronal mass ejections. With the goal of advancing this predictive model and quantifying progress, SWPC and NASA's Community Coordinated Modeling Center (CCMC) have initiated a collaborative effort to assess improvements in space weather forecasts at Earth by moving from a single daily-updated magnetogram to a sequence of time-dependent magnetograms to drive the ambient inputs for the WSA-Enlil model as well as incorporating the newly developed Air Force Data Assimilative Photospheric Flux Transport (ADAPT) model. We will provide a detailed overview of the scope of this effort and discuss preliminary results from the first phase focusing on the impact of time-dependent magnetogram inputs to the WSA-Enlil model.
Flanagan, Meghan R; Foster, Carolyn C; Schleyer, Anneliese; Peterson, Gene N; Mandell, Samuel P; Rudd, Kristina E; Joyner, Byron D; Payne, Thomas H
2016-02-01
House staff quality improvement projects are often not aligned with training institution priorities. House staff are the primary users of inpatient problem lists in academic medical centers, and list maintenance has significant patient safety and financial implications. Improvement of the problem list is an important objective for hospitals with electronic health records under the Meaningful Use program. House staff surveys were used to create an electronic problem list manager (PLM) tool enabling efficient problem list updating. Number of new problems added and house staff perceptions of the problem list were compared before and after PLM intervention. The PLM was used by 654 house staff after release. Surveys demonstrated increased problem list updating (P = .002; response rate 47%). Mean new problems added per day increased from 64 pre-PLM to 125 post-PLM (P < .001). This innovative project serves as a model for successful engagement of house staff in institutional quality and safety initiatives with tangible institutional benefits. Copyright © 2016 Elsevier Inc. All rights reserved.
Jones, Michael L.; Brenden, Travis O.; Irwin, Brian J.
2015-01-01
The St. Marys River (SMR) historically has been a major producer of sea lampreys (Petromyzon marinus) in the Laurentian Great Lakes. In the early 2000s, a decision analysis (DA) project was conducted to evaluate sea lamprey control policies for the SMR; this project suggested that an integrated policy of trapping, sterile male releases, and Bayluscide treatment was the most cost-effective policy. Further, it concluded that formal assessment of larval sea lamprey abundance and distribution in the SMR would be valuable for future evaluation of control strategies. We updated this earlier analysis, adding information from annual larval assessments conducted since 1999 and evaluating additional control policies. Bayluscide treatments continued to be critical for sea lamprey control, but high recruitment compensation minimized the effectiveness of trapping and sterile male release under current feasible ranges. Because Bayluscide control is costly, development of strategies to enhance trapping success remains a priority. This study illustrates benefits of an adaptive management cycle, wherein models inform decisions, are updated based on learning achieved from those decisions, and ultimately inform future decisions.
Gradient optimization of finite projected entangled pair states
NASA Astrophysics Data System (ADS)
Liu, Wen-Yuan; Dong, Shao-Jun; Han, Yong-Jian; Guo, Guang-Can; He, Lixin
2017-05-01
Projected entangled pair states (PEPS) methods have been proven to be powerful tools to solve strongly correlated quantum many-body problems in two dimensions. However, due to the high computational scaling with the virtual bond dimension D , in a practical application, PEPS are often limited to rather small bond dimensions, which may not be large enough for some highly entangled systems, for instance, frustrated systems. Optimization of the ground state using the imaginary time evolution method with a simple update scheme may go to a larger bond dimension. However, the accuracy of the rough approximation to the environment of the local tensors is questionable. Here, we demonstrate that by combining the imaginary time evolution method with a simple update, Monte Carlo sampling techniques and gradient optimization will offer an efficient method to calculate the PEPS ground state. By taking advantage of massive parallel computing, we can study quantum systems with larger bond dimensions up to D =10 without resorting to any symmetry. Benchmark tests of the method on the J1-J2 model give impressive accuracy compared with exact results.
Situation Model Updating in Young and Older Adults: Global versus Incremental Mechanisms
Bailey, Heather R.; Zacks, Jeffrey M.
2015-01-01
Readers construct mental models of situations described by text. Activity in narrative text is dynamic, so readers must frequently update their situation models when dimensions of the situation change. Updating can be incremental, such that a change leads to updating just the dimension that changed, or global, such that the entire model is updated. Here, we asked whether older and young adults make differential use of incremental and global updating. Participants read narratives containing changes in characters and spatial location and responded to recognition probes throughout the texts. Responses were slower when probes followed a change, suggesting that situation models were updated at changes. When either dimension changed, responses to probes for both dimensions were slowed; this provides evidence for global updating. Moreover, older adults showed stronger evidence of global updating than did young adults. One possibility is that older adults perform more global updating to offset reduced ability to manipulate information in working memory. PMID:25938248
DOE Office of Scientific and Technical Information (OSTI.GOV)
BEVINS, R.R.
This document has been updated during the definitive design portion of the first phase of the W-314 Project to capture additional software requirements and is planned to be updated during the second phase of the W-314 Project to cover the second phase of the Project's scope. The objective is to provide requirement traceability by recording the analysis/basis for the functional descriptions of the master pump shutdown system. This document identifies the sources of the requirements and/or how these were derived. Each requirement is validated either by quoting the source or an analysis process involving the required functionality, performance characteristics, operationsmore » input or engineering judgment.« less
2015 Project Portfolio: Solid-State Lighting
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2015-01-30
Overview of SSL projects currently funded by DOE, and those previously funded but since completed. Each profile includes a brief technical description, as well as information about project partners, funding, and the research period. This report is updated annually.
2017 Project Portfolio: Solid-State Lighting
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2017-01-02
Overview of SSL projects currently funded by DOE, and those previously funded but since completed. Each profile includes a brief technical description, as well as information about project partners, funding, and the research period. This report is updated annually.
Guidelines and recommendations to accommodate older driver and pedestrians
DOT National Transportation Integrated Search
2001-05-01
This project updated, revised, and expanded the scope of the Older Driver Highway Design Handbook published by FHWA in 1998. Development of the updated Handbook (FHWA-RD-01-103) was complemented by a technology transfer initiative to make practitione...
AN EVALUATION OF HANFORD SITE TANK FARM SUBSURFACE CONTAMINATION FY2007
DOE Office of Scientific and Technical Information (OSTI.GOV)
MANN, F.M.
2007-07-10
The Tank Farm Vadose Zone (TFVZ) Project conducts activities to characterize and analyze the long-term environmental and human health impacts from tank waste releases to the vadose zone. The project also implements interim measures to mitigate impacts, and plans the remediation of waste releases from tank farms and associated facilities. The scope of this document is to report data needs that are important to estimating long-term human health and environmental risks. The scope does not include technologies needed to remediate contaminated soils and facilities, technologies needed to close tank farms, or management and regulatory decisions that will impact remediation andmore » closure. This document is an update of ''A Summary and Evaluation of Hanford Site Tank Farm Subsurface Contamination''. That 1998 document summarized knowledge of subsurface contamination beneath the tank farms at the time. It included a preliminary conceptual model for migration of tank wastes through the vadose zone and an assessment of data and analysis gaps needed to update the conceptual model. This document provides a status of the data and analysis gaps previously defined and discussion of the gaps and needs that currently exist to support the stated mission of the TFVZ Project. The first data-gaps document provided the basis for TFVZ Project activities over the previous eight years. Fourteen of the nineteen knowledge gaps identified in the previous document have been investigated to the point that the project defines the current status as acceptable. In the process of filling these gaps, significant accomplishments were made in field work and characterization, laboratory investigations, modeling, and implementation of interim measures. The current data gaps are organized in groups that reflect Components of the tank farm vadose zone conceptual model: inventory, release, recharge, geohydrology, geochemistry, and modeling. The inventory and release components address residual wastes that will remain in the tanks and tank-farm infrastructure after closure and potential losses from leaks during waste retrieval. Recharge addresses the impacts of current conditions in the tank farms (i.e. gravel covers that affect infiltration and recharge) as well as the impacts of surface barriers. The geohydrology and geochemistry components address the extent of the existing subsurface contaminant inventory and drivers and pathways for contaminants to be transported through the vadose zone and groundwater. Geochemistry addresses the mobility of key reactive contaminants such as uranium. Modeling addresses conceptual models and how they are simulated in computers. The data gaps will be used to provide input to planning (including the upcoming C Farm Data Quality Objective meetings scheduled this year).« less
The Lunar Mapping and Modeling Project Update
NASA Technical Reports Server (NTRS)
Noble, S.; French, R.; Nall, M.; Muery, K.
2010-01-01
The Lunar Mapping and Modeling Project (LMMP) is managing the development of a suite of lunar mapping and modeling tools and data products that support lunar exploration activities, including the planning, design, development, test, and operations associated with crewed and/or robotic operations on the lunar surface. In addition, LMMP should prove to be a convenient and useful tool for scientific analysis and for education and public outreach (E/PO) activities. LMMP will utilize data predominately from the Lunar Reconnaissance Orbiter, but also historical and international lunar mission data (e.g. Lunar Prospector, Clementine, Apollo, Lunar Orbiter, Kaguya, and Chandrayaan-1) as available and appropriate. LMMP will provide such products as image mosaics, DEMs, hazard assessment maps, temperature maps, lighting maps and models, gravity models, and resource maps. We are working closely with the LRO team to prevent duplication of efforts and ensure the highest quality data products. A beta version of the LMMP software was released for limited distribution in December 2009, with the public release of version 1 expected in the Fall of 2010.
2009-04-13
Michael Kaiser, project scientist, Solar Terrestrial Relations Observatory (STEREO) at Goddard Space Flight Center, left, makes a comment during a Science Update on the STEREO mission at NASA Headquarters in Washington, Tuesday, April 14, 2009, as Angelo Vourlidas, project scientist, Sun Earth Connection Coronal and Heliospheric Investigation, at the Naval Research Laboratory, second from left, Toni Galvin, principal investigator, Plasma and Superthermal Ion Composition instrument at the University of New Hampshire and Madhulika Guhathakurta, STEREO program scientist, right, look on. Photo Credit: (NASA/Paul E. Alers)
2009-04-13
Angelo Vourlidas, project scientist, Sun Earth Connection Coronal and Heliospheric Investigation, at the Naval Research Laboratory, second from left, makes a comment during a Science Update on the STEREO mission at NASA Headquarters in Washington, Tuesday, April 14, 2009, as Michael Kaiser, project scientist, Solar Terrestrial Relations Observatory (STEREO) at Goddard Space Flight Center, left, Toni Galvin, principal investigator, Plasma and Superthermal Ion Composition instrument at the University of New Hampshire and Madhulika Guhathakurta, STEREO program scientist, right, look on. Photo Credit: (NASA/Paul E. Alers)
SAM Photovoltaic Model Technical Reference 2016 Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilman, Paul; DiOrio, Nicholas A; Freeman, Janine M
This manual describes the photovoltaic performance model in the System Advisor Model (SAM) software, Version 2016.3.14 Revision 4 (SSC Version 160). It is an update to the 2015 edition of the manual, which describes the photovoltaic model in SAM 2015.1.30 (SSC 41). This new edition includes corrections of errors in the 2015 edition and descriptions of new features introduced in SAM 2016.3.14, including: 3D shade calculator Battery storage model DC power optimizer loss inputs Snow loss model Plane-of-array irradiance input from weather file option Support for sub-hourly simulations Self-shading works with all four subarrays, and uses same algorithm for fixedmore » arrays and one-axis tracking Linear self-shading algorithm for thin-film modules Loss percentages replace derate factors. The photovoltaic performance model is one of the modules in the SAM Simulation Core (SSC), which is part of both SAM and the SAM SDK. SAM is a user-friedly desktop application for analysis of renewable energy projects. The SAM SDK (Software Development Kit) is for developers writing their own renewable energy analysis software based on SSC. This manual is written for users of both SAM and the SAM SDK wanting to learn more about the details of SAM's photovoltaic model.« less
A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity
Blair, J.L.; McCrory, P.A.; Oppenheimer, D.H.; Waldhauser, F.
2011-01-01
We present a Geographic Information System (GIS) of a new 3-dimensional (3D) model of the subducted Juan de Fuca Plate beneath western North America and associated seismicity of the Cascadia subduction system. The geo-referenced 3D model was constructed from weighted control points that integrate depth information from hypocenter locations and regional seismic velocity studies. We used the 3D model to differentiate earthquakes that occur above the Juan de Fuca Plate surface from earthquakes that occur below the plate surface. This GIS project of the Cascadia subduction system supersedes the one previously published by McCrory and others (2006). Our new slab model updates the model with new constraints. The most significant updates to the model include: (1) weighted control points to incorporate spatial uncertainty, (2) an additional gridded slab surface based on the Generic Mapping Tools (GMT) Surface program which constructs surfaces based on splines in tension (see expanded description below), (3) double-differenced hypocenter locations in northern California to better constrain slab location there, and (4) revised slab shape based on new hypocenter profiles that incorporate routine depth uncertainties as well as data from new seismic-reflection and seismic-refraction studies. We also provide a 3D fly-through animation of the model for use as a visualization tool.
Characterizing DebriSat Fragments: So Many Fragments, So Much Data, and So Little Time
NASA Technical Reports Server (NTRS)
Shiotani, B.; Rivero, M.; Carrasquilla, M.; Allen, S.; Fitz-Coy, N.; Liou, J.-C.; Huynh, T.; Sorge, M.; Cowardin, H.; Opiela, J.;
2017-01-01
To improve prediction accuracy, the DebriSat project was conceived by NASA and DoD to update existing standard break-up models. Updating standard break-up models require detailed fragment characteristics such as physical size, material properties, bulk density, and ballistic coefficient. For the DebriSat project, a representative modern LEO spacecraft was developed and subjected to a laboratory hypervelocity impact test and all generated fragments with at least one dimension greater than 2 mm are collected, characterized and archived. Since the beginning of the characterization phase of the DebriSat project, over 130,000 fragments have been collected and approximately 250,000 fragments are expected to be collected in total, a three-fold increase over the 85,000 fragments predicted by the current break-up model. The challenge throughout the project has been to ensure the integrity and accuracy of the characteristics of each fragment. To this end, the post hypervelocity-impact test activities, which include fragment collection, extraction, and characterization, have been designed to minimize handling of the fragments. The procedures for fragment collection, extraction, and characterization were painstakingly designed and implemented to maintain the post-impact state of the fragments, thus ensuring the integrity and accuracy of the characterization data. Each process is designed to expedite the accumulation of data, however, the need for speed is restrained by the need to protect the fragments. Methods to expedite the process such as parallel processing have been explored and implemented while continuing to maintain the highest integrity and value of the data. To minimize fragment handling, automated systems have been developed and implemented. Errors due to human inputs are also minimized by the use of these automated systems. This paper discusses the processes and challenges involved in the collection, extraction, and characterization of the fragments as well as the time required to complete the processes. The objective is to provide the orbital debris community an understanding of the scale of the effort required to generate and archive high quality data and metadata for each debris fragment 2 mm or larger generated by the DebriSat project.
Anderson, Dan; Hubble, William; Press, Bret A; Hall, Scott K; Michels, Ann D; Koenen, Roxanne; Vespie, Alan W
2010-12-01
The American Registry of Radiologic Technologists (ARRT) conducts periodic job analysis projects to update the content and eligibility requirements for all certification examinations. In 2009, the ARRT conducted a comprehensive job analysis project to update the content specifications and clinical competency requirements for the nuclear medicine technology examination. ARRT staff and a committee of volunteer nuclear medicine technologists designed a job analysis survey that was sent to a random sample of 1,000 entry-level staff nuclear medicine technologists. Through analysis of the survey data and judgments of the committee, the project resulted in changes to the nuclear medicine technology examination task list, content specifications, and clinical competency requirements. The primary changes inspired by the project were the introduction of CT content to the examination and the expansion of the content covering cardiac procedures.
NASA Astrophysics Data System (ADS)
Hawkins, Ed; Day, Jonny; Tietsche, Steffen
2016-04-01
Recent years have seen significant developments in seasonal-to-interannual timescale climate prediction capabilities. However, until recently the potential of such systems to predict Arctic climate had not been assessed. We describe a multi-model predictability experiment which was run as part of the Arctic Predictability and Prediction On Seasonal to Inter-annual TimEscales (APPOSITE) project. The main goal of APPOSITE was to quantify the timescales on which Arctic climate is predictable. In order to achieve this, a coordinated set of idealised initial-value predictability experiments, with seven general circulation models, was conducted. This was the first model intercomparison project designed to quantify the predictability of Arctic climate on seasonal to inter-annual timescales. Here we provide a summary and update of the project's results which include: (1) quantifying the predictability of Arctic climate, especially sea ice; (2) the state-dependence of this predictability, finding that extreme years are potentially more predictable than neutral years; (3) analysing a spring 'predictability barrier' to skillful forecasts; (4) initial sea ice thickness information provides much of the skill for summer forecasts; (5) quantifying the sources of error growth and uncertainty in Arctic predictions. The dataset is now publicly available.
Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory
NASA Technical Reports Server (NTRS)
Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.
1994-01-01
As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.
Peddie, N.W.
1992-01-01
The secular variation of the main geomagnetic field during the periods 1980-1985 and 1985-1990 was analyzed in terms of spherical harmonics up to the eighth degree and order. Data from worldwide magnetic observatories and the Navy's Project MAGNET aerial surveys were used. The resulting pair of secular-variation models was used to update the Definitive Geomagnetic Reference Field (DGRF) model for 1980, resulting in new mainfield models for 1985.0 and 1990.0. These, along with the secular-variation model for 1985-1990, were proposed for the 1991 revision of the International Geomagnetic Reference Field (IGRF). -Author
Finite element modelling and updating of a lively footbridge: The complete process
NASA Astrophysics Data System (ADS)
Živanović, Stana; Pavic, Aleksandar; Reynolds, Paul
2007-03-01
The finite element (FE) model updating technology was originally developed in the aerospace and mechanical engineering disciplines to automatically update numerical models of structures to match their experimentally measured counterparts. The process of updating identifies the drawbacks in the FE modelling and the updated FE model could be used to produce more reliable results in further dynamic analysis. In the last decade, the updating technology has been introduced into civil structural engineering. It can serve as an advanced tool for getting reliable modal properties of large structures. The updating process has four key phases: initial FE modelling, modal testing, manual model tuning and automatic updating (conducted using specialist software). However, the published literature does not connect well these phases, although this is crucial when implementing the updating technology. This paper therefore aims to clarify the importance of this linking and to describe the complete model updating process as applicable in civil structural engineering. The complete process consisting the four phases is outlined and brief theory is presented as appropriate. Then, the procedure is implemented on a lively steel box girder footbridge. It was found that even a very detailed initial FE model underestimated the natural frequencies of all seven experimentally identified modes of vibration, with the maximum error being almost 30%. Manual FE model tuning by trial and error found that flexible supports in the longitudinal direction should be introduced at the girder ends to improve correlation between the measured and FE-calculated modes. This significantly reduced the maximum frequency error to only 4%. It was demonstrated that only then could the FE model be automatically updated in a meaningful way. The automatic updating was successfully conducted by updating 22 uncertain structural parameters. Finally, a physical interpretation of all parameter changes is discussed. This interpretation is often missing in the published literature. It was found that the composite slabs were less stiff than originally assumed and that the asphalt layer contributed considerably to the deck stiffness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osborn, C.S.; Osborn, D.E.
Through a small grant from the Center for Renewable Resources and a matching grant from the Arizona Solar Energy Commission, the Arizona Solar Energy Association has produced a state-wide catalogue of model solar projects. This catalogue presents some of the best solar and conservation projects in the state. It includes solar buildings, educational programs, community development programs, agricultural and industrial projects, state and legislative efforts, and commercial and business programs. Project selection was based on five main considerations: (1) cost-effectiveness, (2) valuable use of resources, (3) generation of jobs and transfer of skills, (4) replicability, and (5) scope. Shorter descriptionsmore » of significant projects not meeting the selection criteria were also included. The development of the catalogue program, its use and impact as a networking tool and the development and implementation of a regular updating program are described. The success of this type of program on information exchange, public education, and cross fertilization are explored. Special emphasis projects from the catalogue are also described.« less
Use of indexing to update United States annual timber harvest by state
James Howard; Enrique Quevedo; Andrew Kramp
2009-01-01
This report provides an index method that can be used to update recent estimates of timber harvest by state to a common current year and to make 5-year projections. The Forest Service Forest Inventory and Analysis (FIA) program makes estimates of harvest for each state in differing years. The purpose of this updating method is to bring each state-level estimate up to a...
NASA Update for Unidata Stratcomm
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2017-01-01
The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk updated Unidata on the program of cloud computing prototypes underway for the Earth Observing System Data and Information System (EOSDIS). Also discussed was a trade study on the use of the Open source Project for a Network Data Access Protocol (OPeNDAP) with Web Object Storage in the cloud.
The Methane to Markets Coal Mine Methane Subcommittee meeting
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2008-07-01
The presentations (overheads/viewgraphs) include: a report from the Administrative Support Group; strategy updates from Australia, India, Italy, Mexico, Nigeria, Poland and the USA; coal mine methane update and IEA's strategy and activities; the power of VAM - technology application update; the emissions trading market; the voluntary emissions reduction market - creating profitable CMM projects in the USA; an Italian perspective towards a zero emission strategies; and the wrap-up and summary.
Real-time projections of cholera outbreaks through data assimilation and rainfall forecasting
NASA Astrophysics Data System (ADS)
Pasetto, Damiano; Finger, Flavio; Rinaldo, Andrea; Bertuzzo, Enrico
2017-10-01
Although treatment for cholera is well-known and cheap, outbreaks in epidemic regions still exact high death tolls mostly due to the unpreparedness of health care infrastructures to face unforeseen emergencies. In this context, mathematical models for the prediction of the evolution of an ongoing outbreak are of paramount importance. Here, we test a real-time forecasting framework that readily integrates new information as soon as available and periodically issues an updated forecast. The spread of cholera is modeled by a spatially-explicit scheme that accounts for the dynamics of susceptible, infected and recovered individuals hosted in different local communities connected through hydrologic and human mobility networks. The framework presents two major innovations for cholera modeling: the use of a data assimilation technique, specifically an ensemble Kalman filter, to update both state variables and parameters based on the observations, and the use of rainfall forecasts to force the model. The exercise of simulating the state of the system and the predictive capabilities of the novel tools, set at the initial phase of the 2010 Haitian cholera outbreak using only information that was available at that time, serves as a benchmark. Our results suggest that the assimilation procedure with the sequential update of the parameters outperforms calibration schemes based on Markov chain Monte Carlo. Moreover, in a forecasting mode the model usefully predicts the spatial incidence of cholera at least one month ahead. The performance decreases for longer time horizons yet allowing sufficient time to plan for deployment of medical supplies and staff, and to evaluate alternative strategies of emergency management.
ENGINEERED BARRIER SYSTEM: PHYSICAL AND CHEMICAL ENVIRONMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Jarek
2005-08-29
The purpose of this model report is to describe the evolution of the physical and chemical environmental conditions within the waste emplacement drifts of the repository, including the drip shield and waste package surfaces. The resulting seepage evaporation and gas abstraction models are used in the total system performance assessment for the license application (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. This report develops and documents a set of abstraction-level models that describe the engineered barrier system physical and chemical environment. Where possible, these models use information directly from other reports as input,more » which promotes integration among process models used for TSPA-LA. Specific tasks and activities of modeling the physical and chemical environment are included in ''Technical Work Plan for: Near-Field Environment and Transport In-Drift Geochemistry Model Report Integration'' (BSC 2005 [DIRS 173782], Section 1.2.2). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system reports. To be consistent with other project documents that address features, events, and processes (FEPs), Table 6.14.1 of the current report includes updates to FEP numbers and FEP subjects for two FEPs identified in the technical work plan (TWP) governing this report (BSC 2005 [DIRS 173782]). FEP 2.1.09.06.0A (Reduction-oxidation potential in EBS), as listed in Table 2 of the TWP (BSC 2005 [DIRS 173782]), has been updated in the current report to FEP 2.1.09.06.0B (Reduction-oxidation potential in Drifts; see Table 6.14-1). FEP 2.1.09.07.0A (Reaction kinetics in EBS), as listed in Table 2 of the TWP (BSC 2005 [DIRS 173782]), has been updated in the current report to FEP 2.1.09.07.0B (Reaction kinetics in Drifts; see Table 6.14-1). These deviations from the TWP are justified because they improve integration with FEPs documents. The updates have no impact on the model developed in this report.« less
Validation of New Wind Resource Maps
NASA Astrophysics Data System (ADS)
Elliott, D.; Schwartz, M.
2002-05-01
The National Renewable Energy Laboratory (NREL) recently led a project to validate updated state wind resource maps for the northwestern United States produced by a private U.S. company, TrueWind Solutions (TWS). The independent validation project was a cooperative activity among NREL, TWS, and meteorological consultants. The independent validation concept originated at a May 2001 technical workshop held at NREL to discuss updating the Wind Energy Resource Atlas of the United States. Part of the workshop, which included more than 20 attendees from the wind resource mapping and consulting community, was dedicated to reviewing the latest techniques for wind resource assessment. It became clear that using a numerical modeling approach for wind resource mapping was rapidly gaining ground as a preferred technique and if the trend continues, it will soon become the most widely-used technique around the world. The numerical modeling approach is a relatively fast application compared to older mapping methods and, in theory, should be quite accurate because it directly estimates the magnitude of boundary-layer processes that affect the wind resource of a particular location. Numerical modeling output combined with high resolution terrain data can produce useful wind resource information at a resolution of 1 km or lower. However, because the use of the numerical modeling approach is new (last 35 years) and relatively unproven, meteorological consultants question the accuracy of the approach. It was clear that new state or regional wind maps produced by this method would have to undergo independent validation before the results would be accepted by the wind energy community and developers.
Numerical model updating technique for structures using firefly algorithm
NASA Astrophysics Data System (ADS)
Sai Kubair, K.; Mohan, S. C.
2018-03-01
Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.
Wang, Yingwen; Kong, Meijing; Ge, Youhong
2016-12-01
Extravasation in a pediatric patient can cause a serious adverse event, but many nurses have insufficient experience to deal with it during intravenous administration. Our division implemented a best practice project, which included extravasation kit instruction preparation, staff education and an update of institutional policy and procedures. The project focused on auditing the extent to which the protocol was implemented and promoting its implementation. The objective of the project was to establish an evidence-based policy and procedure for extravasation management, improve knowledge regarding best practice of extravasation management among staff and formalize the documentation template for extravasation events. The Joanna Briggs Institute's Practical Application of Clinical Evidence System and Getting Research into Practice were used to examine compliance with criteria based on the best available evidence before and after the implementation of strategies to promote the use of the evidence-based practice protocol. Four criteria showed a noticeable improvement in compliance: increased use of extravasation kit (0-100%), updated policies and procedure (0-94%), staff education (19-94%) and documented outcomes (13-88%). The project successfully established effective strategies for establishing an extravasation kit instruction sheet, updating policies and procedures, continuous staff education and nursing documentation to ensure best practice and improve patient outcomes.
ADS-33C related handling qualities research performed using the NRC Bell 205 airborne simulator
NASA Technical Reports Server (NTRS)
Morgan, J. Murray; Baillie, Stewart W.
1993-01-01
Over 10 years ago a project was initiated by the U.S. Army AVSCOM to update the military helicopter flying qualities specification MIL-8501-A. While not yet complete, the project reached a major milestone in 1989 with the publication of an Airworthiness Design Standard, ADS-33C. The 8501 update project initially set out to identify critical gaps in the requisite data base and then proceeded to fill them using a variety of directed research studies. The magnitude of the task required that it become an international effort: appropriate research studies were conducted in Germany, the UK and Canada as well as in the USA. Canadian participation was supported by the Department of National Defence (DND) through the Chief of Research and Development. Both ground based and in-flight simulation were used to study the defined areas and the Canadian Bell 205-A1 variable stability helicopter was used extensively as one of the primary research tools available for this effort. This paper reviews the involvement of the Flight Research Laboratory of the National Research Council of Canada in the update project, it describes the various experiments conducted on the Airborne Simulator, it notes significant results obtained and describes ongoing research associated with the project.
Changes in Sea Levels around the British Isles Revisited (Invited)
NASA Astrophysics Data System (ADS)
Teferle, F. N.; Hansen, D. N.; Bingley, R. M.; Williams, S. D.; Woodworth, P. L.; Gehrels, W. R.; Bradley, S. L.; Stocchi, P.
2009-12-01
Recently a number of new and/or updated sources for estimates of vertical land movements for the British Isles have become available allowing the relative and average changes in sea levels for this region to be revisited. The geodetic data set stems from a combination of re-processed continuous Global Positioning System (GPS) measurements from stations in the British Isles and from a global reference frame network, and absolute gravity (AG) measurements from two stations in the British Isles. The geologic data set of late Holocene sea level indicators has recently been updated, now applying corrections for the 20th century sea level rise, syphoning effect and late Holocene global ice melt, and expanded to Northern Ireland and Ireland. Several new model predictions of the glacial isostatic adjustment (GIA) process active in this region form the modelling data set of vertical land movements for the British Isles. Correcting the updated revised local reference (RLR) trends from the Permanent Service for Mean Sea Level (PSMSL) with these vertical land movement data sets, regional and averaged changes in sea levels around the British Isles have been investigated. Special focus is thereby also given to the coastal areas that have recently been identified within the UK Climate Projections 2009.
Real-time model learning using Incremental Sparse Spectrum Gaussian Process Regression.
Gijsberts, Arjan; Metta, Giorgio
2013-05-01
Novel applications in unstructured and non-stationary human environments require robots that learn from experience and adapt autonomously to changing conditions. Predictive models therefore not only need to be accurate, but should also be updated incrementally in real-time and require minimal human intervention. Incremental Sparse Spectrum Gaussian Process Regression is an algorithm that is targeted specifically for use in this context. Rather than developing a novel algorithm from the ground up, the method is based on the thoroughly studied Gaussian Process Regression algorithm, therefore ensuring a solid theoretical foundation. Non-linearity and a bounded update complexity are achieved simultaneously by means of a finite dimensional random feature mapping that approximates a kernel function. As a result, the computational cost for each update remains constant over time. Finally, algorithmic simplicity and support for automated hyperparameter optimization ensures convenience when employed in practice. Empirical validation on a number of synthetic and real-life learning problems confirms that the performance of Incremental Sparse Spectrum Gaussian Process Regression is superior with respect to the popular Locally Weighted Projection Regression, while computational requirements are found to be significantly lower. The method is therefore particularly suited for learning with real-time constraints or when computational resources are limited. Copyright © 2012 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-20
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 13124-002] Copper Valley...: Original License Application. b. Project No.: 13124-002. c. Applicant: Copper Valley Electric Association (Copper Valley). d. Name of Project: Allison Creek Project. e. Location: On the south side of Port Valdez...
75 FR 52799 - Notice of Public Hearing and Commission Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-27
... certain water resources projects; (2) compliance matters involving three projects; (3) action on a project... the following items: (1) Update on the SRBC Remote Water Quality Monitoring Network; (2) hydrologic..., Susquehanna County, Pa. 3. Project Sponsor: Seneca Resources Corporation. Pad ID: M. Pino H (ABR-20090933...
Space-Based Sensorweb Monitoring of Wildfires in Thailand
NASA Technical Reports Server (NTRS)
Chien, Steve; Doubleday, Joshua; Mclaren, David; Davies, Ashley; Tran, Daniel; Tanpipat, Veerachai; Akaakara, Siri; Ratanasuwan, Anuchit; Mandl, Daniel
2011-01-01
We describe efforts to apply sensorweb technologies to the monitoring of forest fires in Thailand. In this approach, satellite data and ground reports are assimilated to assess the current state of the forest system in terms of forest fire risk, active fires, and likely progression of fires and smoke plumes. This current and projected assessment can then be used to actively direct sensors and assets to best acquire further information. This process operates continually with new data updating models of fire activity leading to further sensing and updating of models. As the fire activity is tracked, products such as active fire maps, burn scar severity maps, and alerts are automatically delivered to relevant parties.We describe the current state of the Thailand Fire Sensorweb which utilizes the MODIS-based FIRMS system to track active fires and trigger Earth Observing One / Advanced Land Imager to acquire imagery and produce active fire maps, burn scar severity maps, and alerts. We describe ongoing work to integrate additional sensor sources and generate additional products.
Who will have health insurance in the future? An updated projection.
Young, Richard A; DeVoe, Jennifer E
2012-01-01
The passage of the 2010 Patient Protection and Affordable Care Act (PPACA) in the United States put the issues of health care reform and health care costs back in the national spotlight. DeVoe and colleagues previously estimated that the cost of a family health insurance premium would equal the median household income by the year 2025. A slowdown in health care spending tied to the recent economic downturn and the passage of the PPACA occurred after this model was published. In this updated model, we estimate that this threshold will be crossed in 2033, and under favorable assumptions the PPACA may extend this date only to 2037. Continuing to make incremental changes in US health policy will likely not bend the cost curve, which has eluded policy makers for the past 50 years. Private health insurance will become increasingly unaffordable to low-to-middle-income Americans unless major changes are made in the US health care system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reedlunn, Benjamin
Room D was an in-situ, isothermal, underground experiment conducted at the Waste Isolation Pilot Plant between 1984 and 1991. The room was carefully instrumented to measure the horizontal and vertical closure immediately upon excavation and for several years thereafter. Early finite element simulations of salt creep around Room D under-predicted the vertical closure by 4.5×, causing investigators to explore a series of changes to the way Room D was modeled. Discrepancies between simulations and measurements were resolved through a series of adjustments to model parameters, which were openly acknowledged in published reports. Interest in Room D has been rekindled recentlymore » by the U.S./German Joint Project III and Project WEIMOS, which seek to improve the predictions of rock salt constitutive models. Joint Project participants calibrate their models solely against laboratory tests, and benchmark the models against underground experiments, such as room D. This report describes updating legacy Room D simulations to today’s computational standards by rectifying several numerical issues. Subsequently, the constitutive model used in previous modeling is recalibrated two different ways against a suite of new laboratory creep experiments on salt extracted from the repository horizon of the Waste Isolation Pilot Plant. Simulations with the new, laboratory-based, calibrations under-predict Room D vertical closure by 3.1×. A list of potential improvements is discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reedlunn, Benjamin
Room D was an in-situ, isothermal, underground experiment conducted at theWaste Isolation Pilot Plant between 1984 and 1991. The room was carefully instrumented to measure the horizontal and vertical closure immediately upon excavation and for several years thereafter. Early finite element simulations of salt creep around Room D under predicted the vertical closure by 4.5×, causing investigators to explore a series of changes to the way Room D was modeled. Discrepancies between simulations and measurements were resolved through a series of adjustments to model parameters, which were openly acknowledged in published reports. Interest in Room D has been rekindled recentlymore » by the U.S./German Joint Project III and Project WEIMOS, which seek to improve the predictions of rock salt constitutive models. Joint Project participants calibrate their models solely against laboratory tests, and benchmark the models against underground experiments, such as room D. This report describes updating legacy Room D simulations to today’s computational standards by rectifying several numerical issues. Subsequently, the constitutive model used in previous modeling is recalibrated two different ways against a suite of new laboratory creep experiments on salt extracted from the repository horizon of the Waste Isolation Pilot Plant. Simulations with the new, laboratory-based, calibrations under predict Room D vertical closure by 3.1×. A list of potential improvements is discussed.« less
33 CFR 385.30 - Master Implementation Sequencing Plan.
Code of Federal Regulations, 2010 CFR
2010-07-01
... projects of the Plan, including pilot projects and operational elements, based on the best scientific... Florida Water Management District shall also consult with the South Florida Ecosystem Restoration Task...; (ii) Information obtained from pilot projects; (iii) Updated funding information; (iv) Approved...
The Arctic Predictability and Prediction on Seasonal-to-Interannual TimEscales (APPOSITE) data set
NASA Astrophysics Data System (ADS)
Day, J. J.; Tietsche, S.; Collins, M.; Goessling, H. F.; Guemas, V.; Guillory, A.; Hurlin, W. J.; Ishii, M.; Keeley, S. P. E.; Matei, D.; Msadek, R.; Sigmond, M.; Tatebe, H.; Hawkins, E.
2015-10-01
Recent decades have seen significant developments in seasonal-to-interannual timescale climate prediction capabilities. However, until recently the potential of such systems to predict Arctic climate had not been assessed. This paper describes a multi-model predictability experiment which was run as part of the Arctic Predictability and Prediction On Seasonal to Inter-annual Timescales (APPOSITE) project. The main goal of APPOSITE was to quantify the timescales on which Arctic climate is predictable. In order to achieve this, a coordinated set of idealised initial-value predictability experiments, with seven general circulation models, was conducted. This was the first model intercomparison project designed to quantify the predictability of Arctic climate on seasonal to inter-annual timescales. Here we present a description of the archived data set (which is available at the British Atmospheric Data Centre) and an update of the project's results. Although designed to address Arctic predictability, this data set could also be used to assess the predictability of other regions and modes of climate variability on these timescales, such as the El Niño Southern Oscillation.
Harmonization of global land-use scenarios for the period 850-2100
NASA Astrophysics Data System (ADS)
Hurtt, G. C.; Chini, L. P.; Sahajpal, R.; Frolking, S. E.; Fisk, J.; Bodirsky, B.; Calvin, K. V.; Fujimori, S.; Goldewijk, K.; Hasegawa, T.; Havlik, P.; Heinimann, A.; Humpenöder, F.; Kaplan, J. O.; Krisztin, T.; Lawrence, D. M.; Lawrence, P.; Mertz, O.; Popp, A.; Riahi, K.; Stehfest, E.; van Vuuren, D.; de Waal, L.; Zhang, X.
2016-12-01
Human land-use activities have resulted in large changes to the biogeochemical and biophysical properties of the Earth surface, with resulting implications for climate. In the future, land-use activities are likely to expand and/or intensify further to meet growing demands for food, fiber, and energy. As part of the World Climate Research Program Coupled Model Intercomparison Project (CMIP6), the international community is developing the next generation of advanced Earth System Models (ESM) able to estimate the combined effects of human activities (e.g. land use and fossil fuel emissions) on the carbon-climate system. In addition, a new set of historical data based on HYDE, and multiple alternative scenarios of the future (2015-2100) from Integrated Assessment Model (IAM) teams, are being developed as input for these models. Here we present results from the Land-use Harmonization 2 (LUH2) project, with the goal to smoothly connect updated historical reconstructions of land-use with new future projections in the format required for ESMs. The harmonization strategy estimates the fractional land-use patterns, underlying land-use transitions, and key agricultural management information, and resulting secondary lands annually while minimizing the differences between the end of the historical reconstruction and IAM initial conditions, and working to preserve changes depicted by the IAMs in the future. The new approach builds off the approach from CMIP5, and is provided at higher resolution (0.25x0.25 degree), over longer time domain (850-2100), with more detail (including multiple crop and pasture types and associated management), using more inputs (including Landsat data), updated algorithms (wood harvest and shifting cultivation), and is assessed via a new diagnostic package. The new LUH2 products contain >50 times the information content of the datasets used in CMIP5, and are designed to enable new and improved estimates of the combined effects of land-use on the global carbon-climate system.
Update of GRASP/Ada reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1993-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application. An overview of the GRASP/Ada project with an emphasis on the current update is provided.
NASA Astrophysics Data System (ADS)
Callahan, P. S.; Wilson, B. D.; Xing, Z.; Raskin, R. G.
2010-12-01
We have developed a web-based system to allow updating and subsetting of TOPEX data. The Altimeter Service will be operated by PODAAC along with their other provision of oceanographic data. The Service could be easily expanded to other mission data. An Altimeter Service is crucial to the improvement and expanded use of altimeter data. A service is necessary for altimetry because the result of most interest - sea surface height anomaly (SSHA) - is composed of several components that are updated individually and irregularly by specialized experts. This makes it difficult for projects to provide the most up-to-date products. Some components are the subject of ongoing research, so the ability for investigators to make products for comparison or sharing is important. The service will allow investigators/producers to get their component models or processing into widespread use much more quickly. For coastal altimetry, the ability to subset the data to the area of interest and insert specialized models (e.g., tides) or data processing results is crucial. A key part of the Altimeter Service is having data producers provide updated or local models and data. In order for this to succeed, producers need to register their products with the Altimeter Service and to provide the product in a form consistent with the service update methods. We will describe the capabilities of the web service and the methods for providing new components. Currently the Service is providing TOPEX GDRs with Retracking (RGDRs) in netCDF format that has been coordinated with Jason data. Users can add new orbits, tide models, gridded geophysical fields such as mean sea surface, and along-track corrections as they become available and are installed by PODAAC. The updated fields are inserted into the netCDF files while the previous values are retained for comparison. The Service will also generate SSH and SSHA. In addition, the Service showcases a feature that plots any variable from files in netCDF. The research described here was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.
Genomes OnLine Database (GOLD) v.6: data updates and feature enhancements
Mukherjee, Supratim; Stamatis, Dimitri; Bertsch, Jon; Ovchinnikova, Galina; Verezemska, Olena; Isbandi, Michelle; Thomas, Alex D.; Ali, Rida; Sharma, Kaushal; Kyrpides, Nikos C.; Reddy, T. B. K.
2017-01-01
The Genomes Online Database (GOLD) (https://gold.jgi.doe.gov) is a manually curated data management system that catalogs sequencing projects with associated metadata from around the world. In the current version of GOLD (v.6), all projects are organized based on a four level classification system in the form of a Study, Organism (for isolates) or Biosample (for environmental samples), Sequencing Project and Analysis Project. Currently, GOLD provides information for 26 117 Studies, 239 100 Organisms, 15 887 Biosamples, 97 212 Sequencing Projects and 78 579 Analysis Projects. These are integrated with over 312 metadata fields from which 58 are controlled vocabularies with 2067 terms. The web interface facilitates submission of a diverse range of Sequencing Projects (such as isolate genome, single-cell genome, metagenome, metatranscriptome) and complex Analysis Projects (such as genome from metagenome, or combined assembly from multiple Sequencing Projects). GOLD provides a seamless interface with the Integrated Microbial Genomes (IMG) system and supports and promotes the Genomic Standards Consortium (GSC) Minimum Information standards. This paper describes the data updates and additional features added during the last two years. PMID:27794040
Update of correlations between cone penetration and boring log data : technical summary report.
DOT National Transportation Integrated Search
2009-12-01
The main objective of this project is to update the correlations that are currently used to interpret : Cone Penetration Test (CPT) data for engineering design purposes and to assess the reliability of : using CPT data to predict soil shear strength....
75 FR 3443 - Sunshine Act Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-21
... COMMISSION ON CIVIL RIGHTS Sunshine Act Notice AGENCY: United States Commission on Civil Rights... Recommendations for The Impact of Illegal Immigration on the Wages and Employment Opportunities of Black Workers Report. Update on Status of Title IX Project. Update on Status of 2010 Enforcement Report. III. State...
Legislative Update: Georgia School Funding Update.
ERIC Educational Resources Information Center
Holmes, C. Thomas; Sielke, Catherine C.
2000-01-01
Fully 40 percent ($5 billion) of Georgia's FY 2000 general funds budget is for K-12 education. There is increased funding for a homestead exemption, expansion of the HOPE (higher education) Scholarship Program, capital outlay projects, remedial assistance programs, and instruction of limited-English speaking students. (MLH)
Project of Near-Real-Time Generation of ShakeMaps and a New Hazard Map in Austria
NASA Astrophysics Data System (ADS)
Jia, Yan; Weginger, Stefan; Horn, Nikolaus; Hausmann, Helmut; Lenhardt, Wolfgang
2016-04-01
Target-orientated prevention and effective crisis management can reduce or avoid damage and save lives in case of a strong earthquake. To achieve this goal, a project for automatic generated ShakeMaps (maps of ground motion and shaking intensity) and updating the Austrian hazard map was started at ZAMG (Zentralanstalt für Meteorologie und Geodynamik) in 2015. The first goal of the project is set for a near-real-time generation of ShakeMaps following strong earthquakes in Austria to provide rapid, accurate and official information to support the governmental crisis management. Using newly developed methods and software by SHARE (Seismic Hazard Harmonization in Europe) and GEM (Global Earthquake Model), which allows a transnational analysis at European level, a new generation of Austrian hazard maps will be ultimately calculated. More information and a status of our project will be given by this presentation.
LDSD POST2 Modeling Enhancements in Support of SFDT-2 Flight Operations
NASA Technical Reports Server (NTRS)
White, Joseph; Bowes, Angela L.; Dutta, Soumyo; Ivanov, Mark C.; Queen, Eric M.
2016-01-01
Program to Optimize Simulated Trajectories II (POST2) was utilized to develop trajectory simulations characterizing all flight phases from drop to splashdown for the Low-Density Supersonic Decelerator (LDSD) project's first and second Supersonic Flight Dynamics Tests (SFDT-1 and SFDT-2) which took place June 28, 2014 and June 8, 2015, respectively. This paper describes the modeling improvements incorporated into the LDSD POST2 simulations since SFDT-1 and presents how these modeling updates affected the predicted SFDT-2 performance and sensitivity to the mission design. The POST2 simulation flight dynamics support during the SFDT-2 launch, operations, and recovery is also provided.
Abu Dhabi Basemap Update Using the LiDAR Mobile Mapping Technology
NASA Astrophysics Data System (ADS)
Alshaiba, Omar; Amparo Núñez-Andrés, M.; Lantada, Nieves
2016-04-01
Mobile LiDAR system provides a new technology which can be used to update geospatial information by direct and rapid data collection. This technology is faster than the traditional survey ways and has lower cost. Abu Dhabi Municipal System aims to update its geospatial system frequently as the government entities have invested heavily in GIS technology and geospatial data to meet the repaid growth in the infrastructure and construction projects in recent years. The Emirate of Abu Dhabi has witnessed a huge growth in infrastructure and construction projects in recent years. Therefore, it is necessary to develop and update its basemap system frequently to meet their own organizational needs. Currently, the traditional ways are used to update basemap system such as human surveyors, GPS receivers and controller (GPS assigned computer). Then the surveyed data are downloaded, edited and reviewed manually before it is merged to the basemap system. Traditional surveying ways may not be applicable in some conditions such as; bad weather, difficult topographic area and boundary area. This paper presents a proposed methodology which uses the Mobile LiDAR system to update basemap in Abu Dhabi by using daily transactions services. It aims to use and integrate the mobile LiDAR technology into the municipality's daily workflow such that it becomes the new standard cost efficiency operating procedure for updating the base-map in Abu Dhabi Municipal System. On another note, the paper will demonstrate the results of the innovated workflow for the base-map update using the mobile LiDAR point cloud and few processing algorithms.
Impact of climate change on water resources status: A case study for Crete Island, Greece
NASA Astrophysics Data System (ADS)
Koutroulis, Aristeidis G.; Tsanis, Ioannis K.; Daliakopoulos, Ioannis N.; Jacob, Daniela
2013-02-01
SummaryAn assessment of the impact of global climate change on the water resources status of the island of Crete, for a range of 24 different scenarios of projected hydro-climatological regime is presented. Three "state of the art" Global Climate Models (GCMs) and an ensemble of Regional Climate Models (RCMs) under emission scenarios B1, A2 and A1B provide future precipitation (P) and temperature (T) estimates that are bias adjusted against observations. The ensemble of RCMs for the A1B scenario project a higher P reduction compared to GCMs projections under A2 and B1 scenarios. Among GCMs model results, the ECHAM model projects a higher P reduction compared to IPSL and CNCM. Water availability for the whole island at basin scale until 2100 is estimated using the SAC-SMA rainfall-runoff model And a set of demand and infrastructure scenarios are adopted to simulate potential water use. While predicted reduction of water availability under the B1 emission scenario can be handled with water demand stabilized at present values and full implementation of planned infrastructure, other scenarios require additional measures and a robust signal of water insufficiency is projected. Despite inherent uncertainties, the quantitative impact of the projected changes on water availability indicates that climate change plays an important role to water use and management in controlling future water status in a Mediterranean island like Crete. The results of the study reinforce the necessity to improve and update local water management planning and adaptation strategies in order to attain future water security.
NASA Technical Reports Server (NTRS)
Liou, Jer-Chyi; Clark, S.; Fitz-Coy, N.; Huynh, T.; Opiela, J.; Polk, M.; Roebuck, B.; Rushing, R.; Sorge, M.; Werremeyer, M.
2013-01-01
The goal of the DebriSat project is to characterize fragments generated by a hypervelocity collision involving a modern satellite in low Earth orbit (LEO). The DebriSat project will update and expand upon the information obtained in the 1992 Satellite Orbital Debris Characterization Impact Test (SOCIT), which characterized the breakup of a 1960 s US Navy Transit satellite. There are three phases to this project: the design and fabrication of DebriSat - an engineering model representing a modern, 60-cm/50-kg class LEO satellite; conduction of a laboratory-based hypervelocity impact to catastrophically break up the satellite; and characterization of the properties of breakup fragments down to 2 mm in size. The data obtained, including fragment size, area-to-mass ratio, density, shape, material composition, optical properties, and radar cross-section distributions, will be used to supplement the DoD s and NASA s satellite breakup models to better describe the breakup outcome of a modern satellite.
TA 55 Reinvestment Project II Phase C Update Project Status May 23, 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giordano, Anthony P.
The TA-55 Reinvestment Project (TRP) II Phase C is a critical infrastructure project focused on improving safety and reliability of the Los Alamos National Laboratory (LANL) TA-55 Complex. The Project recapitalizes and revitalizes aging and obsolete facility and safety systems providing a sustainable nuclear facility for National Security Missions.
ERIC Educational Resources Information Center
Florida State Univ., Tallahassee. Learning Systems Inst.
This publication contains the first two of three training workshop manuals designed to be used in conducting an update of the Indonesian Education and Human Resources Sector Assessment. Workshop I covers the basic concepts, skills, and methods needed to design subsector updates and develop a draft plan for update activities. Workshops II and III…
E-Area Vault Concrete Material Property And Vault Durability/Degradation Projection Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phifer, M. A.
2014-03-11
Subsequent to the 2008 E-Area Low-Level Waste Facility (ELLWF) Performance Assessment (PA) (WSRC 2008), two additional E-Area vault concrete property testing programs have been conducted (Dixon and Phifer 2010 and SIMCO 2011a) and two additional E-Area vault concrete durability modeling projections have been made (Langton 2009 and SIMCO 2012). All the information/data from these reports has been evaluated and consolidated herein by the Savannah River National Laboratory (SRNL) at the request of Solid Waste Management (SWM) to produce E-Area vault concrete hydraulic and physical property data and vault durability/degradation projection recommendations that are adequately justified for use within associated Specialmore » Analyses (SAs) and future PA updates. The Low Activity Waste (LAW) and Intermediate Level (IL) Vaults structural degradation predictions produced by Carey 2006 and Peregoy 2006, respectively, which were used as the basis for the 2008 ELLWF PA, remain valid based upon the results of the E-Area vault concrete durability simulations reported by Langton 2009 and those reported by SIMCO 2012. Therefore revised structural degradation predictions are not required so long as the mean thickness of the closure cap overlying the vaults is no greater than that assumed within Carey 2006 and Peregoy 2006. For the LAW Vault structural degradation prediction (Carey 2006), the mean thickness of the overlying closure cap was taken as nine feet. For the IL Vault structural degradation prediction (Peregoy 2006), the mean thickness of the overlying closure cap was taken as eight feet. The mean closure cap thicknesses as described here for both E-Area Vaults will be included as a key input and assumption (I&A) in the next revision to the closure plan for the ELLWF (Phifer et al. 2009). In addition, it has been identified as new input to the PA model to be assessed in the ongoing update to the new PA Information UDQE (Flach 2013). Once the UDQE is approved, the SWM Key I&A database will be updated with this new information.« less
77 FR 69447 - Inland Waterways Users Board
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-19
... inland navigation projects and studies and the status of the Inland Waterways Trust Fund, the funding... for 2012 and the project investment recommendations, along with updates of the Inland Marine...
Predictive Modeling of Cardiac Ischemia
NASA Technical Reports Server (NTRS)
Anderson, Gary T.
1996-01-01
The goal of the Contextual Alarms Management System (CALMS) project is to develop sophisticated models to predict the onset of clinical cardiac ischemia before it occurs. The system will continuously monitor cardiac patients and set off an alarm when they appear about to suffer an ischemic episode. The models take as inputs information from patient history and combine it with continuously updated information extracted from blood pressure, oxygen saturation and ECG lines. Expert system, statistical, neural network and rough set methodologies are then used to forecast the onset of clinical ischemia before it transpires, thus allowing early intervention aimed at preventing morbid complications from occurring. The models will differ from previous attempts by including combinations of continuous and discrete inputs. A commercial medical instrumentation and software company has invested funds in the project with a goal of commercialization of the technology. The end product will be a system that analyzes physiologic parameters and produces an alarm when myocardial ischemia is present. If proven feasible, a CALMS-based system will be added to existing heart monitoring hardware.
Timing Interactions in Social Simulations: The Voter Model
NASA Astrophysics Data System (ADS)
Fernández-Gracia, Juan; Eguíluz, Víctor M.; Miguel, Maxi San
The recent availability of huge high resolution datasets on human activities has revealed the heavy-tailed nature of the interevent time distributions. In social simulations of interacting agents the standard approach has been to use Poisson processes to update the state of the agents, which gives rise to very homogeneous activity patterns with a well defined characteristic interevent time. As a paradigmatic opinion model we investigate the voter model and review the standard update rules and propose two new update rules which are able to account for heterogeneous activity patterns. For the new update rules each node gets updated with a probability that depends on the time since the last event of the node, where an event can be an update attempt (exogenous update) or a change of state (endogenous update). We find that both update rules can give rise to power law interevent time distributions, although the endogenous one more robustly. Apart from that for the exogenous update rule and the standard update rules the voter model does not reach consensus in the infinite size limit, while for the endogenous update there exist a coarsening process that drives the system toward consensus configurations.
75 FR 11162 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-10
...; Wildorado Wind, LLC. Description: San Juan Mesa Wind Project, LLC et al. submits the Updated Market Power... Power Marketing, LLC; High Majestic Wind Energy Center, LLC. Description: NextEra Companies submits.... Description: Golden Spread Electric Cooperative, Inc et al. submits an Updated Market Power Analysis. Filed...
DOT National Transportation Integrated Search
2008-05-01
Center for Advanced Transportation Infrastructure (CAIT) of Rutgers University is mandated to conduct Ground Penetrating Radar (GPR) surveys to update the NJDOT's pavement management system with GPR measured pavement layer thicknesses. Based on the r...
2017-12-01
people eagerly anticipated failed to materialize. Instead, the country fractured into a collection of well- organized Islamic militias armed with...is continuously updated in near real time. When applied to certain models, ICEWS can be a powerful predictive tool to “forecast select events of...project would be incomplete without you. It is possible that through your work, and the work of dedicated people like you, Libya will see better days
2017-09-01
VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS by Matthew D. Bouwense...VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS 5. FUNDING NUMBERS 6. AUTHOR...unlimited. EXPERIMENTAL VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY
NASA Astrophysics Data System (ADS)
Lu, M.; Lall, U.
2013-12-01
In order to mitigate the impacts of climate change, proactive management strategies to operate reservoirs and dams are needed. A multi-time scale climate informed stochastic model is developed to optimize the operations for a multi-purpose single reservoir by simulating decadal, interannual, seasonal and sub-seasonal variability. We apply the model to a setting motivated by the largest multi-purpose dam in N. India, the Bhakhra reservoir on the Sutlej River, a tributary of the Indus. This leads to a focus on timing and amplitude of the flows for the monsoon and snowmelt periods. The flow simulations are constrained by multiple sources of historical data and GCM future projections, that are being developed through a NSF funded project titled 'Decadal Prediction and Stochastic Simulation of Hydroclimate Over Monsoon Asia'. The model presented is a multilevel, nonlinear programming model that aims to optimize the reservoir operating policy on a decadal horizon and the operation strategy on an updated annual basis. The model is hierarchical, in terms of having a structure that two optimization models designated for different time scales are nested as a matryoshka doll. The two optimization models have similar mathematical formulations with some modifications to meet the constraints within that time frame. The first level of the model is designated to provide optimization solution for policy makers to determine contracted annual releases to different uses with a prescribed reliability; the second level is a within-the-period (e.g., year) operation optimization scheme that allocates the contracted annual releases on a subperiod (e.g. monthly) basis, with additional benefit for extra release and penalty for failure. The model maximizes the net benefit of irrigation, hydropower generation and flood control in each of the periods. The model design thus facilitates the consistent application of weather and climate forecasts to improve operations of reservoir systems. The decadal flow simulations are re-initialized every year with updated climate projections to improve the reliability of the operation rules for the next year, within which the seasonal operation strategies are nested. The multi-level structure can be repeated for monthly operation with weekly subperiods to take advantage of evolving weather forecasts and seasonal climate forecasts. As a result of the hierarchical structure, sub-seasonal even weather time scale updates and adjustment can be achieved. Given an ensemble of these scenarios, the McISH reservoir simulation-optimization model is able to derive the desired reservoir storage levels, including minimum and maximum, as a function of calendar date, and the associated release patterns. The multi-time scale approach allows adaptive management of water supplies acknowledging the changing risks, meeting both the objectives over the decade in expected value and controlling the near term and planning period risk through probabilistic reliability constraints. For the applications presented, the target season is the monsoon season from June to September. The model also includes a monthly flood volume forecast model, based on a Copula density fit to the monthly flow and the flood volume flow. This is used to guide dynamic allocation of the flood control volume given the forecasts.
NASA Technical Reports Server (NTRS)
Schubert, Siegfried; Kang, In-Sik; Reale, Oreste
2009-01-01
This talk gives an update on the progress and further plans for a coordinated project to carry out and analyze high-resolution simulations of tropical storm activity with a number of state-of-the-art global climate models. Issues addressed include, the mechanisms by which SSTs control tropical storm. activity on inter-annual and longer time scales, the modulation of that activity by the Madden Julian Oscillation on sub-seasonal time scales, as well as the sensitivity of the results to model formulation. The project also encourages companion coarser resolution runs to help assess resolution dependence, and. the ability of the models to capture the large-scale and long-terra changes in the parameters important for hurricane development. Addressing the above science questions is critical to understanding the nature of the variability of the Asian-Australian monsoon and its regional impacts, and thus CLIVAR RAMP fully endorses the proposed tropical storm simulation activity. The project is open to all interested organizations and investigators, and the results from the runs will be shared among the participants, as well as made available to the broader scientific community for analysis.
Reviews and syntheses: Field data to benchmark the carbon cycle models for tropical forests
Clark, Deborah A.; Asao, Shinichi; Fisher, Rosie A.; Reed, Sasha C.; Reich, Peter B.; Ryan, Michael G.; Wood, Tana E.; Yang, Xiaojuan
2017-01-01
For more accurate projections of both the global carbon (C) cycle and the changing climate, a critical current need is to improve the representation of tropical forests in Earth system models. Tropical forests exchange more C, energy, and water with the atmosphere than any other class of land ecosystems. Further, tropical-forest C cycling is likely responding to the rapid global warming, intensifying water stress, and increasing atmospheric CO2 levels. Projections of the future C balance of the tropics vary widely among global models. A current effort of the modeling community, the ILAMB (International Land Model Benchmarking) project, is to compile robust observations that can be used to improve the accuracy and realism of the land models for all major biomes. Our goal with this paper is to identify field observations of tropical-forest ecosystem C stocks and fluxes, and of their long-term trends and climatic and CO2 sensitivities, that can serve this effort. We propose criteria for reference-level field data from this biome and present a set of documented examples from old-growth lowland tropical forests. We offer these as a starting point towards the goal of a regularly updated consensus set of benchmark field observations of C cycling in tropical forests.
Reviews and syntheses: Field data to benchmark the carbon cycle models for tropical forests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Deborah A.; Asao, Shinichi; Fisher, Rosie
For more accurate projections of both the global carbon (C) cycle and the changing climate, a critical current need is to improve the representation of tropical forests in Earth system models. Tropical forests exchange more C, energy, and water with the atmosphere than any other class of land ecosystems. Further, tropical-forest C cycling is likely responding to the rapid global warming, intensifying water stress, and increasing atmospheric CO 2 levels. Projections of the future C balance of the tropics vary widely among global models. A current effort of the modeling community, the ILAMB (International Land Model Benchmarking) project, is tomore » compile robust observations that can be used to improve the accuracy and realism of the land models for all major biomes. Our goal with this paper is to identify field observations of tropical-forest ecosystem C stocks and fluxes, and of their long-term trends and climatic and CO 2 sensitivities, that can serve this effort. We propose criteria for reference-level field data from this biome and present a set of documented examples from old-growth lowland tropical forests. We offer these as a starting point towards the goal of a regularly updated consensus set of benchmark field observations of C cycling in tropical forests.« less
Reviews and syntheses: Field data to benchmark the carbon cycle models for tropical forests
NASA Astrophysics Data System (ADS)
Clark, Deborah A.; Asao, Shinichi; Fisher, Rosie; Reed, Sasha; Reich, Peter B.; Ryan, Michael G.; Wood, Tana E.; Yang, Xiaojuan
2017-10-01
For more accurate projections of both the global carbon (C) cycle and the changing climate, a critical current need is to improve the representation of tropical forests in Earth system models. Tropical forests exchange more C, energy, and water with the atmosphere than any other class of land ecosystems. Further, tropical-forest C cycling is likely responding to the rapid global warming, intensifying water stress, and increasing atmospheric CO2 levels. Projections of the future C balance of the tropics vary widely among global models. A current effort of the modeling community, the ILAMB (International Land Model Benchmarking) project, is to compile robust observations that can be used to improve the accuracy and realism of the land models for all major biomes. Our goal with this paper is to identify field observations of tropical-forest ecosystem C stocks and fluxes, and of their long-term trends and climatic and CO2 sensitivities, that can serve this effort. We propose criteria for reference-level field data from this biome and present a set of documented examples from old-growth lowland tropical forests. We offer these as a starting point towards the goal of a regularly updated consensus set of benchmark field observations of C cycling in tropical forests.
Reviews and syntheses: Field data to benchmark the carbon cycle models for tropical forests
Clark, Deborah A.; Asao, Shinichi; Fisher, Rosie; ...
2017-10-23
For more accurate projections of both the global carbon (C) cycle and the changing climate, a critical current need is to improve the representation of tropical forests in Earth system models. Tropical forests exchange more C, energy, and water with the atmosphere than any other class of land ecosystems. Further, tropical-forest C cycling is likely responding to the rapid global warming, intensifying water stress, and increasing atmospheric CO 2 levels. Projections of the future C balance of the tropics vary widely among global models. A current effort of the modeling community, the ILAMB (International Land Model Benchmarking) project, is tomore » compile robust observations that can be used to improve the accuracy and realism of the land models for all major biomes. Our goal with this paper is to identify field observations of tropical-forest ecosystem C stocks and fluxes, and of their long-term trends and climatic and CO 2 sensitivities, that can serve this effort. We propose criteria for reference-level field data from this biome and present a set of documented examples from old-growth lowland tropical forests. We offer these as a starting point towards the goal of a regularly updated consensus set of benchmark field observations of C cycling in tropical forests.« less
ERM model analysis for adaptation to hydrological model errors
NASA Astrophysics Data System (ADS)
Baymani-Nezhad, M.; Han, D.
2018-05-01
Hydrological conditions are changed continuously and these phenomenons generate errors on flood forecasting models and will lead to get unrealistic results. Therefore, to overcome these difficulties, a concept called model updating is proposed in hydrological studies. Real-time model updating is one of the challenging processes in hydrological sciences and has not been entirely solved due to lack of knowledge about the future state of the catchment under study. Basically, in terms of flood forecasting process, errors propagated from the rainfall-runoff model are enumerated as the main source of uncertainty in the forecasting model. Hence, to dominate the exciting errors, several methods have been proposed by researchers to update the rainfall-runoff models such as parameter updating, model state updating, and correction on input data. The current study focuses on investigations about the ability of rainfall-runoff model parameters to cope with three types of existing errors, timing, shape and volume as the common errors in hydrological modelling. The new lumped model, the ERM model, has been selected for this study to evaluate its parameters for its use in model updating to cope with the stated errors. Investigation about ten events proves that the ERM model parameters can be updated to cope with the errors without the need to recalibrate the model.
2010-02-01
interdependencies, and then modifying plans according to updated projections. This is currently an immature area where further research is required. The...crosscutting.html. [7] Zeigler, B.P. and Hammonds, P. (2007). “Modelling and Simulation- Based Data Engineering: Introducing Pragmatics and Ontologies for...the optimum benefit to be obtained and while immature , ongoing research needs to be maintained. 20) Use of M&S to support complex operations needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, Ray Alden; Zou, Ling; Zhao, Haihua
This document summarizes the physical models and mathematical formulations used in the RELAP-7 code. In summary, the MOOSE based RELAP-7 code development is an ongoing effort. The MOOSE framework enables rapid development of the RELAP-7 code. The developmental efforts and results demonstrate that the RELAP-7 project is on a path to success. This theory manual documents the main features implemented into the RELAP-7 code. Because the code is an ongoing development effort, this RELAP-7 Theory Manual will evolve with periodic updates to keep it current with the state of the development, implementation, and model additions/revisions.
Project Tune-Up: A New Look at Project Prepare. Final Report. 1994-1995.
ERIC Educational Resources Information Center
Central Intermediate Unit 10, Pleasant Gap, PA.
This document includes a report on Project Tune-Up, which was conducted to update Project Prepare, a program to help Pennsylvania adult learners prepare for business school entrance examinations, the PSB [Pennsylvania State Board]-Aptitude for Practical Nursing examination, and college placement examinations. The report describes how the Project…
49 CFR 611.207 - Overall New Starts project ratings.
Code of Federal Regulations, 2013 CFR
2013-10-01
... evaluation. (2) Ratings for individual projects will be developed upon entry into engineering and prior to an FFGA. Additionally, ratings may be updated while a project is in engineering if the project scope and cost have changed materially since the most recent rating was assigned. (c) These ratings will be used...
A review of statistical updating methods for clinical prediction models.
Su, Ting-Li; Jaki, Thomas; Hickey, Graeme L; Buchan, Iain; Sperrin, Matthew
2018-01-01
A clinical prediction model is a tool for predicting healthcare outcomes, usually within a specific population and context. A common approach is to develop a new clinical prediction model for each population and context; however, this wastes potentially useful historical information. A better approach is to update or incorporate the existing clinical prediction models already developed for use in similar contexts or populations. In addition, clinical prediction models commonly become miscalibrated over time, and need replacing or updating. In this article, we review a range of approaches for re-using and updating clinical prediction models; these fall in into three main categories: simple coefficient updating, combining multiple previous clinical prediction models in a meta-model and dynamic updating of models. We evaluated the performance (discrimination and calibration) of the different strategies using data on mortality following cardiac surgery in the United Kingdom: We found that no single strategy performed sufficiently well to be used to the exclusion of the others. In conclusion, useful tools exist for updating existing clinical prediction models to a new population or context, and these should be implemented rather than developing a new clinical prediction model from scratch, using a breadth of complementary statistical methods.
NASA Human Health and Performance Center: Open innovation successes and collaborative projects
NASA Astrophysics Data System (ADS)
Richard, Elizabeth E.; Davis, Jeffrey R.
2014-11-01
In May 2007, what was then the Space Life Sciences Directorate published the 2007 Space Life Sciences Strategy for Human Space Exploration, setting the course for development and implementation of new business models and significant advances in external collaboration over the next five years. The strategy was updated on the basis of these accomplishments and reissued as the NASA Human Health and Performance Strategy in 2012, and continues to drive new approaches to innovation for the directorate. This short paper describes the successful execution of the strategy, driving organizational change through open innovation efforts and collaborative projects, including efforts of the NASA Human Health and Performance Center (NHHPC).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koning, A.J.; Bersillon, O.; Forrest, R. A.
The status of the Joint Evaluated Fission and Fusion file (JEFF) is described. The next version of the library, JEFF-3.1, comprises a significant update of actinide evaluations, evaluations emerging from European nuclear data projects, the activation library JEFF-3/A, the decay data and fission yield library, and fusion-related data files from the EFF project. The revisions were motivated by the availability of new measurements, modelling capabilities, or trends from integral experiments. Various pre-release validation efforts are underway, mainly for criticality and shielding of thermal and fast systems. This JEFF-3.1 library is expected to provide improved performances with respect to previous releasesmore » for a variety of scientific and industrial applications.« less
77 FR 19281 - Environmental Impacts Statements; Notice of Availability
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
..., FL, Central and Southern Florida Project, Broward County Water Preserve Areas, Updates Resulting from Policy Changes that occurred since 2007 Civil Works Board Approval, South Florida Water Management... for this project. EIS No. 20120089, Final EIS, USFS, CA, Greys Mountain Ecological Restoration Project...
Report on all ARRA Funded Technical Work
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2013-10-05
The main focus of this American Recovery and Reinvestment Act of 2009 (ARRA) funded project was to design an energy efficient carbon capture and storage (CCS) process using the Recipients membrane system for H{sub 2} separation and CO{sub 2} capture. In the ARRA-funded project, the Recipient accelerated development and scale-up of ongoing hydrogen membrane technology research and development (R&D). Specifically, this project focused on accelerating the current R&D work scope of the base program-funded project, involving lab scale tests, detail design of a 250 lb/day H{sub 2} process development unit (PDU), and scale-up of membrane tube and coating manufacturing. Thismore » project scope included the site selection and a Front End Engineering Design (FEED) study of a nominally 4 to 10 ton-per-day (TPD) Pre-Commercial Module (PCM) hydrogen separation membrane system. Process models and techno-economic analysis were updated to include studies on integration of this technology into an Integrated Gasification Combined Cycle (IGCC) power generation system with CCS.« less
A vision and strategy for the virtual physiological human in 2010 and beyond.
Hunter, Peter; Coveney, Peter V; de Bono, Bernard; Diaz, Vanessa; Fenner, John; Frangi, Alejandro F; Harris, Peter; Hose, Rod; Kohl, Peter; Lawford, Pat; McCormack, Keith; Mendes, Miriam; Omholt, Stig; Quarteroni, Alfio; Skår, John; Tegner, Jesper; Randall Thomas, S; Tollis, Ioannis; Tsamardinos, Ioannis; van Beek, Johannes H G M; Viceconti, Marco
2010-06-13
European funding under framework 7 (FP7) for the virtual physiological human (VPH) project has been in place now for nearly 2 years. The VPH network of excellence (NoE) is helping in the development of common standards, open-source software, freely accessible data and model repositories, and various training and dissemination activities for the project. It is also helping to coordinate the many clinically targeted projects that have been funded under the FP7 calls. An initial vision for the VPH was defined by framework 6 strategy for a European physiome (STEP) project in 2006. It is now time to assess the accomplishments of the last 2 years and update the STEP vision for the VPH. We consider the biomedical science, healthcare and information and communications technology challenges facing the project and we propose the VPH Institute as a means of sustaining the vision of VPH beyond the time frame of the NoE.
A vision and strategy for the virtual physiological human in 2010 and beyond
Hunter, Peter; Coveney, Peter V.; de Bono, Bernard; Diaz, Vanessa; Fenner, John; Frangi, Alejandro F.; Harris, Peter; Hose, Rod; Kohl, Peter; Lawford, Pat; McCormack, Keith; Mendes, Miriam; Omholt, Stig; Quarteroni, Alfio; Skår, John; Tegner, Jesper; Randall Thomas, S.; Tollis, Ioannis; Tsamardinos, Ioannis; van Beek, Johannes H. G. M.; Viceconti, Marco
2010-01-01
European funding under framework 7 (FP7) for the virtual physiological human (VPH) project has been in place now for nearly 2 years. The VPH network of excellence (NoE) is helping in the development of common standards, open-source software, freely accessible data and model repositories, and various training and dissemination activities for the project. It is also helping to coordinate the many clinically targeted projects that have been funded under the FP7 calls. An initial vision for the VPH was defined by framework 6 strategy for a European physiome (STEP) project in 2006. It is now time to assess the accomplishments of the last 2 years and update the STEP vision for the VPH. We consider the biomedical science, healthcare and information and communications technology challenges facing the project and we propose the VPH Institute as a means of sustaining the vision of VPH beyond the time frame of the NoE. PMID:20439264
Invasive Species Science Update (No. 1)
Mee-Sook Kim; Jack Butler
2008-01-01
This electronic newsletter (Invasive Species Science Update) is published by the Rocky Mountain Research Station (RMRS) Cross-Program, Interdisciplinary Project team on Invasive Species. This newsletter will be published 3 times per year and is intended to enhance communication among RMRS scientists, wildland managers, other partners, stakeholders, and customers about...
Updating of the Curriculum for Industrial Refrigeration Course. Final Report.
ERIC Educational Resources Information Center
Eley, Robert H.
A project was conducted at Wenatchee Valley College (Washington) to update the curriculum for the industrial refrigeration technician course. First, resource groups and information networks were contacted in order to obtain a wide range of available resources. In addition, contractors, manufacturers, and government agencies that serve the Pacific…
Net merit as a measure of lifetime profit: 2010 revision
USDA-ARS?s Scientific Manuscript database
The 2010 revision of net merit (NM$) updates a number of key economic values as well as milk utilization statistics. Members of Project S-1040, Genetic Selection and Crossbreeding To Enhance Reproduction and Survival of Dairy Cattle, provided updated incomes and expenses used to estimate lifetime pr...
AN UPDATE ON SOME ARSENIC PROGRAMS AT THE US EPA
An Update on Some Arsenic Projects at the United States
Environmental Protection Agency*
Charles O. Abernathy1, Michael Beringer2, Rebecca L Calderon3,
Timothy McMahon4 and Erik Winchester3
Offices of Science and Technology1, Solid Waste...
Women and Geography, 1993: A Bibliography.
ERIC Educational Resources Information Center
Lee, David
This bibliography is an update of "Women and Geography: A Comprehensive Bibliography" which is published under the auspices of the Specialty Group on Geographic Perspectives on Women of the Association of American Geographers. Updates of the bibliography are issued annually. This bibliography project culminates more than a decade of research on…
Space Mission Human Reliability Analysis (HRA) Project
NASA Technical Reports Server (NTRS)
Boyer, Roger
2014-01-01
The purpose of the Space Mission Human Reliability Analysis (HRA) Project is to extend current ground-based HRA risk prediction techniques to a long-duration, space-based tool. Ground-based HRA methodology has been shown to be a reasonable tool for short-duration space missions, such as Space Shuttle and lunar fly-bys. However, longer-duration deep-space missions, such as asteroid and Mars missions, will require the crew to be in space for as long as 400 to 900 day missions with periods of extended autonomy and self-sufficiency. Current indications show higher risk due to fatigue, physiological effects due to extended low gravity environments, and others, may impact HRA predictions. For this project, Safety & Mission Assurance (S&MA) will work with Human Health & Performance (HH&P) to establish what is currently used to assess human reliabiilty for human space programs, identify human performance factors that may be sensitive to long duration space flight, collect available historical data, and update current tools to account for performance shaping factors believed to be important to such missions. This effort will also contribute data to the Human Performance Data Repository and influence the Space Human Factors Engineering research risks and gaps (part of the HRP Program). An accurate risk predictor mitigates Loss of Crew (LOC) and Loss of Mission (LOM).The end result will be an updated HRA model that can effectively predict risk on long-duration missions.
NASA Astrophysics Data System (ADS)
Fu, Y.; Yang, W.; Xu, O.; Zhou, L.; Wang, J.
2017-04-01
To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately.
Integrated Main Propulsion System Performance Reconstruction Process/Models
NASA Technical Reports Server (NTRS)
Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael
2013-01-01
The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.
78 FR 20359 - NASA Advisory Council; Technology and Innovation Committee; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-04
... NASA Robotics Technologies project and NASA's work with the National Robotics Initiative; and an annual... Sail project --Update on NASA's Robotic Technologies and the National Robotics Initiative It is...
Update rules and interevent time distributions: slow ordering versus no ordering in the voter model.
Fernández-Gracia, J; Eguíluz, V M; San Miguel, M
2011-07-01
We introduce a general methodology of update rules accounting for arbitrary interevent time (IET) distributions in simulations of interacting agents. We consider in particular update rules that depend on the state of the agent, so that the update becomes part of the dynamical model. As an illustration we consider the voter model in fully connected, random, and scale-free networks with an activation probability inversely proportional to the time since the last action, where an action can be an update attempt (an exogenous update) or a change of state (an endogenous update). We find that in the thermodynamic limit, at variance with standard updates and the exogenous update, the system orders slowly for the endogenous update. The approach to the absorbing state is characterized by a power-law decay of the density of interfaces, observing that the mean time to reach the absorbing state might be not well defined. The IET distributions resulting from both update schemes show power-law tails.
NASA Astrophysics Data System (ADS)
Berckmans, Julie; Hamdi, Rafiq; De Troch, Rozemien; Giot, Olivier
2015-04-01
At the Royal Meteorological Institute of Belgium (RMI), climate simulations are performed with the regional climate model (RCM) ALARO, a version of the ALADIN model with improved physical parameterizations. In order to obtain high-resolution information of the regional climate, lateral bounary conditions (LBC) are prescribed from the global climate model (GCM) ARPEGE. Dynamical downscaling is commonly done in a continuous long-term simulation, with the initialisation of the model at the start and driven by the regularly updated LBCs of the GCM. Recently, more interest exists in the dynamical downscaling approach of frequent reinitializations of the climate simulations. For these experiments, the model is initialised daily and driven for 24 hours by the GCM. However, the surface is either initialised daily together with the atmosphere or free to evolve continuously. The surface scheme implemented in ALARO is SURFEX, which can be either run in coupled mode or in stand-alone mode. The regional climate is simulated on different domains, on a 20km horizontal resolution over Western-Europe and a 4km horizontal resolution over Belgium. Besides, SURFEX allows to perform a stand-alone or offline simulation on 1km horizontal resolution over Belgium. This research is in the framework of the project MASC: "Modelling and Assessing Surface Change Impacts on Belgian and Western European Climate", a 4-year project funded by the Belgian Federal Government. The overall aim of the project is to study the feedbacks between climate changes and land surface changes in order to improve regional climate model projections at the decennial scale over Belgium and Western Europe and thus to provide better climate projections and climate change evaluation tools to policy makers, stakeholders and the scientific community.
An integrated knowledge system for wind tunnel testing - Project Engineers' Intelligent Assistant
NASA Technical Reports Server (NTRS)
Lo, Ching F.; Shi, George Z.; Hoyt, W. A.; Steinle, Frank W., Jr.
1993-01-01
The Project Engineers' Intelligent Assistant (PEIA) is an integrated knowledge system developed using artificial intelligence technology, including hypertext, expert systems, and dynamic user interfaces. This system integrates documents, engineering codes, databases, and knowledge from domain experts into an enriched hypermedia environment and was designed to assist project engineers in planning and conducting wind tunnel tests. PEIA is a modular system which consists of an intelligent user-interface, seven modules and an integrated tool facility. Hypermedia technology is discussed and the seven PEIA modules are described. System maintenance and updating is very easy due to the modular structure and the integrated tool facility provides user access to commercial software shells for documentation, reporting, or database updating. PEIA is expected to provide project engineers with technical information, increase efficiency and productivity, and provide a realistic tool for personnel training.
Technical Education Outreach in Materials Science and Technology Based on NASA's Materials Research
NASA Technical Reports Server (NTRS)
Jacobs, James A.
2003-01-01
The grant NAG-1 -2125, Technical Education Outreach in Materials Science and Technology, based on NASA s Materials Research, involves collaborative effort among the National Aeronautics and Space Administration s Langley Research Center (NASA-LaRC), Norfolk State University (NSU), national research centers, private industry, technical societies, colleges and universities. The collaboration aims to strengthen math, science and technology education by providing outreach related to materials science and technology (MST). The goal of the project is to transfer new developments from LaRC s Center for Excellence for Structures and Materials and other NASA materials research into technical education across the nation to provide educational outreach and strengthen technical education. To achieve this goal we are employing two main strategies: 1) development of the gateway website
The Site-Scale Saturated Zone Flow Model for Yucca Mountain
NASA Astrophysics Data System (ADS)
Al-Aziz, E.; James, S. C.; Arnold, B. W.; Zyvoloski, G. A.
2006-12-01
This presentation provides a reinterpreted conceptual model of the Yucca Mountain site-scale flow system subject to all quality assurance procedures. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain, which is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. This effort started from the ground up with a revised and updated hydrogeologic framework model, which incorporates the latest lithology data, and increased grid resolution that better resolves the hydrogeologic framework, which was updated throughout the model domain. In addition, faults are much better represented using the 250× 250- m2 spacing (compared to the previous model's 500× 500-m2 spacing). Data collected since the previous model calibration effort have been included and they comprise all Nye County water-level data through Phase IV of their Early Warning Drilling Program. Target boundary fluxes are derived from the newest (2004) Death Valley Regional Flow System model from the US Geologic Survey. A consistent weighting scheme assigns importance to each measured water-level datum and boundary flux extracted from the regional model. The numerical model is calibrated by matching these weighted water level measurements and boundary fluxes using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM~v2.24 and parameter estimation software PEST~v5.5) and model setup facilitates efficient calibration of multiple conceptual models. Analyses evaluate the impact of these updates and additional data on the modeled potentiometric surface and the flowpaths emanating from below the repository. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the proposed repository and compare them to those from the previous model calibration. Specific discharge at a point 5~km from the repository is also examined and found to be within acceptable uncertainty. The results show that updated model yields a calibration with smaller residuals than the previous model revision while ensuring that flowpaths follow measured gradients and paths derived from hydrochemical analyses. This work was supported by the Yucca Mountain Site Characterization Office as part of the Civilian Radioactive Waste Management Program, which is managed by the U.S. Department of Energy, Yucca Mountain Site Characterization Project. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
NASA Astrophysics Data System (ADS)
Ghosh, S.; Lopez-Coto, I.; Prasad, K.; Karion, A.; Mueller, K.; Gourdji, S.; Martin, C.; Whetstone, J. R.
2017-12-01
The National Institute of Standards and Technology (NIST) supports the North-East Corridor Baltimore Washington (NEC-B/W) project and Indianapolis Flux Experiment (INFLUX) aiming to quantify sources of Greenhouse Gas (GHG) emissions as well as their uncertainties. These projects employ different flux estimation methods including top-down inversion approaches. The traditional Bayesian inversion method estimates emission distributions by updating prior information using atmospheric observations of Green House Gases (GHG) coupled to an atmospheric and dispersion model. The magnitude of the update is dependent upon the observed enhancement along with the assumed errors such as those associated with prior information and the atmospheric transport and dispersion model. These errors are specified within the inversion covariance matrices. The assumed structure and magnitude of the specified errors can have large impact on the emission estimates from the inversion. The main objective of this work is to build a data-adaptive model for these covariances matrices. We construct a synthetic data experiment using a Kalman Filter inversion framework (Lopez et al., 2017) employing different configurations of transport and dispersion model and an assumed prior. Unlike previous traditional Bayesian approaches, we estimate posterior emissions using regularized sample covariance matrices associated with prior errors to investigate whether the structure of the matrices help to better recover our hypothetical true emissions. To incorporate transport model error, we use ensemble of transport models combined with space-time analytical covariance to construct a covariance that accounts for errors in space and time. A Kalman Filter is then run using these covariances along with Maximum Likelihood Estimates (MLE) of the involved parameters. Preliminary results indicate that specifying sptio-temporally varying errors in the error covariances can improve the flux estimates and uncertainties. We also demonstrate that differences between the modeled and observed meteorology can be used to predict uncertainties associated with atmospheric transport and dispersion modeling which can help improve the skill of an inversion at urban scales.
Projections of Education Statistics to 2001: An Update.
ERIC Educational Resources Information Center
Gerald, Debra E.; Hussar, William J.
Statistical projections for elementary and secondary schools and institutions of higher education are provided at the national and state levels through the year 2001. National projection tables cover enrollment, high school graduates, earned degrees conferred, classroom teachers, and expenditures of public elementary and secondary schools.…
The Middle Eastern Regional Irrigation Management Information Systems project-update
USDA-ARS?s Scientific Manuscript database
The Middle Eastern Regional Irrigation Management Information Systems Project (MERIMIS) was formulated at a meeting of experts from the region in Jordan in 2003. Funded by the U.S. Department of State, it is a cooperative regional project bringing together participants from Israel, Jordan, Palestini...
US Gateway to SIMBAD Astronomical Database
NASA Technical Reports Server (NTRS)
Eichhorn, G.
1998-01-01
During the last year the US SIMBAD Gateway Project continued to provide services like user registration to the US users of the SIMBAD database in France. User registration is required by the SIMBAD project in France. Currently, there are almost 3000 US users registered. We also provide user support by answering questions from users and handling requests for lost passwords. We have worked with the CDS SIMBAD project to provide access to the SIMBAD database to US users on an Internet address basis. This will allow most US users to access SIMBAD without having to enter passwords. This new system was installed in August, 1998. The SIMBAD mirror database at SAO is fully operational. We worked with the CDS to adapt it to our computer system. We implemented automatic updating procedures that update the database and password files daily. This mirror database provides much better access to the US astronomical community. We also supported a demonstration of the SIMBAD database at the meeting of the American Astronomical Society in January. We shipped computer equipment to the meeting and provided support for the demonstration activities at the SIMBAD booth. We continued to improve the cross-linking between the SIMBAD project and the Astro- physics Data System. This cross-linking between these systems is very much appreciated by the users of both the SIMBAD database and the ADS Abstract Service. The mirror of the SIMBAD database at SAO makes this connection faster for the US astronomers. The close cooperation between the CDS in Strasbourg and SAO, facilitated by this project, is an important part of the astronomy-wide digital library initiative called Urania. It has proven to be a model in how different data centers can collaborate and enhance the value of their products by linking with other data centers.
Comparing Methods for Dynamic Airspace Configuration
NASA Technical Reports Server (NTRS)
Zelinski, Shannon; Lai, Chok Fung
2011-01-01
This paper compares airspace design solutions for dynamically reconfiguring airspace in response to nominal daily traffic volume fluctuation. Airspace designs from seven algorithmic methods and a representation of current day operations in Kansas City Center were simulated with two times today's demand traffic. A three-configuration scenario was used to represent current day operations. Algorithms used projected unimpeded flight tracks to design initial 24-hour plans to switch between three configurations at predetermined reconfiguration times. At each reconfiguration time, algorithms used updated projected flight tracks to update the subsequent planned configurations. Compared to the baseline, most airspace design methods reduced delay and increased reconfiguration complexity, with similar traffic pattern complexity results. Design updates enabled several methods to as much as half the delay from their original designs. Freeform design methods reduced delay and increased reconfiguration complexity the most.
The mouse-human anatomy ontology mapping project.
Hayamizu, Terry F; de Coronado, Sherri; Fragoso, Gilberto; Sioutos, Nicholas; Kadin, James A; Ringwald, Martin
2012-01-01
The overall objective of the Mouse-Human Anatomy Project (MHAP) was to facilitate the mapping and harmonization of anatomical terms used for mouse and human models by Mouse Genome Informatics (MGI) and the National Cancer Institute (NCI). The anatomy resources designated for this study were the Adult Mouse Anatomy (MA) ontology and the set of anatomy concepts contained in the NCI Thesaurus (NCIt). Several methods and software tools were identified and evaluated, then used to conduct an in-depth comparative analysis of the anatomy ontologies. Matches between mouse and human anatomy terms were determined and validated, resulting in a highly curated set of mappings between the two ontologies that has been used by other resources. These mappings will enable linking of data from mouse and human. As the anatomy ontologies have been expanded and refined, the mappings have been updated accordingly. Insights are presented into the overall process of comparing and mapping between ontologies, which may prove useful for further comparative analyses and ontology mapping efforts, especially those involving anatomy ontologies. Finally, issues concerning further development of the ontologies, updates to the mapping files, and possible additional applications and significance were considered. DATABASE URL: http://obofoundry.org/cgi-bin/detail.cgi?id=ma2ncit.
McFarland, Forrest S.; Lienkaemper, James J.; Caskey, S. John; Grove, Karen
2007-01-01
Introduction Our purpose is to update with six additional years of data, our creep data archive on San Francisco Bay region active faults for use by the scientific research community. Earlier data (1979-2001) were reported in Galehouse (2002) and were analyzed and described in detail in a summary report (Galehouse and Lienkaemper, 2003). A complete analysis of our earlier results obtained on the Hayward fault was presented in Lienkaemper, Galehouse and Simpson (2001). Jon Galehouse of San Francisco State University (SFSU) and many student research assistants measured creep (aseismic slip) rates on these faults from 1979 until his retirement from the project in 2001. The creep measurement project, which was initiated by Galehouse, has continued through the Geosciences Department at SFSU from 2001-2006 under the direction of Co-P.I.'s Karen Grove and John Caskey (Grove and Caskey, 2005), and by Caskey since 2006. Forrest McFarland has managed most of the technical and logistical project operations as well as data processing and compilation since 2001. We plan to publish detailed analyses of these updated creep data in future publications. We maintain a project web site (http://funnel.sfsu.edu/creep/) that includes the following information: project description, project personnel, creep characteristics and measurement, map of creep measurement sites, creep measurement site information, and data plots for each measurement site. Our most current, annually updated results are therefore accessible to the scientific community and to the general public. Information about the project can currently be requested by the public by an email link (fltcreep@sfsu.edu) found on our project website.
NASA Technical Reports Server (NTRS)
Avermann, M.; Bischoff, L.; Brockmeyer, P.; Buhl, D.; Deutsch, A.; Dressler, B. O.; Lakomy, R.; Mueller-Mohr, V.; Stoeffler, D.
1992-01-01
In 1984 the Ontario Geological Survey initiated a research project on the Sudbury structure (SS) in cooperation with the University of Muenster. The project included field mapping (1984-1989) and petrographic, chemical, and isotope analyses of the major stratigraphic units of the SS. Four diploma theses and four doctoral theses were performed during the project (1984-1992). Specific results of the various investigations are reported. Selected areas of the SS were mapped and sampled: Footwall rocks; Footwall breccia and parts of the sublayer and lower section of the Sudbury Igneous Complex (SIC); Onaping Formation and the upper section of the SIC; and Sudbury breccia and adjacent Footwall rocks along extended profiles up to 55 km from the SIC. All these stratigraphic units of the SS were studied in substantial detail by previous workers. The most important characteristic of the previous research is that it was based either on a volcanic model or on a mixed volcanic-impact model for the origin of the SS. The present project was clearly directed toward a test of the impact origin of the SS without invoking an endogenic component. In general, our results confirm the most widely accepted stratigraphic division of the SS. However, our interpretation of some of the major stratigraphic units is different from most views expressed. The stratigraphy of the SS and its new interpretation is given as a basis for discussion.
NASA Astrophysics Data System (ADS)
Vassena, G.; Clerici, A.
2018-05-01
The state of the art of 3D surveying technologies, if correctly applied, allows to obtain 3D coloured models of large open pit mines using different technologies as terrestrial laser scanner (TLS), with images, combined with UAV based digital photogrammetry. GNSS and/or total station are also currently used to geo reference the model. The University of Brescia has been realised a project to map in 3D an open pit mine located in Botticino, a famous location of marble extraction close to Brescia in North Italy. Terrestrial Laser Scanner 3D point clouds combined with RGB images and digital photogrammetry from UAV have been used to map a large part of the cave. By rigorous and well know procedures a 3D point cloud and mesh model have been obtained using an easy and rigorous approach. After the description of the combined mapping process, the paper describes the innovative process proposed for the daily/weekly update of the model itself. To realize this task a SLAM technology approach is described, using an innovative approach based on an innovative instrument capable to run an automatic localization process and real time on the field change detection analysis.
A Probabilistic Approach to Model Update
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.
2001-01-01
Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.
Human-System Integration Scorecard Update to VB.Net
NASA Technical Reports Server (NTRS)
Sanders, Blaze D.
2009-01-01
The purpose of this project was to create Human-System Integration (HSI) scorecard software, which could be utilized to validate that human factors have been considered early in hardware/system specifications and design. The HSI scorecard is partially based upon the revised Human Rating Requirements (HRR) intended for NASA's Constellation program. This software scorecard will allow for quick appraisal of HSI factors, by using visual aids to highlight low and rapidly changing scores. This project consisted of creating a user-friendly Visual Basic program that could be easily distributed and updated, to and by fellow colleagues. Updating the Microsoft Word version of the HSI scorecard to a computer application will allow for the addition of useful features, improved easy of use, and decreased completion time for user. One significant addition is the ability to create Microsoft Excel graphs automatically from scorecard data, to allow for clear presentation of problematic areas. The purpose of this paper is to describe the rational and benefits of creating the HSI scorecard software, the problems and goals of project, and future work that could be done.
2011-03-01
Center 1261 Duck Rd. Kitty Hawk, NC 27949 Lisa Stillwell, Margaret Blanchard-Montgomery, Brian Blanton Renaissance Computing Institute 100 Europa...Insurance Studies in the study area, and serve as the basis for new coastal hazard analysis and ultimately updated Flood Insurance Rate Maps (FIRMs). Study... hazard zones in coastal areas of the United States. Under Task Order HSFE03-06-X-0023, the U.S. Army Corps of Engineers (USACE) and project partners are
Change management methodologies trained for automotive infotainment projects
NASA Astrophysics Data System (ADS)
Prostean, G.; Volker, S.; Hutanu, A.
2017-01-01
An Automotive Electronic Control Units (ECU) development project embedded within a car Environment is constantly under attack of a continuous flow of modifications of specifications throughout the life cycle. Root causes for those modifications are for instance simply software or hardware implementation errors or requirement changes to satisfy the forthcoming demands of the market to ensure the later commercial success. It is unavoidable that from the very beginning until the end of the project “requirement changes” will “expose” the agreed objectives defined by contract specifications, which are product features, budget, schedule and quality. The key discussions will focus upon an automotive radio-navigation (infotainment) unit, which challenges aftermarket devises such as smart phones. This competition stresses especially current used automotive development processes, which are fit into a 4 Year car development (introduction) cycle against a one-year update cycle of a smart phone. The research will focus the investigation of possible impacts of changes during all phases of the project: the Concept-Validation, Development and Debugging-Phase. Building a thorough understanding of prospective threats is of paramount importance in order to establish the adequate project management process to handle requirement changes. Personal automotive development experiences and Literature review of change- and configuration management software development methodologies led the authors to new conceptual models, which integrates into the structure of traditional development models used in automotive projects, more concretely of radio-navigation projects.
Integrated Technology Assessment Center (ITAC) Update
NASA Technical Reports Server (NTRS)
Taylor, J. L.; Neely, M. A.; Curran, F. M.; Christensen, E. R.; Escher, D.; Lovell, N.; Morris, Charles (Technical Monitor)
2002-01-01
The Integrated Technology Assessment Center (ITAC) has developed a flexible systems analysis framework to identify long-term technology needs, quantify payoffs for technology investments, and assess the progress of ASTP-sponsored technology programs in the hypersonics area. For this, ITAC has assembled an experienced team representing a broad sector of the aerospace community and developed a systematic assessment process complete with supporting tools. Concepts for transportation systems are selected based on relevance to the ASTP and integrated concept models (ICM) of these concepts are developed. Key technologies of interest are identified and projections are made of their characteristics with respect to their impacts on key aspects of the specific concepts of interest. Both the models and technology projections are then fed into the ITAC's probabilistic systems analysis framework in ModelCenter. This framework permits rapid sensitivity analysis, single point design assessment, and a full probabilistic assessment of each concept with respect to both embedded and enhancing technologies. Probabilistic outputs are weighed against metrics of interest to ASTP using a multivariate decision making process to provide inputs for technology prioritization within the ASTP. ITAC program is currently finishing the assessment of a two-stage-to-orbit (TSTO), rocket-based combined cycle (RBCC) concept and a TSTO turbine-based combined cycle (TBCC) concept developed by the team with inputs from NASA. A baseline all rocket TSTO concept is also being developed for comparison. Boeing has recently submitted a performance model for their Flexible Aerospace System Solution for Tomorrow (FASST) concept and the ISAT program will provide inputs for a single-stage-to-orbit (SSTO) TBCC based concept in the near-term. Both of these latter concepts will be analyzed within the ITAC framework over the summer. This paper provides a status update of the ITAC program.
2008-03-01
computational version of the CASIE architecture serves to demonstrate the functionality of our primary theories. However, implementation of several other...following facts. First, based on Theorem 3 and Theorem 5, the objective function is non -increasing under updating rule (6); second, by the criteria for...reassignment in updating rule (7), it is trivial to show that the objective function is non -increasing under updating rule (7). A Unified View to Graph
A high resolution spatial population database of Somalia for disease risk mapping.
Linard, Catherine; Alegana, Victor A; Noor, Abdisalan M; Snow, Robert W; Tatem, Andrew J
2010-09-14
Millions of Somali have been deprived of basic health services due to the unstable political situation of their country. Attempts are being made to reconstruct the health sector, in particular to estimate the extent of infectious disease burden. However, any approach that requires the use of modelled disease rates requires reasonable information on population distribution. In a low-income country such as Somalia, population data are lacking, are of poor quality, or become outdated rapidly. Modelling methods are therefore needed for the production of contemporary and spatially detailed population data. Here land cover information derived from satellite imagery and existing settlement point datasets were used for the spatial reallocation of populations within census units. We used simple and semi-automated methods that can be implemented with free image processing software to produce an easily updatable gridded population dataset at 100 × 100 meters spatial resolution. The 2010 population dataset was matched to administrative population totals projected by the UN. Comparison tests between the new dataset and existing population datasets revealed important differences in population size distributions, and in population at risk of malaria estimates. These differences are particularly important in more densely populated areas and strongly depend on the settlement data used in the modelling approach. The results show that it is possible to produce detailed, contemporary and easily updatable settlement and population distribution datasets of Somalia using existing data. The 2010 population dataset produced is freely available as a product of the AfriPop Project and can be downloaded from: http://www.afripop.org.
A high resolution spatial population database of Somalia for disease risk mapping
2010-01-01
Background Millions of Somali have been deprived of basic health services due to the unstable political situation of their country. Attempts are being made to reconstruct the health sector, in particular to estimate the extent of infectious disease burden. However, any approach that requires the use of modelled disease rates requires reasonable information on population distribution. In a low-income country such as Somalia, population data are lacking, are of poor quality, or become outdated rapidly. Modelling methods are therefore needed for the production of contemporary and spatially detailed population data. Results Here land cover information derived from satellite imagery and existing settlement point datasets were used for the spatial reallocation of populations within census units. We used simple and semi-automated methods that can be implemented with free image processing software to produce an easily updatable gridded population dataset at 100 × 100 meters spatial resolution. The 2010 population dataset was matched to administrative population totals projected by the UN. Comparison tests between the new dataset and existing population datasets revealed important differences in population size distributions, and in population at risk of malaria estimates. These differences are particularly important in more densely populated areas and strongly depend on the settlement data used in the modelling approach. Conclusions The results show that it is possible to produce detailed, contemporary and easily updatable settlement and population distribution datasets of Somalia using existing data. The 2010 population dataset produced is freely available as a product of the AfriPop Project and can be downloaded from: http://www.afripop.org. PMID:20840751
Incorporating HYPR de-noising within iterative PET reconstruction (HYPR-OSEM)
NASA Astrophysics Data System (ADS)
(Kevin Cheng, Ju-Chieh; Matthews, Julian; Sossi, Vesna; Anton-Rodriguez, Jose; Salomon, André; Boellaard, Ronald
2017-08-01
HighlY constrained back-PRojection (HYPR) is a post-processing de-noising technique originally developed for time-resolved magnetic resonance imaging. It has been recently applied to dynamic imaging for positron emission tomography and shown promising results. In this work, we have developed an iterative reconstruction algorithm (HYPR-OSEM) which improves the signal-to-noise ratio (SNR) in static imaging (i.e. single frame reconstruction) by incorporating HYPR de-noising directly within the ordered subsets expectation maximization (OSEM) algorithm. The proposed HYPR operator in this work operates on the target image(s) from each subset of OSEM and uses the sum of the preceding subset images as the composite which is updated every iteration. Three strategies were used to apply the HYPR operator in OSEM: (i) within the image space modeling component of the system matrix in forward-projection only, (ii) within the image space modeling component in both forward-projection and back-projection, and (iii) on the image estimate after the OSEM update for each subset thus generating three forms: (i) HYPR-F-OSEM, (ii) HYPR-FB-OSEM, and (iii) HYPR-AU-OSEM. Resolution and contrast phantom simulations with various sizes of hot and cold regions as well as experimental phantom and patient data were used to evaluate the performance of the three forms of HYPR-OSEM, and the results were compared to OSEM with and without a post reconstruction filter. It was observed that the convergence in contrast recovery coefficients (CRC) obtained from all forms of HYPR-OSEM was slower than that obtained from OSEM. Nevertheless, HYPR-OSEM improved SNR without degrading accuracy in terms of resolution and contrast. It achieved better accuracy in CRC at equivalent noise level and better precision than OSEM and better accuracy than filtered OSEM in general. In addition, HYPR-AU-OSEM has been determined to be the more effective form of HYPR-OSEM in terms of accuracy and precision based on the studies conducted in this work.
The C20C+ Detection and Attribution Project
NASA Astrophysics Data System (ADS)
Stone, D. A.; Angélil, O. M.; Cholia, S.; Christidis, N.; Dittus, A. J.; Folland, C. K.; King, A.; Kinter, J. L.; Krishnan, H.; Min, S. K.; Shiogama, H.; Wehner, M. F.; Wolski, P.
2015-12-01
Over the past decade there has been a remarkable growth in interest concerning the effects of anthropogenic emissions on extreme weather. However, research has been constrained by the lack of a public climate-model-based data product optimised for investigation of extreme weather in the context of climate change, relying instead on products designed for other purposes or on bespoke simulations designed for the particular study and not generally applicable to other extremes. The international Climate of the 20th Century Plus (C20C+) Detection and Attribution Project is filling this gap by producing the first large ensemble, multi-model, multi-year, and multi-scenario historical climate data product, specifically designed for resolving variations in the occurrence and characteristics of extreme weather from year to year and their differences from what might have been in the absence of anthropogenic emissions. Updates on project status and tens of terabytes of simulation output are available at http://portal.nersc.gov/c20c.Here we describe the experimental design of the first phase of the project, conducted with six atmospheric climate models, and discuss its various strengths and weaknesses with respect to various types of extreme weather. We also present analyses of the relative importance of climate model, estimate of anthropogenic ocean warming, spatial and temporal scale, and aspects of experimental design on estimates of how much emissions have affected extreme weather.
Clean Coal Technology Demonstration Program: Program Update 1998
DOE Office of Scientific and Technical Information (OSTI.GOV)
Assistant Secretary for Fossil Energy
1999-03-01
Annual report on the Clean Coal Technology Demonstration Program (CCT Program). The report address the role of the CCT Program, implementation, funding and costs, accomplishments, project descriptions, legislative history, program history, environmental aspects, and project contacts. The project descriptions describe the technology and provides a brief summary of the demonstration results.
DOT National Transportation Integrated Search
1979-08-01
The report is part of a study to update the historical and projected cost/revenue analysis of the U.S. domestic automobile manufacturers. It includes the evaluation of the historical and projected financial data to assess the corporate financial posi...
Early Restoration Projects Atlas | NOAA Gulf Spill Restoration
trustees are implementing. To view details of an individual project, click the View icon on the list below or click the project marker on the map. For definitions of the project detail click here. To highlight the location of a project from the list, click the Show on Map icon. This atlas will be updated as
New Jersey's Segregated Schools: Trends and Paths Forward
ERIC Educational Resources Information Center
Orfield, Gary; Ee, Jongyeon; Coughlan, Ryan
2017-01-01
This report updates earlier research published by the Civil Rights Project in 2013. That report detailed troubling racial and economic segregation trends and patterns from 1989-2010. The latest report includes new data from 2010-2015. The research updates public school enrollment trends and details segregation in the state's schools by race and…
Establishing an In-House Wind Maintenance Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2011-12-01
Update to the 2008 guidebook titled “Establishing an In-house Wind Maintenance Program”, which was developed to support utilities in developing O&M strategies. This update includes significant contributions from utilities and other stakeholders around the country, representing all perspectives and regardless of whether or not they own wind turbines or projects.
In March 2010, the 22nd meeting of the Working Group of National Coordinators of the OECD Test Guidelines Programme (WNT) approved a project for updating the Test Guidelines on genotoxicity, with Canada, the Netherlands, France and the USA identified as lead countries for this wo...
A review of methods for updating forest monitoring system estimates
Hector Franco-Lopez; Alan R. Ek; Andrew P. Robinson
2000-01-01
Intensifying interest in forests and the development of new monitoring technologies have induced major changes in forest monitoring systems in the last few years, including major revisions in the methods used for updating. This paper describes the methods available for projecting stand- and plot-level information, emphasizing advantages and disadvantages, and the...
NASA Astrophysics Data System (ADS)
Stockton, D.
2009-12-01
In this presentation, the NPOESS Integrated Program Office will provide a status update on the NPOESS Preparatory Project (NPP) and the NPOESS program. This update will include information on sensors, data products, and the spacecraft as well as the current schedules for NPP and NPOESS. The presentation will also touch on cooperation with EUMETSAT, both current and future.
42 CFR 495.340 - As-needed HIT PAPD update and as-needed HIT IAPD update requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... HEALTH RECORD TECHNOLOGY INCENTIVE PROGRAM Requirements Specific to the Medicaid Program § 495.340 As... document or the HIT implementation advance planning document. (d) A change in implementation concept or a change to the scope of the project. (e) A change to the approved cost allocation methodology. ...
42 CFR 495.340 - As-needed HIT PAPD update and as-needed HIT IAPD update requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... HEALTH RECORD TECHNOLOGY INCENTIVE PROGRAM Requirements Specific to the Medicaid Program § 495.340 As... document or the HIT implementation advance planning document. (d) A change in implementation concept or a change to the scope of the project. (e) A change to the approved cost allocation methodology. ...
Business and Office Curriculum Update. Final Report.
ERIC Educational Resources Information Center
Kjosnes, Iva S.
A project was conducted to update an existing high school business and office occupations education curriculum to include instruction in the use of computers and word processing equipment. The existing curriculum was assessed and revised in order to provide students with training in the following areas: the impact of computers on employment; the…
Update on the Common Core State Standards Initiative
ERIC Educational Resources Information Center
Ritter, Bill, Jr.
2009-01-01
In this update the National Governors Association presents the testimony of Honorable Bill Ritter, Jr., as submitted to the U.S. House Education and Labor Committee. Ritter speaks about the Common Core State Standards Initiative, a joint project by the National Governors Association (NGA) and Council of Chief State School Officers (CCSSO) to…
42 CFR 413.40 - Ceiling on the rate of increase in hospital inpatient costs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... October 1, 2002, is the percentage increase projected by the hospital market basket index. (4) Target... target amount for the previous cost reporting period, updated by the market basket percentage increase... each cost reporting period, the ceiling is determined by multiplying the updated target amount, as...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-21
... Management and Budget (OMB) numbers under the Paperwork Reductions Act); 4. Updates to revised documents that.... 8. Removed section 17. 9. Added provision to section 23 (now designated section 22), ``Management Agreement,'' that a management agreement cannot be assigned without prior written HUD approval. 10. Changed...
The 1993 RPA timber assessment update
Richard W. Haynes; Darius M. Adams; John R. Mills
1995-01-01
This update reports changes in the Nation's timber resource since the 1989 RPA timber assessment. The timber resource situation is analyzed to provide projections for future cost and availability of timber products to meet demands. Prospective trends in demands for and supplies of timber, and the factors that affect these trends are examined. These include changes...
Update on the development of cotton gin PM10 emission factors for EPA's AP-42
USDA-ARS?s Scientific Manuscript database
A cotton ginning industry-supported project was initiated in 2008 to update the U.S. Environmental Protection Agency’s (EPA) Compilation of Air Pollution Emission Factors (AP-42) to include PM10 emission factors. This study develops emission factors from the PM10 emission factor data collected from ...
NASA Astrophysics Data System (ADS)
Chegwidden, O.; Nijssen, B.; Rupp, D. E.; Kao, S. C.; Clark, M. P.
2017-12-01
We describe results from a large hydrologic climate change dataset developed across the Pacific Northwestern United States and discuss how the analysis of those results can be seen as a framework for other large hydrologic ensemble investigations. This investigation will better inform future modeling efforts and large ensemble analyses across domains within and beyond the Pacific Northwest. Using outputs from the Coupled Model Intercomparison Project Phase 5 (CMIP5), we provide projections of hydrologic change for the domain through the end of the 21st century. The dataset is based upon permutations of four methodological choices: (1) ten global climate models (2) two representative concentration pathways (3) three meteorological downscaling methods and (4) four unique hydrologic model set-ups (three of which entail the same hydrologic model using independently calibrated parameter sets). All simulations were conducted across the Columbia River Basin and Pacific coastal drainages at a 1/16th ( 6 km) resolution and at a daily timestep. In total, the 172 distinct simulations offer an updated, comprehensive view of climate change projections through the end of the 21st century. The results consist of routed streamflow at 400 sites throughout the domain as well as distributed spatial fields of relevant hydrologic variables like snow water equivalent and soil moisture. In this presentation, we discuss the level of agreement with previous hydrologic projections for the study area and how these projections differ with specific methodological choices. By controlling for some methodological choices we can show how each choice affects key climatic change metrics. We discuss how the spread in results varies across hydroclimatic regimes. We will use this large dataset as a case study for distilling a wide range of hydroclimatological projections into useful climate change assessments.
NASA Astrophysics Data System (ADS)
Fritsch, A.; Ayyad, Y.; Bazin, D.; Beceiro-Novo, S.; Bradt, J.; Carpenter, L.; Cortesi, M.; Mittig, W.; Suzuki, D.; Ahn, T.; Kolata, J. J.; Howard, A. M.; Becchetti, F. D.; Wolff, M.
Some exotic nuclei appear to exhibit α -cluster structure, which may impact nucleosynthesis reaction rates. While various theoretical models currently describe such clustering, more experimental data are needed to constrain model predictions. The Prototype Active-Target Time-Projection Chamber (PAT-TPC) has low-energy thresholds for charged-particle decay and a high detection efficiency due to its thick gaseous active target volume, making it well-suited to search for low-energy α -cluster reactions. Radioactive-ion beams produced by the TwinSol facility at the University of Notre Dame were delivered to the PAT-TPC to study 14C via α -resonant scattering. Differential cross sections and excitation functions were measured and show evidence of three-body exit channels. Additional data were measured with an updated Micromegas detector more sensitive to three-body decay. Preliminary results are presented.
India Solar Resource Data: Enhanced Data for Accelerated Deployment (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Identifying potential locations for solar photovoltaic (PV) and concentrating solar power (CSP) projects requires an understanding of the underlying solar resource. Under a bilateral partnership between the United States and India - the U.S.-India Energy Dialogue - the National Renewable Energy Laboratory has updated Indian solar data and maps using data provided by the Ministry of New and Renewable Energy (MNRE) and the National Institute for Solar Energy (NISE). This fact sheet overviews the updated maps and data, which help identify high-quality solar energy projects. This can help accelerate the deployment of solar energy in India.
India Solar Resource Data: Enhanced Data for Accelerated Deployment
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
Identifying potential locations for solar photovoltaic (PV) and concentrating solar power (CSP) projects requires an understanding of the underlying solar resource. Under a bilateral partnership between the United States and India - the U.S.-India Energy Dialogue - the National Renewable Energy Laboratory has updated Indian solar data and maps using data provided by the Ministry of New and Renewable Energy (MNRE) and the National Institute for Solar Energy (NISE). This fact sheet overviews the updated maps and data, which help identify high-quality solar energy projects. This can help accelerate the deployment of solar energy in India.
NASA Astrophysics Data System (ADS)
Guo, Ning; Yang, Zhichun; Wang, Le; Ouyang, Yan; Zhang, Xinping
2018-05-01
Aiming at providing a precise dynamic structural finite element (FE) model for dynamic strength evaluation in addition to dynamic analysis. A dynamic FE model updating method is presented to correct the uncertain parameters of the FE model of a structure using strain mode shapes and natural frequencies. The strain mode shape, which is sensitive to local changes in structure, is used instead of the displacement mode for enhancing model updating. The coordinate strain modal assurance criterion is developed to evaluate the correlation level at each coordinate over the experimental and the analytical strain mode shapes. Moreover, the natural frequencies which provide the global information of the structure are used to guarantee the accuracy of modal properties of the global model. Then, the weighted summation of the natural frequency residual and the coordinate strain modal assurance criterion residual is used as the objective function in the proposed dynamic FE model updating procedure. The hybrid genetic/pattern-search optimization algorithm is adopted to perform the dynamic FE model updating procedure. Numerical simulation and model updating experiment for a clamped-clamped beam are performed to validate the feasibility and effectiveness of the present method. The results show that the proposed method can be used to update the uncertain parameters with good robustness. And the updated dynamic FE model of the beam structure, which can correctly predict both the natural frequencies and the local dynamic strains, is reliable for the following dynamic analysis and dynamic strength evaluation.
Static and Dynamic Model Update of an Inflatable/Rigidizable Torus Structure
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Reaves, mercedes C.
2006-01-01
The present work addresses the development of an experimental and computational procedure for validating finite element models. A torus structure, part of an inflatable/rigidizable Hexapod, is used to demonstrate the approach. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with optimization is used to modify key model parameters. Static test results are used to update stiffness parameters and dynamic test results are used to update the mass distribution. Updated parameters are computed using gradient and non-gradient based optimization algorithms. Results show significant improvements in model predictions after parameters are updated. Lessons learned in the areas of test procedures, modeling approaches, and uncertainties quantification are presented.
NASA Astrophysics Data System (ADS)
Harris, Andrew; Latutrie, Benjamin; Andredakis, Ioannis; De Groeve, Tom; Langlois, Eric; van Wyk de Vries, Benjamin; Del Negro, Ciro; Favalli, Massimiliano; Fujita, Eisuke; Kelfoun, Karim; Rongo, Rocco
2016-04-01
RED-SEED stands for Risk Evaluation, Detection and Simulation during Effusive Eruption Disasters, and combines stakeholders from the remote sensing, modeling and response communities with experience in tracking volcanic effusive events. It is an informal working group that has evolved around the philosophy of combining global scientific resources, in the realm of physical volcanology, remote sensing and modeling, to better define and limit uncertainty. The group first met during a three day-long workshop held in Clermont Ferrand (France) between 28 and 30 May 2013. The main recommendation of the workshop in terms of modeling was that there is a pressing need for "real-time input of reliable Time-Averaged Discharge Rate (TADR) data with regular up-dates of Digital Elevation Models (DEMs) if modeling is to be effective; the DEMs can be provided by the radar/photogrammetry community." We thus set up a test to explore (i) which model source terms are needed, (ii) how they can be provided and updated, and (iii) how can models be run and applied in an ensemble approach. The test used two hypothetical effusive events in the Chaîne des Puys (Auvergne, France), for which a prototype Geographical Information System (GIS) was set up to allow loss assessment during an effusive crisis. This system drew on all immediately available data for population, land use, communications, utility and building-type. After defining lava flow model source terms (vent location, effusion rate, lava chemistry, temperature, crystallinity and vesicularity), five operational lava flow emplacement models were run (DOWNFLOW, FLOWGO, LAVASIM, MAGFLOW and VOLCFLOW) to produce a projection for likelihood of impact for all pixels within the area covered by the GIS, based on agreement between models. The test thus aimed not to assess the model output, but instead to examine overlapping output. Next, inundation maps and damage reports for impacted zones were produced. The exercise identified several shortcomings of the modeling systems, but indicates that generation of a global response system for effusive crises that uses rapid-response model projections for lava inundation driven by real-time satellite hot spot detection - and open access data sets - is within the current capabilities of the community.
NASA Astrophysics Data System (ADS)
Appel, W.; Gilliam, R. C.; Pouliot, G. A.; Godowitch, J. M.; Pleim, J.; Hogrefe, C.; Kang, D.; Roselle, S. J.; Mathur, R.
2013-12-01
The DISCOVER-AQ project (Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality), is a joint collaboration between NASA, U.S. EPA and a number of other local organizations with the goal of characterizing air quality in urban areas using satellite, aircraft, vertical profiler and ground based measurements (http://discover-aq.larc.nasa.gov). In July 2011, the DISCOVER-AQ project conducted intensive air quality measurements in the Baltimore, MD and Washington, D.C. area in the eastern U.S. To take advantage of these unique data, the Community Multiscale Air Quality (CMAQ) model, coupled with the Weather Research and Forecasting (WRF) model is used to simulate the meteorology and air quality in the same region using 12-km, 4-km and 1-km horizontal grid spacings. The goal of the modeling exercise is to demonstrate the capability of the coupled WRF-CMAQ modeling system to simulate air quality at fine grid spacings in an urban area. Development of new data assimilation techniques and the use of higher resolution input data for the WRF model have been implemented to improve the meteorological results, particularly at the 4-km and 1-km grid resolutions. In addition, a number of updates to the CMAQ model were made to enhance the capability of the modeling system to accurately represent the magnitude and spatial distribution of pollutants at fine model resolutions. Data collected during the 2011 DISCOVER-AQ campaign, which include aircraft transects and spirals, ship measurements in the Chesapeake Bay, ozonesondes, tethered balloon measurements, DRAGON aerosol optical depth measurements, LIDAR measurements, and intensive ground-based site measurements, are used to evaluate results from the WRF-CMAQ modeling system for July 2011 at the three model grid resolutions. The results of the comparisons of the model results to these measurements will be presented, along with results from the various sensitivity simulations examining the impact the various updates to the modeling system have on the model estimates.
Rubbertown NGEM Demonstration Project - Update to Industry
Follow-up communication to Rubbertown industry group as part of the planning process for the Rubbertown NGEM demonstration study. These slides are for discussion purposes and will not be presented publically beyond the project team and industry group.
Railway project design and construction (CEE 411) course updates.
DOT National Transportation Integrated Search
2017-01-20
Course CEE 411 "Railway Project Design and Construction" is a cornerstone of the railway : engineering education program developed by the Rail Transportation and Engineering Center : (RailTEC) at the University of Illinois at Urbana-Champaign (UIUC)....
Site systems engineering fiscal year 1999 multi-year work plan (MYWP) update for WBS 1.8.2.2
DOE Office of Scientific and Technical Information (OSTI.GOV)
GRYGIEL, M.L.
1998-10-08
Manage the Site Systems Engineering process to provide a traceable integrated requirements-driven, and technically defensible baseline. Through the Site Integration Group(SIG), Systems Engineering ensures integration of technical activities across all site projects. Systems Engineering's primary interfaces are with the RL Project Managers, the Project Direction Office and with the Project Major Subcontractors, as well as with the Site Planning organization. Systems Implementation: (1) Develops, maintains, and controls the site integrated technical baseline, ensures the Systems Engineering interfaces between projects are documented, and maintain the Site Environmental Management Specification. (2) Develops and uses dynamic simulation models for verification of the baselinemore » and analysis of alternatives. (3) Performs and documents fictional and requirements analyses. (4) Works with projects, technology management, and the SIG to identify and resolve technical issues. (5) Supports technical baseline information for the planning and budgeting of the Accelerated Cleanup Plan, Multi-Year Work Plans, Project Baseline Summaries as well as performance measure reporting. (6) Works with projects to ensure the quality of data in the technical baseline. (7) Develops, maintains and implements the site configuration management system.« less
New Mexico Small Business Assistance (NMSBA) September 2016 Advisory Council Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larkin, Ariana Kayla
This is an update on two projects headed by Los Alamos National Laboratory and Sandia National Laboratories. The two projects are: The Electrochemical Based Gas Analyzer for Automotive Diagnostic and Maximizing the Production of High Value and High Demand Guar Gum on Marginal Lands in New Mexico. The Electrochemical Based Gas Analyzer for Automotive Diagnostic NMSBA leveraged project is made up of Albuquerque companies, Automotive Test Solutions, Inc. (ATS), ATS Mobile Diagnostics and Thoma Technologies and Los Alamos small business, VI Control Systems, to develop a new sensor system for the automotive industry. The Guar Gum NMSBA Leveraged Project beganmore » in January 2016 with the goal to develop biotechnology to enable a genetic modification of prairie cordgrass, a renewable feedstock for bioenergy and bio-manufacturing. In the long term, the companies hope to use the technology to bio-manufacture high value products in the stem of the plant. This document describes the laboratories' cooperation with small businesses on these projects.« less
Projections of California Teacher Retirements: A County and Regional Perspective. REL 2017-181
ERIC Educational Resources Information Center
Fong, Anthony B.; Makkonen, Reino; Jaquet, Karina
2016-01-01
This report projects California teacher retirements at the state and county levels for 2014/15-2023/24, updating a previously published report that projected California teacher retirements for 2006/07-2015/16. The current study finds that 25 percent of California teachers who were teaching in 2013/14 are projected to retire over 2014/15-2023/24.…
NASA Astrophysics Data System (ADS)
Baek, H.; Park, E.; Kwon, W.
2009-12-01
Water balance calculations are becoming increasingly important for earth-system studies, because humans require water for their survival. Especially, the relationship between climate change and freshwater resources is of primary concern to human society and also has implications for all living species. The goal of this study is to assess the closure and annual variations of the water cycles based on the multi-model ensemble approach. In this study, the projection results of the previous works focusing on global and six sub-regions are updated using sixteen atmosphere-ocean general circulation model (AOGCM) simulations based on the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) A1B scenario. Before projecting future climate, model performances are evaluated on the simulation of the present-day climate. From the result, we construct and use mainly multi-model ensembles (MMEs), which is referred to as MME9, defined from nine selected AOGCMs of higher performance. Analyzed variables include annual and seasonal precipitation, evaporation, and runoff. The overall projection results from MME9 show that most regions will experience warmer and wetter climate at the end of 21st century. The evaporation shows a very similar trend to precipitation, but not in the runoff projection. The internal and inter-model variabilities are larger in the runoff than both precipitation and evaporation. Moreover, the runoff is notably reduced in Europe at the end of 21st century.
NASA Technical Reports Server (NTRS)
Wu, Sherman S. C.; Howington, Annie-Elpis
1987-01-01
The Mars Digital Terrain Model (DTM) is the result of a new project to: (1) digitize the series of 1:2,000,000-scale topographic maps of Mars, which are being derived photogrammetically under a separate project, and (2) reformat the digital contour information into rasters of elevation that can be readily registered with the Digital Image Model (DIM) of Mars. Derivation of DTM's involves interpolation of elevation values into 1/64-degree resolution and transformation of them to a sinusoidal equal-area projection. Digital data are produced in blocks corresponding with the coordinates of the original 1:2,000,000-scale maps, i.e., the dimensions of each block in the equatorial belt are 22.5 deg of longitude and 15 deg of latitude. This DTM is not only compatible with the DIM, but it can also be registered with other data such as geologic units or gravity. It will be the most comprehensive record of topographic information yet compiled for the Martian surface. Once the DTM's are established, any enhancement of Mars topographic information made with updated data, such as data from the planned Mars Observer Mission, will be by mathematical transformation of the DTM's, eliminating the need for recompilation.
Meris, Ronald G; Barbera, Joseph A
2014-01-01
In a large-scale outdoor, airborne, hazardous materials (HAZMAT) incident, such as ruptured chlorine rail cars during a train derailment, the local Incident Commanders and HAZMAT emergency responders must obtain accurate information quickly to assess the situation and act promptly and appropriately. HAZMAT responders must have a clear understanding of key information and how to integrate it into timely and effective decisions for action planning. This study examined the use of HAZMAT plume modeling as a decision support tool during incident action planning in this type of extreme HAZMAT incident. The concept of situation awareness as presented by Endsley's dynamic situation awareness model contains three levels: perception, comprehension, and projection. It was used to examine the actions of incident managers related to adequate data acquisition, current situational understanding, and accurate situation projection. Scientists and engineers have created software to simulate and predict HAZMAT plume behavior, the projected hazard impact areas, and the associated health effects. Incorporating the use of HAZMAT plume projection modeling into an incident action plan may be a complex process. The present analysis used a mixed qualitative and quantitative methodological approach and examined the use and limitations of a "HAZMAT Plume Modeling Cycle" process that can be integrated into the incident action planning cycle. HAZMAT response experts were interviewed using a computer-based simulation. One of the research conclusions indicated the "HAZMAT Plume Modeling Cycle" is a critical function so that an individual/team can be tasked with continually updating the hazard plume model with evolving data, promoting more accurate situation awareness.
Nasserie, Tahmina; Tuite, Ashleigh R; Whitmore, Lindsay; Hatchette, Todd; Drews, Steven J; Peci, Adriana; Kwong, Jeffrey C; Friedman, Dara; Garber, Gary; Gubbay, Jonathan; Fisman, David N
2017-01-01
Seasonal influenza epidemics occur frequently. Rapid characterization of seasonal dynamics and forecasting of epidemic peaks and final sizes could help support real-time decision-making related to vaccination and other control measures. Real-time forecasting remains challenging. We used the previously described "incidence decay with exponential adjustment" (IDEA) model, a 2-parameter phenomenological model, to evaluate the characteristics of the 2015-2016 influenza season in 4 Canadian jurisdictions: the Provinces of Alberta, Nova Scotia and Ontario, and the City of Ottawa. Model fits were updated weekly with receipt of incident virologically confirmed case counts. Best-fit models were used to project seasonal influenza peaks and epidemic final sizes. The 2015-2016 influenza season was mild and late-peaking. Parameter estimates generated through fitting were consistent in the 2 largest jurisdictions (Ontario and Alberta) and with pooled data including Nova Scotia counts (R 0 approximately 1.4 for all fits). Lower R 0 estimates were generated in Nova Scotia and Ottawa. Final size projections that made use of complete time series were accurate to within 6% of true final sizes, but final size was using pre-peak data. Projections of epidemic peaks stabilized before the true epidemic peak, but these were persistently early (~2 weeks) relative to the true peak. A simple, 2-parameter influenza model provided reasonably accurate real-time projections of influenza seasonal dynamics in an atypically late, mild influenza season. Challenges are similar to those seen with more complex forecasting methodologies. Future work includes identification of seasonal characteristics associated with variability in model performance.
Gibbs, Ann E.; Richmond, Bruce M.
2017-09-25
Long-term rates of shoreline change for the north coast of Alaska, from the U.S.-Canadian border to the Icy Cape region of northern Alaska, have been updated as part of the U.S. Geological Survey’s National Assessment of Shoreline Change Project. Short-term shoreline change rates are reported for the first time. Additional shoreline position data were used to compute rates where the previous rate-of-change assessment only included two shoreline positions at a given location. The calculation of uncertainty associated with the long-term average rates has also been updated to match refined methods used in other study regions of the National Assessment of Shoreline Change Project. The average rates of this report have a reduced amount of uncertainty compared to those presented in the first assessment for this region.
NASA Technical Reports Server (NTRS)
Corey, Stephen; Carnahan, Richard S., Jr.
1990-01-01
A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, C; Zhang, H; Chen, Y
Purpose: Recently, compressed sensing (CS) based iterative reconstruction (IR) method is receiving attentions to reconstruct high quality cone beam computed tomography (CBCT) images using sparsely sampled or noisy projections. The aim of this study is to develop a novel baseline algorithm called Mask Guided Image Reconstruction (MGIR), which can provide superior image quality for both low-dose 3DCBCT and 4DCBCT under single mathematical framework. Methods: In MGIR, the unknown CBCT volume was mathematically modeled as a combination of two regions where anatomical structures are 1) within the priori-defined mask and 2) outside the mask. Then we update each part of imagesmore » alternatively thorough solving minimization problems based on CS type IR. For low-dose 3DCBCT, the former region is defined as the anatomically complex region where it is focused to preserve edge information while latter region is defined as contrast uniform, and hence aggressively updated to remove noise/artifact. In 4DCBCT, the regions are separated as the common static part and moving part. Then, static volume and moving volumes were updated with global and phase sorted projection respectively, to optimize the image quality of both moving and static part simultaneously. Results: Examination of MGIR algorithm showed that high quality of both low-dose 3DCBCT and 4DCBCT images can be reconstructed without compromising the image resolution and imaging dose or scanning time respectively. For low-dose 3DCBCT, a clinical viable and high resolution head-and-neck image can be obtained while cutting the dose by 83%. In 4DCBCT, excellent quality 4DCBCT images could be reconstructed while requiring no more projection data and imaging dose than a typical clinical 3DCBCT scan. Conclusion: The results shown that the image quality of MGIR was superior compared to other published CS based IR algorithms for both 4DCBCT and low-dose 3DCBCT. This makes our MGIR algorithm potentially useful in various on-line clinical applications. Provisional Patent: UF#15476; WGS Ref. No. U1198.70067US00.« less
Costs And Savings Associated With Community Water Fluoridation In The United States.
O'Connell, Joan; Rockell, Jennifer; Ouellet, Judith; Tomar, Scott L; Maas, William
2016-12-01
The most comprehensive study of US community water fluoridation program benefits and costs was published in 2001. This study provides updated estimates using an economic model that includes recent data on program costs, dental caries increments, and dental treatments. In 2013 more than 211 million people had access to fluoridated water through community water systems serving 1,000 or more people. Savings associated with dental caries averted in 2013 as a result of fluoridation were estimated to be $32.19 per capita for this population. Based on 2013 estimated costs ($324 million), net savings (savings minus costs) from fluoridation systems were estimated to be $6,469 million and the estimated return on investment, 20.0. While communities should assess their specific costs for continuing or implementing a fluoridation program, these updated findings indicate that program savings are likely to exceed costs. Project HOPE—The People-to-People Health Foundation, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, Ho-Ling; Davis, Stacy Cagle
2009-12-01
This report is designed to document the analysis process and estimation models currently used by the Federal Highway Administration (FHWA) to estimate the off-highway gasoline consumption and public sector fuel consumption. An overview of the entire FHWA attribution process is provided along with specifics related to the latest update (2008) on the Off-Highway Gasoline Use Model and the Public Use of Gasoline Model. The Off-Highway Gasoline Use Model is made up of five individual modules, one for each of the off-highway categories: agricultural, industrial and commercial, construction, aviation, and marine. This 2008 update of the off-highway models was the secondmore » major update (the first model update was conducted during 2002-2003) after they were originally developed in mid-1990. The agricultural model methodology, specifically, underwent a significant revision because of changes in data availability since 2003. Some revision to the model was necessary due to removal of certain data elements used in the original estimation method. The revised agricultural model also made use of some newly available information, published by the data source agency in recent years. The other model methodologies were not drastically changed, though many data elements were updated to improve the accuracy of these models. Note that components in the Public Use of Gasoline Model were not updated in 2008. A major challenge in updating estimation methods applied by the public-use model is that they would have to rely on significant new data collection efforts. In addition, due to resource limitation, several components of the models (both off-highway and public-us models) that utilized regression modeling approaches were not recalibrated under the 2008 study. An investigation of the Environmental Protection Agency's NONROAD2005 model was also carried out under the 2008 model update. Results generated from the NONROAD2005 model were analyzed, examined, and compared, to the extent that is possible on the overall totals, to the current FHWA estimates. Because NONROAD2005 model was designed for emission estimation purposes (i.e., not for measuring fuel consumption), it covers different equipment populations from those the FHWA models were based on. Thus, a direct comparison generally was not possible in most sectors. As a result, NONROAD2005 data were not used in the 2008 update of the FHWA off-highway models. The quality of fuel use estimates directly affect the data quality in many tables published in the Highway Statistics. Although updates have been made to the Off-Highway Gasoline Use Model and the Public Use Gasoline Model, some challenges remain due to aging model equations and discontinuation of data sources.« less
Prediction-error variance in Bayesian model updating: a comparative study
NASA Astrophysics Data System (ADS)
Asadollahi, Parisa; Li, Jian; Huang, Yong
2017-04-01
In Bayesian model updating, the likelihood function is commonly formulated by stochastic embedding in which the maximum information entropy probability model of prediction error variances plays an important role and it is Gaussian distribution subject to the first two moments as constraints. The selection of prediction error variances can be formulated as a model class selection problem, which automatically involves a trade-off between the average data-fit of the model class and the information it extracts from the data. Therefore, it is critical for the robustness in the updating of the structural model especially in the presence of modeling errors. To date, three ways of considering prediction error variances have been seem in the literature: 1) setting constant values empirically, 2) estimating them based on the goodness-of-fit of the measured data, and 3) updating them as uncertain parameters by applying Bayes' Theorem at the model class level. In this paper, the effect of different strategies to deal with the prediction error variances on the model updating performance is investigated explicitly. A six-story shear building model with six uncertain stiffness parameters is employed as an illustrative example. Transitional Markov Chain Monte Carlo is used to draw samples of the posterior probability density function of the structure model parameters as well as the uncertain prediction variances. The different levels of modeling uncertainty and complexity are modeled through three FE models, including a true model, a model with more complexity, and a model with modeling error. Bayesian updating is performed for the three FE models considering the three aforementioned treatments of the prediction error variances. The effect of number of measurements on the model updating performance is also examined in the study. The results are compared based on model class assessment and indicate that updating the prediction error variances as uncertain parameters at the model class level produces more robust results especially when the number of measurement is small.
ERIC Educational Resources Information Center
Adams, Keith K.; Whiteman, Richard E.
This report describes an in-service industry exchange project conducted by Cerritos College during which five faculty members returned to the work place to update their skills and gather information useful in curriculum development. First, the project's objectives are delineated, covering: (1) formation of the Project Advisory Committee and…
Evaluating Quality in Educational Spaces: OECD/CELE Pilot Project
ERIC Educational Resources Information Center
von Ahlefeld, Hannah
2009-01-01
CELE's International Pilot Project on Evaluating Quality in Educational Spaces aims to assist education authorities, schools and others to maximise the use of and investment in learning environments. This article provides an update on the pilot project, which is currently being implemented in Brazil, Mexico, New Zealand, Portugal and the United…
Monitoring and Evaluating Nonpoint Source Watershed Projects
This guide is written primarily for those who develop and implement monitoring plans for watershed management projects. it can also be used evaluate the technical merits of monitoring proposals they might sponsor. It is an update to the 1997 Guide.
KY-CTDS : Kentucky contract time determination system
DOT National Transportation Integrated Search
2000-06-30
This research project was to update the KyTC's planning tool used for the determination of contract time allotted for contractors to complete highway construction projects in the Commonwealth of Kentucky. Although this issue was initially raised by t...
NASA Astrophysics Data System (ADS)
Berta-Thompson, Zachory K.; Irwin, Jonathan; Charbonneau, David; Newton, Elisabeth R.; Dittmann, Jason
2014-06-01
The MEarth Project is an ongoing all-sky survey for Earth-like planets transiting the closest, smallest M dwarfs. MEarth aims to find good targets for atmospheric characterization with JWST and the next generation of enormous ground-based telescopes. MEarth's yearly data releases, containing precise light curves of nearby mid-to-late M dwarfs, provide a unique window into the photometric variability of the stars that will forever be among the most interesting targets in the search for potentially habitable exoplanets. We present a status update on the MEarth Project, including a detailed map of the progress we’ve made so far with 8 telescopes in the Northern hemisphere and promising early results from our new installation of 8 more telescopes in the Southern hemisphere.
Assessing the performance of eight real-time updating models and procedures for the Brosna River
NASA Astrophysics Data System (ADS)
Goswami, M.; O'Connor, K. M.; Bhattarai, K. P.; Shamseldin, A. Y.
2005-10-01
The flow forecasting performance of eight updating models, incorporated in the Galway River Flow Modelling and Forecasting System (GFMFS), was assessed using daily data (rainfall, evaporation and discharge) of the Irish Brosna catchment (1207 km2), considering their one to six days lead-time discharge forecasts. The Perfect Forecast of Input over the Forecast Lead-time scenario was adopted, where required, in place of actual rainfall forecasts. The eight updating models were: (i) the standard linear Auto-Regressive (AR) model, applied to the forecast errors (residuals) of a simulation (non-updating) rainfall-runoff model; (ii) the Neural Network Updating (NNU) model, also using such residuals as input; (iii) the Linear Transfer Function (LTF) model, applied to the simulated and the recently observed discharges; (iv) the Non-linear Auto-Regressive eXogenous-Input Model (NARXM), also a neural network-type structure, but having wide options of using recently observed values of one or more of the three data series, together with non-updated simulated outflows, as inputs; (v) the Parametric Simple Linear Model (PSLM), of LTF-type, using recent rainfall and observed discharge data; (vi) the Parametric Linear perturbation Model (PLPM), also of LTF-type, using recent rainfall and observed discharge data, (vii) n-AR, an AR model applied to the observed discharge series only, as a naïve updating model; and (viii) n-NARXM, a naive form of the NARXM, using only the observed discharge data, excluding exogenous inputs. The five GFMFS simulation (non-updating) models used were the non-parametric and parametric forms of the Simple Linear Model and of the Linear Perturbation Model, the Linearly-Varying Gain Factor Model, the Artificial Neural Network Model, and the conceptual Soil Moisture Accounting and Routing (SMAR) model. As the SMAR model performance was found to be the best among these models, in terms of the Nash-Sutcliffe R2 value, both in calibration and in verification, the simulated outflows of this model only were selected for the subsequent exercise of producing updated discharge forecasts. All the eight forms of updating models for producing lead-time discharge forecasts were found to be capable of producing relatively good lead-1 (1-day ahead) forecasts, with R2 values almost 90% or above. However, for higher lead time forecasts, only three updating models, viz., NARXM, LTF, and NNU, were found to be suitable, with lead-6 values of R2 about 90% or higher. Graphical comparisons were made of the lead-time forecasts for the two largest floods, one in the calibration period and the other in the verification period.
ERIC Educational Resources Information Center
Hughes, Carroll W.; Emslie, Graham J.; Crismon, M. Lynn; Posner, Kelly; Birmaher, Boris; Ryan, Neal; Jensen, Peter; Curry, John; Vitiello, Benedetto; Lopez, Molly; Shon, Steve P.; Pliszka, Steven R.; Trivedi, Madhukar H.
2007-01-01
Objective: To revise and update consensus guidelines for medication treatment algorithms for childhood major depressive disorder based on new scientific evidence and expert clinical consensus when evidence is lacking. Method: A consensus conference was held January 13-14, 2005, that included academic clinicians and researchers, practicing…
Future of America's Forests and Rangelands: Update to the 2010 Resources Planning Act Assessment
Forest Service U.S. Department of Agriculture
2016-01-01
The Update to the 2010 Resources Planning Act (RPA) Assessment summarizes findings about the status, trends, and projected future of forests, rangelands, wildlife, biodiversity, water, outdoor recreation, and urban forests, as well as the effects of climate change upon these resources. Varying assumptions about population and economic growth, land use change, and...
Documentation of structures branch programs and program updates. Project 3200
NASA Technical Reports Server (NTRS)
Probe, D. G.
1975-01-01
Update programming of applications programs for the integrated structural analysis system is reported. An attempt is made to layout a standard document format for the preparation of program documents. Documentation which involves changes, additions, and I/O capability revisions to existing programs includes a checklist which should be reviewed each time a programming effort is documented.
Update on the development of cotton gin PM2.5 emission factors for EPA's AP-42
USDA-ARS?s Scientific Manuscript database
A cotton ginning industry-supported project was initiated in 2008 to update the U.S. Environmental Protection Agency’s (EPA) Compilation of Air Pollution Emission Factors (AP-42) to include PM2.5 emission factors. This study develops emission factors from the PM2.5 emission factor data collected fro...
An Online Risk Monitor System (ORMS) to Increase Safety and Security Levels in Industry
NASA Astrophysics Data System (ADS)
Zubair, M.; Rahman, Khalil Ur; Hassan, Mehmood Ul
2013-12-01
The main idea of this research is to develop an Online Risk Monitor System (ORMS) based on Living Probabilistic Safety Assessment (LPSA). The article highlights the essential features and functions of ORMS. The basic models and modules such as, Reliability Data Update Model (RDUM), running time update, redundant system unavailability update, Engineered Safety Features (ESF) unavailability update and general system update have been described in this study. ORMS not only provides quantitative analysis but also highlights qualitative aspects of risk measures. ORMS is capable of automatically updating the online risk models and reliability parameters of equipment. ORMS can support in the decision making process of operators and managers in Nuclear Power Plants.
NASA Astrophysics Data System (ADS)
Gonzalez-Nicolas, A.; Cihan, A.; Birkholzer, J. T.; Petrusak, R.; Zhou, Q.; Riestenberg, D. E.; Trautz, R. C.; Godec, M.
2016-12-01
Industrial-scale injection of CO2 into the subsurface can cause reservoir pressure increases that must be properly controlled to prevent any potential environmental impact. Excessive pressure buildup in reservoir may result in ground water contamination stemming from leakage through conductive pathways, such as improperly plugged abandoned wells or distant faults, and the potential for fault reactivation and possibly seal breaching. Brine extraction is a viable approach for managing formation pressure, effective stress, and plume movement during industrial-scale CO2 injection projects. The main objectives of this study are to investigate suitable different pressure management strategies involving active brine extraction and passive pressure relief wells. Adaptive optimized management of CO2 storage projects utilizes the advanced automated optimization algorithms and suitable process models. The adaptive management integrates monitoring, forward modeling, inversion modeling and optimization through an iterative process. In this study, we employ an adaptive framework to understand primarily the effects of initial site characterization and frequency of the model update (calibration) and optimization calculations for controlling extraction rates based on the monitoring data on the accuracy and the success of the management without violating pressure buildup constraints in the subsurface reservoir system. We will present results of applying the adaptive framework to test appropriateness of different management strategies for a realistic field injection project.
NASA Astrophysics Data System (ADS)
Trout, Joseph; Manson, J. Russell; King, David; Decicco, Nicolas; Prince, Alyssa; di Mercurio, Alexis; Rios, Manual
2017-01-01
Wake Vortex Turbulence is the turbulence generated by an aircraft in flight. This turbulence is created by vortices at the tips of the wing that may decay slowly and persist for several minutes after creation. These vortices and turbulence are hazardous to other aircraft in the vicinity. The strength, formation and lifetime of the turbulence and vortices are effected by many things including the weather. Here we present the final results of the pilot project to investigation of low level wind fields generated by the Weather Research and Forecasting Model and an analysis of historical data. The findings from the historical data and the data simulations were used as inputs for the computational fluid dynamics model (OpenFoam) to show that the vortices could be simulated using OpenFoam. Presented here are the updated results from a research grant, ``A Pilot Project to Investigate Wake Vortex Patterns and Weather Patterns at the Atlantic City Airport by the Stockton University and the FAA''.
Spatial fuel data products of the LANDFIRE Project
Reeves, M.C.; Ryan, K.C.; Rollins, M.G.; Thompson, T.G.
2009-01-01
The Landscape Fire and Resource Management Planning Tools (LANDFIRE) Project is mapping wildland fuels, vegetation, and fire regime characteristics across the United States. The LANDFIRE project is unique because of its national scope, creating an integrated product suite at 30-m spatial resolution and complete spatial coverage of all lands within the 50 states. Here we describe development of the LANDFIRE wildland fuels data layers for the conterminous 48 states: surface fire behavior fuel models, canopy bulk density, canopy base height, canopy cover, and canopy height. Surface fire behavior fuel models are mapped by developing crosswalks to vegetation structure and composition created by LANDFIRE. Canopy fuels are mapped using regression trees relating field-referenced estimates of canopy base height and canopy bulk density to satellite imagery, biophysical gradients and vegetation structure and composition data. Here we focus on the methods and data used to create the fuel data products, discuss problems encountered with the data, provide an accuracy assessment, demonstrate recent use of the data during the 2007 fire season, and discuss ideas for updating, maintaining and improving LANDFIRE fuel data products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liegey, Lauren Rene; Wilcox, Trevor; Mckinney, Gregg Walter
2015-08-07
My internship program was the Domestic Nuclear Detection Office Summer Internship Program. I worked at Los Alamos National Laboratory with Trevor A. Wilcox and Gregg W. McKinney in the NEN-5 group. My project title was “MCNP Physical Model Interoperability & Validation”. The goal of my project was to write a program to predict the solar modulation parameter for dates in the future and then implement it into MCNP6. This update to MCNP6 can be used to calculate the background more precisely, which is an important factor in being able to detect Special Nuclear Material. We will share our work inmore » a published American Nuclear Society (ANS) paper, an ANS presentation, and a LANL student poster session. Through this project, I gained skills in programming, computing, and using MCNP. I also gained experience that will help me decide on a career or perhaps obtain employment in the future.« less
Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2013)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David C.; Syamlal, Madhava; Cottrell, Roger
2013-09-30
The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories’ core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI’s industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI’s academic participants (Carnegie Mellon University, Princeton University, West Virginia University, Boston University and the University of Texas at Austin) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 13, CCSI announced the initial release of its first set of computational tools and models during the October 2012 meeting of its Industry Advisory Board. This initial release led to five companies licensing the CCSI Toolset under a Test and Evaluation Agreement this year. By the end of FY13, the CCSI Technical Team had completed development of an updated suite of computational tools and models. The list below summarizes the new and enhanced toolset components that were released following comprehensive testing during October 2013. 1. FOQUS. Framework for Optimization and Quantification of Uncertainty and Sensitivity. Package includes: FOQUS Graphic User Interface (GUI), simulation-based optimization engine, Turbine Client, and heat integration capabilities. There is also an updated simulation interface and new configuration GUI for connecting Aspen Plus or Aspen Custom Modeler (ACM) simulations to FOQUS and the Turbine Science Gateway. 2. A new MFIX-based Computational Fluid Dynamics (CFD) model to predict particle attrition. 3. A new dynamic reduced model (RM) builder, which generates computationally efficient RMs of the behavior of a dynamic system. 4. A completely re-written version of the algebraic surrogate model builder for optimization (ALAMO). The new version is several orders of magnitude faster than the initial release and eliminates the MATLAB dependency. 5. A new suite of high resolution filtered models for the hydrodynamics associated with horizontal cylindrical objects in a flow path. 6. The new Turbine Science Gateway (Cluster), which supports FOQUS for running multiple simulations for optimization or UQ using a local computer or cluster. 7. A new statistical tool (BSS-ANOVA-UQ) for calibration and validation of CFD models. 8. A new basic data submodel in Aspen Plus format for a representative high viscosity capture solvent, 2-MPZ system. 9. An updated RM tool for CFD (REVEAL) that can create a RM from MFIX. A new lightweight, stand-alone version will be available in late 2013. 10. An updated RM integration tool to convert the RM from REVEAL into a CAPE-OPEN or ACM model for use in a process simulator. 11. An updated suite of unified steady-state and dynamic process models for solid sorbent carbon capture included bubbling fluidized bed and moving bed reactors. 12. An updated and unified set of compressor models including steady-state design point model and dynamic model with surge detection. 13. A new framework for the synthesis and optimization of coal oxycombustion power plants using advanced optimization algorithms. This release focuses on modeling and optimization of a cryogenic air separation unit (ASU). 14. A new technical risk model in spreadsheet format. 15. An updated version of the sorbent kinetic/equilibrium model for parameter estimation for the 1st generation sorbent model. 16. An updated process synthesis superstructure model to determine optimal process configurations utilizing surrogate models from ALAMO for adsorption and regeneration in a solid sorbent process. 17. Validation models for NETL Carbon Capture Unit utilizing sorbent AX. Additional validation models will be available for sorbent 32D in 2014. 18. An updated hollow fiber membrane model and system example for carbon capture. 19. An updated reference power plant model in Thermoflex that includes additional steam extraction and reinjection points to enable heat integration module. 20. An updated financial risk model in spreadsheet format.« less
Jacob, Mathews; Blu, Thierry; Vaillant, Cedric; Maddocks, John H; Unser, Michael
2006-01-01
We introduce a three-dimensional (3-D) parametric active contour algorithm for the shape estimation of DNA molecules from stereo cryo-electron micrographs. We estimate the shape by matching the projections of a 3-D global shape model with the micrographs; we choose the global model as a 3-D filament with a B-spline skeleton and a specified radial profile. The active contour algorithm iteratively updates the B-spline coefficients, which requires us to evaluate the projections and match them with the micrographs at every iteration. Since the evaluation of the projections of the global model is computationally expensive, we propose a fast algorithm based on locally approximating it by elongated blob-like templates. We introduce the concept of projection-steerability and derive a projection-steerable elongated template. Since the two-dimensional projections of such a blob at any 3-D orientation can be expressed as a linear combination of a few basis functions, matching the projections of such a 3-D template involves evaluating a weighted sum of inner products between the basis functions and the micrographs. The weights are simple functions of the 3-D orientation and the inner-products are evaluated efficiently by separable filtering. We choose an internal energy term that penalizes the average curvature magnitude. Since the exact length of the DNA molecule is known a priori, we introduce a constraint energy term that forces the curve to have this specified length. The sum of these energies along with the image energy derived from the matching process is minimized using the conjugate gradients algorithm. We validate the algorithm using real, as well as simulated, data and show that it performs well.
Faunt, C.C.; Hanson, R.T.; Martin, P.; Schmid, W.
2011-01-01
California's Central Valley has been one of the most productive agricultural regions in the world for more than 50 years. To better understand the groundwater availability in the valley, the U.S. Geological Survey (USGS) developed the Central Valley hydrologic model (CVHM). Because of recent water-level declines and renewed subsidence, the CVHM is being updated to better simulate the geohydrologic system. The CVHM updates and refinements can be grouped into two general categories: (1) model code changes and (2) data updates. The CVHM updates and refinements will require that the model be recalibrated. The updated CVHM will provide a detailed transient analysis of changes in groundwater availability and flow paths in relation to climatic variability, urbanization, stream flow, and changes in irrigated agricultural practices and crops. The updated CVHM is particularly focused on more accurately simulating the locations and magnitudes of land subsidence. The intent of the updated CVHM is to help scientists better understand the availability and sustainability of water resources and the interaction of groundwater levels with land subsidence. ?? 2011 ASCE.
Faunt, Claudia C.; Hanson, Randall T.; Martin, Peter; Schmid, Wolfgang
2011-01-01
California's Central Valley has been one of the most productive agricultural regions in the world for more than 50 years. To better understand the groundwater availability in the valley, the U.S. Geological Survey (USGS) developed the Central Valley hydrologic model (CVHM). Because of recent water-level declines and renewed subsidence, the CVHM is being updated to better simulate the geohydrologic system. The CVHM updates and refinements can be grouped into two general categories: (1) model code changes and (2) data updates. The CVHM updates and refinements will require that the model be recalibrated. The updated CVHM will provide a detailed transient analysis of changes in groundwater availability and flow paths in relation to climatic variability, urbanization, stream flow, and changes in irrigated agricultural practices and crops. The updated CVHM is particularly focused on more accurately simulating the locations and magnitudes of land subsidence. The intent of the updated CVHM is to help scientists better understand the availability and sustainability of water resources and the interaction of groundwater levels with land subsidence.
Reality of using a model from local governments' perspective-How science community can help?
NASA Astrophysics Data System (ADS)
Mirzazad, S.
2016-12-01
Local governments across the US use historic data to approve capital improvement projects and update comprehensive/zoning plans. Due to the effects of climate change, historic data sets are no longer suitable, which requires communities to use climate models to project the future. However, the use of climate models also presents challenges for local governments such as: Variations between models: Because model-development methodologies vary, different climate models provide different end results. A local governments' decision concerning which climate model to use is tricky because the model drives policy direction and infrastructure investments that can be both expensive and controversial. Communicating the gaps of a model: There are always uncertainties associated with modeling. These gaps may range from the scale of a model to the type of data used in modeling. Effectively communicating this to a community is crucial to gain political support. Managing politics associated with using a model: In many cases, models project changes to the built environment that will detrimentally affect private property owners. This can result in strong push back from the community and could threaten the local tax base. Scientists have important roles; from development to delivery of models to assisting local governments navigate through these challenges. Bringing in entities with experience of working with local governments can contribute to a successful outcome. In this proposed session, ICLEI-Local Governments for Sustainability will use the USGS CoSMoS as a case study for lessons learned in establishing a framework for effective collaboration between local governments and the science community.
MOS 2.0: Modeling the Next Revolutionary Mission Operations System
NASA Technical Reports Server (NTRS)
Delp, Christopher L.; Bindschadler, Duane; Wollaeger, Ryan; Carrion, Carlos; McCullar, Michelle; Jackson, Maddalena; Sarrel, Marc; Anderson, Louise; Lam, Doris
2011-01-01
Designed and implemented in the 1980's, the Advanced Multi-Mission Operations System (AMMOS) was a breakthrough for deep-space NASA missions, enabling significant reductions in the cost and risk of implementing ground systems. By designing a framework for use across multiple missions and adaptability to specific mission needs, AMMOS developers created a set of applications that have operated dozens of deep-space robotic missions over the past 30 years. We seek to leverage advances in technology and practice of architecting and systems engineering, using model-based approaches to update the AMMOS. We therefore revisit fundamental aspects of the AMMOS, resulting in a major update to the Mission Operations System (MOS): MOS 2.0. This update will ensure that the MOS can support an increasing range of mission types, (such as orbiters, landers, rovers, penetrators and balloons), and that the operations systems for deep-space robotic missions can reap the benefits of an iterative multi-mission framework.12 This paper reports on the first phase of this major update. Here we describe the methods and formal semantics used to address MOS 2.0 architecture and some early results. Early benefits of this approach include improved stakeholder input and buy-in, the ability to articulate and focus effort on key, system-wide principles, and efficiency gains obtained by use of well-architected design patterns and the use of models to improve the quality of documentation and decrease the effort required to produce and maintain it. We find that such methods facilitate reasoning, simulation, analysis on the system design in terms of design impacts, generation of products (e.g., project-review and software-delivery products), and use of formal process descriptions to enable goal-based operations. This initial phase yields a forward-looking and principled MOS 2.0 architectural vision, which considers both the mission-specific context and long-term system sustainability.
Calibration, Projection, and Final Image Products of MESSENGER's Mercury Dual Imaging System
NASA Astrophysics Data System (ADS)
Denevi, Brett W.; Chabot, Nancy L.; Murchie, Scott L.; Becker, Kris J.; Blewett, David T.; Domingue, Deborah L.; Ernst, Carolyn M.; Hash, Christopher D.; Hawkins, S. Edward; Keller, Mary R.; Laslo, Nori R.; Nair, Hari; Robinson, Mark S.; Seelos, Frank P.; Stephens, Grant K.; Turner, F. Scott; Solomon, Sean C.
2018-02-01
We present an overview of the operations, calibration, geodetic control, photometric standardization, and processing of images from the Mercury Dual Imaging System (MDIS) acquired during the orbital phase of the MESSENGER spacecraft's mission at Mercury (18 March 2011-30 April 2015). We also provide a summary of all of the MDIS products that are available in NASA's Planetary Data System (PDS). Updates to the radiometric calibration included slight modification of the frame-transfer smear correction, updates to the flat fields of some wide-angle camera (WAC) filters, a new model for the temperature dependence of narrow-angle camera (NAC) and WAC sensitivity, and an empirical correction for temporal changes in WAC responsivity. Further, efforts to characterize scattered light in the WAC system are described, along with a mosaic-dependent correction for scattered light that was derived for two regional mosaics. Updates to the geometric calibration focused on the focal lengths and distortions of the NAC and all WAC filters, NAC-WAC alignment, and calibration of the MDIS pivot angle and base. Additionally, two control networks were derived so that the majority of MDIS images can be co-registered with sub-pixel accuracy; the larger of the two control networks was also used to create a global digital elevation model. Finally, we describe the image processing and photometric standardization parameters used in the creation of the MDIS advanced products in the PDS, which include seven large-scale mosaics, numerous targeted local mosaics, and a set of digital elevation models ranging in scale from local to global.
NASA Astrophysics Data System (ADS)
Balla, Vamsi Krishna; Coox, Laurens; Deckers, Elke; Plyumers, Bert; Desmet, Wim; Marudachalam, Kannan
2018-01-01
The vibration response of a component or system can be predicted using the finite element method after ensuring numerical models represent realistic behaviour of the actual system under study. One of the methods to build high-fidelity finite element models is through a model updating procedure. In this work, a novel model updating method of deep-drawn components is demonstrated. Since the component is manufactured with a high draw ratio, significant deviations in both profile and thickness distributions occurred in the manufacturing process. A conventional model updating, involving Young's modulus, density and damping ratios, does not lead to a satisfactory match between simulated and experimental results. Hence a new model updating process is proposed, where geometry shape variables are incorporated, by carrying out morphing of the finite element model. This morphing process imitates the changes that occurred during the deep drawing process. An optimization procedure that uses the Global Response Surface Method (GRSM) algorithm to maximize diagonal terms of the Modal Assurance Criterion (MAC) matrix is presented. This optimization results in a more accurate finite element model. The advantage of the proposed methodology is that the CAD surface of the updated finite element model can be readily obtained after optimization. This CAD model can be used for carrying out analysis, as it represents the manufactured part more accurately. Hence, simulations performed using this updated model with an accurate geometry, will therefore yield more reliable results.
Kennedy Space Center Director Update
2014-03-06
CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during KSC Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper
Kennedy Space Center Director Update
2014-03-06
CAPE CANAVERAL, Fla. - Community leaders, business executives, educators, and state and local government leaders were updated on NASA Kennedy Space Center programs and accomplishments during Center Director Bob Cabana’s Center Director Update at the Debus Center at the Kennedy Space Center Visitor Complex in Florida. Attendees talked with Cabana and other senior Kennedy managers and visited displays featuring updates on Kennedy programs and projects, including International Space Station, Commercial Crew, Ground System Development and Operations, Launch Services, Center Planning and Development, Technology, KSC Swamp Works and NASA Education. The morning concluded with a tour of the new Space Shuttle Atlantis exhibit at the visitor complex. For more information, visit http://www.nasa.gov/kennedy. Photo credit: NASA/Daniel Casper
Implication of Agricultural Land Use Change on Regional Climate Projection
NASA Astrophysics Data System (ADS)
Wang, G.; Ahmed, K. F.; You, L.
2015-12-01
Agricultural land use plays an important role in land-atmosphere interaction. Agricultural activity is one of the most important processes driving human-induced land use land cover change (LULCC) in a region. In addition to future socioeconomic changes, climate-induced changes in crop yield represent another important factor shaping agricultural land use. In feedback, the resulting LULCC influences the direction and magnitude of global, regional and local climate change by altering Earth's radiative equilibrium. Therefore, assessment of climate change impact on future agricultural land use and its feedback is of great importance in climate change study. In this study, to evaluate the feedback of projected land use changes to the regional climate in West Africa, we employed an asynchronous coupling between a regional climate model (RegCM) and a prototype land use projection model (LandPro). The LandPro model, which was developed to project the future change in agricultural land use and the resulting shift in natural vegetation in West Africa, is a spatially explicit model that can account for both climate and socioeconomic changes in projecting future land use changes. In the asynchronously coupled modeling framework, LandPro was run for every five years during the period of 2005-2050 accounting for climate-induced change in crop yield and socioeconomic changes to project the land use pattern by the mid-21st century. Climate data at 0.5˚ was derived from RegCM to drive the crop model DSSAT for each of the five-year periods to simulate crop yields, which was then provided as input data to LandPro. Subsequently, the land use land cover map required to run RegCM was updated every five years using the outputs from the LandPro simulations. Results from the coupled model simulations improve the understanding of climate change impact on future land use and the resulting feedback to regional climate.
Deployment of a tool for measuring freeway safety performance.
DOT National Transportation Integrated Search
2011-12-01
This project updated and deployed a freeway safety performance measurement tool, building upon a previous project that developed the core methodology. The tool evaluates the cumulative risk over time of an accident or a particular kind of accident. T...
Payload Documentation Enhancement Project
NASA Technical Reports Server (NTRS)
Brown, Betty G.
1999-01-01
In late 1998, the Space Shuttle Program recognized a need to revitalize its payload accommodations documentation. As a result a payload documentation enhancement project was initiated to review and update payload documentation and improve the accessibility to that documentation by the Space Shuttle user community.
76 FR 14968 - Environmental Impacts Statements; Notice of Availability
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-18
.... 20110076, Draft Supplement, USFS, MT, Grizzly Vegetation and Transportation Management Project, Updated and... Management Actions, Three Rivers Ranger District, Kootenai National Forest, Lincoln County, MT, Comment..., Section 30 Limestone Mining Project, Proposal to Implement Mining Actions, Mystic Ranger District, Black...
NASA Astrophysics Data System (ADS)
Machado, M. R.; Adhikari, S.; Dos Santos, J. M. C.; Arruda, J. R. F.
2018-03-01
Structural parameter estimation is affected not only by measurement noise but also by unknown uncertainties which are present in the system. Deterministic structural model updating methods minimise the difference between experimentally measured data and computational prediction. Sensitivity-based methods are very efficient in solving structural model updating problems. Material and geometrical parameters of the structure such as Poisson's ratio, Young's modulus, mass density, modal damping, etc. are usually considered deterministic and homogeneous. In this paper, the distributed and non-homogeneous characteristics of these parameters are considered in the model updating. The parameters are taken as spatially correlated random fields and are expanded in a spectral Karhunen-Loève (KL) decomposition. Using the KL expansion, the spectral dynamic stiffness matrix of the beam is expanded as a series in terms of discretized parameters, which can be estimated using sensitivity-based model updating techniques. Numerical and experimental tests involving a beam with distributed bending rigidity and mass density are used to verify the proposed method. This extension of standard model updating procedures can enhance the dynamic description of structural dynamic models.
Near real-time forecasting for cholera decision making in Haiti after Hurricane Matthew
Camacho, Anton; Grandesso, Francesco; Cohuet, Sandra; Lemaitre, Joseph C.; Rinaldo, Andrea
2018-01-01
Computational models of cholera transmission can provide objective insights into the course of an ongoing epidemic and aid decision making on allocation of health care resources. However, models are typically designed, calibrated and interpreted post-hoc. Here, we report the efforts of a team from academia, field research and humanitarian organizations to model in near real-time the Haitian cholera outbreak after Hurricane Matthew in October 2016, to assess risk and to quantitatively estimate the efficacy of a then ongoing vaccination campaign. A rainfall-driven, spatially-explicit meta-community model of cholera transmission was coupled to a data assimilation scheme for computing short-term projections of the epidemic in near real-time. The model was used to forecast cholera incidence for the months after the passage of the hurricane (October-December 2016) and to predict the impact of a planned oral cholera vaccination campaign. Our first projection, from October 29 to December 31, predicted the highest incidence in the departments of Grande Anse and Sud, accounting for about 45% of the total cases in Haiti. The projection included a second peak in cholera incidence in early December largely driven by heavy rainfall forecasts, confirming the urgency for rapid intervention. A second projection (from November 12 to December 31) used updated rainfall forecasts to estimate that 835 cases would be averted by vaccinations in Grande Anse (90% Prediction Interval [PI] 476-1284) and 995 in Sud (90% PI 508-2043). The experience gained by this modeling effort shows that state-of-the-art computational modeling and data-assimilation methods can produce informative near real-time projections of cholera incidence. Collaboration among modelers and field epidemiologists is indispensable to gain fast access to field data and to translate model results into operational recommendations for emergency management during an outbreak. Future efforts should thus draw together multi-disciplinary teams to ensure model outputs are appropriately based, interpreted and communicated. PMID:29768401
Near real-time forecasting for cholera decision making in Haiti after Hurricane Matthew.
Pasetto, Damiano; Finger, Flavio; Camacho, Anton; Grandesso, Francesco; Cohuet, Sandra; Lemaitre, Joseph C; Azman, Andrew S; Luquero, Francisco J; Bertuzzo, Enrico; Rinaldo, Andrea
2018-05-01
Computational models of cholera transmission can provide objective insights into the course of an ongoing epidemic and aid decision making on allocation of health care resources. However, models are typically designed, calibrated and interpreted post-hoc. Here, we report the efforts of a team from academia, field research and humanitarian organizations to model in near real-time the Haitian cholera outbreak after Hurricane Matthew in October 2016, to assess risk and to quantitatively estimate the efficacy of a then ongoing vaccination campaign. A rainfall-driven, spatially-explicit meta-community model of cholera transmission was coupled to a data assimilation scheme for computing short-term projections of the epidemic in near real-time. The model was used to forecast cholera incidence for the months after the passage of the hurricane (October-December 2016) and to predict the impact of a planned oral cholera vaccination campaign. Our first projection, from October 29 to December 31, predicted the highest incidence in the departments of Grande Anse and Sud, accounting for about 45% of the total cases in Haiti. The projection included a second peak in cholera incidence in early December largely driven by heavy rainfall forecasts, confirming the urgency for rapid intervention. A second projection (from November 12 to December 31) used updated rainfall forecasts to estimate that 835 cases would be averted by vaccinations in Grande Anse (90% Prediction Interval [PI] 476-1284) and 995 in Sud (90% PI 508-2043). The experience gained by this modeling effort shows that state-of-the-art computational modeling and data-assimilation methods can produce informative near real-time projections of cholera incidence. Collaboration among modelers and field epidemiologists is indispensable to gain fast access to field data and to translate model results into operational recommendations for emergency management during an outbreak. Future efforts should thus draw together multi-disciplinary teams to ensure model outputs are appropriately based, interpreted and communicated.
Summary of Expansions, Updates, and Results in GREET 2017 Suite of Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Michael; Elgowainy, Amgad; Han, Jeongwoo
This report provides a technical summary of the expansions and updates to the 2017 release of Argonne National Laboratory’s Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET®) model, including references and links to key technical documents related to these expansions and updates. The GREET 2017 release includes an updated version of the GREET1 (the fuel-cycle GREET model) and GREET2 (the vehicle-cycle GREET model), both in the Microsoft Excel platform and in the GREET.net modeling platform. Figure 1 shows the structure of the GREET Excel modeling platform. The .net platform integrates all GREET modules together seamlessly.
NASA Astrophysics Data System (ADS)
Serbin, S. P.; Dietze, M.; Desai, A. R.; LeBauer, D.; Viskari, T.; Kooper, R.; McHenry, K. G.; Townsend, P. A.
2013-12-01
The ability to seamlessly integrate information on vegetation structure and function across a continuum of scales, from field to satellite observations, greatly enhances our ability to understand how terrestrial vegetation-atmosphere interactions change over time and in response to disturbances. In particular, terrestrial ecosystem models require detailed information on ecosystem states and canopy properties in order to properly simulate the fluxes of carbon (C), water and energy from the land to the atmosphere as well as address the vulnerability of ecosystems to environmental and other perturbations. Over the last several decades the amount of available data to constrain ecological predictions has increased substantially, resulting in a progressively data-rich era for global change research. In particular remote sensing data, specifically optical data (leaf and canopy), offers the potential for an important and direct data constraint on ecosystem model projections of C and energy fluxes. Here we highlight the utility of coupling information provided through the Ecosystem Spectral Information System (EcoSIS) with complex process models through the Predictive Ecosystem Analyzer (PEcAn; http://www.pecanproject.org/) eco-informatics framework as a means to improve the description of canopy optical properties, vegetation composition, and modeled radiation balance. We also present this an efficient approach for understanding and correcting implicit assumptions and model structural deficiencies. We first illustrate the challenges and issues in adequately characterizing ecosystem fluxes with the Ecosystem Demography model (ED2, Medvigy et al., 2009) due to improper parameterization of leaf and canopy properties, as well as assumptions describing radiative transfer within the canopy. ED2 is especially relevant to these efforts because it contains a sophisticated structure for scaling ecological processes across a range of spatial scales: from the tree-level (demography, physiology) to the distribution of stands across a landscape, which allows for the direct use of remotely sensed data at the appropriate spatial scale. A sensitivity analysis is employed within PEcAn to illustrate the influence of ED2 parameterizations on modeled C and energy fluxes for a northern temperate forest ecosystem as an example of the need for more detailed information on leaf and canopy optical properties. We then demonstrate a data assimilation approach to synthesize spectral data contained within EcoSIS in order to update model parameterizations across key vegetation plant functional types, as well as a means to update vegetation state information (i.e. composition, LAI) and improve the description of radiation transfer through model structural updates. A better understanding of the radiation balance of ecosystems will improve regional and global scale C and energy balance projections.
Lord, J; Willis, S; Eatock, J; Tappenden, P; Trapero-Bertran, M; Miners, A; Crossan, C; Westby, M; Anagnostou, A; Taylor, S; Mavranezouli, I; Wonderling, D; Alderson, P; Ruiz, F
2013-12-01
National Institute for Health and Care Excellence (NICE) clinical guidelines (CGs) make recommendations across large, complex care pathways for broad groups of patients. They rely on cost-effectiveness evidence from the literature and from new analyses for selected high-priority topics. An alternative approach would be to build a model of the full care pathway and to use this as a platform to evaluate the cost-effectiveness of multiple topics across the guideline recommendations. In this project we aimed to test the feasibility of building full guideline models for NICE guidelines and to assess if, and how, such models can be used as a basis for cost-effectiveness analysis (CEA). A 'best evidence' approach was used to inform the model parameters. Data were drawn from the guideline documentation, advice from clinical experts and rapid literature reviews on selected topics. Where possible we relied on good-quality, recent UK systematic reviews and meta-analyses. Two published NICE guidelines were used as case studies: prostate cancer and atrial fibrillation (AF). Discrete event simulation (DES) was used to model the recommended care pathways and to estimate consequent costs and outcomes. For each guideline, researchers not involved in model development collated a shortlist of topics suggested for updating. The modelling teams then attempted to evaluate options related to these topics. Cost-effectiveness results were compared with opinions about the importance of the topics elicited in a survey of stakeholders. The modelling teams developed simulations of the guideline pathways and disease processes. Development took longer and required more analytical time than anticipated. Estimates of cost-effectiveness were produced for six of the nine prostate cancer topics considered, and for five of eight AF topics. The other topics were not evaluated owing to lack of data or time constraints. The modelled results suggested 'economic priorities' for an update that differed from priorities expressed in the stakeholder survey. We did not conduct systematic reviews to inform the model parameters, and so the results might not reflect all current evidence. Data limitations and time constraints restricted the number of analyses that we could conduct. We were also unable to obtain feedback from guideline stakeholders about the usefulness of the models within project time scales. Discrete event simulation can be used to model full guideline pathways for CEA, although this requires a substantial investment of clinical and analytic time and expertise. For some topics lack of data may limit the potential for modelling. There are also uncertainties over the accessibility and adaptability of full guideline models. However, full guideline modelling offers the potential to strengthen and extend the analytical basis of NICE's CGs. Further work is needed to extend the analysis of our case study models to estimate population-level budget and health impacts. The practical usefulness of our models to guideline developers and users should also be investigated, as should the feasibility and usefulness of whole guideline modelling alongside development of a new CG. This project was funded by the Medical Research Council and the National Institute for Health Research through the Methodology Research Programme [grant number G0901504] and will be published in full in Health Technology Assessment; Vol. 17, No. 58. See the NIHR Journals Library website for further project information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Rhett; Marshall, Tim; Chavez, Adrian
The exe-Guard Project is an alliance between Dominion Virginia Power (DVP), Sandia National Laboratories (SNL), Dartmouth University, and Schweitzer Engineering Laboratories (SEL). SEL is primary recipient on this project. The exe-Guard project was selected for award under DE-FOA-0000359 with CFDA number 81.122 to address Topic Area of Interest 4: Hardened platforms and Systems. The exe-Guard project developed an antivirus solution for control system embedded devices to prevent the execution of unauthorized code and maintain settings and configuration integrity. This project created a white list antivirus solution for control systems capable of running on embedded Linux® operating systems. White list antivirusmore » methods allow only credible programs to run through the use of digital signatures and hash functions. Once a system’s secure state is baselined, white list antivirus software denies deviations from that state because of the installation of malicious code as this changes hash results. Black list antivirus software has been effective in traditional IT environments but has negative implications for control systems. Black list antivirus uses pattern matching and behavioral analysis to identify system threats while relying on regular updates to the signature file and recurrent system scanning. Black list antivirus is vulnerable to zero day exploits which have not yet been incorporated into a signature file update. System scans hamper the performance of high availability applications, as revealed in NIST special publication 1058 which summarizes the impact of blacklist antivirus on control systems: Manual or “on-demand” scanning has a major effect on control processes in that they take CPU time needed by the control process (Sometimes close to 100% of CPU time). Minimizing the antivirus software throttle setting will reduce but not eliminate this effect. Signature updates can also take up to 100% of CPU time, but for a much shorter period than a typical manual scanning process. Control systems are vulnerable to performance losses if off-the-shelf blacklist antivirus solutions aren’t implemented with care. This investment in configuration in addition to constant decommissioning to perform manual signature file updates is unprecedented and impractical. Additionally, control systems are often disconnected or islanded from the network making the delivery of signature updates difficult. Exe-Guard project developed a white list antivirus solution that mitigated the above drawbacks and allows control systems to cost-effectively apply malware protection while maintaining high reliability. The application of security patches can also be minimized since white listing maintains constant defense against unauthorized code execution. Security patches can instead be applied in less frequent intervals where system decommissioning can be scheduled and planned for. Since control systems are less dynamic than IT environments, the feasibility of maintaining a secure baselined state is more practical. Because upgrades are performed in infrequent, calculated intervals, it allows a new security baseline to be established before the system is returned to service. Exe-Guard built on the efforts of SNL under the Code Seal project. SNL demonstrated prototype Trust Anchors on the project which are independent monitoring and control devices that can be integrated into untrustworthy components. The exe-Guard team started with the lessons learned under this project then designed commercial solution for white list malware protection. Malware is a real threat, even on islanded or un-networked installations, since operators can unintentionally install infected files, plug in infected mass storage devices, or infect a piece of equipment on the islanded local area network that can then spread to other connected equipment. Protection at the device level is one of the last layers of defense in a security-in-depth defense model before an asset becomes compromised. This project provided non-destructive intrusion, isolation and automated response solution, achieving a goal of the Department of Energy (DOE) Roadmap to Secure Control Systems. It also addressed CIP-007-R4 which requires asset owners to employ malicious software prevention tools on assets within the electronic security perimeter. In addition, the CIP-007-R3 requirement for security patch management is minimized because white listing narrows the impact of vulnerabilities and patch releases. The exe-Guard Project completed all tasks identified in the statement of project objective and identified additional tasks within scope that were performed and completed within the original budget. The cost share was met and all deliverables were successfully completed and submitted on time. Most importantly the technology developed and commercialized under this project has been adopted by the Energy sector and thousands of devices with exe-Guard technology integrated in them have now been deployed and are protecting our power systems today« less
Emission Data For Climate-Chemistry Interactions
NASA Astrophysics Data System (ADS)
Smith, S. J.
2012-12-01
Data on anthropogenic and natural emissions of reactive species are a critical input for studies of atmospheric chemistry and climate. The availability and characteristics of anthropogenic emissions data that can be used for such studies are reviewed and pathways for future work discuss Global and regional datasets for historical and future emissions are available, but their characteristics and applicability for specific studies differ. For the first time, a coordinated set of historical emissions (Lamarque et al 2010) and the future projections (van Vuurren et al. 2011) have been developed for use in the CMIP5 and ACCMIP long-term simulation comparison projects. These data have decadal resolution and were designed for long-term, global simulations. These data, however, lack finer-scale spatial and temporal detail that might be needed for some studies. Robust and timely updates of emissions data is generally lacking, although recent updates will be presented. While historical emission data is often treated as known, emissions are uncertain, even though this uncertainty is rarely quantified. Uncertainty varies by species and location. Inverse modeling is starting to indicate where emission data may be uncertain, which opens the way to improving these data overall. Further interaction between the chemistry modeling and inventory development communities are needed. Future projections are intrinsically uncertain, and while institutions and processes are in place to develop and review long-term century-scale scenarios, a need has remained for a wider range in shorter-term (e.g., several decade) projections. Emissions and scenario development communities have been working to fill this need. Communication across disciplines of the assumptions embedded in emissions projections remains a challenge. Atmospheric chemistry models are a central tool needed for studying chemistry-climate interactions. Simpler models, however, are also needed in order to examine interactions between different physical systems and also between the physical and human systems. Statistical models of system responses are particularly needed both to parameterize interactions in models that cannot simulate particular processes directly, and also to represent uncertainty. Coordinated model experiments are necessary to provide the information needed to develop these representations (i.e. Wild et al 2011). Lamarque, J. F, et al. (2010) Historical (1850-2000) gridded anthropogenic and biomass burning emissions of reactive gases and aerosols: methodology and application. Atmospheric Chemistry and Physics 10 pp. 7017-7039. doi:10.5194/acp-10-7017-2010 Van Vuuren, D, JA Edmonds, M Kainuma, K Riahi, AM Thomson, KA Hibbard, G Hurtt, T Kram, V Krey, JF Lamarque, matsui, M Meinhausen, N Nakicenovic, SJ Smith, and SK Rose. 2011. "The Representative Concentration Pathways: An Overview." Climatic Change 109 (1-2) 5-31. doi: 10.1007/s10584-011-0148-z. Wild, O., et al. (2012) Modelling future changes in surface ozone: A parameterized approach. Atmos. Chem. Phys., 12, 2037-2054, doi:10.5194/acp-12-2037-2012.
Working with Consortia - Advanced Packaging Reliability
NASA Technical Reports Server (NTRS)
Blanche, Jim; Strickland, Mark
2010-01-01
Description: Support the responsible NASA official for lead-free solder evaluation. Serve as the NASA technical liaison to the NASA/DoD Pb-free Project. Assure NASA areas of interest are included in JG-PP follow-on work. Support NASA/DoD telcons and face-to-face meetings. Update MSFC lead-free solder lessons learned report. FY10 plans: - Reliability data on lead-free solder applications for various part lead finishes and board finishes. - Update lead-free solder risks and risk mitigation strategies for NASA. - Evaluate lead-free alloy/lead-free finish reliability in design application. - Status CAVE project on Pb-free solder aging effects. - Compile the LTESE flight and bench data.
Particle tracing modeling of ion fluxes at geosynchronous orbit during substorms
NASA Astrophysics Data System (ADS)
Brito, T. V.; Jordanova, V.; Woodroffe, J. R.; Henderson, M. G.; Morley, S.; Birn, J.
2016-12-01
The SHIELDS project aims to couple a host of different models for different regions of the magnetosphere using different numerical methods such as MHD, PIC and particle tracing, with the ultimate goal of having a more realistic model of the whole magnetospheric environment capturing, as much as possible, the different physics of the various plasma populations. In that context, we present a modeling framework that can be coupled with a global MHD model to calculate particle fluxes in the inner magnetosphere, which can in turn be used to constantly update the input for a ring current model. In that regard, one advantage of that approach over using spacecraft data is that it produces a much better spatial and temporal coverage of the nightside geosynchronous region and thus a possibly more complete input for the ring current model, which will likely produce more accurate global results for the ring current population. In this presentation, we will describe the particle tracing method in more detail, describe the method used to couple it to the BATS-R-US 3D global MHD code, and the method used to update the flux results to the RAM-SCB ring current model. We will also present the simulation results for the July 18, 2013 period, which showed significant substorm activity. We will compare simulated ion fluxes on the nightside magnetosphere with spacecraft observations to gauge how well our simulations are capturing substorm dynamics.
Landfill Gas Energy Cost Model Version 3.0 (LFGcost-Web V3 ...
To help stakeholders estimate the costs of a landfill gas (LFG) energy project, in 2002, LMOP developed a cost tool (LFGcost). Since then, LMOP has routinely updated the tool to reflect changes in the LFG energy industry. Initially the model was designed for EPA to assist landfills in evaluating the economic and financial feasibility of LFG energy project development. In 2014, LMOP developed a public version of the model, LFGcost-Web (Version 3.0), to allow landfill and industry stakeholders to evaluate project feasibility on their own. LFGcost-Web can analyze costs for 12 energy recovery project types. These project costs can be estimated with or without the costs of a gas collection and control system (GCCS). The EPA used select equations from LFGcost-Web to estimate costs of the regulatory options in the 2015 proposed revisions to the MSW Landfills Standards of Performance (also known as New Source Performance Standards) and the Emission Guidelines (herein thereafter referred to collectively as the Landfill Rules). More specifically, equations derived from LFGcost-Web were applied to each landfill expected to be impacted by the Landfill Rules to estimate annualized installed capital costs and annual O&M costs of a gas collection and control system. In addition, after applying the LFGcost-Web equations to the list of landfills expected to require a GCCS in year 2025 as a result of the proposed Landfill Rules, the regulatory analysis evaluated whether electr
Model Update of a Micro Air Vehicle (MAV) Flexible Wing Frame with Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.; Waszak, Martin R.; Morgan, Benjamin G.
2004-01-01
This paper describes a procedure to update parameters in the finite element model of a Micro Air Vehicle (MAV) to improve displacement predictions under aerodynamics loads. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with Multidisciplinary Design Optimization (MDO) is used to modify key model parameters. Static test data collected using photogrammetry are used to correlate with model predictions. Results show significant improvements in model predictions after parameters are updated; however, computed probabilities values indicate low confidence in updated values and/or model structure errors. Lessons learned in the areas of wing design, test procedures, modeling approaches with geometric nonlinearities, and uncertainties quantification are all documented.
Development of Life Support System Technologies for Human Lunar Missions
NASA Technical Reports Server (NTRS)
Barta, Daniel J.; Ewert, Michael K.
2009-01-01
With the Preliminary Design Review (PDR) for the Orion Crew Exploration Vehicle planned to be completed in 2009, Exploration Life Support (ELS), a technology development project under the National Aeronautics and Space Administration s (NASA) Exploration Technology Development Program, is focusing its efforts on needs for human lunar missions. The ELS Project s goal is to develop and mature a suite of Environmental Control and Life Support System (ECLSS) technologies for potential use on human spacecraft under development in support of U.S. Space Exploration Policy. ELS technology development is directed at three major vehicle projects within NASA s Constellation Program (CxP): the Orion Crew Exploration Vehicle (CEV), the Altair Lunar Lander and Lunar Surface Systems, including habitats and pressurized rovers. The ELS Project includes four technical elements: Atmosphere Revitalization Systems, Water Recovery Systems, Waste Management Systems and Habitation Engineering, and two cross cutting elements, Systems Integration, Modeling and Analysis, and Validation and Testing. This paper will provide an overview of the ELS Project, connectivity with its customers and an update to content within its technology development portfolio with focus on human lunar missions.
Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.
2004-03-01
The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates basedmore » on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four projections, and associated kriging variances, were averaged using the posterior model probabilities as weights. Finally, cross-validation was conducted by eliminating from consideration all data from one borehole at a time, repeating the above process, and comparing the predictive capability of the model-averaged result with that of each individual model. Using two quantitative measures of comparison, the model-averaged result was superior to any individual geostatistical model of log permeability considered.« less
Kim, Seung-Nam; Park, Taewon; Lee, Sang-Hyun
2014-01-01
Damage of a 5-story framed structure was identified from two types of measured data, which are frequency response functions (FRF) and natural frequencies, using a finite element (FE) model updating procedure. In this study, a procedure to determine the appropriate weightings for different groups of observations was proposed. In addition, a modified frame element which included rotational springs was used to construct the FE model for updating to represent concentrated damage at the member ends (a formulation for plastic hinges in framed structures subjected to strong earthquakes). The results of the model updating and subsequent damage detection when the rotational springs (RS model) were used were compared with those obtained using the conventional frame elements (FS model). Comparisons indicated that the RS model gave more accurate results than the FS model. That is, the errors in the natural frequencies of the updated models were smaller, and the identified damage showed clearer distinctions between damaged and undamaged members and was more consistent with observed damage. PMID:24574888
Projecting other public inventories for the 2005 RPA timber assessment update.
Xiaoping Zhou; John R. Mills; Richard W. Haynes
2007-01-01
This study gives an overview of the current inventory status and the projection of future forest inventories on other public timberland. Other public lands are lands administered by state, local, and federal government but excluding National Forest System lands. These projections were used as part of the 2005 USDA Forest Service Resource Planning Act timber assessment...
ERIC Educational Resources Information Center
Bachman, Jerald G.; Johnston, Lloyd D.; O'Malley, Patrick M.; Schulenberg, John E.
2006-01-01
This occasional paper updates and extends earlier papers in the Monitoring the Future project. It provides a detailed description of the project's design, including sampling design, data collection procedures, measurement content, and questionnaire format. It attempts to include sufficient information for others who wish to evaluate the results,…
Project WET Curriculum and Activity Guide 2.0
ERIC Educational Resources Information Center
Project WET Foundation, 2011
2011-01-01
The "Project WET Curriculum and Activity Guide 2.0" continues Project WET's dedication to 21st-century, cutting-edge water education. Now in full color, Guide 2.0 offers new activities on topics such as National Parks and storm water, fully revised and updated activities from the original Guide and the very best activities gathered from all of…
Projective Test Use among School Psychologists: A Survey and Critique
ERIC Educational Resources Information Center
Hojnoski, Robin L.; Morrison, Rhonda; Brown, Melissa; Matthews, William J.
2006-01-01
The use of projective techniques by school psychologists has been a point of interest and debate, with a number of survey studies documenting usage. The purpose of this study is to update the status of projective use among school psychologists, with a specific focus on their use in the social emotional assessment of children in schools. In…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... measures based on updated scallop biomass projections. The proposed FY 2013 DAS allocations would be set at a precautionary level (i.e., 75 percent of what current biomass levels project would be the DAS... of what current biomass projections indicate could be allocated to each LA scallop vessel for the...
Validating induced seismicity forecast models—Induced Seismicity Test Bench
NASA Astrophysics Data System (ADS)
Király-Proag, Eszter; Zechar, J. Douglas; Gischig, Valentin; Wiemer, Stefan; Karvounis, Dimitrios; Doetsch, Joseph
2016-08-01
Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. In this study, we propose an Induced Seismicity Test Bench to test and rank such models; this test bench can be used for model development, model selection, and ensemble model building. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models: Shapiro and Smoothed Seismicity (SaSS) and Hydraulics and Seismics (HySei). These models incorporate a different mix of physics-based elements and stochastic representation of the induced sequences. Our results show that neither model is fully superior to the other. Generally, HySei forecasts the seismicity rate better after shut-in but is only mediocre at forecasting the spatial distribution. On the other hand, SaSS forecasts the spatial distribution better and gives better seismicity rate estimates before shut-in. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in.
ERIC Educational Resources Information Center
Costrell, Robert M.
2009-01-01
In February 2008, the School Choice Demonstration Project (SCDP) issued its first report on the fiscal impact of the Milwaukee Parental Choice Program (MPCP) on taxpayers in Milwaukee and the state of Wisconsin. There are two reasons to update the 2008 report. First, the figures will naturally change with the continuing growth of the voucher…
ERIC Educational Resources Information Center
Gaines; Gale F.
2007-01-01
Teacher pay continues to be a hot issue for states, particularly since it is likely the largest expenditure in education budgets. This paper summarizes the latest on average salaries in the Southern Regional Education Board (SREB) states, including an update on recent incentive pay programs, pilot projects, and other legislative actions that…
ERIC Educational Resources Information Center
Lesch, Gerald E.
In 1972, the welding department personnel at Blackhawk Technical Institute in Wisconsin undertook the project of updating the curriculum of their one-year welding degree program. A study was conducted of local welding industries to determine hiring policies, the tools and equipment a beginning welder should purchase, the types of welding processes…
ERIC Educational Resources Information Center
British Columbia Council on Admissions and Transfer, 2010
2010-01-01
In 2008, a number of changes were identified that expanded the scope of the updating required for Block Transfer for tourism management as follows: a new core curriculum for diploma programs; the need for expanded information on diploma to diploma transfer; and, a growing need for an expanded system of transfer identified in Campus 2020…
ERIC Educational Resources Information Center
Montana State Dept. of Health and Environmental Sciences, Helena. Health Education Bureau.
This volume consists of updated information to be inserted into a Montana AIDS Project manual on providing services to persons with acquired immune deficiency syndrome/human immunodeficiency virus (AIDS/HIV), originally published in December 1985. The updates are mainly statistics and terminology, along with the addition of several new sections.…
ERIC Educational Resources Information Center
Harris, Robert; Phillips, Alan
A project sought to develop a means of updating and retraining those required to comply with Britain's 1985 Building Regulations, which are substantially different from the previous ones in regard to procedures and technical content. The training needs analysis conducted indicated that the new training should be flexible and use practical and…
NASA Technical Reports Server (NTRS)
Harrington, James L., Jr.
2000-01-01
The Minority University Space Interdisciplinary (MUSPIN) Network project is a comprehensive outreach and education initiative that focuses on the transfer of advanced computer networking technologies and relevant science to Historically Black Colleges and Universities (HBCU's) and Other Minority Universities (OMU's) for supporting multi-disciplinary education research.
Forecasting in an integrated surface water-ground water system: The Big Cypress Basin, South Florida
NASA Astrophysics Data System (ADS)
Butts, M. B.; Feng, K.; Klinting, A.; Stewart, K.; Nath, A.; Manning, P.; Hazlett, T.; Jacobsen, T.
2009-04-01
The South Florida Water Management District (SFWMD) manages and protects the state's water resources on behalf of 7.5 million South Floridians and is the lead agency in restoring America's Everglades - the largest environmental restoration project in US history. Many of the projects to restore and protect the Everglades ecosystem are part of the Comprehensive Everglades Restoration Plan (CERP). The region has a unique hydrological regime, with close connection between surface water and groundwater, and a complex managed drainage network with many structures. Added to the physical complexity are the conflicting needs of the ecosystem for protection and restoration, versus the substantial urban development with the accompanying water supply, water quality and flood control issues. In this paper a novel forecasting and real-time modelling system is presented for the Big Cypress Basin. The Big Cypress Basin includes 272 km of primary canals and 46 water control structures throughout the area that provide limited levels of flood protection, as well as water supply and environmental quality management. This system is linked to the South Florida Water Management District's extensive real-time (SCADA) data monitoring and collection system. Novel aspects of this system include the use of a fully distributed and integrated modeling approach and a new filter-based updating approach for accurately forecasting river levels. Because of the interaction between surface- and groundwater a fully integrated forecast modeling approach is required. Indeed, results for the Tropical Storm Fay in 2008, the groundwater levels show an extremely rapid response to heavy rainfall. Analysis of this storm also shows that updating levels in the river system can have a direct impact on groundwater levels.
NASA Astrophysics Data System (ADS)
Wang, Xing; Hill, Thomas L.; Neild, Simon A.; Shaw, Alexander D.; Haddad Khodaparast, Hamed; Friswell, Michael I.
2018-02-01
This paper proposes a model updating strategy for localised nonlinear structures. It utilises an initial finite-element (FE) model of the structure and primary harmonic response data taken from low and high amplitude excitations. The underlying linear part of the FE model is first updated using low-amplitude test data with established techniques. Then, using this linear FE model, the nonlinear elements are localised, characterised, and quantified with primary harmonic response data measured under stepped-sine or swept-sine excitations. Finally, the resulting model is validated by comparing the analytical predictions with both the measured responses used in the updating and with additional test data. The proposed strategy is applied to a clamped beam with a nonlinear mechanism and good agreements between the analytical predictions and measured responses are achieved. Discussions on issues of damping estimation and dealing with data from amplitude-varying force input in the updating process are also provided.
CONSULTATION ON UPDATED METHODOLOGY FOR ...
The National Academy of Sciences (NAS) expects to publish the Biological Effects of Ionizing Radiation (BEIR) committee's report (BEIR VII) on risks from ionizing radiation exposures in calendar year 2005. The committee is expected to have analyzed the most recent epidemiology from the important exposed cohorts and to have factored in any changes resulting from the updated analysis of dosimetry for the Japanese atomic bomb survivors. To the extent practical, the Committee will also consider any relevant radiobiological data, including those from the Department of Energy's low dose effects research program. Based on their evaluation of relevant information, the Committee is then expected to propose a set of models for estimating risks from low-dose ionizing radiation. ORIA will review the BEIR VII report and consider revisions to the Agency's methodology for estimating cancer risks from exposure to ionizing radiation in light of this report and other relevant information. This will be the subject of the Consultation. This project supports a major risk management initiative to improve the basis on which radiation risk decisions are made. This project, funded by several Federal Agencies, reflects an attempt to characterize risks where there are substantial uncertainties. The outcome will improve our ability to assess risks well into the future and will strengthen EPAs overall capability for assessing and managing radiation risks. the BEIR VII report is funde
Utilizing Flight Data to Update Aeroelastic Stability Estimates
NASA Technical Reports Server (NTRS)
Lind, Rick; Brenner, Marty
1997-01-01
Stability analysis of high performance aircraft must account for errors in the system model. A method for computing flutter margins that incorporates flight data has been developed using robust stability theory. This paper considers applying this method to update flutter margins during a post-flight or on-line analysis. Areas of modeling uncertainty that arise when using flight data with this method are investigated. The amount of conservatism in the resulting flutter margins depends on the flight data sets used to update the model. Post-flight updates of flutter margins for an F/A-18 are presented along with a simulation of on-line updates during a flight test.
EPA's announced the availability of the final report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2). This update furthered land change modeling by providing nationwide housing developmen...
Application of GIS Technology for Town Planning Tasks Solving
NASA Astrophysics Data System (ADS)
Kiyashko, G. A.
2017-11-01
For developing territories, one of the most actual town-planning tasks is to find out the suitable sites for building projects. The geographic information system (GIS) allows one to model complex spatial processes and can provide necessary effective tools to solve these tasks. We propose several GIS analysis models which can define suitable settlement allocations and select appropriate parcels for construction objects. We implement our models in the ArcGIS Desktop package and verify by application to the existing objects in Primorsky Region (Primorye Territory). These suitability models use several variations of the analysis method combinations and include various ways to resolve the suitability task using vector data and a raster data set. The suitability models created in this study can be combined, and one model can be integrated into another as its part. Our models can be updated by other suitability models for further detailed planning.
Cost estimation model for advanced planetary programs, fourth edition
NASA Technical Reports Server (NTRS)
Spadoni, D. J.
1983-01-01
The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.
Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.
2015-05-01
The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactormore » innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less
GCIP water and energy budget synthesis (WEBS)
Roads, J.; Lawford, R.; Bainto, E.; Berbery, E.; Chen, S.; Fekete, B.; Gallo, K.; Grundstein, A.; Higgins, W.; Kanamitsu, M.; Krajewski, W.; Lakshmi, V.; Leathers, D.; Lettenmaier, D.; Luo, L.; Maurer, E.; Meyers, T.; Miller, D.; Mitchell, Ken; Mote, T.; Pinker, R.; Reichler, T.; Robinson, D.; Robock, A.; Smith, J.; Srinivasan, G.; Verdin, K.; Vinnikov, K.; Vonder, Haar T.; Vorosmarty, C.; Williams, S.; Yarosh, E.
2003-01-01
As part of the World Climate Research Program's (WCRPs) Global Energy and Water-Cycle Experiment (GEWEX) Continental-scale International Project (GCIP), a preliminary water and energy budget synthesis (WEBS) was developed for the period 1996-1999 fromthe "best available" observations and models. Besides this summary paper, a companion CD-ROM with more extensive discussion, figures, tables, and raw data is available to the interested researcher from the GEWEX project office, the GAPP project office, or the first author. An updated online version of the CD-ROM is also available at http://ecpc.ucsd.edu/gcip/webs.htm/. Observations cannot adequately characterize or "close" budgets since too many fundamental processes are missing. Models that properly represent the many complicated atmospheric and near-surface interactions are also required. This preliminary synthesis therefore included a representative global general circulation model, regional climate model, and a macroscale hydrologic model as well as a global reanalysis and a regional analysis. By the qualitative agreement among the models and available observations, it did appear that we now qualitatively understand water and energy budgets of the Mississippi River Basin. However, there is still much quantitative uncertainty. In that regard, there did appear to be a clear advantage to using a regional analysis over a global analysis or a regional simulation over a global simulation to describe the Mississippi River Basin water and energy budgets. There also appeared to be some advantage to using a macroscale hydrologic model for at least the surface water budgets. Copyright 2003 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Sanderson, B. M.
2017-12-01
The CMIP ensembles represent the most comprehensive source of information available to decision-makers for climate adaptation, yet it is clear that there are fundamental limitations in our ability to treat the ensemble as an unbiased sample of possible future climate trajectories. There is considerable evidence that models are not independent, and increasing complexity and resolution combined with computational constraints prevent a thorough exploration of parametric uncertainty or internal variability. Although more data than ever is available for calibration, the optimization of each model is influenced by institutional priorities, historical precedent and available resources. The resulting ensemble thus represents a miscellany of climate simulators which defy traditional statistical interpretation. Models are in some cases interdependent, but are sufficiently complex that the degree of interdependency is conditional on the application. Configurations have been updated using available observations to some degree, but not in a consistent or easily identifiable fashion. This means that the ensemble cannot be viewed as a true posterior distribution updated by available data, but nor can observational data alone be used to assess individual model likelihood. We assess recent literature for combining projections from an imperfect ensemble of climate simulators. Beginning with our published methodology for addressing model interdependency and skill in the weighting scheme for the 4th US National Climate Assessment, we consider strategies for incorporating process-based constraints on future response, perturbed parameter experiments and multi-model output into an integrated framework. We focus on a number of guiding questions: Is the traditional framework of confidence in projections inferred from model agreement leading to biased or misleading conclusions? Can the benefits of upweighting skillful models be reconciled with the increased risk of truth lying outside the weighted ensemble distribution? If CMIP is an ensemble of partially informed best-guesses, can we infer anything about the parent distribution of all possible models of the climate system (and if not, are we implicitly under-representing the risk of a climate catastrophe outside of the envelope of CMIP simulations)?
Application of Artificial Intelligence for Bridge Deterioration Model.
Chen, Zhang; Wu, Yangyang; Li, Li; Sun, Lijun
2015-01-01
The deterministic bridge deterioration model updating problem is well established in bridge management, while the traditional methods and approaches for this problem require manual intervention. An artificial-intelligence-based approach was presented to self-updated parameters of the bridge deterioration model in this paper. When new information and data are collected, a posterior distribution was constructed to describe the integrated result of historical information and the new gained information according to Bayesian theorem, which was used to update model parameters. This AI-based approach is applied to the case of updating parameters of bridge deterioration model, which is the data collected from bridges of 12 districts in Shanghai from 2004 to 2013, and the results showed that it is an accurate, effective, and satisfactory approach to deal with the problem of the parameter updating without manual intervention.
Application of Artificial Intelligence for Bridge Deterioration Model
Chen, Zhang; Wu, Yangyang; Sun, Lijun
2015-01-01
The deterministic bridge deterioration model updating problem is well established in bridge management, while the traditional methods and approaches for this problem require manual intervention. An artificial-intelligence-based approach was presented to self-updated parameters of the bridge deterioration model in this paper. When new information and data are collected, a posterior distribution was constructed to describe the integrated result of historical information and the new gained information according to Bayesian theorem, which was used to update model parameters. This AI-based approach is applied to the case of updating parameters of bridge deterioration model, which is the data collected from bridges of 12 districts in Shanghai from 2004 to 2013, and the results showed that it is an accurate, effective, and satisfactory approach to deal with the problem of the parameter updating without manual intervention. PMID:26601121
Zhang, Kaihua; Zhang, Lei; Yang, Ming-Hsuan
2014-10-01
It is a challenging task to develop effective and efficient appearance models for robust object tracking due to factors such as pose variation, illumination change, occlusion, and motion blur. Existing online tracking algorithms often update models with samples from observations in recent frames. Despite much success has been demonstrated, numerous issues remain to be addressed. First, while these adaptive appearance models are data-dependent, there does not exist sufficient amount of data for online algorithms to learn at the outset. Second, online tracking algorithms often encounter the drift problems. As a result of self-taught learning, misaligned samples are likely to be added and degrade the appearance models. In this paper, we propose a simple yet effective and efficient tracking algorithm with an appearance model based on features extracted from a multiscale image feature space with data-independent basis. The proposed appearance model employs non-adaptive random projections that preserve the structure of the image feature space of objects. A very sparse measurement matrix is constructed to efficiently extract the features for the appearance model. We compress sample images of the foreground target and the background using the same sparse measurement matrix. The tracking task is formulated as a binary classification via a naive Bayes classifier with online update in the compressed domain. A coarse-to-fine search strategy is adopted to further reduce the computational complexity in the detection procedure. The proposed compressive tracking algorithm runs in real-time and performs favorably against state-of-the-art methods on challenging sequences in terms of efficiency, accuracy and robustness.
Overview and Evaluation of the Community Multiscale Air ...
The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Protection Agency develops the CMAQ model and periodically releases new versions of the model that include bug fixes and various other improvements to the modeling system. In late 2016 or early 2017, CMAQ version 5.2 will be released. This new version of CMAQ will contain important updates from the current CMAQv5.1 modeling system, along with several instrumented versions of the model (e.g. decoupled direct method and sulfur tracking). Some specific model updates include the implementation of a new wind-blown dust treatment in CMAQv5.2, a significant improvement over the treatment in v5.1 which can severely overestimate wind-blown dust under certain conditions. Several other major updates to the modeling system include an update to the calculation of aerosols; implementation of full halogen chemistry (CMAQv5.1 contains a partial implementation of halogen chemistry); the new carbon bond 6 (CB6) chemical mechanism; updates to cloud model in CMAQ; and a new lightning assimilation scheme for the WRF model which significant improves the placement and timing of convective precipitation in the WRF precipitation fields. Numerous other updates to the modeling system will also be available in v5.2.
Nasserie, Tahmina; Tuite, Ashleigh R; Whitmore, Lindsay; Hatchette, Todd; Drews, Steven J; Peci, Adriana; Kwong, Jeffrey C; Friedman, Dara; Garber, Gary; Gubbay, Jonathan
2017-01-01
Abstract Background Seasonal influenza epidemics occur frequently. Rapid characterization of seasonal dynamics and forecasting of epidemic peaks and final sizes could help support real-time decision-making related to vaccination and other control measures. Real-time forecasting remains challenging. Methods We used the previously described “incidence decay with exponential adjustment” (IDEA) model, a 2-parameter phenomenological model, to evaluate the characteristics of the 2015–2016 influenza season in 4 Canadian jurisdictions: the Provinces of Alberta, Nova Scotia and Ontario, and the City of Ottawa. Model fits were updated weekly with receipt of incident virologically confirmed case counts. Best-fit models were used to project seasonal influenza peaks and epidemic final sizes. Results The 2015–2016 influenza season was mild and late-peaking. Parameter estimates generated through fitting were consistent in the 2 largest jurisdictions (Ontario and Alberta) and with pooled data including Nova Scotia counts (R0 approximately 1.4 for all fits). Lower R0 estimates were generated in Nova Scotia and Ottawa. Final size projections that made use of complete time series were accurate to within 6% of true final sizes, but final size was using pre-peak data. Projections of epidemic peaks stabilized before the true epidemic peak, but these were persistently early (~2 weeks) relative to the true peak. Conclusions A simple, 2-parameter influenza model provided reasonably accurate real-time projections of influenza seasonal dynamics in an atypically late, mild influenza season. Challenges are similar to those seen with more complex forecasting methodologies. Future work includes identification of seasonal characteristics associated with variability in model performance. PMID:29497629
Impact of edge lines on safety of rural two-lane highways.
DOT National Transportation Integrated Search
2005-10-01
This report documents the results of the project for Impact of Edge Lines on Safety of Rural Two Lane Highways. This research project was initiated in the effort of compliance with the updated version of the Manual on Uniform Traffic Control De...
Baltimore applications project
NASA Technical Reports Server (NTRS)
Golden, T. S.; Yaffee, P.
1979-01-01
An update is presented for the following projects: (1) asphalt pavement recycling; (2) data collection platform/water quality monitoring; (3) digital emergency traffic routing; (4) fire department communications and dispatch system; (5) health department management information system; (6) hazardous materials; (7) coal gasification; and (8) emergency vehicle proximity sensing.
Synthesizing long-term sea level rise projections - the MAGICC sea level model v2.0
NASA Astrophysics Data System (ADS)
Nauels, Alexander; Meinshausen, Malte; Mengel, Matthias; Lorbacher, Katja; Wigley, Tom M. L.
2017-06-01
Sea level rise (SLR) is one of the major impacts of global warming; it will threaten coastal populations, infrastructure, and ecosystems around the globe in coming centuries. Well-constrained sea level projections are needed to estimate future losses from SLR and benefits of climate protection and adaptation. Process-based models that are designed to resolve the underlying physics of individual sea level drivers form the basis for state-of-the-art sea level projections. However, associated computational costs allow for only a small number of simulations based on selected scenarios that often vary for different sea level components. This approach does not sufficiently support sea level impact science and climate policy analysis, which require a sea level projection methodology that is flexible with regard to the climate scenario yet comprehensive and bound by the physical constraints provided by process-based models. To fill this gap, we present a sea level model that emulates global-mean long-term process-based model projections for all major sea level components. Thermal expansion estimates are calculated with the hemispheric upwelling-diffusion ocean component of the simple carbon-cycle climate model MAGICC, which has been updated and calibrated against CMIP5 ocean temperature profiles and thermal expansion data. Global glacier contributions are estimated based on a parameterization constrained by transient and equilibrium process-based projections. Sea level contribution estimates for Greenland and Antarctic ice sheets are derived from surface mass balance and solid ice discharge parameterizations reproducing current output from ice-sheet models. The land water storage component replicates recent hydrological modeling results. For 2100, we project 0.35 to 0.56 m (66 % range) total SLR based on the RCP2.6 scenario, 0.45 to 0.67 m for RCP4.5, 0.46 to 0.71 m for RCP6.0, and 0.65 to 0.97 m for RCP8.5. These projections lie within the range of the latest IPCC SLR estimates. SLR projections for 2300 yield median responses of 1.02 m for RCP2.6, 1.76 m for RCP4.5, 2.38 m for RCP6.0, and 4.73 m for RCP8.5. The MAGICC sea level model provides a flexible and efficient platform for the analysis of major scenario, model, and climate uncertainties underlying long-term SLR projections. It can be used as a tool to directly investigate the SLR implications of different mitigation pathways and may also serve as input for regional SLR assessments via component-wise sea level pattern scaling.
SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating
Lee, Young-Joo; Cho, Soojin
2016-01-01
Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125
Predicting remaining life by fusing the physics of failure modeling with diagnostics
NASA Astrophysics Data System (ADS)
Kacprzynski, G. J.; Sarlashkar, A.; Roemer, M. J.; Hess, A.; Hardman, B.
2004-03-01
Technology that enables failure prediction of critical machine components (prognostics) has the potential to significantly reduce maintenance costs and increase availability and safety. This article summarizes a research effort funded through the U.S. Defense Advanced Research Projects Agency and Naval Air System Command aimed at enhancing prognostic accuracy through more advanced physics-of-failure modeling and intelligent utilization of relevant diagnostic information. H-60 helicopter gear is used as a case study to introduce both stochastic sub-zone crack initiation and three-dimensional fracture mechanics lifing models along with adaptive model updating techniques for tuning key failure mode variables at a local material/damage site based on fused vibration features. The overall prognostic scheme is aimed at minimizing inherent modeling and operational uncertainties via sensed system measurements that evolve as damage progresses.
Post2 End-to-End Descent and Landing Simulation for ALHAT Design Analysis Cycle 2
NASA Technical Reports Server (NTRS)
Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Johnson, Andrew E.; Paschall, Stephen C., II
2010-01-01
The ALHAT project is an agency-level program involving NASA centers, academia, and industry, with a primary goal to develop a safe, autonomous, precision-landing system for robotic and crew-piloted lunar and planetary descent vehicles. POST2 is used as the 6DOF descent and landing trajectory simulation for determining integrated system performance of ALHAT landing-system models and lunar environment models. This paper presents updates in the development of the ALHAT POST2 simulation, as well as preliminary system performance analysis for ALDAC-2 used for the testing and assessment of ALHAT system models. The ALDAC-2 POST2 Monte Carlo simulation results have been generated and focus on HRN model performance with the fully integrated system, as well performance improvements of AGNC and TSAR model since the previous design analysis cycle
Normal response function method for mass and stiffness matrix updating using complex FRFs
NASA Astrophysics Data System (ADS)
Pradhan, S.; Modak, S. V.
2012-10-01
Quite often a structural dynamic finite element model is required to be updated so as to accurately predict the dynamic characteristics like natural frequencies and the mode shapes. Since in many situations undamped natural frequencies and mode shapes need to be predicted, it has generally been the practice in these situations to seek updating of only mass and stiffness matrix so as to obtain a reliable prediction model. Updating using frequency response functions (FRFs) has been one of the widely used approaches for updating, including updating of mass and stiffness matrices. However, the problem with FRF based methods, for updating mass and stiffness matrices, is that these methods are based on use of complex FRFs. Use of complex FRFs to update mass and stiffness matrices is not theoretically correct as complex FRFs are not only affected by these two matrices but also by the damping matrix. Therefore, in situations where updating of only mass and stiffness matrices using FRFs is required, the use of complex FRFs based updating formulation is not fully justified and would lead to inaccurate updated models. This paper addresses this difficulty and proposes an improved FRF based finite element model updating procedure using the concept of normal FRFs. The proposed method is a modified version of the existing response function method that is based on the complex FRFs. The effectiveness of the proposed method is validated through a numerical study of a simple but representative beam structure. The effect of coordinate incompleteness and robustness of method under presence of noise is investigated. The results of updating obtained by the improved method are compared with the existing response function method. The performance of the two approaches is compared for cases of light, medium and heavily damped structures. It is found that the proposed improved method is effective in updating of mass and stiffness matrices in all the cases of complete and incomplete data and with all levels and types of damping.
NASA Human Health and Performance Center: Open Innovation Successes and Collaborative Projects
NASA Technical Reports Server (NTRS)
Davis, Jeffrey R.; Richard, Elizabeth E.
2014-01-01
In May 2007, what was then the Space Life Sciences Directorate published the 2007 Space Life Sciences Strategy for Human Space Exploration, which resulted in the development and implementation of new business models and significant advances in external collaboration over the next five years. The strategy was updated on the basis of these accomplishments and reissued as the NASA Human Health and Performance Strategy in 2012, and continues to drive new approaches to innovation for the directorate. This short paper describes the open innovation successes and collaborative projects developed over this timeframe, including the efforts of the NASA Human Health and Performance Center (NHHPC), which was established to advance human health and performance innovations for spaceflight and societal benefit via collaboration in new markets.
CoMET: Cost and Mass Evaluation Tool for Spacecraft and Mission Design
NASA Technical Reports Server (NTRS)
Bieber, Ben S.
2005-01-01
New technology in space exploration is often developed without a complete knowledge of its impact. While the immediate benefits of a new technology are obvious, it is harder to understand its indirect consequences, which ripple through the entire system. COMET is a technology evaluation tool designed to illuminate how specific technology choices affect a mission at each system level. COMET uses simplified models for mass, power, and cost to analyze performance parameters of technologies of interest. The sensitivity analysis that CoMET provides shows whether developing a certain technology will greatly benefit the project or not. CoMET is an ongoing project approaching a web-based implementation phase. This year, development focused on the models for planetary daughter craft, such as atmospheric probes, blimps and balloons, and landers. These models are developed through research into historical data, well established rules of thumb, and engineering judgment of experts at JPL. The model is validated by corroboration with JpL advanced mission studies. Other enhancements to COMET include adding launch vehicle analysis and integrating an updated cost model. When completed, COMET will allow technological development to be focused on areas that will most drastically improve spacecraft performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMurray, L. and W. Templin-Branner
Training Manual updated for United Negro College Fund Special Programs Corporation/National Library of Medicine - HBCU ACCESS Project for Alcorn State University, Natchez, Mississippi, November 12, 2010
The In-Space Propulsion Technology Project Low-Thrust Trajectory Tool Suite
NASA Technical Reports Server (NTRS)
Dankanich, John W.
2008-01-01
The ISPT project released its low-thrust trajectory tool suite in March of 2006. The LTTT suite tools range in capabilities, but represent the state-of-the art in NASA low-thrust trajectory optimization tools. The tools have all received considerable updates following the initial release, and they are available through their respective development centers or the ISPT project website.
INEEL AIR MODELING PROTOCOL ext
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. S. Staley; M. L. Abbott; P. D. Ritter
2004-12-01
Various laws stemming from the Clean Air Act of 1970 and the Clean Air Act amendments of 1990 require air emissions modeling. Modeling is used to ensure that air emissions from new projects and from modifications to existing facilities do not exceed certain standards. For radionuclides, any new airborne release must be modeled to show that downwind receptors do not receive exposures exceeding the dose limits and to determine the requirements for emissions monitoring. For criteria and toxic pollutants, emissions usually must first exceed threshold values before modeling of downwind concentrations is required. This document was prepared to provide guidancemore » for performing environmental compliance-driven air modeling of emissions from Idaho National Engineering and Environmental Laboratory facilities. This document assumes that the user has experience in air modeling and dose and risk assessment. It is not intended to be a "cookbook," nor should all recommendations herein be construed as requirements. However, there are certain procedures that are required by law, and these are pointed out. It is also important to understand that air emissions modeling is a constantly evolving process. This document should, therefore, be reviewed periodically and revised as needed. The document is divided into two parts. Part A is the protocol for radiological assessments, and Part B is for nonradiological assessments. This document is an update of and supersedes document INEEL/INT-98-00236, Rev. 0, INEEL Air Modeling Protocol. This updated document incorporates changes in some of the rules, procedures, and air modeling codes that have occurred since the protocol was first published in 1998.« less
The Master Archive Collection Inventory (MACI)
NASA Astrophysics Data System (ADS)
Lief, C. J.; Arnfield, J.; Sprain, M.
2014-12-01
The Master Archive Collection Inventory (MACI) project at the NOAA National Climatic Data Center (NCDC) is an effort to re-inventory all digital holdings to streamline data set and product titles and update documentation to discovery level ISO 199115-2. Subject Matter Experts (SME) are being identified for each of the holdings and will be responsible for creating and maintaining metadata records. New user-friendly tools are available for the SMEs to easily create and update this documentation. Updated metadata will be available for retrieval by other aggregators and discovery tools, increasing the usability of NCDC data and products.
Final Report: Update of the Glossary of Meteorology, September 1, 1994 - August 3, 1999
DOE Office of Scientific and Technical Information (OSTI.GOV)
American Meteorological Society
2000-01-24
The American Meteorological Society has updated the Glossary of Meteorology from the first addition which was published in 1959. The second edition contains over 12,000 entries in meteorology and related fields. The glossary will be made available in both book and CD-ROM formats. DOE was one of six federal agencies that provided support for this project.
CSTT Update: Fuel Quality Analyzer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brosha, Eric L.; Lujan, Roger W.; Mukundan, Rangachary
These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.
Manufacturing Methods and Technology Program Plan. Update.
1981-11-01
INDUSTRIAL BASE ENGINEERING ACTIVITY ROCK ISLAND. ILLINOIS 61299 82 INDEX PAGE I. INTRODUCTION The MMT Program Plan Update ........... 1 Industry Guide...obtained from that Plan, extra copies of which are available upon request from the Industrial Base Engineering Activity. Other sources for this data are...Major Subcommands (SUBMACOM’S). The SUBMACOM’S plan, formulate, budget, and execute individual projects. The Industrial Base Engineering Activity
Himmelstoss, Emily A.; Kratzmann, Meredith G.; Thieler, E. Robert
2017-07-18
Long-term rates of shoreline change for the Gulf of Mexico and Southeast Atlantic regions of the United States have been updated as part of the U.S. Geological Survey’s National Assessment of Shoreline Change project. Additional shoreline position data were used to compute rates where the previous rate-of-change assessment only included four shoreline positions at a given location. The long-term shoreline change rates also incorporate the proxy-datum bias correction to account for the unidirectional onshore bias of the proxy-based high water line shorelines relative to the datum-based mean high water shorelines. The calculation of uncertainty associated with the long-term average rates has also been updated to match refined methods used in other study regions of the National Assessment project. The average rates reported here have a reduced amount of uncertainty relative to those presented in the previous assessments for these two regions.
NASA Astrophysics Data System (ADS)
Turnbull, Heather; Omenzetter, Piotr
2018-03-01
vDifficulties associated with current health monitoring and inspection practices combined with harsh, often remote, operational environments of wind turbines highlight the requirement for a non-destructive evaluation system capable of remotely monitoring the current structural state of turbine blades. This research adopted a physics based structural health monitoring methodology through calibration of a finite element model using inverse techniques. A 2.36m blade from a 5kW turbine was used as an experimental specimen, with operational modal analysis techniques utilised to realize the modal properties of the system. Modelling the experimental responses as fuzzy numbers using the sub-level technique, uncertainty in the response parameters was propagated back through the model and into the updating parameters. Initially, experimental responses of the blade were obtained, with a numerical model of the blade created and updated. Deterministic updating was carried out through formulation and minimisation of a deterministic objective function using both firefly algorithm and virus optimisation algorithm. Uncertainty in experimental responses were modelled using triangular membership functions, allowing membership functions of updating parameters (Young's modulus and shear modulus) to be obtained. Firefly algorithm and virus optimisation algorithm were again utilised, however, this time in the solution of fuzzy objective functions. This enabled uncertainty associated with updating parameters to be quantified. Varying damage location and severity was simulated experimentally through addition of small masses to the structure intended to cause a structural alteration. A damaged model was created, modelling four variable magnitude nonstructural masses at predefined points and updated to provide a deterministic damage prediction and information in relation to the parameters uncertainty via fuzzy updating.
Numerical modeling and model updating for smart laminated structures with viscoelastic damping
NASA Astrophysics Data System (ADS)
Lu, Jun; Zhan, Zhenfei; Liu, Xu; Wang, Pan
2018-07-01
This paper presents a numerical modeling method combined with model updating techniques for the analysis of smart laminated structures with viscoelastic damping. Starting with finite element formulation, the dynamics model with piezoelectric actuators is derived based on the constitutive law of the multilayer plate structure. The frequency-dependent characteristics of the viscoelastic core are represented utilizing the anelastic displacement fields (ADF) parametric model in the time domain. The analytical model is validated experimentally and used to analyze the influencing factors of kinetic parameters under parametric variations. Emphasis is placed upon model updating for smart laminated structures to improve the accuracy of the numerical model. Key design variables are selected through the smoothing spline ANOVA statistical technique to mitigate the computational cost. This updating strategy not only corrects the natural frequencies but also improves the accuracy of damping prediction. The effectiveness of the approach is examined through an application problem of a smart laminated plate. It is shown that a good consistency can be achieved between updated results and measurements. The proposed method is computationally efficient.
Adapting to change: The role of the right hemisphere in mental model building and updating.
Filipowicz, Alex; Anderson, Britt; Danckert, James
2016-09-01
We recently proposed that the right hemisphere plays a crucial role in the processes underlying mental model building and updating. Here, we review the evidence we and others have garnered to support this novel account of right hemisphere function. We begin by presenting evidence from patient work that suggests a critical role for the right hemisphere in the ability to learn from the statistics in the environment (model building) and adapt to environmental change (model updating). We then provide a review of neuroimaging research that highlights a network of brain regions involved in mental model updating. Next, we outline specific roles for particular regions within the network such that the anterior insula is purported to maintain the current model of the environment, the medial prefrontal cortex determines when to explore new or alternative models, and the inferior parietal lobule represents salient and surprising information with respect to the current model. We conclude by proposing some future directions that address some of the outstanding questions in the field of mental model building and updating. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Burns, Lee; Merry, Carl; Decker, Ryan; Harrington, Brian
2008-01-01
The 2006 Cape Canaveral Air Force Station (CCAFS) Range Reference Atmosphere (RRA) is a statistical model summarizing the wind and thermodynamic atmospheric variability from surface to 70 kin. Launches of the National Aeronautics and Space Administration's (NASA) Space Shuttle from Kennedy Space Center utilize CCAFS RRA data to evaluate environmental constraints on various aspects of the vehicle during ascent. An update to the CCAFS RRA was recently completed. As part of the update, a validation study on the 2006 version was conducted as well as a comparison analysis of the 2006 version to the existing CCAFS RRA database version 1983. Assessments to the Space Shuttle vehicle ascent profile characteristics were performed to determine impacts of the updated model to the vehicle performance. Details on the model updates and the vehicle sensitivity analyses with the update model are presented.
Artificial Boundary Conditions for Finite Element Model Update and Damage Detection
2017-03-01
BOUNDARY CONDITIONS FOR FINITE ELEMENT MODEL UPDATE AND DAMAGE DETECTION by Emmanouil Damanakis March 2017 Thesis Advisor: Joshua H. Gordis...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE ARTIFICIAL BOUNDARY CONDITIONS FOR FINITE ELEMENT MODEL UPDATE AND DAMAGE DETECTION...release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) In structural engineering, a finite element model is often
An Investigation Into the Effects of Frequency Response Function Estimators on Model Updating
NASA Astrophysics Data System (ADS)
Ratcliffe, M. J.; Lieven, N. A. J.
1999-03-01
Model updating is a very active research field, in which significant effort has been invested in recent years. Model updating methodologies are invariably successful when used on noise-free simulated data, but tend to be unpredictable when presented with real experimental data that are—unavoidably—corrupted with uncorrelated noise content. In the development and validation of model-updating strategies, a random zero-mean Gaussian variable is added to simulated test data to tax the updating routines more fully. This paper proposes a more sophisticated model for experimental measurement noise, and this is used in conjunction with several different frequency response function estimators, from the classical H1and H2to more refined estimators that purport to be unbiased. Finite-element model case studies, in conjunction with a genuine experimental test, suggest that the proposed noise model is a more realistic representation of experimental noise phenomena. The choice of estimator is shown to have a significant influence on the viability of the FRF sensitivity method. These test cases find that the use of the H2estimator for model updating purposes is contraindicated, and that there is no advantage to be gained by using the sophisticated estimators over the classical H1estimator.
Ground Source Heat Pump Sub-Slab Heat Exchange Loop Performance in a Cold Climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittereder, N.; Poerschke, A.
2013-11-01
This report presents a cold-climate project that examines an alternative approach to ground source heat pump (GSHP) ground loop design. The innovative ground loop design is an attempt to reduce the installed cost of the ground loop heat exchange portion of the system by containing the entire ground loop within the excavated location beneath the basement slab. Prior to the installation and operation of the sub-slab heat exchanger, energy modeling using TRNSYS software and concurrent design efforts were performed to determine the size and orientation of the system. One key parameter in the design is the installation of the GSHPmore » in a low-load home, which considerably reduces the needed capacity of the ground loop heat exchanger. This report analyzes data from two cooling seasons and one heating season. Upon completion of the monitoring phase, measurements revealed that the initial TRNSYS simulated horizontal sub-slab ground loop heat exchanger fluid temperatures and heat transfer rates differed from the measured values. To determine the cause of this discrepancy, an updated model was developed utilizing a new TRNSYS subroutine for simulating sub-slab heat exchangers. Measurements of fluid temperature, soil temperature, and heat transfer were used to validate the updated model.« less
Addressing forecast uncertainty impact on CSP annual performance
NASA Astrophysics Data System (ADS)
Ferretti, Fabio; Hogendijk, Christopher; Aga, Vipluv; Ehrsam, Andreas
2017-06-01
This work analyzes the impact of weather forecast uncertainty on the annual performance of a Concentrated Solar Power (CSP) plant. Forecast time series has been produced by a commercial forecast provider using the technique of hindcasting for the full year 2011 in hourly resolution for Ouarzazate, Morocco. Impact of forecast uncertainty has been measured on three case studies, representing typical tariff schemes observed in recent CSP projects plus a spot market price scenario. The analysis has been carried out using an annual performance model and a standard dispatch optimization algorithm based on dynamic programming. The dispatch optimizer has been demonstrated to be a key requisite to maximize the annual revenues depending on the price scenario, harvesting the maximum potential out of the CSP plant. Forecasting uncertainty affects the revenue enhancement outcome of a dispatch optimizer depending on the error level and the price function. Results show that forecasting accuracy of direct solar irradiance (DNI) is important to make best use of an optimized dispatch but also that a higher number of calculation updates can partially compensate this uncertainty. Improvement in revenues can be significant depending on the price profile and the optimal operation strategy. Pathways to achieve better performance are presented by having more updates both by repeatedly generating new optimized trajectories but also more often updating weather forecasts. This study shows the importance of working on DNI weather forecasting for revenue enhancement as well as selecting weather services that can provide multiple updates a day and probabilistic forecast information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harpole, K.J.; Hill, C.J.
1983-02-01
A review of the performance of the North Stanley Polymer Demonstration Project has been completed. The objective of the cost-project was to evaluate the technical efficiency and economic feasibility of polymer-enhanced waterflooding as a tertiary recovery process in a highly heterogeneous and vertically fractured sandstone reservoir that has been successfully waterflooded and is approaching the economic limits of conventional waterflooding recovery. The ultimate incremental oil recovery from the project is estimated to be about 570,000 barrels (or approximately 1.4% of the original oil-in-place). This is significantly less than the original recovery predictions but does demonstrate that the project was technicallymore » successful. The lower-than-anticipated recovery is attributed principally to the extremely heterogeneous nature of the reservoir. One of the major objectives of this evaluation is to present an updated economic anlaysis of the North Stanley Polymer Demonstration Project. The updated economic analysis under current (mid-1982) economic conditions indicates that the North Stanley project would be commercially feasible if polymer injection had begun in 1982, rather than in 1976. Overall project operations were conducted efficiently, with a minimum of operational problems. The North Stanley polymer project provides a well-documented example of an actual field-scale tertiary application of polymer-augmented waterflooding in a highly heterogeneous reservoir.« less
EFFECT OF SELECTIVE CATALYTIC REDUCTION ON MERCURY, 2002 FIELD STUDIES UPDATE
The report documents the 2002 "Selective Catalytic Reduction Mercury Field Sampling Project." An overall evaluation of the results from both 2001 and 2002 testing is also provided. The project was sponsored by the Electric Power Research Institute (EPRI), the U.S. Department of...
Draft project management update to the Iowa DOT Project Development Manual : tech transfer summary.
DOT National Transportation Integrated Search
2016-08-01
The Iowa DOT applied and was selected to receive User Incentive : funding from the U.S. DOT Federal Highway Administration (FHWA) : for the SHRP 2 R10 Implementation Assistance Program. Through the : program, the Iowa DOT plans to utilize the results...
The Market Linkage Project for Special Education: A Project Update.
ERIC Educational Resources Information Center
Bulford, Sally; Daniels, Carol
1982-01-01
The procedure for marketing special education materials developed by government money is detailed, and the array of technical assistance activities offered by LINC Resources is described. Material criteria is considered in terms of such aspects as timeliness, target audiences, effectiveness data, and product format. (CL)
NASA Earned Value Management (EVM) Update
NASA Technical Reports Server (NTRS)
Kerby, Jerald
2013-01-01
Earned Value Management (EVM) is an integrated management control system for assessing, understanding and qualifying what a project is achieving with the resoures. EVM integrates technical cost and schedules with risk management. It allows objective assessment and quantification of current project performance, and helps predict future performance-based trents.
Annual Energy Outlook Retrospective Review
2015-01-01
The Annual Energy Outlook Retrospective Review provides a yearly comparison between realized energy outcomes and the Reference case projections included in previous Annual Energy Outlooks (AEO) beginning with 1982. This edition of the report adds the AEO 2012 projections and updates the historical data to incorporate the latest data revisions.
Projecting the aspen resource in the Lake States.
William A. Leuschner
1972-01-01
Aspen growing stock inventories for nine Lake States forest survey units were updated to the common base year of 1968. Cut and inventory were projected to the year 2000 under three sets of assumptions. Potential shortages were found in northeastern Wisconsin and Michigan if historical trends continue.
High-dose neutron detector project update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menlove, Howard Olsen; Henzlova, Daniela
These are the slides for a progress review meeting by the sponsor. This is an update on the high-dose neutron detector project. In summary, improvements in both boron coating and signal amplification have been achieved; improved boron coating materials and procedures have increased efficiency by ~ 30-40% without the corresponding increase in the detector plate area; low dead-time via thin cell design (~ 4 mm gas gaps) and fast amplifiers; prototype PDT 8” pod has been received and testing is in progress; significant improvements in efficiency and stability have been verified; use commercial PDT 10B design and fabrication to obtainmore » a faster path from the research to practical high-dose neutron detector.« less
Recent Results from NASA's Morphing Project
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria R.; Washburn, Anthony E.; Horta, Lucas G.; Bryant, Robert G.; Cox, David E.; Siochi, Emilie J.; Padula, Sharon L.; Holloway, Nancy M.
2002-01-01
The NASA Morphing Project seeks to develop and assess advanced technologies and integrated component concepts to enable efficient, multi-point adaptability in air and space vehicles. In the context of the project, the word "morphing" is defined as "efficient, multi-point adaptability" and may include macro, micro, structural and/or fluidic approaches. The project includes research on smart materials, adaptive structures, micro flow control, biomimetic concepts, optimization and controls. This paper presents an updated overview of the content of the Morphing Project including highlights of recent research results.
Wuebbles, Donald J; Patten, Kenneth O
2009-05-01
HCFC-123 (C2HCl2F3) is used in large refrigeration systems and as a fire suppression agent blend. Like other hydrochlorofluorocarbons, production and consumption of HCFC-123 is limited under the Montreal Protocol on Substances that Deplete the Ozone Layer. The purpose of this study is to update the understanding of the current and projected impacts of HCFC-123 on stratospheric ozone and on climate and to discuss the potential environmental effects from continued use of this chemical for specific applications. For the first time, the Ozone Depletion Potential (ODP) of a HCFC is determined using a three-dimensional model (MOZART-3) of atmospheric physics and chemistry. All previous studies have relied on results from two-dimensional models. The derived HCFC-123 ODP of 0.0098 is smaller than previous values. Analysis of the projected uses and emissions of HCFC-123, assuming reasonable levels of projected growth and use in centrifugal chiller and fire suppressant applications, suggests an extremely small impact on the environment due to its short atmospheric lifetime, low ODP, low Global Warming Potential (GWP), and the small production and emission of its limited applications. The current contribution of HCFC-123 to stratospheric reactive chlorine is too small to be measurable.
Adaptation of clinical prediction models for application in local settings.
Kappen, Teus H; Vergouwe, Yvonne; van Klei, Wilton A; van Wolfswinkel, Leo; Kalkman, Cor J; Moons, Karel G M
2012-01-01
When planning to use a validated prediction model in new patients, adequate performance is not guaranteed. For example, changes in clinical practice over time or a different case mix than the original validation population may result in inaccurate risk predictions. To demonstrate how clinical information can direct updating a prediction model and development of a strategy for handling missing predictor values in clinical practice. A previously derived and validated prediction model for postoperative nausea and vomiting was updated using a data set of 1847 patients. The update consisted of 1) changing the definition of an existing predictor, 2) reestimating the regression coefficient of a predictor, and 3) adding a new predictor to the model. The updated model was then validated in a new series of 3822 patients. Furthermore, several imputation models were considered to handle real-time missing values, so that possible missing predictor values could be anticipated during actual model use. Differences in clinical practice between our local population and the original derivation population guided the update strategy of the prediction model. The predictive accuracy of the updated model was better (c statistic, 0.68; calibration slope, 1.0) than the original model (c statistic, 0.62; calibration slope, 0.57). Inclusion of logistical variables in the imputation models, besides observed patient characteristics, contributed to a strategy to deal with missing predictor values at the time of risk calculation. Extensive knowledge of local, clinical processes provides crucial information to guide the process of adapting a prediction model to new clinical practices.
Ares I-X Flight Evaluation Tasks in Support of Ares I Development
NASA Technical Reports Server (NTRS)
Huebner, Lawrence D.; Richards, James S.; Coates, Ralph H., III; Cruit, Wendy D.; Ramsey, Matthew N.
2010-01-01
NASA s Constellation Program successfully launched the Ares I-X Flight Test Vehicle on October 28, 2009. The Ares I-X flight was a development flight test that offered a unique opportunity for early engineering data to impact the design and development of the Ares I crew launch vehicle. As the primary customer for flight data from the Ares I-X mission, the Ares Projects Office established a set of 33 flight evaluation tasks to correlate fight results with prospective design assumptions and models. Included within these tasks were direct comparisons of flight data with pre-flight predictions and post-flight assessments utilizing models and modeling techniques being applied to design and develop Ares I. A discussion of the similarities and differences in those comparisons and the need for discipline-level model updates based upon those comparisons form the substance of this paper. The benefits of development flight testing were made evident by implementing these tasks that used Ares I-X data to partially validate tools and methodologies in technical disciplines that will ultimately influence the design and development of Ares I and future launch vehicles. The areas in which partial validation from the flight test was most significant included flight control system algorithms to predict liftoff clearance, ascent, and stage separation; structural models from rollout to separation; thermal models that have been updated based on these data; pyroshock attenuation; and the ability to predict complex flow fields during time-varying conditions including plume interactions.
A review and update of the Virginia Department of Transportation cash flow forecasting model.
DOT National Transportation Integrated Search
1996-01-01
This report details the research done to review and update components of the VDOT cash flow forecasting model. Specifically, the study updated the monthly factors submodel used to predict payments on construction contracts. For the other submodel rev...