Sample records for model projection technique

  1. Variances in the projections, resulting from CLIMEX, Boosted Regression Trees and Random Forests techniques

    NASA Astrophysics Data System (ADS)

    Shabani, Farzin; Kumar, Lalit; Solhjouy-fard, Samaneh

    2017-08-01

    The aim of this study was to have a comparative investigation and evaluation of the capabilities of correlative and mechanistic modeling processes, applied to the projection of future distributions of date palm in novel environments and to establish a method of minimizing uncertainty in the projections of differing techniques. The location of this study on a global scale is in Middle Eastern Countries. We compared the mechanistic model CLIMEX (CL) with the correlative models MaxEnt (MX), Boosted Regression Trees (BRT), and Random Forests (RF) to project current and future distributions of date palm ( Phoenix dactylifera L.). The Global Climate Model (GCM), the CSIRO-Mk3.0 (CS) using the A2 emissions scenario, was selected for making projections. Both indigenous and alien distribution data of the species were utilized in the modeling process. The common areas predicted by MX, BRT, RF, and CL from the CS GCM were extracted and compared to ascertain projection uncertainty levels of each individual technique. The common areas identified by all four modeling techniques were used to produce a map indicating suitable and unsuitable areas for date palm cultivation for Middle Eastern countries, for the present and the year 2100. The four different modeling approaches predict fairly different distributions. Projections from CL were more conservative than from MX. The BRT and RF were the most conservative methods in terms of projections for the current time. The combination of the final CL and MX projections for the present and 2100 provide higher certainty concerning those areas that will become highly suitable for future date palm cultivation. According to the four models, cold, hot, and wet stress, with differences on a regional basis, appears to be the major restrictions on future date palm distribution. The results demonstrate variances in the projections, resulting from different techniques. The assessment and interpretation of model projections requires reservations, especially in correlative models such as MX, BRT, and RF. Intersections between different techniques may decrease uncertainty in future distribution projections. However, readers should not miss the fact that the uncertainties are mostly because the future GHG emission scenarios are unknowable with sufficient precision. Suggestions towards methodology and processing for improving projections are included.

  2. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  3. Multi-Model Combination techniques for Hydrological Forecasting: Application to Distributed Model Intercomparison Project Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ajami, N K; Duan, Q; Gao, X

    2005-04-11

    This paper examines several multi-model combination techniques: the Simple Multi-model Average (SMA), the Multi-Model Super Ensemble (MMSE), Modified Multi-Model Super Ensemble (M3SE) and the Weighted Average Method (WAM). These model combination techniques were evaluated using the results from the Distributed Model Intercomparison Project (DMIP), an international project sponsored by the National Weather Service (NWS) Office of Hydrologic Development (OHD). All of the multi-model combination results were obtained using uncalibrated DMIP model outputs and were compared against the best uncalibrated as well as the best calibrated individual model results. The purpose of this study is to understand how different combination techniquesmore » affect the skill levels of the multi-model predictions. This study revealed that the multi-model predictions obtained from uncalibrated single model predictions are generally better than any single member model predictions, even the best calibrated single model predictions. Furthermore, more sophisticated multi-model combination techniques that incorporated bias correction steps work better than simple multi-model average predictions or multi-model predictions without bias correction.« less

  4. A numerical projection technique for large-scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang

    2011-10-01

    We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.

  5. The balance sheet technique. Volume I. The balance sheet analysis technique for preconstruction review of airports and highways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaBelle, S.J.; Smith, A.E.; Seymour, D.A.

    1977-02-01

    The technique applies equally well to new or existing airports. The importance of accurate accounting of emissions, cannot be overstated. The regional oxidant modelling technique used in conjunction with a balance sheet review must be a proportional reduction technique. This type of emission balancing presumes equality of all sources in the analysis region. The technique can be applied successfully in the highway context, either in planning at the system level or looking only at projects individually. The project-by-project reviews could be used to examine each project in the same way as the airport projects are examined for their impact onmore » regional desired emission levels. The primary limitation of this technique is that it should not be used when simulation models have been used for regional oxidant air quality. In the case of highway projects, the balance sheet technique might appear to be limited; the real limitations are in the transportation planning process. That planning process is not well-suited to the needs of air quality forecasting. If the transportation forecasting techniques are insensitive to change in the variables that affect HC emissions, then no internal emission trade-offs can be identified, and the initial highway emission forecasts are themselves suspect. In general, the balance sheet technique is limited by the quality of the data used in the review. Additionally, the technique does not point out effective trade-off strategies, nor does it indicate when it might be worthwhile to ignore small amounts of excess emissions. Used in the context of regional air quality plans based on proportional reduction models, the balance sheet analysis technique shows promise as a useful method by state or regional reviewing agencies.« less

  6. Improvements in approaches to forecasting and evaluation techniques

    NASA Astrophysics Data System (ADS)

    Weatherhead, Elizabeth

    2014-05-01

    The US is embarking on an experiment to make significant and sustained improvements in weather forecasting. The effort stems from a series of community conversations that recognized the rapid advancements in observations, modeling and computing techniques in the academic, governmental and private sectors. The new directions and initial efforts will be summarized, including information on possibilities for international collaboration. Most new projects are scheduled to start in the last half of 2014. Several advancements include ensemble forecasting with global models, and new sharing of computing resources. Newly developed techniques for evaluating weather forecast models will be presented in detail. The approaches use statistical techniques that incorporate pair-wise comparisons of forecasts with observations and account for daily auto-correlation to assess appropriate uncertainty in forecast changes. Some of the new projects allow for international collaboration, particularly on the research components of the projects.

  7. Early warning and crop condition assessment research

    NASA Technical Reports Server (NTRS)

    Boatwright, G. O.; Whitehead, V. S.

    1986-01-01

    The Early Warning Crop Condition Assessment Project of AgRISTARS was a multiagency and multidisciplinary effort. Its mission and objectives were centered around development and testing of remote-sensing techniques that enhance operational methodologies for global crop-condition assessments. The project developed crop stress indicators models that provide data filter and alert capabilities for monitoring global agricultural conditions. The project developed a technique for using NOAA-n satellite advanced very-high-resolution radiometer (AVHRR) data for operational crop-condition assessments. This technology was transferred to the Foreign Agricultural Service of the USDA. The project developed a U.S. Great Plains data base that contains various meteorological parameters and vegetative index numbers (VIN) derived from AVHRR satellite data. It developed cloud screening techniques and scan angle correction models for AVHRR data. It also developed technology for using remotely acquired thermal data for crop water stress indicator modeling. The project provided basic technology including spectral characteristics of soils, water, stressed and nonstressed crop and range vegetation, solar zenith angle, and atmospheric and canopy structure effects.

  8. Simulation of Neural Firing Dynamics: A Student Project.

    ERIC Educational Resources Information Center

    Kletsky, E. J.

    This paper describes a student project in digital simulation techniques that is part of a graduate systems analysis course entitled Biosimulation. The students chose different simulation techniques to solve a problem related to the neuron model. (MLH)

  9. A Comprehensive Careers Cluster Curriculum Model. Health Occupations Cluster Curriculum Project and Health-Care Aide Curriculum Project.

    ERIC Educational Resources Information Center

    Bortz, Richard F.

    To prepare learning materials for health careers programs at the secondary level, the developmental phase of two curriculum projects--the Health Occupations Cluster Curriculum Project and Health-Care Aide Curriculum Project--utilized a model which incorporated a key factor analysis technique. Entitled "A Comprehensive Careers Cluster Curriculum…

  10. Cacao Intensification in Sulawesi: A Green Prosperity Model Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moriarty, K.; Elchinger, M.; Hill, G.

    2014-09-01

    NREL conducted eight model projects for Millennium Challenge Corporation's (MCC) Compact with Indonesia. Green Prosperity, the largest project of the Compact, seeks to address critical constraints to economic growth while supporting the Government of Indonesia's commitment to a more sustainable, less carbon-intensive future. This study evaluates techniques to improve cacao farming in Sulawesi Indonesia with an emphasis on Farmer Field Schools and Cocoa Development Centers to educate farmers and for train the trainer programs. The study estimates the economic viability of cacao farming if smallholder implement techniques to increase yield as well as social and environmental impacts of the project.

  11. World Energy Projection System Plus Model Documentation: Coal Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Coal Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  12. World Energy Projection System Plus Model Documentation: Transportation Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) International Transportation model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  13. World Energy Projection System Plus Model Documentation: Residential Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Residential Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  14. World Energy Projection System Plus Model Documentation: Refinery Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  15. World Energy Projection System Plus Model Documentation: Main Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Main Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  16. World Energy Projection System Plus Model Documentation: Electricity Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Electricity Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  17. A Three-Component Model for Magnetization Transfer. Solution by Projection-Operator Technique, and Application to Cartilage

    NASA Astrophysics Data System (ADS)

    Adler, Ronald S.; Swanson, Scott D.; Yeung, Hong N.

    1996-01-01

    A projection-operator technique is applied to a general three-component model for magnetization transfer, extending our previous two-component model [R. S. Adler and H. N. Yeung,J. Magn. Reson. A104,321 (1993), and H. N. Yeung, R. S. Adler, and S. D. Swanson,J. Magn. Reson. A106,37 (1994)]. The PO technique provides an elegant means of deriving a simple, effective rate equation in which there is natural separation of relaxation and source terms and allows incorporation of Redfield-Provotorov theory without any additional assumptions or restrictive conditions. The PO technique is extended to incorporate more general, multicomponent models. The three-component model is used to fit experimental data from samples of human hyaline cartilage and fibrocartilage. The fits of the three-component model are compared to the fits of the two-component model.

  18. Climatic Models Ensemble-based Mid-21st Century Runoff Projections: A Bayesian Framework

    NASA Astrophysics Data System (ADS)

    Achieng, K. O.; Zhu, J.

    2017-12-01

    There are a number of North American Regional Climate Change Assessment Program (NARCCAP) climatic models that have been used to project surface runoff in the mid-21st century. Statistical model selection techniques are often used to select the model that best fits data. However, model selection techniques often lead to different conclusions. In this study, ten models are averaged in Bayesian paradigm to project runoff. Bayesian Model Averaging (BMA) is used to project and identify effect of model uncertainty on future runoff projections. Baseflow separation - a two-digital filter which is also called Eckhardt filter - is used to separate USGS streamflow (total runoff) into two components: baseflow and surface runoff. We use this surface runoff as the a priori runoff when conducting BMA of runoff simulated from the ten RCM models. The primary objective of this study is to evaluate how well RCM multi-model ensembles simulate surface runoff, in a Bayesian framework. Specifically, we investigate and discuss the following questions: How well do ten RCM models ensemble jointly simulate surface runoff by averaging over all the models using BMA, given a priori surface runoff? What are the effects of model uncertainty on surface runoff simulation?

  19. Competency model for the project managers of technical projects

    NASA Astrophysics Data System (ADS)

    Duncan, William R.

    1992-05-01

    Traditional job description techniques were developed to support compensation decisions for hourly wage earners in a manufacturing environment. Their resultant focus on activities performed on the job works well in this environment where the ability to perform the activity adequately is objectively verifiable by testing and observation. Although many organizations have adapted these techniques for salaried employees and service environments, the focus on activities performed has never been satisfactory. For example, stating that a project manager `prepares regular project status reports' tells us little about what to look for in a potential project manager or how to determine if a practicing project manager is ready for additional responsibilities. The concept of a `competency model' has been developed within the last decade to address this shortcoming. Competency models focus on what skills are needed to perform the tasks defined by the job description. For example, a project manager must be able to communicate well both orally and in writing in order to `prepare regular project status reports.'

  20. World Energy Projection System Plus Model Documentation: Greenhouse Gases Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Greenhouse Gases Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  1. World Energy Projection System Plus Model Documentation: Natural Gas Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Natural Gas Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  2. World Energy Projection System Plus Model Documentation: District Heat Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) District Heat Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  3. World Energy Projection System Plus Model Documentation: Industrial Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Industrial Model (WIM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  4. Dot Projection Photogrammetric Technique for Shape Measurements of Aerospace Test Articles

    NASA Technical Reports Server (NTRS)

    Jones, Thomas W.; Pappa, Richard S.

    2002-01-01

    Results from initial laboratory investigations with the dot projection photogrammetric technique are presented for three wind-tunnel test articles with a range of surface scattering and reflection properties. These test articles are a semispan model and a micro air vehicle with a latex wing that are both diffusely reflecting, and a highly polished specularly reflecting model used for high Reynolds number testing. Results using both white light and laser illumination are presented. Some of the advantages and limitations of the dot projection technique are discussed. Although a desirable final outcome of this research effort is the characterization of dynamic behavior, only static laboratory results are presented in this preliminary effort.

  5. 3D Modeling Techniques for Print and Digital Media

    NASA Astrophysics Data System (ADS)

    Stephens, Megan Ashley

    In developing my thesis, I looked to gain skills using ZBrush to create 3D models, 3D scanning, and 3D printing. The models created compared the hearts of several vertebrates and were intended for students attending Comparative Vertebrate Anatomy. I used several resources to create a model of the human heart and was able to work from life while creating heart models from other vertebrates. I successfully learned ZBrush and 3D scanning, and successfully printed 3D heart models. ZBrush allowed me to create several intricate models for use in both animation and print media. The 3D scanning technique did not fit my needs for the project, but may be of use for later projects. I was able to 3D print using two different techniques as well.

  6. Assessment of Remote Sensing Technologies for Location of Hydrogen and Helium Leaks

    NASA Technical Reports Server (NTRS)

    Sellar, R. Glenn; Sohn, Yongho; Mathur, Varun; Reardon, Peter

    2001-01-01

    In Phase 1 of this project, a hierarchy of techniques for H2 and He leak location was developed. A total of twelve specific remote sensing techniques were evaluated; the results are summarized. A basic diffusion model was also developed to predict the concentration and distribution of H2 or He resulting from a leak. The objectives of Phase 2 of the project consisted of the following four tasks: Advance Rayleigh Doppler technique from TRL 1 to TRL 2; Plan to advance Rayleigh Doppler technique from TRL 2 to TRL 3; Advance researchers and resources for further advancement; Extend diffusion model.

  7. Accuracy and performance of 3D mask models in optical projection lithography

    NASA Astrophysics Data System (ADS)

    Agudelo, Viviana; Evanschitzky, Peter; Erdmann, Andreas; Fühner, Tim; Shao, Feng; Limmer, Steffen; Fey, Dietmar

    2011-04-01

    Different mask models have been compared: rigorous electromagnetic field (EMF) modeling, rigorous EMF modeling with decomposition techniques and the thin mask approach (Kirchhoff approach) to simulate optical diffraction from different mask patterns in projection systems for lithography. In addition, each rigorous model was tested for two different formulations for partially coherent imaging: The Hopkins assumption and rigorous simulation of mask diffraction orders for multiple illumination angles. The aim of this work is to closely approximate results of the rigorous EMF method by the thin mask model enhanced with pupil filtering techniques. The validity of this approach for different feature sizes, shapes and illumination conditions is investigated.

  8. Project Simu-School Component Washington State University

    ERIC Educational Resources Information Center

    Glass, Thomas E.

    1976-01-01

    This component of the project attempts to facilitate planning by furnishing models that manage cumbersome and complex data, supply an objectivity that identifies all relationships between elements of the model, and provide a quantitative model allowing for various forecasting techniques that describe the long-range impact of decisions. (Author/IRT)

  9. Interface projection techniques for fluid-structure interaction modeling with moving-mesh methods

    NASA Astrophysics Data System (ADS)

    Tezduyar, Tayfun E.; Sathe, Sunil; Pausewang, Jason; Schwaab, Matthew; Christopher, Jason; Crabtree, Jason

    2008-12-01

    The stabilized space-time fluid-structure interaction (SSTFSI) technique developed by the Team for Advanced Flow Simulation and Modeling (T★AFSM) was applied to a number of 3D examples, including arterial fluid mechanics and parachute aerodynamics. Here we focus on the interface projection techniques that were developed as supplementary methods targeting the computational challenges associated with the geometric complexities of the fluid-structure interface. Although these supplementary techniques were developed in conjunction with the SSTFSI method and in the context of air-fabric interactions, they can also be used in conjunction with other moving-mesh methods, such as the Arbitrary Lagrangian-Eulerian (ALE) method, and in the context of other classes of FSI applications. The supplementary techniques currently consist of using split nodal values for pressure at the edges of the fabric and incompatible meshes at the air-fabric interfaces, the FSI Geometric Smoothing Technique (FSI-GST), and the Homogenized Modeling of Geometric Porosity (HMGP). Using split nodal values for pressure at the edges and incompatible meshes at the interfaces stabilizes the structural response at the edges of the membrane used in modeling the fabric. With the FSI-GST, the fluid mechanics mesh is sheltered from the consequences of the geometric complexity of the structure. With the HMGP, we bypass the intractable complexities of the geometric porosity by approximating it with an “equivalent”, locally-varying fabric porosity. As test cases demonstrating how the interface projection techniques work, we compute the air-fabric interactions of windsocks, sails and ringsail parachutes.

  10. Program and Project Management Framework

    NASA Technical Reports Server (NTRS)

    Butler, Cassandra D.

    2002-01-01

    The primary objective of this project was to develop a framework and system architecture for integrating program and project management tools that may be applied consistently throughout Kennedy Space Center (KSC) to optimize planning, cost estimating, risk management, and project control. Project management methodology used in building interactive systems to accommodate the needs of the project managers is applied as a key component in assessing the usefulness and applicability of the framework and tools developed. Research for the project included investigation and analysis of industrial practices, KSC standards, policies, and techniques, Systems Management Office (SMO) personnel, and other documented experiences of project management experts. In addition, this project documents best practices derived from the literature as well as new or developing project management models, practices, and techniques.

  11. Optimizaton of corrosion control for lead in drinking water using computational modeling techniques

    EPA Science Inventory

    Computational modeling techniques have been used to very good effect in the UK in the optimization of corrosion control for lead in drinking water. A “proof-of-concept” project with three US/CA case studies sought to demonstrate that such techniques could work equally well in the...

  12. Spatial regression methods capture prediction uncertainty in species distribution model projections through time

    Treesearch

    Alan K. Swanson; Solomon Z. Dobrowski; Andrew O. Finley; James H. Thorne; Michael K. Schwartz

    2013-01-01

    The uncertainty associated with species distribution model (SDM) projections is poorly characterized, despite its potential value to decision makers. Error estimates from most modelling techniques have been shown to be biased due to their failure to account for spatial autocorrelation (SAC) of residual error. Generalized linear mixed models (GLMM) have the ability to...

  13. Novel Plasmonic and Hyberbolic Optical Materials for Control of Quantum Nanoemitters

    DTIC Science & Technology

    2016-12-08

    properties, metal ion implantation techniques, and multi- physics modeling to produce hyperbolic quantum nanoemitters. 15. SUBJECT TERMS nanotechnology 16...techniques, and multi- physics modeling to produce hyperbolic quantum nanoemitters. During the course of this project we studied plasmonic

  14. Using Technology to Facilitate and Enhance Project-based Learning in Mathematical Physics

    NASA Astrophysics Data System (ADS)

    Duda, Gintaras

    2011-04-01

    Problem-based and project-based learning are two pedagogical techniques that have several clear advantages over traditional instructional methods: 1) both techniques are active and student centered, 2) students confront real-world and/or highly complex problems, and 3) such exercises model the way science and engineering are done professionally. This talk will present an experiment in project/problem-based learning in a mathematical physics course. The group project in the course involved modeling a zombie outbreak of the type seen in AMC's ``The Walking Dead.'' Students researched, devised, and solved their mathematical models for the spread of zombie-like infection. Students used technology in all stages; in fact, since analytical solutions to the models were often impossible, technology was a necessary and critical component of the challenge. This talk will explore the use of technology in general in problem and project-based learning and will detail some specific examples of how technology was used to enhance student learning in this course. A larger issue of how students use the Internet to learn will also be explored.

  15. The Python Project: A Unique Model for Extending Research Opportunities to Undergraduate Students

    ERIC Educational Resources Information Center

    Harvey, Pamela A.; Wall, Christopher; Luckey, Stephen W.; Langer, Stephen; Leinwand, Leslie A.

    2014-01-01

    Undergraduate science education curricula are traditionally composed of didactic instruction with a small number of laboratory courses that provide introductory training in research techniques. Research on learning methodologies suggests this model is relatively ineffective, whereas participation in independent research projects promotes enhanced…

  16. Genetic Programming as Alternative for Predicting Development Effort of Individual Software Projects

    PubMed Central

    Chavoya, Arturo; Lopez-Martin, Cuauhtemoc; Andalon-Garcia, Irma R.; Meda-Campaña, M. E.

    2012-01-01

    Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment. PMID:23226305

  17. Planning level assessment of greenhouse gas emissions for alternative transportation construction projects : carbon footprint estimator, phase II, volume I - GASCAP model.

    DOT National Transportation Integrated Search

    2014-03-01

    The GASCAP model was developed to provide a software tool for analysis of the life-cycle GHG : emissions associated with the construction and maintenance of transportation projects. This phase : of development included techniques for estimating emiss...

  18. Stabilizing a Bicycle: A Modeling Project

    ERIC Educational Resources Information Center

    Pennings, Timothy J.; Williams, Blair R.

    2010-01-01

    This article is a project that takes students through the process of forming a mathematical model of bicycle dynamics. Beginning with basic ideas from Newtonian mechanics (forces and torques), students use techniques from calculus and differential equations to develop the equations of rotational motion for a bicycle-rider system as it tips from…

  19. Underwater 3D Surface Measurement Using Fringe Projection Based Scanning Devices

    PubMed Central

    Bräuer-Burchardt, Christian; Heinze, Matthias; Schmidt, Ingo; Kühmstedt, Peter; Notni, Gunther

    2015-01-01

    In this work we show the principle of optical 3D surface measurements based on the fringe projection technique for underwater applications. The challenges of underwater use of this technique are shown and discussed in comparison with the classical application. We describe an extended camera model which takes refraction effects into account as well as a proposal of an effective, low-effort calibration procedure for underwater optical stereo scanners. This calibration technique combines a classical air calibration based on the pinhole model with ray-based modeling and requires only a few underwater recordings of an object of known length and a planar surface. We demonstrate a new underwater 3D scanning device based on the fringe projection technique. It has a weight of about 10 kg and the maximal water depth for application of the scanner is 40 m. It covers an underwater measurement volume of 250 mm × 200 mm × 120 mm. The surface of the measurement objects is captured with a lateral resolution of 150 μm in a third of a second. Calibration evaluation results are presented and examples of first underwater measurements are given. PMID:26703624

  20. Advanced Reactors-Intermediate Heat Exchanger (IHX) Coupling: Theoretical Modeling and Experimental Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Utgikar, Vivek; Sun, Xiaodong; Christensen, Richard

    2016-12-29

    The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate themore » models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.« less

  1. MERINOVA: Meteorological risks as drivers of environmental innovation in agro-ecosystem management

    NASA Astrophysics Data System (ADS)

    Gobin, Anne; Oger, Robert; Marlier, Catherine; Van De Vijver, Hans; Vandermeulen, Valerie; Van Huylenbroeck, Guido; Zamani, Sepideh; Curnel, Yannick; Mettepenningen, Evi

    2013-04-01

    The BELSPO funded project 'MERINOVA' deals with risks associated with extreme weather phenomena and with risks of biological origin such as pests and diseases. The major objectives of the proposed project are to characterise extreme meteorological events, assess the impact on Belgian agro-ecosystems, characterise their vulnerability and resilience to these events, and explore innovative adaptation options to agricultural risk management. The project comprises of five major parts that reflect the chain of risks: (i) Hazard: Assessing the likely frequency and magnitude of extreme meteorological events by means of probability density functions; (ii) Impact: Analysing the potential bio-physical and socio-economic impact of extreme weather events on agro-ecosystems in Belgium using process-based modelling techniques commensurate with the regional scale; (iii) Vulnerability: Identifying the most vulnerable agro-ecosystems using fuzzy multi-criteria and spatial analysis; (iv) Risk Management: Uncovering innovative risk management and adaptation options using actor-network theory and fuzzy cognitive mapping techniques; and, (v) Communication: Communicating to research, policy and practitioner communities using web-based techniques. The different tasks of the MERINOVA project require expertise in several scientific disciplines: meteorology, statistics, spatial database management, agronomy, bio-physical impact modelling, socio-economic modelling, actor-network theory, fuzzy cognitive mapping techniques. These expertises are shared by the four scientific partners who each lead one work package. The MERINOVA project will concentrate on promoting a robust and flexible framework by demonstrating its performance across Belgian agro-ecosystems, and by ensuring its relevance to policy makers and practitioners. Impacts developed from physically based models will not only provide information on the state of the damage at any given time, but also assist in understanding the links between different factors causing damage and determining bio-physical vulnerability. Socio-economic impacts will enlarge the basis for vulnerability mapping, risk management and adaptation options. A strong expert and end-user network will be established to help disseminating and exploiting project results to meet user needs.

  2. Projective Identification in Common Couple Dances.

    ERIC Educational Resources Information Center

    Middelberg, Carol V.

    2001-01-01

    Integrates the object relations concept of projective identification and the systemic concept of marital dances to develop a more powerful model for working with more difficult and distressed couples. Suggests how object relations techniques can be used to interrupt projective identifications and resolve conflict on intrapsychic level so the…

  3. Using Spreadsheet Modeling Techniques for Capital Project Review. AIR 1985 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Kaynor, Robert K.

    The value of microcomputer modeling tools and spreadsheets to help college institutional researchers analyze proposed capital projects is discussed, along with strengths and weaknesses of different software packages. Capital budgeting is the analysis that supports decisions about the allocation and commitment of funds to long-term capital…

  4. Modular Bundle Adjustment for Photogrammetric Computations

    NASA Astrophysics Data System (ADS)

    Börlin, N.; Murtiyoso, A.; Grussenmeyer, P.; Menna, F.; Nocerino, E.

    2018-05-01

    In this paper we investigate how the residuals in bundle adjustment can be split into a composition of simple functions. According to the chain rule, the Jacobian (linearisation) of the residual can be formed as a product of the Jacobians of the individual steps. When implemented, this enables a modularisation of the computation of the bundle adjustment residuals and Jacobians where each component has limited responsibility. This enables simple replacement of components to e.g. implement different projection or rotation models by exchanging a module. The technique has previously been used to implement bundle adjustment in the open-source package DBAT (Börlin and Grussenmeyer, 2013) based on the Photogrammetric and Computer Vision interpretations of Brown (1971) lens distortion model. In this paper, we applied the technique to investigate how affine distortions can be used to model the projection of a tilt-shift lens. Two extended distortion models were implemented to test the hypothesis that the ordering of the affine and lens distortion steps can be changed to reduce the size of the residuals of a tilt-shift lens calibration. Results on synthetic data confirm that the ordering of the affine and lens distortion steps matter and is detectable by DBAT. However, when applied to a real camera calibration data set of a tilt-shift lens, no difference between the extended models was seen. This suggests that the tested hypothesis is false and that other effects need to be modelled to better explain the projection. The relatively low implementation effort that was needed to generate the models suggest that the technique can be used to investigate other novel projection models in photogrammetry, including modelling changes in the 3D geometry to better understand the tilt-shift lens.

  5. Regional climate models downscaling in the Alpine area with Multimodel SuperEnsemble

    NASA Astrophysics Data System (ADS)

    Cane, D.; Barbarino, S.; Renier, L. A.; Ronchi, C.

    2012-08-01

    The climatic scenarios show a strong signal of warming in the Alpine area already for the mid XXI century. The climate simulations, however, even when obtained with Regional Climate Models (RCMs), are affected by strong errors where compared with observations, due to their difficulties in representing the complex orography of the Alps and limitations in their physical parametrization. Therefore the aim of this work is reducing these model biases using a specific post processing statistic technique to obtain a more suitable projection of climate change scenarios in the Alpine area. For our purposes we use a selection of RCMs runs from the ENSEMBLES project, carefully chosen in order to maximise the variety of leading Global Climate Models and of the RCMs themselves, calculated on the SRES scenario A1B. The reference observation for the Greater Alpine Area are extracted from the European dataset E-OBS produced by the project ENSEMBLES with an available resolution of 25 km. For the study area of Piedmont daily temperature and precipitation observations (1957-present) were carefully gridded on a 14-km grid over Piedmont Region with an Optimal Interpolation technique. Hence, we applied the Multimodel SuperEnsemble technique to temperature fields, reducing the high biases of RCMs temperature field compared to observations in the control period. We propose also the first application to RCMS of a brand new probabilistic Multimodel SuperEnsemble Dressing technique to estimate precipitation fields, already applied successfully to weather forecast models, with careful description of precipitation Probability Density Functions conditioned to the model outputs. This technique reduces the strong precipitation overestimation by RCMs over the alpine chain and reproduces well the monthly behaviour of precipitation in the control period.

  6. Optimisation of Critical Infrastructure Protection: The SiVe Project on Airport Security

    NASA Astrophysics Data System (ADS)

    Breiing, Marcus; Cole, Mara; D'Avanzo, John; Geiger, Gebhard; Goldner, Sascha; Kuhlmann, Andreas; Lorenz, Claudia; Papproth, Alf; Petzel, Erhard; Schwetje, Oliver

    This paper outlines the scientific goals, ongoing work and first results of the SiVe research project on critical infrastructure security. The methodology is generic while pilot studies are chosen from airport security. The outline proceeds in three major steps, (1) building a threat scenario, (2) development of simulation models as scenario refinements, and (3) assessment of alternatives. Advanced techniques of systems analysis and simulation are employed to model relevant airport structures and processes as well as offences. Computer experiments are carried out to compare and optimise alternative solutions. The optimality analyses draw on approaches to quantitative risk assessment recently developed in the operational sciences. To exploit the advantages of the various techniques, an integrated simulation workbench is build up in the project.

  7. Results of the Greenland ice sheet model initialisation experiments: ISMIP6 - initMIP-Greenland

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin; Beckley, Matthew

    2017-04-01

    Ice sheet model initialisation has a large effect on projected future sea-level contributions and gives rise to important uncertainties. The goal of this intercomparison exercise for the continental-scale Greenland ice sheet is therefore to compare, evaluate and improve the initialisation techniques used in the ice sheet modelling community. The initMIP-Greenland project is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). The experimental set-up has been designed to allow comparison of the initial present-day state of the Greenland ice sheet between participating models and against observations. Furthermore, the initial states are tested with two schematic forward experiments to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss results that highlight the wide diversity of data sets, boundary conditions and initialisation techniques used in the community to generate initial states of the Greenland ice sheet.

  8. The use of cluster analysis techniques in spaceflight project cost risk estimation

    NASA Technical Reports Server (NTRS)

    Fox, G.; Ebbeler, D.; Jorgensen, E.

    2003-01-01

    Project cost risk is the uncertainty in final project cost, contingent on initial budget, requirements and schedule. For a proposed mission, a dynamic simulation model relying for some of its input on a simple risk elicitation is used to identify and quantify systemic cost risk.

  9. A numerical study of different projection-based model reduction techniques applied to computational homogenisation

    NASA Astrophysics Data System (ADS)

    Soldner, Dominic; Brands, Benjamin; Zabihyan, Reza; Steinmann, Paul; Mergheim, Julia

    2017-10-01

    Computing the macroscopic material response of a continuum body commonly involves the formulation of a phenomenological constitutive model. However, the response is mainly influenced by the heterogeneous microstructure. Computational homogenisation can be used to determine the constitutive behaviour on the macro-scale by solving a boundary value problem at the micro-scale for every so-called macroscopic material point within a nested solution scheme. Hence, this procedure requires the repeated solution of similar microscopic boundary value problems. To reduce the computational cost, model order reduction techniques can be applied. An important aspect thereby is the robustness of the obtained reduced model. Within this study reduced-order modelling (ROM) for the geometrically nonlinear case using hyperelastic materials is applied for the boundary value problem on the micro-scale. This involves the Proper Orthogonal Decomposition (POD) for the primary unknown and hyper-reduction methods for the arising nonlinearity. Therein three methods for hyper-reduction, differing in how the nonlinearity is approximated and the subsequent projection, are compared in terms of accuracy and robustness. Introducing interpolation or Gappy-POD based approximations may not preserve the symmetry of the system tangent, rendering the widely used Galerkin projection sub-optimal. Hence, a different projection related to a Gauss-Newton scheme (Gauss-Newton with Approximated Tensors- GNAT) is favoured to obtain an optimal projection and a robust reduced model.

  10. Climate downscaling effects on predictive ecological models: a case study for threatened and endangered vertebrates in the southeastern United States

    USGS Publications Warehouse

    Bucklin, David N.; Watling, James I.; Speroterra, Carolina; Brandt, Laura A.; Mazzotti, Frank J.; Romañach, Stephanie S.

    2013-01-01

    High-resolution (downscaled) projections of future climate conditions are critical inputs to a wide variety of ecological and socioeconomic models and are created using numerous different approaches. Here, we conduct a sensitivity analysis of spatial predictions from climate envelope models for threatened and endangered vertebrates in the southeastern United States to determine whether two different downscaling approaches (with and without the use of a regional climate model) affect climate envelope model predictions when all other sources of variation are held constant. We found that prediction maps differed spatially between downscaling approaches and that the variation attributable to downscaling technique was comparable to variation between maps generated using different general circulation models (GCMs). Precipitation variables tended to show greater discrepancies between downscaling techniques than temperature variables, and for one GCM, there was evidence that more poorly resolved precipitation variables contributed relatively more to model uncertainty than more well-resolved variables. Our work suggests that ecological modelers requiring high-resolution climate projections should carefully consider the type of downscaling applied to the climate projections prior to their use in predictive ecological modeling. The uncertainty associated with alternative downscaling methods may rival that of other, more widely appreciated sources of variation, such as the general circulation model or emissions scenario with which future climate projections are created.

  11. A technique for estimating 4D-CBCT using prior knowledge and limited-angle projections.

    PubMed

    Zhang, You; Yin, Fang-Fang; Segars, W Paul; Ren, Lei

    2013-12-01

    To develop a technique to estimate onboard 4D-CBCT using prior information and limited-angle projections for potential 4D target verification of lung radiotherapy. Each phase of onboard 4D-CBCT is considered as a deformation from one selected phase (prior volume) of the planning 4D-CT. The deformation field maps (DFMs) are solved using a motion modeling and free-form deformation (MM-FD) technique. In the MM-FD technique, the DFMs are estimated using a motion model which is extracted from planning 4D-CT based on principal component analysis (PCA). The motion model parameters are optimized by matching the digitally reconstructed radiographs of the deformed volumes to the limited-angle onboard projections (data fidelity constraint). Afterward, the estimated DFMs are fine-tuned using a FD model based on data fidelity constraint and deformation energy minimization. The 4D digital extended-cardiac-torso phantom was used to evaluate the MM-FD technique. A lung patient with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume, including changes of respiration amplitude, lesion size and lesion average-position, and phase shift between lesion and body respiratory cycle. The lesions were contoured in both the estimated and "ground-truth" onboard 4D-CBCT for comparison. 3D volume percentage-difference (VPD) and center-of-mass shift (COMS) were calculated to evaluate the estimation accuracy of three techniques: MM-FD, MM-only, and FD-only. Different onboard projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy. For all simulated patient and projection acquisition scenarios, the mean VPD (±S.D.)∕COMS (±S.D.) between lesions in prior images and "ground-truth" onboard images were 136.11% (±42.76%)∕15.5 mm (±3.9 mm). Using orthogonal-view 15°-each scan angle, the mean VPD∕COMS between the lesion in estimated and "ground-truth" onboard images for MM-only, FD-only, and MM-FD techniques were 60.10% (±27.17%)∕4.9 mm (±3.0 mm), 96.07% (±31.48%)∕12.1 mm (±3.9 mm) and 11.45% (±9.37%)∕1.3 mm (±1.3 mm), respectively. For orthogonal-view 30°-each scan angle, the corresponding results were 59.16% (±26.66%)∕4.9 mm (±3.0 mm), 75.98% (±27.21%)∕9.9 mm (±4.0 mm), and 5.22% (±2.12%)∕0.5 mm (±0.4 mm). For single-view scan angles of 3°, 30°, and 60°, the results for MM-FD technique were 32.77% (±17.87%)∕3.2 mm (±2.2 mm), 24.57% (±18.18%)∕2.9 mm (±2.0 mm), and 10.48% (±9.50%)∕1.1 mm (±1.3 mm), respectively. For projection angular-sampling-intervals of 0.6°, 1.2°, and 2.5° with the orthogonal-view 30°-each scan angle, the MM-FD technique generated similar VPD (maximum deviation 2.91%) and COMS (maximum deviation 0.6 mm), while sparser sampling yielded larger VPD∕COMS. With equal number of projections, the estimation results using scattered 360° scan angle were slightly better than those using orthogonal-view 30°-each scan angle. The estimation accuracy of MM-FD technique declined as noise level increased. The MM-FD technique substantially improves the estimation accuracy for onboard 4D-CBCT using prior planning 4D-CT and limited-angle projections, compared to the MM-only and FD-only techniques. It can potentially be used for the inter/intrafractional 4D-localization verification.

  12. A technique for estimating 4D-CBCT using prior knowledge and limited-angle projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, You; Yin, Fang-Fang; Ren, Lei

    2013-12-15

    Purpose: To develop a technique to estimate onboard 4D-CBCT using prior information and limited-angle projections for potential 4D target verification of lung radiotherapy.Methods: Each phase of onboard 4D-CBCT is considered as a deformation from one selected phase (prior volume) of the planning 4D-CT. The deformation field maps (DFMs) are solved using a motion modeling and free-form deformation (MM-FD) technique. In the MM-FD technique, the DFMs are estimated using a motion model which is extracted from planning 4D-CT based on principal component analysis (PCA). The motion model parameters are optimized by matching the digitally reconstructed radiographs of the deformed volumes tomore » the limited-angle onboard projections (data fidelity constraint). Afterward, the estimated DFMs are fine-tuned using a FD model based on data fidelity constraint and deformation energy minimization. The 4D digital extended-cardiac-torso phantom was used to evaluate the MM-FD technique. A lung patient with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume, including changes of respiration amplitude, lesion size and lesion average-position, and phase shift between lesion and body respiratory cycle. The lesions were contoured in both the estimated and “ground-truth” onboard 4D-CBCT for comparison. 3D volume percentage-difference (VPD) and center-of-mass shift (COMS) were calculated to evaluate the estimation accuracy of three techniques: MM-FD, MM-only, and FD-only. Different onboard projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy.Results: For all simulated patient and projection acquisition scenarios, the mean VPD (±S.D.)/COMS (±S.D.) between lesions in prior images and “ground-truth” onboard images were 136.11% (±42.76%)/15.5 mm (±3.9 mm). Using orthogonal-view 15°-each scan angle, the mean VPD/COMS between the lesion in estimated and “ground-truth” onboard images for MM-only, FD-only, and MM-FD techniques were 60.10% (±27.17%)/4.9 mm (±3.0 mm), 96.07% (±31.48%)/12.1 mm (±3.9 mm) and 11.45% (±9.37%)/1.3 mm (±1.3 mm), respectively. For orthogonal-view 30°-each scan angle, the corresponding results were 59.16% (±26.66%)/4.9 mm (±3.0 mm), 75.98% (±27.21%)/9.9 mm (±4.0 mm), and 5.22% (±2.12%)/0.5 mm (±0.4 mm). For single-view scan angles of 3°, 30°, and 60°, the results for MM-FD technique were 32.77% (±17.87%)/3.2 mm (±2.2 mm), 24.57% (±18.18%)/2.9 mm (±2.0 mm), and 10.48% (±9.50%)/1.1 mm (±1.3 mm), respectively. For projection angular-sampling-intervals of 0.6°, 1.2°, and 2.5° with the orthogonal-view 30°-each scan angle, the MM-FD technique generated similar VPD (maximum deviation 2.91%) and COMS (maximum deviation 0.6 mm), while sparser sampling yielded larger VPD/COMS. With equal number of projections, the estimation results using scattered 360° scan angle were slightly better than those using orthogonal-view 30°-each scan angle. The estimation accuracy of MM-FD technique declined as noise level increased.Conclusions: The MM-FD technique substantially improves the estimation accuracy for onboard 4D-CBCT using prior planning 4D-CT and limited-angle projections, compared to the MM-only and FD-only techniques. It can potentially be used for the inter/intrafractional 4D-localization verification.« less

  13. EM Transition Sum Rules Within the Framework of sdg Proton-Neutron Interacting Boson Model, Nuclear Pair Shell Model and Fermion Dynamical Symmetry Model

    NASA Astrophysics Data System (ADS)

    Zhao, Yumin

    1997-07-01

    By the techniques of the Wick theorem for coupled clusters, the no-energy-weighted electromagnetic sum-rule calculations are presented in the sdg neutron-proton interacting boson model, the nuclear pair shell model and the fermion-dynamical symmetry model. The project supported by Development Project Foundation of China, National Natural Science Foundation of China, Doctoral Education Fund of National Education Committee, Fundamental Research Fund of Southeast University

  14. Hard Copy to Digital Transfer: 3D Models that Match 2D Maps

    ERIC Educational Resources Information Center

    Kellie, Andrew C.

    2011-01-01

    This research describes technical drawing techniques applied in a project involving digitizing of existing hard copy subsurface mapping for the preparation of three dimensional graphic and mathematical models. The intent of this research was to identify work flows that would support the project, ensure the accuracy of the digital data obtained,…

  15. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    ERIC Educational Resources Information Center

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  16. World Ocean Circulation Experiment (WOCE) Young Investigator Workshops

    NASA Technical Reports Server (NTRS)

    Austin, Meg

    2004-01-01

    The World Ocean Circulation Experiment (WOCE) Young Investigator Workshops goals and objectives are: a) to familiarize Young Investigators with WOCE models, datasets and estimation procedures; b) to offer intensive hands-on exposure to these models ard methods; c) to build collaborations among junior scientists and more senior WOCE investigators; and finally, d) to generate ideas and projects leading to fundable WOCE synthesis projects. To achieve these goals and objectives, the Workshop will offer a mixture of tutorial lectures on numerical models and estimation procedures, advanced seminars on current WOCE synthesis activities and related projects, and the opportunity to conduct small projects which put into practice the techniques advanced in the lectures.

  17. Myoglobin structure and function: A multiweek biochemistry laboratory project.

    PubMed

    Silverstein, Todd P; Kirk, Sarah R; Meyer, Scott C; Holman, Karen L McFarlane

    2015-01-01

    We have developed a multiweek laboratory project in which students isolate myoglobin and characterize its structure, function, and redox state. The important laboratory techniques covered in this project include size-exclusion chromatography, electrophoresis, spectrophotometric titration, and FTIR spectroscopy. Regarding protein structure, students work with computer modeling and visualization of myoglobin and its homologues, after which they spectroscopically characterize its thermal denaturation. Students also study protein function (ligand binding equilibrium) and are instructed on topics in data analysis (calibration curves, nonlinear vs. linear regression). This upper division biochemistry laboratory project is a challenging and rewarding one that not only exposes students to a wide variety of important biochemical laboratory techniques but also ties those techniques together to work with a single readily available and easily characterized protein, myoglobin. © 2015 International Union of Biochemistry and Molecular Biology.

  18. Estimating 4D CBCT from prior information and extremely limited angle projections using structural PCA and weighted free-form deformation for lung radiotherapy

    PubMed Central

    Harris, Wendy; Zhang, You; Yin, Fang-Fang; Ren, Lei

    2017-01-01

    Purpose To investigate the feasibility of using structural-based principal component analysis (PCA) motion-modeling and weighted free-form deformation to estimate on-board 4D-CBCT using prior information and extremely limited angle projections for potential 4D target verification of lung radiotherapy. Methods A technique for lung 4D-CBCT reconstruction has been previously developed using a deformation field map (DFM)-based strategy. In the previous method, each phase of the 4D-CBCT was generated by deforming a prior CT volume. The DFM was solved by a motion-model extracted by global PCA and free-form deformation (GMM-FD) technique, using a data fidelity constraint and deformation energy minimization. In this study, a new structural-PCA method was developed to build a structural motion-model (SMM) by accounting for potential relative motion pattern changes between different anatomical structures from simulation to treatment. The motion model extracted from planning 4DCT was divided into two structures: tumor and body excluding tumor, and the parameters of both structures were optimized together. Weighted free-form deformation (WFD) was employed afterwards to introduce flexibility in adjusting the weightings of different structures in the data fidelity constraint based on clinical interests. XCAT (computerized patient model) simulation with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume to evaluate the method. The estimation accuracy was evaluated by the Volume-Percent-Difference (VPD)/Center-of-Mass-Shift (COMS) between lesions in the estimated and “ground-truth” on board 4D-CBCT. Different onboard projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy. The method was also evaluated against 3 lung patients. Results The SMM-WFD method achieved substantially better accuracy than the GMM-FD method for CBCT estimation using extremely small scan angles or projections. Using orthogonal 15° scanning angles, the VPD/COMS were 3.47±2.94% and 0.23±0.22mm for SMM-WFD and 25.23±19.01% and 2.58±2.54mm for GMM-FD among all 8 XCAT scenarios. Compared to GMM-FD, SMM-WFD was more robust against reduction of the scanning angles down to orthogonal 10° with VPD/COMS of 6.21±5.61% and 0.39±0.49mm, and more robust against reduction of projection numbers down to only 8 projections in total for both orthogonal-view 30° and orthogonal-view 15° scan angles. SMM-WFD method was also more robust than the GMM-FD method against increasing levels of noise in the projection images. Additionally, the SMM-WFD technique provided better tumor estimation for all three lung patients compared to the GMM-FD technique. Conclusion Compared to the GMM-FD technique, the SMM-WFD technique can substantially improve the 4D-CBCT estimation accuracy using extremely small scan angles and low number of projections to provide fast low dose 4D target verification. PMID:28079267

  19. The Woodworker's Website: A Project Management Case Study

    ERIC Educational Resources Information Center

    Jance, Marsha

    2014-01-01

    A case study that focuses on building a website for a woodworking business is discussed. Project management and linear programming techniques can be used to determine the time required to complete the website project discussed in the case. This case can be assigned to students in an undergraduate or graduate decision modeling or management science…

  20. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    NASA Astrophysics Data System (ADS)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  1. A biomechanical modeling-guided simultaneous motion estimation and image reconstruction technique (SMEIR-Bio) for 4D-CBCT reconstruction

    NASA Astrophysics Data System (ADS)

    Huang, Xiaokun; Zhang, You; Wang, Jing

    2018-02-01

    Reconstructing four-dimensional cone-beam computed tomography (4D-CBCT) images directly from respiratory phase-sorted traditional 3D-CBCT projections can capture target motion trajectory, reduce motion artifacts, and reduce imaging dose and time. However, the limited numbers of projections in each phase after phase-sorting decreases CBCT image quality under traditional reconstruction techniques. To address this problem, we developed a simultaneous motion estimation and image reconstruction (SMEIR) algorithm, an iterative method that can reconstruct higher quality 4D-CBCT images from limited projections using an inter-phase intensity-driven motion model. However, the accuracy of the intensity-driven motion model is limited in regions with fine details whose quality is degraded due to insufficient projection number, which consequently degrades the reconstructed image quality in corresponding regions. In this study, we developed a new 4D-CBCT reconstruction algorithm by introducing biomechanical modeling into SMEIR (SMEIR-Bio) to boost the accuracy of the motion model in regions with small fine structures. The biomechanical modeling uses tetrahedral meshes to model organs of interest and solves internal organ motion using tissue elasticity parameters and mesh boundary conditions. This physics-driven approach enhances the accuracy of solved motion in the organ’s fine structures regions. This study used 11 lung patient cases to evaluate the performance of SMEIR-Bio, making both qualitative and quantitative comparisons between SMEIR-Bio, SMEIR, and the algebraic reconstruction technique with total variation regularization (ART-TV). The reconstruction results suggest that SMEIR-Bio improves the motion model’s accuracy in regions containing small fine details, which consequently enhances the accuracy and quality of the reconstructed 4D-CBCT images.

  2. Modelling climate impact on floods under future emission scenarios using an ensemble of climate model projections

    NASA Astrophysics Data System (ADS)

    Wetterhall, F.; Cloke, H. L.; He, Y.; Freer, J.; Pappenberger, F.

    2012-04-01

    Evidence provided by modelled assessments of climate change impact on flooding is fundamental to water resource and flood risk decision making. Impact models usually rely on climate projections from Global and Regional Climate Models, and there is no doubt that these provide a useful assessment of future climate change. However, cascading ensembles of climate projections into impact models is not straightforward because of problems of coarse resolution in Global and Regional Climate Models (GCM/RCM) and the deficiencies in modelling high-intensity precipitation events. Thus decisions must be made on how to appropriately pre-process the meteorological variables from GCM/RCMs, such as selection of downscaling methods and application of Model Output Statistics (MOS). In this paper a grand ensemble of projections from several GCM/RCM are used to drive a hydrological model and analyse the resulting future flood projections for the Upper Severn, UK. The impact and implications of applying MOS techniques to precipitation as well as hydrological model parameter uncertainty is taken into account. The resultant grand ensemble of future river discharge projections from the RCM/GCM-hydrological model chain is evaluated against a response surface technique combined with a perturbed physics experiment creating a probabilisic ensemble climate model outputs. The ensemble distribution of results show that future risk of flooding in the Upper Severn increases compared to present conditions, however, the study highlights that the uncertainties are large and that strong assumptions were made in using Model Output Statistics to produce the estimates of future discharge. The importance of analysing on a seasonal basis rather than just annual is highlighted. The inability of the RCMs (and GCMs) to produce realistic precipitation patterns, even in present conditions, is a major caveat of local climate impact studies on flooding, and this should be a focus for future development.

  3. Calibration of a COTS Integration Cost Model Using Local Project Data

    NASA Technical Reports Server (NTRS)

    Boland, Dillard; Coon, Richard; Byers, Kathryn; Levitt, David

    1997-01-01

    The software measures and estimation techniques appropriate to a Commercial Off the Shelf (COTS) integration project differ from those commonly used for custom software development. Labor and schedule estimation tools that model COTS integration are available. Like all estimation tools, they must be calibrated with the organization's local project data. This paper describes the calibration of a commercial model using data collected by the Flight Dynamics Division (FDD) of the NASA Goddard Spaceflight Center (GSFC). The model calibrated is SLIM Release 4.0 from Quantitative Software Management (QSM). By adopting the SLIM reuse model and by treating configuration parameters as lines of code, we were able to establish a consistent calibration for COTS integration projects. The paper summarizes the metrics, the calibration process and results, and the validation of the calibration.

  4. Regional climate models downscaling in the Alpine area with multimodel superensemble

    NASA Astrophysics Data System (ADS)

    Cane, D.; Barbarino, S.; Renier, L. A.; Ronchi, C.

    2013-05-01

    The climatic scenarios show a strong signal of warming in the Alpine area already for the mid-XXI century. The climate simulations, however, even when obtained with regional climate models (RCMs), are affected by strong errors when compared with observations, due both to their difficulties in representing the complex orography of the Alps and to limitations in their physical parametrization. Therefore, the aim of this work is to reduce these model biases by using a specific post processing statistic technique, in order to obtain a more suitable projection of climate change scenarios in the Alpine area. For our purposes we used a selection of regional climate models (RCMs) runs which were developed in the framework of the ENSEMBLES project. They were carefully chosen with the aim to maximise the variety of leading global climate models and of the RCMs themselves, calculated on the SRES scenario A1B. The reference observations for the greater Alpine area were extracted from the European dataset E-OBS (produced by the ENSEMBLES project), which have an available resolution of 25 km. For the study area of Piedmont daily temperature and precipitation observations (covering the period from 1957 to the present) were carefully gridded on a 14 km grid over Piedmont region through the use of an optimal interpolation technique. Hence, we applied the multimodel superensemble technique to temperature fields, reducing the high biases of RCMs temperature field compared to observations in the control period. We also proposed the application of a brand new probabilistic multimodel superensemble dressing technique, already applied to weather forecast models successfully, to RCMS: the aim was to estimate precipitation fields, with careful description of precipitation probability density functions conditioned to the model outputs. This technique allowed for reducing the strong precipitation overestimation, arising from the use of RCMs, over the Alpine chain and to reproduce well the monthly behaviour of precipitation in the control period.

  5. Three-dimensional displacement measurement by fringe projection and speckle photography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrientos, B.; Garcia-Marquez, J.; Cerca, M.

    2008-04-15

    3D displacement fields are measured by the combination of two optical methods, fringe projection and speckle photography. The use of only one camera recording the necessary information implies that no calibration procedures are necessary as is the case in techniques based on stereoscopy. The out-of-plane displacement is measured by fringe projection whereas speckle photography yields the 2-D in-plane component. To show the feasibility of the technique, we analyze a detailed morphological spatio-temporal evolution of a model of the Earth's crust while subjected to compression forces. The results show that the combination of fringe projection and speckle photography is well suitedmore » for this type of studies.« less

  6. The FORE-SCE model: a practical approach for projecting land cover change using scenario-based modeling

    USGS Publications Warehouse

    Sohl, Terry L.; Sayler, Kristi L.; Drummond, Mark A.; Loveland, Thomas R.

    2007-01-01

    A wide variety of ecological applications require spatially explicit, historic, current, and projected land use and land cover data. The U.S. Land Cover Trends project is analyzing contemporary (1973–2000) land-cover change in the conterminous United States. The newly developed FORE-SCE model used Land Cover Trends data and theoretical, statistical, and deterministic modeling techniques to project future land cover change through 2020 for multiple plausible scenarios. Projected proportions of future land use were initially developed, and then sited on the lands with the highest potential for supporting that land use and land cover using a statistically based stochastic allocation procedure. Three scenarios of 2020 land cover were mapped for the western Great Plains in the US. The model provided realistic, high-resolution, scenario-based land-cover products suitable for multiple applications, including studies of climate and weather variability, carbon dynamics, and regional hydrology.

  7. Regional Climate Change across North America in 2030 Projected from RCP6.0

    NASA Astrophysics Data System (ADS)

    Otte, T.; Nolte, C. G.; Faluvegi, G.; Shindell, D. T.

    2012-12-01

    Projecting climate change scenarios to local scales is important for understanding and mitigating the effects of climate change on society and the environment. Many of the general circulation models (GCMs) that are participating in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) do not fully resolve regional-scale processes and therefore cannot capture local changes in temperature and precipitation extremes. We seek to project the GCM's large-scale climate change signal to the local scale using a regional climate model (RCM) by applying dynamical downscaling techniques. The RCM will be used to better understand the local changes of temperature and precipitation extremes that may result from a changing climate. In this research, downscaling techniques that we developed with historical data are now applied to GCM fields. Results from downscaling NASA/GISS ModelE2 simulations of the IPCC AR5 Representative Concentration Pathway (RCP) scenario 6.0 will be shown. The Weather Research and Forecasting (WRF) model has been used as the RCM to downscale decadal time slices for ca. 2000 and ca. 2030 over North America and illustrate potential changes in regional climate that are projected by ModelE2 and WRF under RCP6.0. The analysis focuses on regional climate fields that most strongly influence the interactions between climate change and air quality. In particular, an analysis of extreme temperature and precipitation events will be presented.

  8. Computational Control of Flexible Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Sharpe, Lonnie, Jr.; Shen, Ji Yao

    1994-01-01

    The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years of the project. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed. A theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modelling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide a embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.

  9. Comparison of Three Optical Methods for Measuring Model Deformation

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Fleming, G. A.; Hoppe, J. C.

    2000-01-01

    The objective of this paper is to compare the current state-of-the-art of the following three optical techniques under study by NASA for measuring model deformation in wind tunnels: (1) video photogrammetry, (2) projection moire interferometry, and (3) the commercially available Optotrak system. An objective comparison of these three techniques should enable the selection of the best technique for a particular test undertaken at various NASA facilities. As might be expected, no one technique is best for all applications. The techniques are also not necessarily mutually exclusive and in some cases can be complementary to one another.

  10. Rupture Propagation Imaging of Fluid Induced Events at the Basel EGS Project

    NASA Astrophysics Data System (ADS)

    Folesky, Jonas; Kummerow, Jörn; Shapiro, Serge A.

    2014-05-01

    The analysis of rupture properties using rupture propagation imaging techniques is a fast developing field of research in global seismology. Usually rupture fronts of large to megathrust earthquakes are subject of recent studies, like e.g. the 2004 Sumatra-Andaman earthquake or the 2011 Tohoku, Japan earthquake. The back projection technique is the most prominent technique in this field. Here the seismograms recorded at an array or at a seismic network are back shifted to a grid of possible source locations via a special stacking procedure. This can provide information on the energy release and energy distribution of the rupture which then can be used to find estimates of event properties like location, rupture direction, rupture speed or length. The procedure is fast and direct and it only relies on a reasonable velocity model. Thus it is a good way to rapidly estimate the rupture properties and it can be used to confirm independently achieved event information. We adopted the back projection technique and put it in a microseismic context. We demonstrated its usage for multiple synthetic ruptures within a reservoir model of microseismic scale in earlier works. Our motivation hereby is the occurrence of relatively large, induced seismic events at a number of stimulated geothermal reservoirs or waste disposal sites, having magnitudes ML ≥ 3.4 and yielding rupture lengths of several hundred meters. We use the configuration of the seismic network and reservoir properties of the Basel Geothermal Site to build a synthetic model of a rupture by modeling the wave field of multiple spatio-temporal separated single sources using Finite-Difference modeling. The focus of this work is the application of the Back Projection technique and the demonstration of its feasibility to retrieve the rupture properties of real fluid induced events. We take four microseismic events with magnitudes from ML 3.1 to 3.4 and reconstruct source parameters like location, orientation and length. By comparison with our synthetic results as well as independent localization studies and source mechanism studies in this area we can show, that the obtained results are reasonable and that the application of back projection imaging is not only possible for microseismic datasets of respective quality, but that it provides important additional insights in the rupture process.

  11. Nonlinear models for estimating GSFC travel requirements

    NASA Technical Reports Server (NTRS)

    Buffalano, C.; Hagan, F. J.

    1974-01-01

    A methodology is presented for estimating travel requirements for a particular period of time. Travel models were generated using nonlinear regression analysis techniques on a data base of FY-72 and FY-73 information from 79 GSFC projects. Although the subject matter relates to GSFX activities, the type of analysis used and the manner of selecting the relevant variables would be of interest to other NASA centers, government agencies, private corporations and, in general, any organization with a significant travel budget. Models were developed for each of six types of activity: flight projects (in-house and out-of-house), experiments on non-GSFC projects, international projects, ART/SRT, data analysis, advanced studies, tracking and data, and indirects.

  12. Design and results of the ice sheet model initialisation experiments initMIP-Greenland: an ISMIP6 intercomparison

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin; Beckley, Matthew; Abe-Ouchi, Ayako; Aschwanden, Andy; Calov, Reinhard; Gagliardini, Olivier; Gillet-Chaulet, Fabien; Golledge, Nicholas R.; Gregory, Jonathan; Greve, Ralf; Humbert, Angelika; Huybrechts, Philippe; Kennedy, Joseph H.; Larour, Eric; Lipscomb, William H.; Le clec'h, Sébastien; Lee, Victoria; Morlighem, Mathieu; Pattyn, Frank; Payne, Antony J.; Rodehacke, Christian; Rückamp, Martin; Saito, Fuyuki; Schlegel, Nicole; Seroussi, Helene; Shepherd, Andrew; Sun, Sainan; van de Wal, Roderik; Ziemen, Florian A.

    2018-04-01

    Earlier large-scale Greenland ice sheet sea-level projections (e.g. those run during the ice2sea and SeaRISE initiatives) have shown that ice sheet initial conditions have a large effect on the projections and give rise to important uncertainties. The goal of this initMIP-Greenland intercomparison exercise is to compare, evaluate, and improve the initialisation techniques used in the ice sheet modelling community and to estimate the associated uncertainties in modelled mass changes. initMIP-Greenland is the first in a series of ice sheet model intercomparison activities within ISMIP6 (the Ice Sheet Model Intercomparison Project for CMIP6), which is the primary activity within the Coupled Model Intercomparison Project Phase 6 (CMIP6) focusing on the ice sheets. Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of (1) the initial present-day state of the ice sheet and (2) the response in two idealised forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without additional forcing) and in response to a large perturbation (prescribed surface mass balance anomaly); they should not be interpreted as sea-level projections. We present and discuss results that highlight the diversity of data sets, boundary conditions, and initialisation techniques used in the community to generate initial states of the Greenland ice sheet. We find good agreement across the ensemble for the dynamic response to surface mass balance changes in areas where the simulated ice sheets overlap but differences arising from the initial size of the ice sheet. The model drift in the control experiment is reduced for models that participated in earlier intercomparison exercises.

  13. Design and results of the ice sheet model initialisation experiments initMIP-Greenland: an ISMIP6 intercomparison

    DOE PAGES

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin; ...

    2018-04-19

    Earlier large-scale Greenland ice sheet sea-level projections (e.g. those run during the ice2sea and SeaRISE initiatives) have shown that ice sheet initial conditions have a large effect on the projections and give rise to important uncertainties. Here, the goal of this initMIP-Greenland intercomparison exercise is to compare, evaluate, and improve the initialisation techniques used in the ice sheet modelling community and to estimate the associated uncertainties in modelled mass changes. initMIP-Greenland is the first in a series of ice sheet model intercomparison activities within ISMIP6 (the Ice Sheet Model Intercomparison Project for CMIP6), which is the primary activity within themore » Coupled Model Intercomparison Project Phase 6 (CMIP6) focusing on the ice sheets. Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of (1) the initial present-day state of the ice sheet and (2) the response in two idealised forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without additional forcing) and in response to a large perturbation (prescribed surface mass balance anomaly); they should not be interpreted as sea-level projections. We present and discuss results that highlight the diversity of data sets, boundary conditions, and initialisation techniques used in the community to generate initial states of the Greenland ice sheet. We find good agreement across the ensemble for the dynamic response to surface mass balance changes in areas where the simulated ice sheets overlap but differences arising from the initial size of the ice sheet. The model drift in the control experiment is reduced for models that participated in earlier intercomparison exercises.« less

  14. Design and results of the ice sheet model initialisation experiments initMIP-Greenland: an ISMIP6 intercomparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin

    Earlier large-scale Greenland ice sheet sea-level projections (e.g. those run during the ice2sea and SeaRISE initiatives) have shown that ice sheet initial conditions have a large effect on the projections and give rise to important uncertainties. Here, the goal of this initMIP-Greenland intercomparison exercise is to compare, evaluate, and improve the initialisation techniques used in the ice sheet modelling community and to estimate the associated uncertainties in modelled mass changes. initMIP-Greenland is the first in a series of ice sheet model intercomparison activities within ISMIP6 (the Ice Sheet Model Intercomparison Project for CMIP6), which is the primary activity within themore » Coupled Model Intercomparison Project Phase 6 (CMIP6) focusing on the ice sheets. Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of (1) the initial present-day state of the ice sheet and (2) the response in two idealised forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without additional forcing) and in response to a large perturbation (prescribed surface mass balance anomaly); they should not be interpreted as sea-level projections. We present and discuss results that highlight the diversity of data sets, boundary conditions, and initialisation techniques used in the community to generate initial states of the Greenland ice sheet. We find good agreement across the ensemble for the dynamic response to surface mass balance changes in areas where the simulated ice sheets overlap but differences arising from the initial size of the ice sheet. The model drift in the control experiment is reduced for models that participated in earlier intercomparison exercises.« less

  15. Modelling Truck Camper Production

    ERIC Educational Resources Information Center

    Kramlich, G. R., II; Kobylski, G.; Ahner, D.

    2008-01-01

    This note describes an interdisciplinary project designed to enhance students' knowledge of the basic techniques taught in a multivariable calculus course. The note discusses the four main requirements of the project and then the solutions for each requirement. Concepts covered include differentials, gradients, Lagrange multipliers, constrained…

  16. Constructive Criticism.

    ERIC Educational Resources Information Center

    Lieberfeld, Lawrence

    1982-01-01

    Many crucial questions need to be answered before a college embarks on a construction project and makes a substantial financial commitment. Computer modeling techniques can be used to make even complex project feasibility analyses. Available from Peat, Marwick, Mitchell & Co., 345 Park Avenue, New York, NY 10154. (MSE)

  17. Multi-GPU Acceleration of Branchless Distance Driven Projection and Backprojection for Clinical Helical CT.

    PubMed

    Mitra, Ayan; Politte, David G; Whiting, Bruce R; Williamson, Jeffrey F; O'Sullivan, Joseph A

    2017-01-01

    Model-based image reconstruction (MBIR) techniques have the potential to generate high quality images from noisy measurements and a small number of projections which can reduce the x-ray dose in patients. These MBIR techniques rely on projection and backprojection to refine an image estimate. One of the widely used projectors for these modern MBIR based technique is called branchless distance driven (DD) projection and backprojection. While this method produces superior quality images, the computational cost of iterative updates keeps it from being ubiquitous in clinical applications. In this paper, we provide several new parallelization ideas for concurrent execution of the DD projectors in multi-GPU systems using CUDA programming tools. We have introduced some novel schemes for dividing the projection data and image voxels over multiple GPUs to avoid runtime overhead and inter-device synchronization issues. We have also reduced the complexity of overlap calculation of the algorithm by eliminating the common projection plane and directly projecting the detector boundaries onto image voxel boundaries. To reduce the time required for calculating the overlap between the detector edges and image voxel boundaries, we have proposed a pre-accumulation technique to accumulate image intensities in perpendicular 2D image slabs (from a 3D image) before projection and after backprojection to ensure our DD kernels run faster in parallel GPU threads. For the implementation of our iterative MBIR technique we use a parallel multi-GPU version of the alternating minimization (AM) algorithm with penalized likelihood update. The time performance using our proposed reconstruction method with Siemens Sensation 16 patient scan data shows an average of 24 times speedup using a single TITAN X GPU and 74 times speedup using 3 TITAN X GPUs in parallel for combined projection and backprojection.

  18. SU-E-J-26: A Novel Technique for Markerless Self-Sorted 4D-CBCT Using Patient Motion Modeling: A Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L; Zhang, Y; Harris, W

    2015-06-15

    Purpose: To develop an automatic markerless 4D-CBCT projection sorting technique by using a patient respiratory motion model extracted from the planning 4D-CT images. Methods: Each phase of onboard 4D-CBCT is considered as a deformation of one phase of the prior planning 4D-CT. The deformation field map (DFM) is represented as a linear combination of three major deformation patterns extracted from the planning 4D-CT using principle component analysis (PCA). The coefficients of the PCA deformation patterns are solved by matching the digitally reconstructed radiograph (DRR) of the deformed volume to the onboard projection acquired. The PCA coefficients are solved for eachmore » single projection, and are used for phase sorting. Projections at the peaks of the Z direction coefficient are sorted as phase 1 and other projections are assigned into 10 phase bins by dividing phases equally between peaks. The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the proposed technique. Three scenarios were simulated, with different tumor motion amplitude (3cm to 2cm), tumor spatial shift (8mm SI), and tumor body motion phase shift (2 phases) from prior to on-board images. Projections were simulated over 180 degree scan-angle for the 4D-XCAT. The percentage of accurately binned projections across entire dataset was calculated to represent the phase sorting accuracy. Results: With a changed tumor motion amplitude from 3cm to 2cm, markerless phase sorting accuracy was 100%. With a tumor phase shift of 2 phases w.r.t. body motion, the phase sorting accuracy was 100%. With a tumor spatial shift of 8mm in SI direction, phase sorting accuracy was 86.1%. Conclusion: The XCAT phantom simulation results demonstrated that it is feasible to use prior knowledge and motion modeling technique to achieve markerless 4D-CBCT phase sorting. National Institutes of Health Grant No. R01-CA184173 Varian Medical System.« less

  19. Development of Millimeter-Wave Velocimetry and Acoustic Time-of-Flight Tomography for Measurements in Densely Loaded Gas-Solid Riser Flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fort, James A.; Pfund, David M.; Sheen, David M.

    2007-04-01

    The MFDRC was formed in 1998 to advance the state-of-the-art in simulating multiphase turbulent flows by developing advanced computational models for gas-solid flows that are experimentally validated over a wide range of industrially relevant conditions. The goal was to transfer the resulting validated models to interested US commercial CFD software vendors, who would then propagate the models as part of new code versions to their customers in the US chemical industry. Since the lack of detailed data sets at industrially relevant conditions is the major roadblock to developing and validating multiphase turbulence models, a significant component of the work involvedmore » flow measurements on an industrial-scale riser contributed by Westinghouse, which was subsequently installed at SNL. Model comparisons were performed against these datasets by LANL. A parallel Office of Industrial Technology (OIT) project within the consortium made similar comparisons between riser measurements and models at NETL. Measured flow quantities of interest included volume fraction, velocity, and velocity-fluctuation profiles for both gas and solid phases at various locations in the riser. Some additional techniques were required for these measurements beyond what was currently available. PNNL’s role on the project was to work with the SNL experimental team to develop and test two new measurement techniques, acoustic tomography and millimeter-wave velocimetry. Acoustic tomography is a promising technique for gas-solid flow measurements in risers and PNNL has substantial related experience in this area. PNNL is also active in developing millimeter wave imaging techniques, and this technology presents an additional approach to make desired measurements. PNNL supported the advanced diagnostics development part of this project by evaluating these techniques and then by adapting and developing the selected technology to bulk gas-solids flows and by implementing them for testing in the SNL riser testbed.« less

  20. Project Super Heart--Year One.

    ERIC Educational Resources Information Center

    Bellardini, Harry; And Others

    1980-01-01

    A model cardiovascular disease prevention program for young children is described. Components include physical examinations, health education (anatomy and physiology of the cardiovascular system), nutrition instruction, first aid techniques, role modeling, and environmental engineering. (JN)

  1. Reduced order modeling of fluid/structure interaction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew Franklin; Kalashnikova, Irina; Segalman, Daniel Joseph

    2009-11-01

    This report describes work performed from October 2007 through September 2009 under the Sandia Laboratory Directed Research and Development project titled 'Reduced Order Modeling of Fluid/Structure Interaction.' This project addresses fundamental aspects of techniques for construction of predictive Reduced Order Models (ROMs). A ROM is defined as a model, derived from a sequence of high-fidelity simulations, that preserves the essential physics and predictive capability of the original simulations but at a much lower computational cost. Techniques are developed for construction of provably stable linear Galerkin projection ROMs for compressible fluid flow, including a method for enforcing boundary conditions that preservesmore » numerical stability. A convergence proof and error estimates are given for this class of ROM, and the method is demonstrated on a series of model problems. A reduced order method, based on the method of quadratic components, for solving the von Karman nonlinear plate equations is developed and tested. This method is applied to the problem of nonlinear limit cycle oscillations encountered when the plate interacts with an adjacent supersonic flow. A stability-preserving method for coupling the linear fluid ROM with the structural dynamics model for the elastic plate is constructed and tested. Methods for constructing efficient ROMs for nonlinear fluid equations are developed and tested on a one-dimensional convection-diffusion-reaction equation. These methods are combined with a symmetrization approach to construct a ROM technique for application to the compressible Navier-Stokes equations.« less

  2. Estimating 4D-CBCT from prior information and extremely limited angle projections using structural PCA and weighted free-form deformation for lung radiotherapy.

    PubMed

    Harris, Wendy; Zhang, You; Yin, Fang-Fang; Ren, Lei

    2017-03-01

    To investigate the feasibility of using structural-based principal component analysis (PCA) motion-modeling and weighted free-form deformation to estimate on-board 4D-CBCT using prior information and extremely limited angle projections for potential 4D target verification of lung radiotherapy. A technique for lung 4D-CBCT reconstruction has been previously developed using a deformation field map (DFM)-based strategy. In the previous method, each phase of the 4D-CBCT was generated by deforming a prior CT volume. The DFM was solved by a motion model extracted by a global PCA and free-form deformation (GMM-FD) technique, using a data fidelity constraint and deformation energy minimization. In this study, a new structural PCA method was developed to build a structural motion model (SMM) by accounting for potential relative motion pattern changes between different anatomical structures from simulation to treatment. The motion model extracted from planning 4DCT was divided into two structures: tumor and body excluding tumor, and the parameters of both structures were optimized together. Weighted free-form deformation (WFD) was employed afterwards to introduce flexibility in adjusting the weightings of different structures in the data fidelity constraint based on clinical interests. XCAT (computerized patient model) simulation with a 30 mm diameter lesion was simulated with various anatomical and respiratory changes from planning 4D-CT to on-board volume to evaluate the method. The estimation accuracy was evaluated by the volume percent difference (VPD)/center-of-mass-shift (COMS) between lesions in the estimated and "ground-truth" on-board 4D-CBCT. Different on-board projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy. The method was also evaluated against three lung patients. The SMM-WFD method achieved substantially better accuracy than the GMM-FD method for CBCT estimation using extremely small scan angles or projections. Using orthogonal 15° scanning angles, the VPD/COMS were 3.47 ± 2.94% and 0.23 ± 0.22 mm for SMM-WFD and 25.23 ± 19.01% and 2.58 ± 2.54 mm for GMM-FD among all eight XCAT scenarios. Compared to GMM-FD, SMM-WFD was more robust against reduction of the scanning angles down to orthogonal 10° with VPD/COMS of 6.21 ± 5.61% and 0.39 ± 0.49 mm, and more robust against reduction of projection numbers down to only 8 projections in total for both orthogonal-view 30° and orthogonal-view 15° scan angles. SMM-WFD method was also more robust than the GMM-FD method against increasing levels of noise in the projection images. Additionally, the SMM-WFD technique provided better tumor estimation for all three lung patients compared to the GMM-FD technique. Compared to the GMM-FD technique, the SMM-WFD technique can substantially improve the 4D-CBCT estimation accuracy using extremely small scan angles and low number of projections to provide fast low dose 4D target verification. © 2017 American Association of Physicists in Medicine.

  3. Science to Improve Nutrient Management Practices, Metrics of Benefits, Accountability, and Communication (Project SSWR 4.03)

    EPA Science Inventory

    This project will demonstrate transferable modeling techniques and monitoring approaches to enable water resource professionals to make comparisons among nutrient reduction management scenarios across urban and agricultural areas. It will produce the applied science to allow bett...

  4. Kinematic Measurement of Knee Prosthesis from Single-Plane Projection Images

    NASA Astrophysics Data System (ADS)

    Hirokawa, Shunji; Ariyoshi, Shogo; Takahashi, Kenji; Maruyama, Koichi

    In this paper, the measurement of 3D motion from 2D perspective projections of knee prosthesis is described. The technique reported by Banks and Hodge was further developed in this study. The estimation was performed in two steps. The first-step estimation was performed on the assumption of orthogonal projection. Then, the second-step estimation was subsequently carried out based upon the perspective projection to accomplish more accurate estimation. The simulation results have demonstrated that the technique archived sufficient accuracies of position/orientation estimation for prosthetic kinematics. Then we applied our algorithm to the CCD images, thereby examining the influences of various artifacts, possibly incorporated through an imaging process, on the estimation accuracies. We found that accuracies in the experiment were influenced mainly by the geometric discrepancies between the prosthesis component and computer generated model and by the spacial inconsistencies between the coordinate axes of the positioner and that of the computer model. However, we verified that our algorithm could achieve proper and consistent estimation even for the CCD images.

  5. Smart climate ensemble exploring approaches: the example of climate impacts on air pollution in Europe.

    NASA Astrophysics Data System (ADS)

    Lemaire, Vincent; Colette, Augustin; Menut, Laurent

    2016-04-01

    Because of its sensitivity to weather patterns, climate change will have an impact on air pollution so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, at present, such impact assessment lack multi-model ensemble approaches to address uncertainties because of the substantial computing cost. Therefore, as a preliminary step towards exploring large climate ensembles with air quality models, we developed an ensemble exploration technique in order to point out the climate models that should be investigated in priority. By using a training dataset from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed statistical models that could be used to estimate future air pollutant concentrations. Applying this statistical model to the whole EuroCordex ensemble of climate projection, we find a climate penalty for six subregions out of eight (Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy). On the contrary, a climate benefit for PM2.5 was identified for three regions (Eastern Europe, Mid Europe and Northern Italy). The uncertainty of this statistical model challenges limits however the confidence we can attribute to associated quantitative projections. This technique allows however selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections to propose an adequate coverage of uncertainties. We are thereby proposing a smart ensemble exploration strategy that can also be used for other impacts studies beyond air quality.

  6. Model Independence in Downscaled Climate Projections: a Case Study in the Southeast United States

    NASA Astrophysics Data System (ADS)

    Gray, G. M. E.; Boyles, R.

    2016-12-01

    Downscaled climate projections are used to deduce how the climate will change in future decades at local and regional scales. It is important to use multiple models to characterize part of the future uncertainty given the impact on adaptation decision making. This is traditionally employed through an equally-weighted ensemble of multiple GCMs downscaled using one technique. Newer practices include several downscaling techniques in an effort to increase the ensemble's representation of future uncertainty. However, this practice may be adding statistically dependent models to the ensemble. Previous research has shown a dependence problem in the GCM ensemble in multiple generations, but has not been shown in the downscaled ensemble. In this case study, seven downscaled climate projections on the daily time scale are considered: CLAREnCE10, SERAP, BCCA (CMIP5 and CMIP3 versions), Hostetler, CCR, and MACA-LIVNEH. These data represent 83 ensemble members, 44 GCMs, and two generations of GCMs. Baseline periods are compared against the University of Idaho's METDATA gridded observation dataset. Hierarchical agglomerative clustering is applied to the correlated errors to determine dependent clusters. Redundant GCMs across different downscaling techniques show the most dependence, while smaller dependence signals are detected within downscaling datasets and across generations of GCMs. These results indicate that using additional downscaled projections to increase the ensemble size must be done with care to avoid redundant GCMs and the process of downscaling may increase the dependence of those downscaled GCMs. Climate model generation does not appear dissimilar enough to be treated as two separate statistical populations for ensemble building at the local and regional scales.

  7. Multi-scale model of the ionosphere from the combination of modern space-geodetic satellite techniques - project status and first results

    NASA Astrophysics Data System (ADS)

    Schmidt, M.; Hugentobler, U.; Jakowski, N.; Dettmering, D.; Liang, W.; Limberger, M.; Wilken, V.; Gerzen, T.; Hoque, M.; Berdermann, J.

    2012-04-01

    Near real-time high resolution and high precision ionosphere models are needed for a large number of applications, e.g. in navigation, positioning, telecommunications or astronautics. Today these ionosphere models are mostly empirical, i.e., based purely on mathematical approaches. In the DFG project 'Multi-scale model of the ionosphere from the combination of modern space-geodetic satellite techniques (MuSIK)' the complex phenomena within the ionosphere are described vertically by combining the Chapman electron density profile with a plasmasphere layer. In order to consider the horizontal and temporal behaviour the fundamental target parameters of this physics-motivated approach are modelled by series expansions in terms of tensor products of localizing B-spline functions depending on longitude, latitude and time. For testing the procedure the model will be applied to an appropriate region in South America, which covers relevant ionospheric processes and phenomena such as the Equatorial Anomaly. The project connects the expertise of the three project partners, namely Deutsches Geodätisches Forschungsinstitut (DGFI) Munich, the Institute of Astronomical and Physical Geodesy (IAPG) of the Technical University Munich (TUM) and the German Aerospace Center (DLR), Neustrelitz. In this presentation we focus on the current status of the project. In the first year of the project we studied the behaviour of the ionosphere in the test region, we setup appropriate test periods covering high and low solar activity as well as winter and summer and started the data collection, analysis, pre-processing and archiving. We developed partly the mathematical-physical modelling approach and performed first computations based on simulated input data. Here we present information on the data coverage for the area and the time periods of our investigations and we outline challenges of the multi-dimensional mathematical-physical modelling approach. We show first results, discuss problems in modelling and possible solution strategies and finally, we address open questions.

  8. Alignment issues, correlation techniques and their assessment for a visible light imaging-based 3D printer quality control system

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2016-05-01

    Quality control is critical to manufacturing. Frequently, techniques are used to define object conformity bounds, based on historical quality data. This paper considers techniques for bespoke and small batch jobs that are not statistical model based. These techniques also serve jobs where 100% validation is needed due to the mission or safety critical nature of particular parts. One issue with this type of system is alignment discrepancies between the generated model and the physical part. This paper discusses and evaluates techniques for characterizing and correcting alignment issues between the projected and perceived data sets to prevent errors attributable to misalignment.

  9. Final Report on the Multicultural/Diversity Assessment Project.

    ERIC Educational Resources Information Center

    Ambrosio, Anthony L.

    The Emporia State University Multicultural/Diversity Project developed a set of assessment instruments and a model evaluation plan to assess multicultural/diversity (MCD) outcomes in teacher education and general education programs. Assessment instruments and techniques were constructed to evaluate the impact of coursework on student attitudes,…

  10. C-LAMP Subproject Description:Climate Forcing by the Terrestrial Biosphere During the Second Half of the 20th Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Covey, Curt; Hoffman, Forrest

    2008-10-02

    This project will quantify selected components of climate forcing due to changes in the terrestrial biosphere over the period 1948-2004, as simulated by the climate / carboncycle models participating in C-LAMP (the Carbon-Land Model Intercomparison Project; see http://www.climatemodeling.org/c-lamp). Unlike other C-LAMP projects that attempt to close the carbon budget, this project will focus on the contributions of individual biomes in terms of the resulting climate forcing. Bala et al. (2007) used a similar (though more comprehensive) model-based technique to assess and compare different components of biospheric climate forcing, but their focus was on potential future deforestation rather than the historicalmore » period.« less

  11. A controlled field pilot for testing near surface CO2 detection techniques and transport models

    USGS Publications Warehouse

    Spangler, L.H.; Dobeck, L.M.; Repasky, K.; Nehrir, A.; Humphries, S.; Keith, C.; Shaw, J.; Rouse, J.; Cunningham, A.; Benson, S.; Oldenburg, C.M.; Lewicki, J.L.; Wells, A.; Diehl, R.; Strazisar, B.; Fessenden, J.; Rahn, Thomas; Amonette, J.; Barr, J.; Pickles, W.; Jacobson, J.; Silver, E.; Male, E.; Rauch, H.; Gullickson, K.; Trautz, R.; Kharaka, Y.; Birkholzer, J.; Wielopolski, L.

    2009-01-01

    A field facility has been developed to allow controlled studies of near surface CO2 transport and detection technologies. The key component of the facility is a shallow, slotted horizontal well divided into six zones. The scale and fluxes were designed to address large scale CO2 storage projects and desired retention rates for those projects. A wide variety of detection techniques were deployed by collaborators from 6 national labs, 2 universities, EPRI, and the USGS. Additionally, modeling of CO2 transport and concentrations in the saturated soil and in the vadose zone was conducted. An overview of these results will be presented. ?? 2009 Elsevier Ltd. All rights reserved.

  12. Citygml Modelling for Singapore 3d National Mapping

    NASA Astrophysics Data System (ADS)

    Soon, K. H.; Khoo, V. H. S.

    2017-10-01

    Since 2014, the Land Survey Division of Singapore Land Authority (SLA) has spearheaded a Whole-of-Government (WOG) 3D mapping project to create and maintain a 3D national map for Singapore. The implementation of the project is divided into two phases. The first phase of the project, which was based on airborne data collection, has produced 3D models for Relief, Building, Vegetation and Waterbody. This part of the work was completed in 2016. To complement the first phase, the second phase used mobile imaging and scanning technique. This phase is targeted to be completed by the mid of 2017 and is creating 3D models for Transportation, CityFurniture, Bridge and Tunnel. The project has extensively adopted the Open Geospatial Consortium (OGC)'s CityGML standard. Out of 10 currently supported thematic modules in CityGML 2.0, the project has implemented 8. The paper describes the adoption of CityGML in the project, and discusses challenges, data validations and management of the models.

  13. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility

    NASA Technical Reports Server (NTRS)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd

    1999-01-01

    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  14. Development of the evaluation instrument use CIPP on the implementation of project assessment topic optik

    NASA Astrophysics Data System (ADS)

    Asfaroh, Jati Aurum; Rosana, Dadan; Supahar

    2017-08-01

    This research aims to develop an evaluation instrument models CIPP valid and reliable as well as determine the feasibility and practicality of an evaluation instrument models CIPP. An evaluation instrument models CIPP to evaluate the implementation of the project assessment topic optik to measure problem-solving skills of junior high school class VIII in the Yogyakarta region. This research is a model of development that uses 4-D. Subject of product trials are students in class VIII SMP N 1 Galur and SMP N 1 Sleman. Data collection techniques in this research using non-test techniques include interviews, questionnaires and observations. Validity in this research was analyzed using V'Aikens. Reliability analyzed using ICC. This research uses 7 raters are derived from two lecturers expert (expert judgment), two practitioners (science teacher) and three colleagues. The results of this research is the evaluation's instrument model of CIPP is used to evaluate the implementation of the implementation of the project assessment instruments. The validity result of evaluation instrument have V'Aikens values between 0.86 to 1, which means a valid and 0.836 reliability values into categories so well that it has been worth used as an evaluation instrument.

  15. Projection model for flame chemiluminescence tomography based on lens imaging

    NASA Astrophysics Data System (ADS)

    Wan, Minggang; Zhuang, Jihui

    2018-04-01

    For flame chemiluminescence tomography (FCT) based on lens imaging, the projection model is essential because it formulates the mathematical relation between the flame projections captured by cameras and the chemiluminescence field, and, through this relation, the field is reconstructed. This work proposed the blurry-spot (BS) model, which takes more universal assumptions and has higher accuracy than the widely applied line-of-sight model. By combining the geometrical camera model and the thin-lens equation, the BS model takes into account perspective effect of the camera lens; by combining ray-tracing technique and Monte Carlo simulation, it also considers inhomogeneous distribution of captured radiance on the image plane. Performance of these two models in FCT was numerically compared, and results showed that using the BS model could lead to better reconstruction quality in wider application ranges.

  16. Closed-form breakdown voltage/specific on-resistance model using charge superposition technique for vertical power double-diffused metal–oxide–semiconductor device with high-κ insulator

    NASA Astrophysics Data System (ADS)

    Chen, Xue; Wang, Zhi-Gang; Wang, Xi; Kuo, James B.

    2018-04-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant No. 61404110) and the National Higher-education Institution General Research and Development Project, China (Grant No. 2682014CX097).

  17. Carbon footprint estimator, phase II : volume I - GASCAP model.

    DOT National Transportation Integrated Search

    2014-03-01

    The GASCAP model was developed to provide a software tool for analysis of the life-cycle GHG : emissions associated with the construction and maintenance of transportation projects. This phase : of development included techniques for estimating emiss...

  18. EXPLORATION OF SIMULATION AS A RETIREMENT EDUCATION TECHNIQUE. FINAL REPORT.

    ERIC Educational Resources Information Center

    BOOCOCK, SARANE SPENCE; SPRAGUE, NORMAN

    A PILOT PROJECT EXPLORED THE ADAPTATION OF SIMULATION TECHNIQUES TO FOUR RETIREMENT PROBLEMS--FINANCIAL POSITION, PHYSICAL ENVIRONMENT (HOUSING CHOICES), HEALTH, AND SOCIAL ENVIRONMENT (PLANNING AND GAINING SKILLS BEFORE RETIREMENT). A PRELIMINARY MODEL OF A GAME IN RETIREMENT FINANCE PRESENTS PLAYERS WITH THREE INVESTMENT SITUATIONS--SAVINGS…

  19. Special Section: A Debate on Research Techniques in Economic Education

    ERIC Educational Resources Information Center

    Dawson, George G.; And Others

    1976-01-01

    Dawson introduces three articles which debate merits of research techniques in undergraduate economic education. William E. Becker criticizes John C. Soper's models, multicollinearity argument, and student incentives in a research project; Soper replies; Robert Highsmith critically analyzes strengths and weaknesses of each argument. (AV)

  20. A fluidized bed technique for estimating soil critical shear stress

    USDA-ARS?s Scientific Manuscript database

    Soil erosion models, depending on how they are formulated, always have erodibilitiy parameters in the erosion equations. For a process-based model like the Water Erosion Prediction Project (WEPP) model, the erodibility parameters include rill and interrill erodibility and critical shear stress. Thes...

  1. Personalized models of bones based on radiographic photogrammetry.

    PubMed

    Berthonnaud, E; Hilmi, R; Dimnet, J

    2009-07-01

    The radiographic photogrammetry is applied, for locating anatomical landmarks in space, from their two projected images. The goal of this paper is to define a personalized geometric model of bones, based uniquely on photogrammetric reconstructions. The personalized models of bones are obtained from two successive steps: their functional frameworks are first determined experimentally, then, the 3D bone representation results from modeling techniques. Each bone functional framework is issued from direct measurements upon two radiographic images. These images may be obtained using either perpendicular (spine and sacrum) or oblique incidences (pelvis and lower limb). Frameworks link together their functional axes and punctual landmarks. Each global bone volume is decomposed in several elementary components. Each volumic component is represented by simple geometric shapes. Volumic shapes are articulated to the patient's bone structure. The volumic personalization is obtained by best fitting the geometric model projections to their real images, using adjustable articulations. Examples are presented to illustrating the technique of personalization of bone volumes, directly issued from the treatment of only two radiographic images. The chosen techniques for treating data are then discussed. The 3D representation of bones completes, for clinical users, the information brought by radiographic images.

  2. Development of Powder Processing Models and Techniques for Meso-scale Devices: Perspirable Skin

    DTIC Science & Technology

    2008-03-31

    of Powder Processing Models and Techniques for Meso-scale Devices: Perspirable Skin Contract Number ...Skin 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-05-1-0202 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Patrick Kwon, Michigan State University 5d...PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Patrick Kwon Department of

  3. Galerkin v. discrete-optimal projection in nonlinear model reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlberg, Kevin Thomas; Barone, Matthew Franklin; Antil, Harbir

    Discrete-optimal model-reduction techniques such as the Gauss{Newton with Approximated Tensors (GNAT) method have shown promise, as they have generated stable, accurate solutions for large-scale turbulent, compressible ow problems where standard Galerkin techniques have failed. However, there has been limited comparative analysis of the two approaches. This is due in part to difficulties arising from the fact that Galerkin techniques perform projection at the time-continuous level, while discrete-optimal techniques do so at the time-discrete level. This work provides a detailed theoretical and experimental comparison of the two techniques for two common classes of time integrators: linear multistep schemes and Runge{Kutta schemes.more » We present a number of new ndings, including conditions under which the discrete-optimal ROM has a time-continuous representation, conditions under which the two techniques are equivalent, and time-discrete error bounds for the two approaches. Perhaps most surprisingly, we demonstrate both theoretically and experimentally that decreasing the time step does not necessarily decrease the error for the discrete-optimal ROM; instead, the time step should be `matched' to the spectral content of the reduced basis. In numerical experiments carried out on a turbulent compressible- ow problem with over one million unknowns, we show that increasing the time step to an intermediate value decreases both the error and the simulation time of the discrete-optimal reduced-order model by an order of magnitude.« less

  4. The US DOE A2e Mesoscale to Microscale Coupling Project: Nonstationary Modeling Techniques and Assessment

    NASA Astrophysics Data System (ADS)

    Haupt, Sue Ellen; Kosovic, Branko; Shaw, William

    2017-04-01

    The purpose of the US DOE's Mesoscale-Microscale Coupling (MMC) Project is to develop, verify, and validate physical models and modeling techniques that bridge the most important atmospheric scales that determine wind plant performance and reliability. As part of DOE's Atmosphere to Electrons (A2e) program, the MMC project seeks to create a new predictive numerical simulation capability that is able to represent the full range of atmospheric flow conditions impacting wind plant performance. The recent focus of MMC has been on nonstationary conditions over flat terrain. These nonstationary cases are critical for wind energy and represent a primary need for mesoscale meteorological forcing of the microscale models. The MMC team modeled two types of non-stationary cases: 1) diurnal cycles in which the daytime convective boundary layer collapses with the setting of the sun when the surface heat flux changes from positive to negative, passing through a brief period of neutral stability before becoming stable, with smaller scale turbulence and the potential for low level jet (LLJ) formation; and 2) frontal passage as an example of a synoptic weather event that may cause relatively rapid changes in wind speed and direction. The team compared and contrasted two primary techniques for non-stationary forcing of the microscale by the mesoscale model. The first is to use the tendencies from the mesoscale model to directly force the microscale mode. The second method is to couple not only the microscale domain's internal forcing parameters, but also its lateral boundaries, to a mesoscale simulation. While the boundary coupled approach provides the greatest generality, since the mesoscale flow information providing the lateral boundary information for the microscale domain contains no explicit turbulence information, the approach requires methods to accelerate turbulence production at the microscale domain's inflow boundaries. Forefront assessment strategies, including comparing spectra and cospectra, were used to assess the techniques. Testing methods to initialize turbulence at the microscale was also accomplished.

  5. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  6. EXAMPLE APPLICATION OF CFD SIMULATIONS FOR SHORT-RANGE ATMOSPHERIC DISPERSION OVER THE OPEN FIELDS OF PROJECT PRAIRIE GRASS

    EPA Science Inventory

    Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...

  7. Hybrid Systems Diagnosis

    NASA Technical Reports Server (NTRS)

    McIlraith, Sheila; Biswas, Gautam; Clancy, Dan; Gupta, Vineet

    2005-01-01

    This paper reports on an on-going Project to investigate techniques to diagnose complex dynamical systems that are modeled as hybrid systems. In particular, we examine continuous systems with embedded supervisory controllers that experience abrupt, partial or full failure of component devices. We cast the diagnosis problem as a model selection problem. To reduce the space of potential models under consideration, we exploit techniques from qualitative reasoning to conjecture an initial set of qualitative candidate diagnoses, which induce a smaller set of models. We refine these diagnoses using parameter estimation and model fitting techniques. As a motivating case study, we have examined the problem of diagnosing NASA's Sprint AERCam, a small spherical robotic camera unit with 12 thrusters that enable both linear and rotational motion.

  8. HSR Model Deformation Measurements from Subsonic to Supersonic Speeds

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Erickson, G. E.; Goodman, W. L.; Fleming, G. A.

    1999-01-01

    This paper describes the video model deformation technique (VMD) used at five NASA facilities and the projection moire interferometry (PMI) technique used at two NASA facilities. Comparisons between the two techniques for model deformation measurements are provided. Facilities at NASA-Ames and NASA-Langley where deformation measurements have been made are presented. Examples of HSR model deformation measurements from the Langley Unitary Wind Tunnel, Langley 16-foot Transonic Wind Tunnel, and the Ames 12-foot Pressure Tunnel are presented. A study to improve and develop new targeting schemes at the National Transonic Facility is also described. The consideration of milled targets for future HSR models is recommended when deformation measurements are expected to be required. Finally, future development work for VMD and PMI is addressed.

  9. A decision-theoretic approach to the display of information for time-critical decisions: The Vista project

    NASA Technical Reports Server (NTRS)

    Horvitz, Eric; Ruokangas, Corinne; Srinivas, Sampath; Barry, Matthew

    1993-01-01

    We describe a collaborative research and development effort between the Palo Alto Laboratory of the Rockwell Science Center, Rockwell Space Operations Company, and the Propulsion Systems Section of NASA JSC to design computational tools that can manage the complexity of information displayed to human operators in high-stakes, time-critical decision contexts. We shall review an application from NASA Mission Control and describe how we integrated a probabilistic diagnostic model and a time-dependent utility model, with techniques for managing the complexity of computer displays. Then, we shall describe the behavior of VPROP, a system constructed to demonstrate promising display-management techniques. Finally, we shall describe our current research directions on the Vista 2 follow-on project.

  10. Projecting malaria hazard from climate change in eastern Africa using large ensembles to estimate uncertainty.

    PubMed

    Leedale, Joseph; Tompkins, Adrian M; Caminade, Cyril; Jones, Anne E; Nikulin, Grigory; Morse, Andrew P

    2016-03-31

    The effect of climate change on the spatiotemporal dynamics of malaria transmission is studied using an unprecedented ensemble of climate projections, employing three diverse bias correction and downscaling techniques, in order to partially account for uncertainty in climate- driven malaria projections. These large climate ensembles drive two dynamical and spatially explicit epidemiological malaria models to provide future hazard projections for the focus region of eastern Africa. While the two malaria models produce very distinct transmission patterns for the recent climate, their response to future climate change is similar in terms of sign and spatial distribution, with malaria transmission moving to higher altitudes in the East African Community (EAC) region, while transmission reduces in lowland, marginal transmission zones such as South Sudan. The climate model ensemble generally projects warmer and wetter conditions over EAC. The simulated malaria response appears to be driven by temperature rather than precipitation effects. This reduces the uncertainty due to the climate models, as precipitation trends in tropical regions are very diverse, projecting both drier and wetter conditions with the current state-of-the-art climate model ensemble. The magnitude of the projected changes differed considerably between the two dynamical malaria models, with one much more sensitive to climate change, highlighting that uncertainty in the malaria projections is also associated with the disease modelling approach.

  11. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  12. LANDSAT land cover analysis completed for CIRSS/San Bernardino County project

    NASA Technical Reports Server (NTRS)

    Likens, W.; Maw, K.; Sinnott, D. (Principal Investigator)

    1982-01-01

    The LANDSAT analysis carried out as part of Ames Research Center's San Bernardino County Project, one of four projects sponsored by NASA as part of the California Integrated Remote Sensing System (CIRSS) effort for generating and utilizing digital geographic data bases, is described. Topics explored include use of data-base modeling with spectral cluster data to improve LANDSAT data classification, and quantitative evaluation of several change techniques. Both 1976 and 1979 LANDSAT data were used in the project.

  13. Assessing the impacts of climate change in Mediterranean catchments under conditions of data scarcity

    NASA Astrophysics Data System (ADS)

    Meyer, Swen; Ludwig, Ralf

    2013-04-01

    According to current climate projections, Mediterranean countries are at high risk for an even pronounced susceptibility to changes in the hydrological budget and extremes. While there is scientific consensus that climate induced changes on the hydrology of Mediterranean regions are presently occurring and are projected to amplify in the future, very little knowledge is available about the quantification of these changes, which is hampered by a lack of suitable and cost effective hydrological monitoring and modeling systems. The European FP7-project CLIMB is aiming to analyze climate induced changes on the hydrology of the Mediterranean Basins by investigating 7 test sites located in the countries Italy, France, Turkey, Tunisia, Gaza and Egypt. CLIMB employs a combination of novel geophysical field monitoring concepts, remote sensing techniques and integrated hydrologic modeling to improve process descriptions and understanding and to quantify existing uncertainties in climate change impact analysis. The Rio Mannu Basin, located in Sardinia; Italy, is one test site of the CLIMB project. The catchment has a size of 472.5 km2, it ranges from 62 to 946 meters in elevation, at mean annual temperatures of 16°C and precipitation of about 700 mm, the annual runoff volume is about 200 mm. The physically based Water Simulation Model WaSiM Vers. 2 (Schulla & Jasper (1999)) was setup to model current and projected future hydrological conditions. The availability of measured meteorological and hydrological data is poor as common to many Mediterranean catchments. The lack of available measured input data hampers the calibration of the model setup and the validation of model outputs. State of the art remote sensing techniques and field measuring techniques were applied to improve the quality of hydrological input parameters. In a field campaign about 250 soil samples were collected and lab-analyzed. Different geostatistical regionalization methods were tested to improve the model setup. The soil parameterization of the model was tested against publically available soil data. Results show a significant improvement of modeled soil moisture outputs. To validate WaSiMs evapotranspiration (ETact) outputs, Landsat TM images were used to calculate the actual monthly mean ETact rates using the triangle method (Jiang and Islam, 1999). Simulated spatial ETact patterns and those derived from remote sensing show a good fit especially for the growing season. WaSiM was driven with the meteorological forcing taken from 4 different ENSEMBLES climate projections for a reference (1971-2000) and a future (2041-2070) times series. Output results were analyzed for climate induced changes on selected hydrological variables. While the climate projections reveal increased precipitation rates in the spring season, first simulation results show an earlier onset and an increased duration of the dry season, imposing an increased irrigation demand and higher vulnerability of agricultural productivity.

  14. Power in the Classroom VI: Verbal Control Strategies, Nonverbal Immediacy and Affective Learning.

    ERIC Educational Resources Information Center

    Plax, Timothy G.; And Others

    Recognizing that nonverbal behaviors typically provide the framework for interpreting verbal messages, this project (the sixth in a series of projects designed to examine teacher power in the classroom) proposed and sequentially tested a heuristic model of student affective learning as a function of behavior alteration techniques and teacher…

  15. The Hemophilia Games: An Experiment in Health Education Planning.

    ERIC Educational Resources Information Center

    National Heart and Lung Inst. (DHEW/PHS), Bethesda, MD.

    The Hemophilia Health Education Planning Project was designed to (1) create a set of tools useful in hemophilia planning and education, and (2) create a planning model for other diseases with similar factors. The project used the game-simulations technique which was felt to be particularly applicable to hemophilia health problems, since as a…

  16. Synthesis, Purification, and Characterization of a [mu]-(1,3-Propanedithiolato)-Hexacarbonyldiiron

    ERIC Educational Resources Information Center

    Works, Carmen F.

    2007-01-01

    A project which exposes students to biologically important transition-metal chemistry is illustrated by taking an example of the iron-carbonyl compound, [mu]-(1,3-Propanedithiolaro)-hexa-carbonyldiiron as a structural model for an iron-only hydro-genase. The project provides the students with experience of Schlenk line techniques, purification,…

  17. The QSPR-THESAURUS: the online platform of the CADASTER project.

    PubMed

    Brandmaier, Stefan; Peijnenburg, Willie; Durjava, Mojca K; Kolar, Boris; Gramatica, Paola; Papa, Ester; Bhhatarai, Barun; Kovarich, Simona; Cassani, Stefano; Roy, Partha Pratim; Rahmberg, Magnus; Öberg, Tomas; Jeliazkova, Nina; Golsteijn, Laura; Comber, Mike; Charochkina, Larisa; Novotarskyi, Sergii; Sushko, Iurii; Abdelaziz, Ahmed; D'Onofrio, Elisa; Kunwar, Prakash; Ruggiu, Fiorella; Tetko, Igor V

    2014-03-01

    The aim of the CADASTER project (CAse Studies on the Development and Application of in Silico Techniques for Environmental Hazard and Risk Assessment) was to exemplify REACH-related hazard assessments for four classes of chemical compound, namely, polybrominated diphenylethers, per and polyfluorinated compounds, (benzo)triazoles, and musks and fragrances. The QSPR-THESAURUS website (http: / /qspr-thesaurus.eu) was established as the project's online platform to upload, store, apply, and also create, models within the project. We overview the main features of the website, such as model upload, experimental design and hazard assessment to support risk assessment, and integration with other web tools, all of which are essential parts of the QSPR-THESAURUS. 2014 FRAME.

  18. Final Report: Biological and Synthetic Nanostructures Controlled at the Atomistic Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, A; van Buuren, T

    2007-02-21

    Nanotechnology holds great promise for many application fields, ranging from the semiconductor industry to medical research and national security. Novel, nanostructured materials are the fundamental building blocks upon which all these future nanotechnologies will be based. In this Strategic Initiative (SI) we conducted a combined theoretical and experimental investigation of the modeling, synthesis, characterization, and design techniques which are required to fabricate semiconducting and metallic nanostructures with enhanced properties. We focused on developing capabilities that have broad applicability to a wide range of materials and can be applied both to nanomaterials that are currently being developed for nanotechnology applications andmore » also to new, yet to be discovered, nanomaterials. During this 3 year SI project we have made excellent scientific progress in each of the components of this project. We have developed first-principles techniques for modeling the structural, electronic, optical, and transport properties of materials at the nanoscale. For the first time, we have simulated nanomaterials both in vacuum and in aqueous solution. These simulation capabilities harness the worldleading computational resources available at LLNL to model, at the quantum mechanical level, systems containing hundreds of atoms and thousands of electrons. Significant advances in the density functional and quantum Monte Carlo techniques employed in this project were developed to enable these techniques to scale up to simulating realistic size nanostructured materials. We have developed the first successful techniques for chemically synthesizing crystalline silicon and germanium nanoparticles and nanowires. We grew the first macroscopic, faceted superlattice crystals from these nanoparticles. We have also advanced our capabilities to synthesize semiconductor nanoparticles using physical vapor deposition techniques so that we are now able to control of the size, shape and surface structure of these nanoparticles. We have made advances in characterizing the surface of nanoparticles using x-ray absorption experiments. Throughout this SI a number of long-term, strategic external collaborations have been established. These collaborations have resulted in 30 joint publications, strategic hires of postdocs and graduate students from these groups into groups at LLNL and the submission of joint research grants. We have developed collaborations on the theory and modeling of nanomaterials with the groups of Profs. Ceder and Marzari (MIT), Crespi (Penn State), Freeman (Northwestern), Grossman and Lester (UC Berkeley), Mitas (North Carolina State), and Needs (Cambridge). We are collaborating with Dr. Alivisatos's group in the Molecular Foundry at Lawrence Berkeley Laboratory on the fabrication, characterization and modeling of inorganic nanomaterials. We are working with Prof. Majumdar's group at UC Berkeley on the characterization of nanomaterials. We are working with the molecular diamond group at Chevron-Texaco who has developed a process for extracting mono-disperse samples of nano-scale diamonds from crude oil. We are collaborating with Dr. Chen at UCSF to develop CdSe nanoparticle-biolabels. As a result of the outstanding scientific achievements and the long-term collaborations developed during this strategic initiative we have been extremely successful in obtaining external funding to continue and grow this research activity at the LLNL. We have received two DARPA grants to support the further development of our computational modeling techniques and to develop carbon nanotube based molecular separation devices. We have received two new Office of Science BES grants to support our nanomaterials modeling and synthesis projects. We have received funding from the NA22 office of DOE to develop the materials modeling capabilities begun in this SI for modeling detector materials. We have received funding from Intel Corporation to apply the modeling techniques developed in this initiative to examine silicon nanowires fabricated on computer chips. We are also pursuing several additional sources of funding from BES, the DHS, and NIH to support the continuation of the research programs developed in this SI. The remainder of this report and the attached publications describe the background to this SI research project and the details of the scientific achievements that have been made.« less

  19. Development of daily temperature scenarios and their impact on paddy crop evapotranspiration in Kangsabati command area

    NASA Astrophysics Data System (ADS)

    Dhage, P. M.; Raghuwanshi, N. S.; Singh, R.; Mishra, A.

    2017-05-01

    Production of the principal paddy crop in West Bengal state of India is vulnerable to climate change due to limited water resources and strong dependence on surface irrigation. Therefore, assessment of impact of temperature scenarios on crop evapotranspiration (ETc) is essential for irrigation management in Kangsabati command (West Bengal). In the present study, impact of the projected temperatures on ETc was studied under climate change scenarios. Further, the performance of the bias correction and spatial downscaling (BCSD) technique was compared with the two well-known downscaling techniques, namely, multiple linear regression (MLR) and Kernel regression (KR), for the projections of daily maximum and minimum air temperatures for four stations, namely, Purulia, Bankura, Jhargram, and Kharagpur. In National Centers for Environmental Prediction (NCEP) and General Circulation Model (GCM), 14 predictors were used in MLR and KR techniques, whereas maximum and minimum surface air temperature predictor of CanESM2 GCM was used in BCSD technique. The comparison results indicated that the performance of the BCSD technique was better than the MLR and KR techniques. Therefore, the BCSD technique was used to project the future temperatures of study locations with three Representative Concentration Pathway (RCP) scenarios for the period of 2006-2100. The warming tendencies of maximum and minimum temperatures over the Kangsabati command area were projected as 0.013 and 0.014 °C/year under RCP 2.6, 0.015 and 0.023 °C/year under RCP 4.5, and 0.056 and 0.061 °C/year under RCP 8.5 for 2011-2100 period, respectively. As a result, kharif (monsoon) crop evapotranspiration demand of Kangsabati reservoir command (project area) will increase by approximately 10, 8, and 18 % over historical demand under RCP 2.6, 4.5, and 8.5 scenarios, respectively.

  20. Operationalising uncertainty in data and models for integrated water resources management.

    PubMed

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  1. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  2. The materials processing research base of the Materials Processing Center. Report for FY 1982

    NASA Technical Reports Server (NTRS)

    Flemings, M. C.

    1983-01-01

    The work described, while involving research in the broad field of materials processing, has two common features: the problems are closed related to space precessing of materials and have both practical and fundamental significance. An interesting and important feature of many of the projects is that the interdisciplinary nature of the problem mandates complementary analytical modeling/experimental approaches. An other important aspect of many of the projects is the increasing use of mathematical modeling techniques as one of the research tools. The predictive capability of these models, when tested against measurements, plays a very important role in both the planning of experimental programs and in the rational interpretation of the results. Many of the projects described have a space experiment as their ultimate objective. Mathematical models are proving to be extremely valuable in projecting the findings of ground - based experiments to microgravity conditions.

  3. Development of a model for predicting NASA/MSFC program success

    NASA Technical Reports Server (NTRS)

    Riggs, Jeffrey; Miller, Tracy; Finley, Rosemary

    1990-01-01

    Research conducted during the execution of a previous contract (NAS8-36955/0039) firmly established the feasibility of developing a tool to aid decision makers in predicting the potential success of proposed projects. The final report from that investigation contains an outline of the method to be applied in developing this Project Success Predictor Model. As a follow-on to the previous study, this report describes in detail the development of this model and includes full explanation of the data-gathering techniques used to poll expert opinion. The report includes the presentation of the model code itself.

  4. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections

    PubMed Central

    Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.

    2018-01-01

    Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737

  5. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    PubMed

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.

  6. COLLABORATIVE RESEARCH:USING ARM OBSERVATIONS & ADVANCED STATISTICAL TECHNIQUES TO EVALUATE CAM3 CLOUDS FOR DEVELOPMENT OF STOCHASTIC CLOUD-RADIATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somerville, Richard

    2013-08-22

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).« less

  7. The Effect of Learning Based on Technology Model and Assessment Technique toward Thermodynamic Learning Achievement

    NASA Astrophysics Data System (ADS)

    Makahinda, T.

    2018-02-01

    The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.

  8. Electromagnetic modelling, inversion and data-processing techniques for GPR: ongoing activities in Working Group 3 of COST Action TU1208

    NASA Astrophysics Data System (ADS)

    Pajewski, Lara; Giannopoulos, Antonis; van der Kruk, Jan

    2015-04-01

    This work aims at presenting the ongoing research activities carried out in Working Group 3 (WG3) 'EM methods for near-field scattering problems by buried structures; data processing techniques' of the COST (European COoperation in Science and Technology) Action TU1208 'Civil Engineering Applications of Ground Penetrating Radar' (www.GPRadar.eu). The principal goal of the COST Action TU1208 is to exchange and increase scientific-technical knowledge and experience of GPR techniques in civil engineering, simultaneously promoting throughout Europe the effective use of this safe and non-destructive technique in the monitoring of infrastructures and structures. WG3 is structured in four Projects. Project 3.1 deals with 'Electromagnetic modelling for GPR applications.' Project 3.2 is concerned with 'Inversion and imaging techniques for GPR applications.' The topic of Project 3.3 is the 'Development of intrinsic models for describing near-field antenna effects, including antenna-medium coupling, for improved radar data processing using full-wave inversion.' Project 3.4 focuses on 'Advanced GPR data-processing algorithms.' Electromagnetic modeling tools that are being developed and improved include the Finite-Difference Time-Domain (FDTD) technique and the spectral domain Cylindrical-Wave Approach (CWA). One of the well-known freeware and versatile FDTD simulators is GprMax that enables an improved realistic representation of the soil/material hosting the sought structures and of the GPR antennas. Here, input/output tools are being developed to ease the definition of scenarios and the visualisation of numerical results. The CWA expresses the field scattered by subsurface two-dimensional targets with arbitrary cross-section as a sum of cylindrical waves. In this way, the interaction is taken into account of multiple scattered fields within the medium hosting the sought targets. Recently, the method has been extended to deal with through-the-wall scenarios. One of the inversion techniques currently being improved is Full-Waveform Inversion (FWI) for on-ground, off-ground, and crosshole GPR configurations. In contrast to conventional inversion tools which are often based on approximations and use only part of the available data, FWI uses the complete measured data and detailed modeling tools to obtain an improved estimation of medium properties. During the first year of the Action, information was collected and shared about state-of-the-art of the available modelling, imaging, inversion, and data-processing methods. Advancements achieved by WG3 Members were presented during the TU1208 Second General Meeting (April 30 - May 2, 2014, Vienna, Austria) and the 15th International Conference on Ground Penetrating Radar (June 30 - July 4, 2014, Brussels, Belgium). Currently, a database of numerical and experimental GPR responses from natural and manmade structures is being designed. A geometrical and physical description of the scenarios, together with the available synthetic and experimental data, will be at the disposal of the scientific community. Researchers will thus have a further opportunity of testing and validating, against reliable data, their electromagnetic forward- and inverse-scattering techniques, imaging methods and data-processing algorithms. The motivation to start this database came out during TU1208 meetings and takes inspiration by successful past initiatives carried out in different areas, as the Ipswich and Fresnel databases in the field of free-space electromagnetic scattering, and the Marmousi database in seismic science. Acknowledgement The Authors thank COST, for funding the Action TU1208 'Civil Engineering Applications of Ground Penetrating Radar.'

  9. H2LIFT: global navigation simulation ship tracking and WMD detection in the maritime domain

    NASA Astrophysics Data System (ADS)

    Wyffels, Kevin

    2007-04-01

    This paper presents initial results for a tracking simulation of multiple maritime vehicles for use in a data fusion program detecting Weapons of Mass Destruction (WMD). This simulation supports a fusion algorithm (H2LIFT) for collecting and analyzing data providing a heuristic analysis tool for detecting weapons of mass destruction in the maritime domain. Tools required to develop a navigational simulation fitting a set of project objectives are introduced for integration into the H2LIFT algorithm. Emphasis is placed on the specific requirements of the H2LIFT project, however the basic equations, algorithms, and methodologies can be used as tools in a variety of scenario simulations. Discussion will be focused on track modeling (e.g. position tracking of ships), navigational techniques, WMD detection, and simulation of these models using Matlab and Simulink. Initial results provide absolute ship position data for a given multi-ship maritime scenario with random generation of a given ship containing a WMD. Required coordinate systems, conversions between coordinate systems, Earth modeling techniques, and navigational conventions and techniques are introduced for development of the simulations.

  10. An Investigative Graduate Laboratory Course for Teaching Modern DNA Techniques

    ERIC Educational Resources Information Center

    de Lencastre, Alexandre; Torello, A. Thomas; Keller, Lani C.

    2017-01-01

    This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the "Drosophila" ortholog of a human disease gene of their choosing using…

  11. Hardware Implementation of Multiple Fan Beam Projection Technique in Optical Fibre Process Tomography

    PubMed Central

    Rahim, Ruzairi Abdul; Fazalul Rahiman, Mohd Hafiz; Leong, Lai Chen; Chan, Kok San; Pang, Jon Fea

    2008-01-01

    The main objective of this project is to implement the multiple fan beam projection technique using optical fibre sensors with the aim to achieve a high data acquisition rate. Multiple fan beam projection technique here is defined as allowing more than one emitter to transmit light at the same time using the switch-mode fan beam method. For the thirty-two pairs of sensors used, the 2-projection technique and 4-projection technique are being investigated. Sixteen sets of projections will complete one frame of light emission for the 2-projection technique while eight sets of projection will complete one frame of light emission for the 4-projection technique. In order to facilitate data acquisition process, PIC microcontroller and the sample and hold circuit are being used. This paper summarizes the hardware configuration and design for this project. PMID:27879885

  12. Cockpit System Situational Awareness Modeling Tool

    NASA Technical Reports Server (NTRS)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  13. High-Quality 3d Models and Their Use in a Cultural Heritage Conservation Project

    NASA Astrophysics Data System (ADS)

    Tucci, G.; Bonora, V.; Conti, A.; Fiorini, L.

    2017-08-01

    Cultural heritage digitization and 3D modelling processes are mainly based on laser scanning and digital photogrammetry techniques to produce complete, detailed and photorealistic three-dimensional surveys: geometric as well as chromatic aspects, in turn testimony of materials, work techniques, state of preservation, etc., are documented using digitization processes. The paper explores the topic of 3D documentation for conservation purposes; it analyses how geomatics contributes in different steps of a restoration process and it presents an overview of different uses of 3D models for the conservation and enhancement of the cultural heritage. The paper reports on the project to digitize the earthenware frieze of the Ospedale del Ceppo in Pistoia (Italy) for 3D documentation, restoration work support, and digital and physical reconstruction and integration purposes. The intent to design an exhibition area suggests new ways to take advantage of 3D data originally acquired for documentation and scientific purposes.

  14. Impact of Aquifer Heterogeneities on Autotrophic Denitrification.

    NASA Astrophysics Data System (ADS)

    McCarthy, A.; Roques, C.; Selker, J. S.; Istok, J. D.; Pett-Ridge, J. C.

    2015-12-01

    Nitrate contamination in groundwater is a big challenge that will need to be addressed by hydrogeologists throughout the world. With a drinking water standard of 10mg/L of NO3-, innovative techniques will need to be pursued to ensure a decrease in drinking water nitrate concentration. At the pumping site scale, the influence and relationship between heterogeneous flow, mixing, and reactivity is not well understood. The purpose of this project is to incorporate both physical and chemical modeling techniques to better understand the effect of aquifer heterogeneities on autotrophic denitrification. We will investigate the link between heterogeneous hydraulic properties, transport, and the rate of autotrophic denitrification. Data collected in previous studies in laboratory experiments and pumping site scale experiments will be used to validate the models. The ultimate objective of this project is to develop a model in which such coupled processes are better understood resulting in best management practices of groundwater.

  15. Use of System Dynamics Modeling in Medical Education and Research Projects.

    PubMed

    Bozikov, Jadranka; Relic, Danko; Dezelic, Gjuro

    2018-01-01

    The paper reviews experiences and accomplishments in application of system dynamics modeling in education, training and research projects at the Andrija Stampar School of Public Health, a branch of the Zagreb University School of Medicine, Croatia. A number of simulation models developed over the past 40 years are briefly described with regard to real problems concerned, objectives and modeling methods and techniques used. Many of them have been developed as the individual students' projects as a part of their graduation, MSc or PhD theses and subsequently published in journals or conference proceedings. Some of them were later used in teaching and simulation training. System dynamics modeling proved to be not only powerful method for research and decision making but also a useful tool in medical and nursing education enabling better understanding of dynamic systems' behavior.

  16. Computational modelling of the impact of AIDS on business.

    PubMed

    Matthews, Alan P

    2007-07-01

    An overview of computational modelling of the impact of AIDS on business in South Africa, with a detailed description of the AIDS Projection Model (APM) for companies, developed by the author, and suggestions for further work. Computational modelling of the impact of AIDS on business in South Africa requires modelling of the epidemic as a whole, and of its impact on a company. This paper gives an overview of epidemiological modelling, with an introduction to the Actuarial Society of South Africa (ASSA) model, the most widely used such model for South Africa. The APM produces projections of HIV prevalence, new infections, and AIDS mortality on a company, based on the anonymous HIV testing of company employees, and projections from the ASSA model. A smoothed statistical model of the prevalence test data is computed, and then the ASSA model projection for each category of employees is adjusted so that it matches the measured prevalence in the year of testing. FURTHER WORK: Further techniques that could be developed are microsimulation (representing individuals in the computer), scenario planning for testing strategies, and models for the business environment, such as models of entire sectors, and mapping of HIV prevalence in time and space, based on workplace and community data.

  17. Nondestructive analysis and development

    NASA Technical Reports Server (NTRS)

    Moslehy, Faissal A.

    1993-01-01

    This final report summarizes the achievements of project #4 of the NASA/UCF Cooperative Agreement from January 1990 to December 1992. The objectives of this project are to review NASA's NDE program at Kennedy Space Center (KSC) and recommend means for enhancing the present testing capabilities through the use of improved or new technologies. During the period of the project, extensive development of a reliable nondestructive, non-contact vibration technique to determine and quantify the bond condition of the thermal protection system (TPS) tiles of the Space Shuttle Orbiter was undertaken. Experimental modal analysis (EMA) is used as a non-destructive technique for the evaluation of Space Shuttle thermal protection system (TPS) tile bond integrity. Finite element (FE) models for tile systems were developed and were used to generate their vibration characteristics (i.e. natural frequencies and mode shapes). Various TPS tile assembly configurations as well as different bond conditions were analyzed. Results of finite element analyses demonstrated a drop in natural frequencies and a change in mode shapes which correlate with both size and location of disbond. Results of experimental testing of tile panels correlated with FE results and demonstrated the feasibility of EMA as a viable technique for tile bond verification. Finally, testing performed on the Space Shuttle Columbia using a laser doppler velocimeter demonstrated the application of EMA, when combined with FE modeling, as a non-contact, non-destructive bond evaluation technique.

  18. Galerkin v. least-squares Petrov–Galerkin projection in nonlinear model reduction

    DOE PAGES

    Carlberg, Kevin Thomas; Barone, Matthew F.; Antil, Harbir

    2016-10-20

    Least-squares Petrov–Galerkin (LSPG) model-reduction techniques such as the Gauss–Newton with Approximated Tensors (GNAT) method have shown promise, as they have generated stable, accurate solutions for large-scale turbulent, compressible flow problems where standard Galerkin techniques have failed. Furthermore, there has been limited comparative analysis of the two approaches. This is due in part to difficulties arising from the fact that Galerkin techniques perform optimal projection associated with residual minimization at the time-continuous level, while LSPG techniques do so at the time-discrete level. This work provides a detailed theoretical and computational comparison of the two techniques for two common classes of timemore » integrators: linear multistep schemes and Runge–Kutta schemes. We present a number of new findings, including conditions under which the LSPG ROM has a time-continuous representation, conditions under which the two techniques are equivalent, and time-discrete error bounds for the two approaches. Perhaps most surprisingly, we demonstrate both theoretically and computationally that decreasing the time step does not necessarily decrease the error for the LSPG ROM; instead, the time step should be ‘matched’ to the spectral content of the reduced basis. In numerical experiments carried out on a turbulent compressible-flow problem with over one million unknowns, we show that increasing the time step to an intermediate value decreases both the error and the simulation time of the LSPG reduced-order model by an order of magnitude.« less

  19. Galerkin v. least-squares Petrov–Galerkin projection in nonlinear model reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlberg, Kevin Thomas; Barone, Matthew F.; Antil, Harbir

    Least-squares Petrov–Galerkin (LSPG) model-reduction techniques such as the Gauss–Newton with Approximated Tensors (GNAT) method have shown promise, as they have generated stable, accurate solutions for large-scale turbulent, compressible flow problems where standard Galerkin techniques have failed. Furthermore, there has been limited comparative analysis of the two approaches. This is due in part to difficulties arising from the fact that Galerkin techniques perform optimal projection associated with residual minimization at the time-continuous level, while LSPG techniques do so at the time-discrete level. This work provides a detailed theoretical and computational comparison of the two techniques for two common classes of timemore » integrators: linear multistep schemes and Runge–Kutta schemes. We present a number of new findings, including conditions under which the LSPG ROM has a time-continuous representation, conditions under which the two techniques are equivalent, and time-discrete error bounds for the two approaches. Perhaps most surprisingly, we demonstrate both theoretically and computationally that decreasing the time step does not necessarily decrease the error for the LSPG ROM; instead, the time step should be ‘matched’ to the spectral content of the reduced basis. In numerical experiments carried out on a turbulent compressible-flow problem with over one million unknowns, we show that increasing the time step to an intermediate value decreases both the error and the simulation time of the LSPG reduced-order model by an order of magnitude.« less

  20. Vehicle System Management Modeling in UML for Ares I

    NASA Technical Reports Server (NTRS)

    Pearson, Newton W.; Biehn, Bradley A.; Curry, Tristan D.; Martinez, Mario R.

    2011-01-01

    The Spacecraft & Vehicle Systems Department of Marshall Space Flight Center is responsible for modeling the Vehicle System Management for the Ares I vehicle which was a part of the now canceled Constellation Program. An approach to generating the requirements for the Vehicle System Management was to use the Unified Modeling Language technique to build and test a model that would fulfill the Vehicle System Management requirements. UML has been used on past projects (flight software) in the design phase of the effort but this was the first attempt to use the UML technique from a top down requirements perspective.

  1. Determining the Marker Configuration and Modeling Technique to Optimize the Biomechanical Analysis of Running-Specific Prostheses

    DTIC Science & Technology

    2011-08-01

    4 Body ...Report requirement. 5 Body The approved Statement of Work proposed the following timeline (Table 1): Table 1. Timeline for approved project...Figure 1) were tested for this project including the 1E90 Sprinter (OttoBock Inc.), Flex-Run (Ossur), Cheetah ® (Ossur) and Nitro Running Foot (Freedom

  2. The New Hampshire High School Career Education Model. Final Report.

    ERIC Educational Resources Information Center

    Keene State Coll., NH.

    The purpose of this project was to improve the quality and demonstrate the most effective methods and techniques of career education in four high schools in the state of New Hampshire. The focus was to effect change at two points: the first was the academic curriculum, where committees in each of the project schools reviewed their existing…

  3. Generic precise augmented reality guiding system and its calibration method based on 3D virtual model.

    PubMed

    Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua

    2016-05-30

    Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique.

  4. Tools and Techniques for Basin-Scale Climate Change Assessment

    NASA Astrophysics Data System (ADS)

    Zagona, E.; Rajagopalan, B.; Oakley, W.; Wilson, N.; Weinstein, P.; Verdin, A.; Jerla, C.; Prairie, J. R.

    2012-12-01

    The Department of Interior's WaterSMART Program seeks to secure and stretch water supplies to benefit future generations and identify adaptive measures to address climate change. Under WaterSMART, Basin Studies are comprehensive water studies to explore options for meeting projected imbalances in water supply and demand in specific basins. Such studies could be most beneficial with application of recent scientific advances in climate projections, stochastic simulation, operational modeling and robust decision-making, as well as computational techniques to organize and analyze many alternatives. A new integrated set of tools and techniques to facilitate these studies includes the following components: Future supply scenarios are produced by the Hydrology Simulator, which uses non-parametric K-nearest neighbor resampling techniques to generate ensembles of hydrologic traces based on historical data, optionally conditioned on long paleo reconstructed data using various Markov Chain techniuqes. Resampling can also be conditioned on climate change projections from e.g., downscaled GCM projections to capture increased variability; spatial and temporal disaggregation is also provided. The simulations produced are ensembles of hydrologic inputs to the RiverWare operations/infrastucture decision modeling software. Alternative demand scenarios can be produced with the Demand Input Tool (DIT), an Excel-based tool that allows modifying future demands by groups such as states; sectors, e.g., agriculture, municipal, energy; and hydrologic basins. The demands can be scaled at future dates or changes ramped over specified time periods. Resulting data is imported directly into the decision model. Different model files can represent infrastructure alternatives and different Policy Sets represent alternative operating policies, including options for noticing when conditions point to unacceptable vulnerabilities, which trigger dynamically executing changes in operations or other options. The over-arching Study Manager provides a graphical tool to create combinations of future supply scenarios, demand scenarios, infrastructure and operating policy alternatives; each scenario is executed as an ensemble of RiverWare runs, driven by the hydrologic supply. The Study Manager sets up and manages multiple executions on multi-core hardware. The sizeable are typically direct model outputs, or post-processed indicators of performance based on model outputs. Post processing statistical analysis of the outputs are possible using the Graphical Policy Analysis Tool or other statistical packages. Several Basin Studies undertaken have used RiverWare to evaluate future scenarios. The Colorado River Basin Study, the most complex and extensive to date, has taken advantage of these tools and techniques to generate supply scenarios, produce alternative demand scenarios and to set up and execute the many combinations of supplies, demands, policies, and infrastructure alternatives. The tools and techniques will be described with example applications.

  5. Finite Dimensional Approximations for Continuum Multiscale Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berlyand, Leonid

    2017-01-24

    The completed research project concerns the development of novel computational techniques for modeling nonlinear multiscale physical and biological phenomena. Specifically, it addresses the theoretical development and applications of the homogenization theory (coarse graining) approach to calculation of the effective properties of highly heterogenous biological and bio-inspired materials with many spatial scales and nonlinear behavior. This theory studies properties of strongly heterogeneous media in problems arising in materials science, geoscience, biology, etc. Modeling of such media raises fundamental mathematical questions, primarily in partial differential equations (PDEs) and calculus of variations, the subject of the PI’s research. The focus of completed researchmore » was on mathematical models of biological and bio-inspired materials with the common theme of multiscale analysis and coarse grain computational techniques. Biological and bio-inspired materials offer the unique ability to create environmentally clean functional materials used for energy conversion and storage. These materials are intrinsically complex, with hierarchical organization occurring on many nested length and time scales. The potential to rationally design and tailor the properties of these materials for broad energy applications has been hampered by the lack of computational techniques, which are able to bridge from the molecular to the macroscopic scale. The project addressed the challenge of computational treatments of such complex materials by the development of a synergistic approach that combines innovative multiscale modeling/analysis techniques with high performance computing.« less

  6. Focusing cosmic telescopes: systematics of strong lens modeling

    NASA Astrophysics Data System (ADS)

    Johnson, Traci Lin; Sharon, Keren q.

    2018-01-01

    The use of strong gravitational lensing by galaxy clusters has become a popular method for studying the high redshift universe. While diverse in computational methods, lens modeling techniques have grasped the means for determining statistical errors on cluster masses and magnifications. However, the systematic errors have yet to be quantified, arising from the number of constraints, availablity of spectroscopic redshifts, and various types of image configurations. I will be presenting my dissertation work on quantifying systematic errors in parametric strong lensing techniques. I have participated in the Hubble Frontier Fields lens model comparison project, using simulated clusters to compare the accuracy of various modeling techniques. I have extended this project to understanding how changing the quantity of constraints affects the mass and magnification. I will also present my recent work extending these studies to clusters in the Outer Rim Simulation. These clusters are typical of the clusters found in wide-field surveys, in mass and lensing cross-section. These clusters have fewer constraints than the HFF clusters and thus, are more susceptible to systematic errors. With the wealth of strong lensing clusters discovered in surveys such as SDSS, SPT, DES, and in the future, LSST, this work will be influential in guiding the lens modeling efforts and follow-up spectroscopic campaigns.

  7. What spatial scales are believable for climate model projections of sea surface temperature?

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Lester; Halloran, Paul R.; Mumby, Peter J.; Stephenson, David B.

    2014-09-01

    Earth system models (ESMs) provide high resolution simulations of variables such as sea surface temperature (SST) that are often used in off-line biological impact models. Coral reef modellers have used such model outputs extensively to project both regional and global changes to coral growth and bleaching frequency. We assess model skill at capturing sub-regional climatologies and patterns of historical warming. This study uses an established wavelet-based spatial comparison technique to assess the skill of the coupled model intercomparison project phase 5 models to capture spatial SST patterns in coral regions. We show that models typically have medium to high skill at capturing climatological spatial patterns of SSTs within key coral regions, with model skill typically improving at larger spatial scales (≥4°). However models have much lower skill at modelling historical warming patters and are shown to often perform no better than chance at regional scales (e.g. Southeast Asian) and worse than chance at finer scales (<8°). Our findings suggest that output from current generation ESMs is not yet suitable for making sub-regional projections of change in coral bleaching frequency and other marine processes linked to SST warming.

  8. Reproduction of 20th century inter- to multi-decadel surface temperature variablilty in radiatively forced coupled climate models

    USDA-ARS?s Scientific Manuscript database

    Coupled Model Intercomparison Project 3 simulations of surface temperature were evaluated over the period 1902-1999 to assess their ability to reproduce historical temperature variability at 211 global locations. Model performance was evaluated using the running Mann Whitney-Z method, a technique th...

  9. Printing Space: Using 3D Printing of Digital Terrain Models in Geosciences Education and Research

    ERIC Educational Resources Information Center

    Horowitz, Seth S.; Schultz, Peter H.

    2014-01-01

    Data visualization is a core component of every scientific project; however, generation of physical models previously depended on expensive or labor-intensive molding, sculpting, or laser sintering techniques. Physical models have the advantage of providing not only visual but also tactile modes of inspection, thereby allowing easier visual…

  10. Forward model with space-variant of source size for reconstruction on X-ray radiographic image

    NASA Astrophysics Data System (ADS)

    Liu, Jin; Liu, Jun; Jing, Yue-feng; Xiao, Bo; Wei, Cai-hua; Guan, Yong-hong; Zhang, Xuan

    2018-03-01

    The Forward Imaging Technique is a method to solve the inverse problem of density reconstruction in radiographic imaging. In this paper, we introduce the forward projection equation (IFP model) for the radiographic system with areal source blur and detector blur. Our forward projection equation, based on X-ray tracing, is combined with the Constrained Conjugate Gradient method to form a new method for density reconstruction. We demonstrate the effectiveness of the new technique by reconstructing density distributions from simulated and experimental images. We show that for radiographic systems with source sizes larger than the pixel size, the effect of blur on the density reconstruction is reduced through our method and can be controlled within one or two pixels. The method is also suitable for reconstruction of non-homogeneousobjects.

  11. World Energy Projection System Plus Model Documentation: Commercial Module

    EIA Publications

    2016-01-01

    The Commercial Model of the World Energy Projection System Plus (WEPS ) is an energy demand modeling system of the world commercial end?use sector at a regional level. This report describes the version of the Commercial Model that was used to produce the commercial sector projections published in the International Energy Outlook 2016 (IEO2016). The Commercial Model is one of 13 components of the WEPS system. The WEPS is a modular system, consisting of a number of separate energy models that are communicate and work with each other through an integrated system model. The model components are each developed independently, but are designed with well?defined protocols for system communication and interactivity. The WEPS modeling system uses a shared database (the “restart” file) that allows all the models to communicate with each other when they are run in sequence over a number of iterations. The overall WEPS system uses an iterative solution technique that forces convergence of consumption and supply pressures to solve for an equilibrium price.

  12. Locally adaptive, spatially explicit projection of US population for 2030 and 2050.

    PubMed

    McKee, Jacob J; Rose, Amy N; Bright, Edward A; Huynh, Timmy; Bhaduri, Budhendra L

    2015-02-03

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Building on the spatial interpolation technique previously developed for high-resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically informed spatial distribution of projected population of the contiguous United States for 2030 and 2050, depicting one of many possible population futures. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection model departs from these by accounting for multiple components that affect population distribution. Modeled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the US Census's projection methodology, with the US Census's official projection as the benchmark. Applications of our model include incorporating multiple various scenario-driven events to produce a range of spatially explicit population futures for suitability modeling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.

  13. A mathematical model approach toward combining information from multiple image projections of the same patient

    NASA Astrophysics Data System (ADS)

    Chawla, Amarpreet S.; Samei, Ehsan; Abbey, Craig

    2007-03-01

    In this study, we used a mathematical observer model to combine information obtained from multiple angular projections of the same breast to determine the overall detection performance of a multi-projection breast imaging system in detectability of a simulated mass. 82 subjects participated in the study and 25 angular projections of each breast were acquired. Projections from a simulated 3 mm 3-D lesion were added to the projection images. The lesion was assumed to be embedded in the compressed breast at a distance of 3 cm from the detector. Hotelling observer with Laguerre-Gauss channels (LG CHO) was applied to each image. Detectability was analyzed in terms of ROC curves and the area under ROC curves (AUC). The critical question studied is how to best integrate the individual decision variables across multiple (correlated) views. Towards that end, three different methods were investigated. Specifically, 1) ROCs from different projections were simply averaged; 2) the test statistics from different projections were averaged; and 3) a Bayesian decision fusion rule was used. Finally, AUC of the combined ROC was used as a parameter to optimize the acquisition parameters to maximize the performance of the system. It was found that the Bayesian decision fusion technique performs better than the other two techniques and likely offers the best approximation of the diagnostic process. Furthermore, if the total dose level is held constant at 1/25th of dual-view mammographic screening dose, the highest detectability performance is observed when considering only two projections spread along an angular span of 11.4°.

  14. A Bayesian Ensemble Approach for Epidemiological Projections

    PubMed Central

    Lindström, Tom; Tildesley, Michael; Webb, Colleen

    2015-01-01

    Mathematical models are powerful tools for epidemiology and can be used to compare control actions. However, different models and model parameterizations may provide different prediction of outcomes. In other fields of research, ensemble modeling has been used to combine multiple projections. We explore the possibility of applying such methods to epidemiology by adapting Bayesian techniques developed for climate forecasting. We exemplify the implementation with single model ensembles based on different parameterizations of the Warwick model run for the 2001 United Kingdom foot and mouth disease outbreak and compare the efficacy of different control actions. This allows us to investigate the effect that discrepancy among projections based on different modeling assumptions has on the ensemble prediction. A sensitivity analysis showed that the choice of prior can have a pronounced effect on the posterior estimates of quantities of interest, in particular for ensembles with large discrepancy among projections. However, by using a hierarchical extension of the method we show that prior sensitivity can be circumvented. We further extend the method to include a priori beliefs about different modeling assumptions and demonstrate that the effect of this can have different consequences depending on the discrepancy among projections. We propose that the method is a promising analytical tool for ensemble modeling of disease outbreaks. PMID:25927892

  15. Discriminative Projection Selection Based Face Image Hashing

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  16. Expansion of the On-line Archive "Statistically Downscaled WCRP CMIP3 Climate Projections"

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Das, T.; Duffy, P.; White, K.

    2009-12-01

    Presentation highlights status and plans for a public-access archive of downscaled CMIP3 climate projections. Incorporating climate projection information into long-term evaluations of water and energy resources requires analysts to have access to projections at "basin-relevant" resolution. Such projections would ideally be bias-corrected to account for climate model tendencies to systematically simulate historical conditions different than observed. In 2007, the U.S. Bureau of Reclamation, Santa Clara University and Lawrence Livermore National Laboratory (LLNL) collaborated to develop an archive of 112 bias-corrected and spatially disaggregated (BCSD) CMIP3 temperature and precipitation projections. These projections were generated using 16 CMIP3 models to simulate three emissions pathways (A2, A1b, and B1) from one or more initializations (runs). Projections are specified on a monthly time step from 1950-2099 and at 0.125 degree spatial resolution within the North American Land Data Assimilation System domain (i.e. contiguous U.S., southern Canada and northern Mexico). Archive data are freely accessible at LLNL Green Data Oasis (url). Since being launched, the archive has served over 3500 data requests by nearly 500 users in support of a range of planning, research and educational activities. Archive developers continue to look for ways to improve the archive and respond to user needs. One request has been to serve the intermediate datasets generated during the BCSD procedure, helping users to interpret the relative influences of the bias-correction and spatial disaggregation on the transformed CMIP3 output. This request has been addressed with intermediate datasets now posted at the archive web-site. Another request relates closely to studying hydrologic and ecological impacts under climate change, where users are asking for projected diurnal temperature information (e.g., projected daily minimum and maximum temperature) and daily time step resolution. In response, archive developers are adding content in 2010, teaming with Scripps Institution of Oceanography (through their NOAA-RISA California-Nevada Applications Program and the California Climate Change Center) to apply a new daily downscaling technique to a sub-ensemble of the archive’s CMIP3 projections. The new technique, Bias-Corrected Constructed Analogs, combines the BC part of BCSD with a recently developed technique that preserves the daily sequencing structure of CMIP3 projections (Constructed Analogs, or CA). Such data will more easily serve hydrologic and ecological impacts assessments, and offer an opportunity to evaluate projection uncertainty associated with downscaling technique. Looking ahead to the arrival CMIP5 projections, archive collaborators have plans apply both BCSD and BCCA over the contiguous U.S. consistent with CMIP3 applications above, and also apply BCSD globally at a 0.5 degree spatial resolution. The latter effort involves collaboration with U.S. Army Corps of Engineers (USACE) and Climate Central.

  17. Multiple Fan-Beam Optical Tomography: Modelling Techniques

    PubMed Central

    Rahim, Ruzairi Abdul; Chen, Leong Lai; San, Chan Kok; Rahiman, Mohd Hafiz Fazalul; Fea, Pang Jon

    2009-01-01

    This paper explains in detail the solution to the forward and inverse problem faced in this research. In the forward problem section, the projection geometry and the sensor modelling are discussed. The dimensions, distributions and arrangements of the optical fibre sensors are determined based on the real hardware constructed and these are explained in the projection geometry section. The general idea in sensor modelling is to simulate an artificial environment, but with similar system properties, to predict the actual sensor values for various flow models in the hardware system. The sensitivity maps produced from the solution of the forward problems are important in reconstructing the tomographic image. PMID:22291523

  18. From statistical proofs of the Kochen-Specker theorem to noise-robust noncontextuality inequalities

    NASA Astrophysics Data System (ADS)

    Kunjwal, Ravi; Spekkens, Robert W.

    2018-05-01

    The Kochen-Specker theorem rules out models of quantum theory wherein projective measurements are assigned outcomes deterministically and independently of context. This notion of noncontextuality is not applicable to experimental measurements because these are never free of noise and thus never truly projective. For nonprojective measurements, therefore, one must drop the requirement that an outcome be assigned deterministically in the model and merely require that it be assigned a distribution over outcomes in a manner that is context-independent. By demanding context independence in the representation of preparations as well, one obtains a generalized principle of noncontextuality that also supports a quantum no-go theorem. Several recent works have shown how to derive inequalities on experimental data which, if violated, demonstrate the impossibility of finding a generalized-noncontextual model of this data. That is, these inequalities do not presume quantum theory and, in particular, they make sense without requiring an operational analog of the quantum notion of projectiveness. We here describe a technique for deriving such inequalities starting from arbitrary proofs of the Kochen-Specker theorem. It extends significantly previous techniques that worked only for logical proofs, which are based on sets of projective measurements that fail to admit of any deterministic noncontextual assignment, to the case of statistical proofs, which are based on sets of projective measurements that d o admit of some deterministic noncontextual assignments, but not enough to explain the quantum statistics.

  19. Use of Advanced Spectroscopic Techniques for Predicting the Mechanical Properties of Wood Composites

    Treesearch

    Timothy G. Rials; Stephen S. Kelley; Chi-Leung So

    2002-01-01

    Near infrared (NIR) spectroscopy was used to characterize a set of medium-density fiberboard (MDF) samples. This spectroscopic technique, in combination with projection to latent structures (PLS) modeling, effectively predicted the mechanical strength of MDF samples with a wide range of physical properties. The stiffness, strength, and internal bond properties of the...

  20. All-Possible-Subsets for MANOVA and Factorial MANOVAs: Less than a Weekend Project

    ERIC Educational Resources Information Center

    Nimon, Kim; Zientek, Linda Reichwein; Kraha, Amanda

    2016-01-01

    Multivariate techniques are increasingly popular as researchers attempt to accurately model a complex world. MANOVA is a multivariate technique used to investigate the dimensions along which groups differ, and how these dimensions may be used to predict group membership. A concern in a MANOVA analysis is to determine if a smaller subset of…

  1. The Mind Research Network - Mental Illness Neuroscience Discovery Grant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, J.; Calhoun, V.

    The scientific and technological programs of the Mind Research Network (MRN), reflect DOE missions in basic science and associated instrumentation, computational modeling, and experimental techniques. MRN's technical goals over the course of this project have been to develop and apply integrated, multi-modality functional imaging techniques derived from a decade of DOE-support research and technology development.

  2. Adaptive parametric model order reduction technique for optimization of vibro-acoustic models: Application to hearing aid design

    NASA Astrophysics Data System (ADS)

    Creixell-Mediante, Ester; Jensen, Jakob S.; Naets, Frank; Brunskog, Jonas; Larsen, Martin

    2018-06-01

    Finite Element (FE) models of complex structural-acoustic coupled systems can require a large number of degrees of freedom in order to capture their physical behaviour. This is the case in the hearing aid field, where acoustic-mechanical feedback paths are a key factor in the overall system performance and modelling them accurately requires a precise description of the strong interaction between the light-weight parts and the internal and surrounding air over a wide frequency range. Parametric optimization of the FE model can be used to reduce the vibroacoustic feedback in a device during the design phase; however, it requires solving the model iteratively for multiple frequencies at different parameter values, which becomes highly time consuming when the system is large. Parametric Model Order Reduction (pMOR) techniques aim at reducing the computational cost associated with each analysis by projecting the full system into a reduced space. A drawback of most of the existing techniques is that the vector basis of the reduced space is built at an offline phase where the full system must be solved for a large sample of parameter values, which can also become highly time consuming. In this work, we present an adaptive pMOR technique where the construction of the projection basis is embedded in the optimization process and requires fewer full system analyses, while the accuracy of the reduced system is monitored by a cheap error indicator. The performance of the proposed method is evaluated for a 4-parameter optimization of a frequency response for a hearing aid model, evaluated at 300 frequencies, where the objective function evaluations become more than one order of magnitude faster than for the full system.

  3. Regional Climate Models Downscaling in the Alpine Area with Multimodel SuperEnsemble

    NASA Astrophysics Data System (ADS)

    Cane, D.; Barbarino, S.; Renier, L.; Ronchi, C.

    2012-04-01

    The climatic scenarios show a strong signal of warming in the Alpine area already for the mid XXI century. The climate simulation, however, even when obtained with Regional Climate Models (RCMs), are affected by strong errors where compared with observations in the control period, due to their difficulties in representing the complex orography of the Alps and limitations in their physical parametrization. In this work we use a selection of RCMs runs from the ENSEMBLES project, carefully chosen in order to maximise the variety of leading Global Climate Models and of the RCMs themselves, calculated on the SRES scenario A1B. The reference observation for the Greater Alpine Area are extracted from the European dataset E-OBS produced by the project ENSEMBLES with an available resolution of 25 km. For the study area of Piemonte daily temperature and precipitation observations (1957-present) were carefully gridded on a 14-km grid over Piemonte Region with an Optimal Interpolation technique. We applied the Multimodel SuperEnsemble technique to temperature fields, reducing the high biases of RCMs temperature field compared to observations in the control period. We propose also the first application to RCMs of a brand new probabilistic Multimodel SuperEnsemble Dressing technique to estimate precipitation fields, already applied successfully to weather forecast models, with careful description of precipitation Probability Density Functions conditioned to the model outputs. This technique reduces the strong precipitation overestimation by RCMs over the alpine chain and reproduces the monthly behaviour of observed precipitation in the control period far better than the direct model outputs.

  4. Models of railroad passenger-car requirements in the northeast corridor : volume II user's guide

    DOT National Transportation Integrated Search

    1976-09-30

    Models and techniques for determining passenger-car requirements in railroad service were developed and applied by a research project of which this is the final report. The report is published in two volumes. The solution and analysis of the Northeas...

  5. Models of railroad passenger-car requirements in the northeast corridor : volume 1. formulation and results.

    DOT National Transportation Integrated Search

    1976-09-30

    Models and techniques for determining passenger-car requirements in railroad service were developed and applied by a research project of which this is the final report. The report is published in two volumes. This volume considers a general problem o...

  6. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  7. Breaking Computational Barriers: Real-time Analysis and Optimization with Large-scale Nonlinear Models via Model Reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlberg, Kevin Thomas; Drohmann, Martin; Tuminaro, Raymond S.

    2014-10-01

    Model reduction for dynamical systems is a promising approach for reducing the computational cost of large-scale physics-based simulations to enable high-fidelity models to be used in many- query (e.g., Bayesian inference) and near-real-time (e.g., fast-turnaround simulation) contexts. While model reduction works well for specialized problems such as linear time-invariant systems, it is much more difficult to obtain accurate, stable, and efficient reduced-order models (ROMs) for systems with general nonlinearities. This report describes several advances that enable nonlinear reduced-order models (ROMs) to be deployed in a variety of time-critical settings. First, we present an error bound for the Gauss-Newton with Approximatedmore » Tensors (GNAT) nonlinear model reduction technique. This bound allows the state-space error for the GNAT method to be quantified when applied with the backward Euler time-integration scheme. Second, we present a methodology for preserving classical Lagrangian structure in nonlinear model reduction. This technique guarantees that important properties--such as energy conservation and symplectic time-evolution maps--are preserved when performing model reduction for models described by a Lagrangian formalism (e.g., molecular dynamics, structural dynamics). Third, we present a novel technique for decreasing the temporal complexity --defined as the number of Newton-like iterations performed over the course of the simulation--by exploiting time-domain data. Fourth, we describe a novel method for refining projection-based reduced-order models a posteriori using a goal-oriented framework similar to mesh-adaptive h -refinement in finite elements. The technique allows the ROM to generate arbitrarily accurate solutions, thereby providing the ROM with a 'failsafe' mechanism in the event of insufficient training data. Finally, we present the reduced-order model error surrogate (ROMES) method for statistically quantifying reduced- order-model errors. This enables ROMs to be rigorously incorporated in uncertainty-quantification settings, as the error model can be treated as a source of epistemic uncertainty. This work was completed as part of a Truman Fellowship appointment. We note that much additional work was performed as part of the Fellowship. One salient project is the development of the Trilinos-based model-reduction software module Razor , which is currently bundled with the Albany PDE code and currently allows nonlinear reduced-order models to be constructed for any application supported in Albany. Other important projects include the following: 1. ROMES-equipped ROMs for Bayesian inference: K. Carlberg, M. Drohmann, F. Lu (Lawrence Berkeley National Laboratory), M. Morzfeld (Lawrence Berkeley National Laboratory). 2. ROM-enabled Krylov-subspace recycling: K. Carlberg, V. Forstall (University of Maryland), P. Tsuji, R. Tuminaro. 3. A pseudo balanced POD method using only dual snapshots: K. Carlberg, M. Sarovar. 4. An analysis of discrete v. continuous optimality in nonlinear model reduction: K. Carlberg, M. Barone, H. Antil (George Mason University). Journal articles for these projects are in progress at the time of this writing.« less

  8. UML activity diagram swimlanes in logic controller design

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona

    2015-12-01

    Logic controller behavior can be specified using various techniques, including UML activity diagrams and control Petri nets. Each technique has its advantages and disadvantages. Application of both specification types in one project allows to take benefits from both of them. Additional elements of UML models make it possible to divide a specification into some parts, considered from other point of view (logic controller, user or system). The paper introduces an idea to use UML activity diagrams with swimlanes to increase the understandability of design models.

  9. Development and comparison of projection and image space 3D nodule insertion techniques

    NASA Astrophysics Data System (ADS)

    Robins, Marthony; Solomon, Justin; Sahbaee, Pooyan; Samei, Ehsan

    2016-04-01

    This study aimed to develop and compare two methods of inserting computerized virtual lesions into CT datasets. 24 physical (synthetic) nodules of three sizes and four morphologies were inserted into an anthropomorphic chest phantom (LUNGMAN, KYOTO KAGAKU). The phantom was scanned (Somatom Definition Flash, Siemens Healthcare) with and without nodules present, and images were reconstructed with filtered back projection and iterative reconstruction (SAFIRE) at 0.6 mm slice thickness using a standard thoracic CT protocol at multiple dose settings. Virtual 3D CAD models based on the physical nodules were virtually inserted (accounting for the system MTF) into the nodule-free CT data using two techniques. These techniques include projection-based and image-based insertion. Nodule volumes were estimated using a commercial segmentation tool (iNtuition, TeraRecon, Inc.). Differences were tested using paired t-tests and R2 goodness of fit between the virtually and physically inserted nodules. Both insertion techniques resulted in nodule volumes very similar to the real nodules (<3% difference) and in most cases the differences were not statistically significant. Also, R2 values were all <0.97 for both insertion techniques. These data imply that these techniques can confidently be used as a means of inserting virtual nodules in CT datasets. These techniques can be instrumental in building hybrid CT datasets composed of patient images with virtually inserted nodules.

  10. Development of a risk-based environmental management tool for drilling discharges. Summary of a four-year project.

    PubMed

    Singsaas, Ivar; Rye, Henrik; Frost, Tone Karin; Smit, Mathijs G D; Garpestad, Eimund; Skare, Ingvild; Bakke, Knut; Veiga, Leticia Falcao; Buffagni, Melania; Follum, Odd-Arne; Johnsen, Ståle; Moltu, Ulf-Einar; Reed, Mark

    2008-04-01

    This paper briefly summarizes the ERMS project and presents the developed model by showing results from environmental fates and risk calculations of a discharge from offshore drilling operations. The developed model calculates environmental risks for the water column and sediments resulting from exposure to toxic stressors (e.g., chemicals) and nontoxic stressors (e.g., suspended particles, sediment burial). The approach is based on existing risk assessment techniques described in the European Union technical guidance document on risk assessment and species sensitivity distributions. The model calculates an environmental impact factor, which characterizes the overall potential impact on the marine environment in terms of potentially impacted water volume and sediment area. The ERMS project started in 2003 and was finalized in 2007. In total, 28 scientific reports and 9 scientific papers have been delivered from the ERMS project (http://www.sintef.no/erms).

  11. Pareto-Optimal Estimates of California Precipitation Change

    NASA Astrophysics Data System (ADS)

    Langenbrunner, Baird; Neelin, J. David

    2017-12-01

    In seeking constraints on global climate model projections under global warming, one commonly finds that different subsets of models perform well under different objective functions, and these trade-offs are difficult to weigh. Here a multiobjective approach is applied to a large set of subensembles generated from the Climate Model Intercomparison Project phase 5 ensemble. We use observations and reanalyses to constrain tropical Pacific sea surface temperatures, upper level zonal winds in the midlatitude Pacific, and California precipitation. An evolutionary algorithm identifies the set of Pareto-optimal subensembles across these three measures, and these subensembles are used to constrain end-of-century California wet season precipitation change. This methodology narrows the range of projections throughout California, increasing confidence in estimates of positive mean precipitation change. Finally, we show how this technique complements and generalizes emergent constraint approaches for restricting uncertainty in end-of-century projections within multimodel ensembles using multiple criteria for observational constraints.

  12. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  13. Rural Infant Stimulation Environment (RISE). Progress Report, July 1, 1974 to June 30, 1977. Handicapped Children's Early Education Program.

    ERIC Educational Resources Information Center

    Holder, Loreta

    This final report describes a federally-funded project that was designed to provide a model for service delivery to severely physically involved infants and their families living in the highly rural area of West Alabama. The project developed and refined an eclectic treatment approach known as Developmental Physical Management Techniques (DPMT).…

  14. Understanding Proximal-Distal Economic Projections of the Benefits of Childhood Preventive Interventions

    PubMed Central

    Slade, Eric P.; Becker, Kimberly D.

    2014-01-01

    This paper discusses the steps and decisions involved in proximal-distal economic modeling, in which social, behavioral, and academic outcomes data for children may be used to inform projections of the economic consequences of interventions. Economic projections based on proximal-distal modeling techniques may be used in cost-benefit analyses when information is unavailable for certain long term outcomes data in adulthood or to build entire cost-benefit analyses. Although examples of proximal-distal economic analyses of preventive interventions exist in policy reports prepared for governmental agencies, such analyses have rarely been completed in conjunction with research trials. The modeling decisions on which these prediction models are based are often opaque to policymakers and other end-users. This paper aims to illuminate some of the key steps and considerations involved in constructing proximal-distal prediction models and to provide examples and suggestions that may help guide future proximal-distal analyses. PMID:24337979

  15. Modeling the Hydrologic Effects of Large-Scale Green Infrastructure Projects with GIS

    NASA Astrophysics Data System (ADS)

    Bado, R. A.; Fekete, B. M.; Khanbilvardi, R.

    2015-12-01

    Impervious surfaces in urban areas generate excess runoff, which in turn causes flooding, combined sewer overflows, and degradation of adjacent surface waters. Municipal environmental protection agencies have shown a growing interest in mitigating these effects with 'green' infrastructure practices that partially restore the perviousness and water holding capacity of urban centers. Assessment of the performance of current and future green infrastructure projects is hindered by the lack of adequate hydrological modeling tools; conventional techniques fail to account for the complex flow pathways of urban environments, and detailed analyses are difficult to prepare for the very large domains in which green infrastructure projects are implemented. Currently, no standard toolset exists that can rapidly and conveniently predict runoff, consequent inundations, and sewer overflows at a city-wide scale. We demonstrate how streamlined modeling techniques can be used with open-source GIS software to efficiently model runoff in large urban catchments. Hydraulic parameters and flow paths through city blocks, roadways, and sewer drains are automatically generated from GIS layers, and ultimately urban flow simulations can be executed for a variety of rainfall conditions. With this methodology, users can understand the implications of large-scale land use changes and green/gray storm water retention systems on hydraulic loading, peak flow rates, and runoff volumes.

  16. ISMIP6 - initMIP: Greenland ice sheet model initialisation experiments

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Payne, Tony; Larour, Eric; Abe Ouchi, Ayako; Gregory, Jonathan; Lipscomb, William; Seroussi, Helene; Shepherd, Andrew; Edwards, Tamsin

    2016-04-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. This intercomparison exercise (initMIP) aims at comparing, evaluating and improving the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). The experiments are conceived for the large-scale Greenland ice sheet and are designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The latter experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss first results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  17. Viscosity Meaurement Technique for Metal Fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ban, Heng; Kennedy, Rory

    2015-02-09

    Metallic fuels have exceptional transient behavior, excellent thermal conductivity, and a more straightforward reprocessing path, which does not separate out pure plutonium from the process stream. Fabrication of fuel containing minor actinides and rare earth (RE) elements for irradiation tests, for instance, U-20Pu-3Am-2Np-1.0RE-15Zr samples at the Idaho National Laboratory, is generally done by melt casting in an inert atmosphere. For the design of a casting system and further scale up development, computational modeling of the casting process is needed to provide information on melt flow and solidification for process optimization. Therefore, there is a need for melt viscosity data, themore » most important melt property that controls the melt flow. The goal of the project was to develop a measurement technique that uses fully sealed melt sample with no Americium vapor loss to determine the viscosity of metallic melts and at temperatures relevant to the casting process. The specific objectives of the project were to: develop mathematical models to establish the principle of the measurement method, design and build a viscosity measurement prototype system based on the established principle, and calibrate the system and quantify the uncertainty range. The result of the project indicates that the oscillation cup technique is applicable for melt viscosity measurement. Detailed mathematical models of innovative sample ampoule designs were developed to not only determine melt viscosity, but also melt density under certain designs. Measurement uncertainties were analyzed and quantified. The result of this project can be used as the initial step toward the eventual goal of establishing a viscosity measurement system for radioactive melts.« less

  18. Rain, Brains and Climate Change: improving understanding of regional precipitation with medical registration techniques (Invited)

    NASA Astrophysics Data System (ADS)

    Levy, A.; Ingram, W.; Allen, M. R.; Jenkinson, M.; Lambert, F. H.; Huntingford, C.

    2013-12-01

    Precipitation is one of the most important climate variables, but is extremely difficult for general circulation models (GCMs) to simulate accurately. Not only do GCMs disagree on projected and past changes, but they also struggle to get the mean present day distribution of precipitation correct. Some of this disagreement in changes, though, is due to errors in location in the GCMs' mean climate. For example, if the GCMs disagree about the location of the South Asian monsoon, they will disagree in their projections when compared locally, even if they all simulate an intensification of this feature. We have therefore implemented techniques to remove biases of location from GCMs' mean climates. Initially, we adapted medical registration software designed for the analysis of brain scans, using it to transform each GCM's mean climate onto observations. These transformations (or ';warps') were then applied to each GCM's projected changes under an extreme climate change scenario (1% CO2 experiment). We found that both the inter-model range and standard deviation were decreased by 15%, with many regions of the globe receiving a more than 50% reduction [Levy et al. 2013, GRL]. We have now developed a technique tailored to the spatial (longitudinal and latitudinal) and seasonal warping of precipitation fields, which is able to correct precipitation fields much more accurately, as well as providing the option to conserve total global rainfall upon warping. As well as allowing more confident projections of future climate change, this technique can be expected to improve the power and accuracy of the detection and attribution of past changes in precipitation. We have now begun investigating this application, and preliminary results will be presented here.

  19. Comparison of Fast Neutron Detector Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stange, Sy; Mckigney, Edward Allen

    2015-02-09

    This report documents the work performed for the Department of Homeland Security Domestic Nuclear Detection O ce as the project Fast Neutron Detection Evaluation under contract HSHQDC-14-X-00022. This study was performed as a follow-on to the project Study of Fast Neutron Signatures and Measurement Techniques for SNM Detection - DNDO CFP11-100 STA-01. That work compared various detector technologies in a portal monitor con guration, focusing on a comparison between a number of fast neutron detection techniques and two standard thermal neutron detection technologies. The conclusions of the earlier work are contained in the report Comparison of Fast Neutron Detector Technologies.more » This work is designed to address questions raised about assumptions underlying the models built for the earlier project. To that end, liquid scintillators of two di erent sizes{ one a commercial, o -the-shelf (COTS) model of standard dimensions and the other a large, planer module{were characterized at Los Alamos National Laboratory. The results of those measurements were combined with the results of the earlier models to gain a more complete picture of the performance of liquid scintillator as a portal monitor technology.« less

  20. Design and Testing of a One-Third Scale Soyuz TM Descent Module Spartan Conversion Project Super Loki Instrumentation

    NASA Technical Reports Server (NTRS)

    Anderson, Loren A.; Armitage, Pamela Kay

    1993-01-01

    The 1992-1993 senior Aerospace Engineering Design class continued work on the post landing configurations for the Assured Crew Return Vehicle. The Assured Crew Return Vehicle will be permanently docked to the space station fulfilling NASA's commitment of Assured Crew Return Capability in the event of an accident or illness aboard the space station. The objective of the project was to give the Assured Crew Return Vehicle Project Office data to feed into their feasibility studies. Three design teams were given the task of developing models with dynamically and geometrically scaled characteristics. Groups one and two combined efforts to design a one-third scale model of the Russian Soyuz TM Descent Module, and an on-board flotation system. This model was designed to determine the flotation characteristics and test the effects of a rigid flotation and orientation system. Group three designed a portable water wave test facility to be located on campus. Because of additional funding from Thiokol Corporation, testing of the Soyuz model and flotation systems took place at the Offshore Technology Research Center. Universities Space Research Association has been studying the use of small expendable launch vehicles for missions which cost less than 200 million dollars. The Crusader2B. which consists of the original Spartan first and second stage with an additional Spartan second stage and the Minuteman III upper stage is being considered for this task. University of Central Florida project accomplishments include an analysis of launch techniques, a modeling technique to determine flight characteristics, and input into the redesign of an existing mobile rail launch platform.

  1. Getting expert systems off the ground: Lessons learned from integrating model-based diagnostics with prototype flight hardware

    NASA Technical Reports Server (NTRS)

    Stephan, Amy; Erikson, Carol A.

    1991-01-01

    As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.

  2. The effects of climate downscaling technique and observational data set on modeled ecological responses

    Treesearch

    Afshin Pourmokhtarian; Charles T. Driscoll; John L. Campbell; Katharine Hayhoe; Anne M. K. Stoner

    2016-01-01

    Assessments of future climate change impacts on ecosystems typically rely on multiple climate model projections, but often utilize only one downscaling approach trained on one set of observations. Here, we explore the extent to which modeled biogeochemical responses to changing climate are affected by the selection of the climate downscaling method and training...

  3. Declining Enrollment: Why? What Are the Trends and Implications?; and Enrollment Forecasting Techniques.

    ERIC Educational Resources Information Center

    Finch, Harold L.; Tatham, Elaine L.

    This document presents a modified cohort survival model which can be of use in making enrollment projections. The model begins by analytically profiling an area's residents. Each person's demographic characteristics--sex, age, place of residence--are recorded in the computer memory. Four major input variables are then incorporated into the model:…

  4. Aberration measurement of projection optics in lithographic tools based on two-beam interference theory.

    PubMed

    Ma, Mingying; Wang, Xiangzhao; Wang, Fan

    2006-11-10

    The degradation of image quality caused by aberrations of projection optics in lithographic tools is a serious problem in optical lithography. We propose what we believe to be a novel technique for measuring aberrations of projection optics based on two-beam interference theory. By utilizing the partial coherent imaging theory, a novel model that accurately characterizes the relative image displacement of a fine grating pattern to a large pattern induced by aberrations is derived. Both even and odd aberrations are extracted independently from the relative image displacements of the printed patterns by two-beam interference imaging of the zeroth and positive first orders. The simulation results show that by using this technique we can measure the aberrations present in the lithographic tool with higher accuracy.

  5. An object-oriented simulator for 3D digital breast tomosynthesis imaging system.

    PubMed

    Seyyedi, Saeed; Cengiz, Kubra; Kamasak, Mustafa; Yildirim, Isa

    2013-01-01

    Digital breast tomosynthesis (DBT) is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART) were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT) imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI) helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART) and total variation regularized reconstruction techniques (ART+TV) are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM) values.

  6. An Object-Oriented Simulator for 3D Digital Breast Tomosynthesis Imaging System

    PubMed Central

    Cengiz, Kubra

    2013-01-01

    Digital breast tomosynthesis (DBT) is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART) were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT) imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI) helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART) and total variation regularized reconstruction techniques (ART+TV) are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM) values. PMID:24371468

  7. Introducing quality improvement methods into local public health departments: structured evaluation of a statewide pilot project.

    PubMed

    Riley, William; Parsons, Helen; McCoy, Kim; Burns, Debra; Anderson, Donna; Lee, Suhna; Sainfort, François

    2009-10-01

    To test the feasibility and assess the preliminary impact of a unique statewide quality improvement (QI) training program designed for public health departments. One hundred and ninety-five public health employees/managers from 38 local health departments throughout Minnesota were selected to participate in a newly developed QI training program and 65 of those engaged in and completed eight expert-supported QI projects over a period of 10 months from June 2007 through March 2008. As part of the Minnesota Quality Improvement Initiative, a structured distance education QI training program was designed and deployed in a first large-scale pilot. To evaluate the preliminary impact of the program, a mixed-method evaluation design was used based on four dimensions: learner reaction, knowledge, intention to apply, and preliminary outcomes. Subjective ratings of three dimensions of training quality were collected from participants after each of the scheduled learning sessions. Pre- and post-QI project surveys were administered to collect participant reactions, knowledge, future intention to apply learning, and perceived outcomes. Monthly and final QI project reports were collected to further inform success and preliminary outcomes of the projects. The participants reported (1) high levels of satisfaction with the training sessions, (2) increased perception of the relevance of the QI techniques, (3) increased perceived knowledge of all specific QI methods and techniques, (4) increased confidence in applying QI techniques on future projects, (5) increased intention to apply techniques on future QI projects, and (6) high perceived success of, and satisfaction with, the projects. Finally, preliminary outcomes data show moderate to large improvements in quality and/or efficiency for six out of eight projects. QI methods and techniques can be successfully implemented in local public health agencies on a statewide basis using the collaborative model through distance training and expert facilitation. This unique training can improve both core and support processes and lead to favorable staff reactions, increased knowledge, and improved health outcomes. The program can be further improved and deployed and holds great promise to facilitate the successful dissemination of proven QI methods throughout local public health departments.

  8. Systems-Level Synthetic Biology for Advanced Biofuel Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruffing, Anne; Jensen, Travis J.; Strickland, Lucas Marshall

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcusmore » sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.« less

  9. Model-Driven Design: Systematically Building Integrated Blended Learning Experiences

    ERIC Educational Resources Information Center

    Laster, Stephen

    2010-01-01

    Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…

  10. Los Angeles County Parks and Recreation.

    ERIC Educational Resources Information Center

    Iowa Univ., Iowa City. Recreation Education Program.

    Presented are duplications of the responses given by the Los Angeles County Parks and Recreation Rehabilitation Unit (California) as part of a project to collect, share, and compile information about, and techniques in the operation of 18 community action models for recreation services to the disabled. Model programs are categorized as consumer,…

  11. Constructing an Urban Population Model for Medical Insurance Scheme Using Microsimulation Techniques

    PubMed Central

    Xiong, Linping; Zhang, Lulu; Tang, Weidong; Ma, Yuqin

    2012-01-01

    China launched a pilot project of medical insurance reform in 79 cities in 2007 to cover urban nonworking residents. An urban population model was created in this paper for China's medical insurance scheme using microsimulation model techniques. The model made it clear for the policy makers the population distributions of different groups of people, the potential urban residents entering the medical insurance scheme. The income trends of units of individuals and families were also obtained. These factors are essential in making the challenging policy decisions when considering to balance the long-term financial sustainability of the medical insurance scheme. PMID:22481973

  12. A Modified Particle Swarm Optimization Technique for Finding Optimal Designs for Mixture Models

    PubMed Central

    Wong, Weng Kee; Chen, Ray-Bing; Huang, Chien-Chih; Wang, Weichung

    2015-01-01

    Particle Swarm Optimization (PSO) is a meta-heuristic algorithm that has been shown to be successful in solving a wide variety of real and complicated optimization problems in engineering and computer science. This paper introduces a projection based PSO technique, named ProjPSO, to efficiently find different types of optimal designs, or nearly optimal designs, for mixture models with and without constraints on the components, and also for related models, like the log contrast models. We also compare the modified PSO performance with Fedorov's algorithm, a popular algorithm used to generate optimal designs, Cocktail algorithm, and the recent algorithm proposed by [1]. PMID:26091237

  13. High Penetration Solar PV Deployment Sunshine State Solar Grid Initiative (SUNGRIN)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meeker, Rick; Steurer, Mischa; Faruque, MD Omar

    The report provides results from the Sunshine State Solar Grid Initiative (SUNGRIN) high penetration solar PV deployment project led by Florida State University’s (FSU) Center for Advanced Power Systems (CAPS). FSU CAPS and industry and university partners have completed a five-year effort aimed at enabling effective integration of high penetration levels of grid-connected solar PV generation. SUNGRIN has made significant contributions in the development of simulation-assisted techniques, tools, insight and understanding associated with solar PV effects on electric power system (EPS) operation and the evaluation of mitigation options for maintaining reliable operation. An important element of the project was themore » partnership and participation of six major Florida utilities and the Florida Reliability Coordinating Council (FRCC). Utilities provided details and data associated with actual distribution circuits having high-penetration PV to use as case studies. The project also conducted foundational work supporting future investigations of effects at the transmission / bulk power system level. In the final phase of the project, four open-use models with built-in case studies were developed and released, along with synthetic solar PV data sets, and tools and techniques for model reduction and in-depth parametric studies of solar PV impact on distribution circuits. Along with models and data, at least 70 supporting MATLAB functions have been developed and made available, with complete documentation.« less

  14. Flexible multibody simulation of automotive systems with non-modal model reduction techniques

    NASA Astrophysics Data System (ADS)

    Shiiba, Taichi; Fehr, Jörg; Eberhard, Peter

    2012-12-01

    The stiffness of the body structure of an automobile has a strong relationship with its noise, vibration, and harshness (NVH) characteristics. In this paper, the effect of the stiffness of the body structure upon ride quality is discussed with flexible multibody dynamics. In flexible multibody simulation, the local elastic deformation of the vehicle has been described traditionally with modal shape functions. Recently, linear model reduction techniques from system dynamics and mathematics came into the focus to find more sophisticated elastic shape functions. In this work, the NVH-relevant states of a racing kart are simulated, whereas the elastic shape functions are calculated with modern model reduction techniques like moment matching by projection on Krylov-subspaces, singular value decomposition-based reduction techniques, and combinations of those. The whole elastic multibody vehicle model consisting of tyres, steering, axle, etc. is considered, and an excitation with a vibration characteristics in a wide frequency range is evaluated in this paper. The accuracy and the calculation performance of those modern model reduction techniques is investigated including a comparison of the modal reduction approach.

  15. Pavement Performance : Approaches Using Predictive Analytics

    DOT National Transportation Integrated Search

    2018-03-23

    Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...

  16. Lumped Model Generation and Evaluation: Sensitivity and Lie Algebraic Techniques with Applications to Combustion

    DTIC Science & Technology

    1989-03-03

    address global parameter space mapping issues for first order differential equations. The rigorous criteria for the existence of exact lumping by linear projective transformations was also established.

  17. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    NASA Technical Reports Server (NTRS)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  18. Bayesian reconstruction of projection reconstruction NMR (PR-NMR).

    PubMed

    Yoon, Ji Won

    2014-11-01

    Projection reconstruction nuclear magnetic resonance (PR-NMR) is a technique for generating multidimensional NMR spectra. A small number of projections from lower-dimensional NMR spectra are used to reconstruct the multidimensional NMR spectra. In our previous work, it was shown that multidimensional NMR spectra are efficiently reconstructed using peak-by-peak based reversible jump Markov chain Monte Carlo (RJMCMC) algorithm. We propose an extended and generalized RJMCMC algorithm replacing a simple linear model with a linear mixed model to reconstruct close NMR spectra into true spectra. This statistical method generates samples in a Bayesian scheme. Our proposed algorithm is tested on a set of six projections derived from the three-dimensional 700 MHz HNCO spectrum of a protein HasA. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. SEM-PLS Analysis of Inhibiting Factors of Cost Performance for Large Construction Projects in Malaysia: Perspective of Clients and Consultants

    PubMed Central

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227

  20. SEM-PLS analysis of inhibiting factors of cost performance for large construction projects in Malaysia: perspective of clients and consultants.

    PubMed

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.

  1. SU-C-207-04: Reconstruction Artifact Reduction in X-Ray Cone Beam CT Using a Treatment Couch Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lasio, G; Hu, E; Zhou, J

    2015-06-15

    Purpose: to mitigate artifacts induced by the presence of the RT treatment couch in on-board CBCT and improve image quality Methods: a model of a Varian IGRT couch is constructed using a CBCT scan of the couch in air. The model is used to generate a set of forward projections (FP) of the treatment couch at specified gantry angles. The model couch forward projections are then used to process CBCT scan projections which contain the couch in addition to the scan object (Catphan phantom), in order to remove the attenuation component of the couch at any given gantry angle. Priormore » to pre-processing with the model FP, the Catphan projection data is normalized to an air scan with bowtie filter. The filtered Catphan projections are used to reconstruct the CBCT with an in-house FDK algorithm. The artifact reduction in the processed CBCT scan is assessed visually, and the image quality improvement is measured with the CNR over a few selected ROIs of the Catphan modules. Results: Sufficient match between the forward projected data and the x-ray projections is achieved to allow filtering in attenuation space. Visual improvement of the couch induced artifacts is achieved, with a moderate expense of CNR. Conclusion: Couch model-based correction of CBCT projection data has a potential for qualitative improvement of clinical CBCT scans, without requiring position specific correction data. The technique could be used to produce models of other artifact inducing devices, such as immobilization boards, and reduce their impact on patient CBCT images.« less

  2. An Analysis of a Comprehensive Evaluation Model for Guided Group Interaction Techniques with Juvenile Delinquents. Final Report.

    ERIC Educational Resources Information Center

    Silverman, Mitchell

    Reported are the first phase activities of a longitudinal project designed to evaluate the effectiveness of Guided Group Interaction (GGI) technique as a meaningful approach in the field of corrections. The main findings relate to the establishment of reliability for the main components of the Revised Behavior Scores System developed to assess the…

  3. Development of Improved Oil Field Waste Injection Disposal Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terralog Technologies

    2002-11-25

    The goals of this project have was to: (1) assemble and analyze a comprehensive database of past waste injection operations; (2) develop improved diagnostic techniques for monitoring fracture growth and formation changes; (3) develop operating guidelines to optimize daily operations and ultimate storage capacity of the target formation; and (4) to apply these improved models and guidelines in the field.

  4. Numerical model estimating the capabilities and limitations of the fast Fourier transform technique in absolute interferometry

    NASA Astrophysics Data System (ADS)

    Talamonti, James J.; Kay, Richard B.; Krebs, Danny J.

    1996-05-01

    A numerical model was developed to emulate the capabilities of systems performing noncontact absolute distance measurements. The model incorporates known methods to minimize signal processing and digital sampling errors and evaluates the accuracy limitations imposed by spectral peak isolation by using Hanning, Blackman, and Gaussian windows in the fast Fourier transform technique. We applied this model to the specific case of measuring the relative lengths of a compound Michelson interferometer. By processing computer-simulated data through our model, we project the ultimate precision for ideal data, and data containing AM-FM noise. The precision is shown to be limited by nonlinearities in the laser scan. absolute distance, interferometer.

  5. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  6. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  7. La Mancha Plus One--1969: Proceedings of the Annual Model Secondary School Conference (2nd, University of Vermont, May 23-24, 1969).

    ERIC Educational Resources Information Center

    Manchel, Frank, Ed.; Clark, Virginia, Ed.

    This overview of the first year of the La Mancha Project consists of papers on various aspects of the 5-year project to improve the composition instruction in Vermont schools, incorporating workshop and individual student conference techniques, and integrating writing with other academic studies. The papers include discussions of (1) ninth grade…

  8. Software Acquisition Process (SWAP) Model FY81

    DTIC Science & Technology

    1982-12-01

    experience. In addition, the manpower accounting techniques and the effects of resource limitation are described below. a. Contractor Personnel. Five job...developed are each oriented to a specific type of developmental activity. Between them, they account for all types of activities in the acquisition...manning levels and duration; Decision Box probability; and project staffing levels. They take into account the overall size of the project and the

  9. Research in nonlinear structural and solid mechanics

    NASA Technical Reports Server (NTRS)

    Mccomb, H. G., Jr. (Compiler); Noor, A. K. (Compiler)

    1981-01-01

    Recent and projected advances in applied mechanics, numerical analysis, computer hardware and engineering software, and their impact on modeling and solution techniques in nonlinear structural and solid mechanics are discussed. The fields covered are rapidly changing and are strongly impacted by current and projected advances in computer hardware. To foster effective development of the technology perceptions on computing systems and nonlinear analysis software systems are presented.

  10. Use of Machine Learning Techniques for Iidentification of Robust Teleconnections to East African Rainfall Variability in Observations and Models

    NASA Technical Reports Server (NTRS)

    Roberts, J. Brent; Robertson, Franklin R.; Funk, Chris

    2014-01-01

    Providing advance warning of East African rainfall variations is a particular focus of several groups including those participating in the Famine Early Warming Systems Network. Both seasonal and long-term model projections of climate variability are being used to examine the societal impacts of hydrometeorological variability on seasonal to interannual and longer time scales. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of both seasonal and climate model projections to develop downscaled scenarios for using in impact modeling. The utility of these projections is reliant on the ability of current models to capture the embedded relationships between East African rainfall and evolving forcing within the coupled ocean-atmosphere-land climate system. Previous studies have posited relationships between variations in El Niño, the Walker circulation, Pacific decadal variability (PDV), and anthropogenic forcing. This study applies machine learning methods (e.g. clustering, probabilistic graphical model, nonlinear PCA) to observational datasets in an attempt to expose the importance of local and remote forcing mechanisms of East African rainfall variability. The ability of the NASA Goddard Earth Observing System (GEOS5) coupled model to capture the associated relationships will be evaluated using Coupled Model Intercomparison Project Phase 5 (CMIP5) simulations.

  11. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  12. Techniques for virtual lung nodule insertion: volumetric and morphometric comparison of projection-based and image-based methods for quantitative CT

    NASA Astrophysics Data System (ADS)

    Robins, Marthony; Solomon, Justin; Sahbaee, Pooyan; Sedlmair, Martin; Choudhury, Kingshuk Roy; Pezeshk, Aria; Sahiner, Berkman; Samei, Ehsan

    2017-09-01

    Virtual nodule insertion paves the way towards the development of standardized databases of hybrid CT images with known lesions. The purpose of this study was to assess three methods (an established and two newly developed techniques) for inserting virtual lung nodules into CT images. Assessment was done by comparing virtual nodule volume and shape to the CT-derived volume and shape of synthetic nodules. 24 synthetic nodules (three sizes, four morphologies, two repeats) were physically inserted into the lung cavity of an anthropomorphic chest phantom (KYOTO KAGAKU). The phantom was imaged with and without nodules on a commercial CT scanner (SOMATOM Definition Flash, Siemens) using a standard thoracic CT protocol at two dose levels (1.4 and 22 mGy CTDIvol). Raw projection data were saved and reconstructed with filtered back-projection and sinogram affirmed iterative reconstruction (SAFIRE, strength 5) at 0.6 mm slice thickness. Corresponding 3D idealized, virtual nodule models were co-registered with the CT images to determine each nodule’s location and orientation. Virtual nodules were voxelized, partial volume corrected, and inserted into nodule-free CT data (accounting for system imaging physics) using two methods: projection-based Technique A, and image-based Technique B. Also a third Technique C based on cropping a region of interest from the acquired image of the real nodule and blending it into the nodule-free image was tested. Nodule volumes were measured using a commercial segmentation tool (iNtuition, TeraRecon, Inc.) and deformation was assessed using the Hausdorff distance. Nodule volumes and deformations were compared between the idealized, CT-derived and virtual nodules using a linear mixed effects regression model which utilized the mean, standard deviation, and coefficient of variation (Mea{{n}RHD} , ST{{D}RHD} and C{{V}RHD}{) }~ of the regional Hausdorff distance. Overall, there was a close concordance between the volumes of the CT-derived and virtual nodules. Percent differences between them were less than 3% for all insertion techniques and were not statistically significant in most cases. Correlation coefficient values were greater than 0.97. The deformation according to the Hausdorff distance was also similar between the CT-derived and virtual nodules with minimal statistical significance in the (C{{V}RHD} ) for Techniques A, B, and C. This study shows that both projection-based and image-based nodule insertion techniques yield realistic nodule renderings with statistical similarity to the synthetic nodules with respect to nodule volume and deformation. These techniques could be used to create a database of hybrid CT images containing nodules of known size, location and morphology.

  13. Techniques for virtual lung nodule insertion: volumetric and morphometric comparison of projection-based and image-based methods for quantitative CT

    PubMed Central

    Robins, Marthony; Solomon, Justin; Sahbaee, Pooyan; Sedlmair, Martin; Choudhury, Kingshuk Roy; Pezeshk, Aria; Sahiner, Berkman; Samei, Ehsan

    2017-01-01

    Virtual nodule insertion paves the way towards the development of standardized databases of hybrid CT images with known lesions. The purpose of this study was to assess three methods (an established and two newly developed techniques) for inserting virtual lung nodules into CT images. Assessment was done by comparing virtual nodule volume and shape to the CT-derived volume and shape of synthetic nodules. 24 synthetic nodules (three sizes, four morphologies, two repeats) were physically inserted into the lung cavity of an anthropomorphic chest phantom (KYOTO KAGAKU). The phantom was imaged with and without nodules on a commercial CT scanner (SOMATOM Definition Flash, Siemens) using a standard thoracic CT protocol at two dose levels (1.4 and 22 mGy CTDIvol). Raw projection data were saved and reconstructed with filtered back-projection and sinogram affirmed iterative reconstruction (SAFIRE, strength 5) at 0.6 mm slice thickness. Corresponding 3D idealized, virtual nodule models were co-registered with the CT images to determine each nodule’s location and orientation. Virtual nodules were voxelized, partial volume corrected, and inserted into nodule-free CT data (accounting for system imaging physics) using two methods: projection-based Technique A, and image-based Technique B. Also a third Technique C based on cropping a region of interest from the acquired image of the real nodule and blending it into the nodule-free image was tested. Nodule volumes were measured using a commercial segmentation tool (iNtuition, TeraRecon, Inc.) and deformation was assessed using the Hausdorff distance. Nodule volumes and deformations were compared between the idealized, CT-derived and virtual nodules using a linear mixed effects regression model which utilized the mean, standard deviation, and coefficient of variation (MeanRHD, and STDRHD CVRHD) of the regional Hausdorff distance. Overall, there was a close concordance between the volumes of the CT-derived and virtual nodules. Percent differences between them were less than 3% for all insertion techniques and were not statistically significant in most cases. Correlation coefficient values were greater than 0.97. The deformation according to the Hausdorff distance was also similar between the CT-derived and virtual nodules with minimal statistical significance in the (CVRHD) for Techniques A, B, and C. This study shows that both projection-based and image-based nodule insertion techniques yield realistic nodule renderings with statistical similarity to the synthetic nodules with respect to nodule volume and deformation. These techniques could be used to create a database of hybrid CT images containing nodules of known size, location and morphology. PMID:28786399

  14. Characterization of Orbital Debris via Hyper-Velocity Ground-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather

    2016-01-01

    The purpose of the DebriSat project is to replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoDand NASA breakup models.

  15. Regional Climate Change across the Continental U.S. Projected from Downscaling IPCC AR5 Simulations

    NASA Astrophysics Data System (ADS)

    Otte, T. L.; Nolte, C. G.; Otte, M. J.; Pinder, R. W.; Faluvegi, G.; Shindell, D. T.

    2011-12-01

    Projecting climate change scenarios to local scales is important for understanding and mitigating the effects of climate change on society and the environment. Many of the general circulation models (GCMs) that are participating in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) do not fully resolve regional-scale processes and therefore cannot capture local changes in temperature and precipitation extremes. We seek to project the GCM's large-scale climate change signal to the local scale using a regional climate model (RCM) by applying dynamical downscaling techniques. The RCM will be used to better understand the local changes of temperature and precipitation extremes that may result from a changing climate. Preliminary results from downscaling NASA/GISS ModelE simulations of the IPCC AR5 Representative Concentration Pathway (RCP) scenario 6.0 will be shown. The Weather Research and Forecasting (WRF) model will be used as the RCM to downscale decadal time slices for ca. 2000 and ca. 2030 and illustrate potential changes in regional climate for the continental U.S. that are projected by ModelE and WRF under RCP6.0.

  16. Improved pattern scaling approaches for the use in climate impact studies

    NASA Astrophysics Data System (ADS)

    Herger, Nadja; Sanderson, Benjamin M.; Knutti, Reto

    2015-05-01

    Pattern scaling is a simple way to produce climate projections beyond the scenarios run with expensive global climate models (GCMs). The simplest technique has known limitations and assumes that a spatial climate anomaly pattern obtained from a GCM can be scaled by the global mean temperature (GMT) anomaly. We propose alternatives and assess their skills and limitations. One approach which avoids scaling is to consider a period in a different scenario with the same GMT change. It is attractive as it provides patterns of any temporal resolution that are consistent across variables, and it does not distort variability. Second, we extend the traditional approach with a land-sea contrast term, which provides the largest improvements over the traditional technique. When interpolating between known bounding scenarios, the proposed methods significantly improve the accuracy of the pattern scaled scenario with little computational cost. The remaining errors are much smaller than the Coupled Model Intercomparison Project Phase 5 model spread.

  17. Advanced space power requirements and techniques. Task 1: Mission projections and requirements. Volume 3: Appendices. [cost estimates and computer programs

    NASA Technical Reports Server (NTRS)

    Wolfe, M. G.

    1978-01-01

    Contents: (1) general study guidelines and assumptions; (2) launch vehicle performance and cost assumptions; (3) satellite programs 1959 to 1979; (4) initiative mission and design characteristics; (5) satellite listing; (6) spacecraft design model; (7) spacecraft cost model; (8) mission cost model; and (9) nominal and optimistic budget program cost summaries.

  18. A Method for Calculating the Probability of Successfully Completing a Rocket Propulsion Ground Test

    NASA Technical Reports Server (NTRS)

    Messer, Bradley

    2007-01-01

    Propulsion ground test facilities face the daily challenge of scheduling multiple customers into limited facility space and successfully completing their propulsion test projects. Over the last decade NASA s propulsion test facilities have performed hundreds of tests, collected thousands of seconds of test data, and exceeded the capabilities of numerous test facility and test article components. A logistic regression mathematical modeling technique has been developed to predict the probability of successfully completing a rocket propulsion test. A logistic regression model is a mathematical modeling approach that can be used to describe the relationship of several independent predictor variables X(sub 1), X(sub 2),.., X(sub k) to a binary or dichotomous dependent variable Y, where Y can only be one of two possible outcomes, in this case Success or Failure of accomplishing a full duration test. The use of logistic regression modeling is not new; however, modeling propulsion ground test facilities using logistic regression is both a new and unique application of the statistical technique. Results from this type of model provide project managers with insight and confidence into the effectiveness of rocket propulsion ground testing.

  19. A dynamic model-based approach to motion and deformation tracking of prosthetic valves from biplane x-ray images.

    PubMed

    Wagner, Martin G; Hatt, Charles R; Dunkerley, David A P; Bodart, Lindsay E; Raval, Amish N; Speidel, Michael A

    2018-04-16

    Transcatheter aortic valve replacement (TAVR) is a minimally invasive procedure in which a prosthetic heart valve is placed and expanded within a defective aortic valve. The device placement is commonly performed using two-dimensional (2D) fluoroscopic imaging. Within this work, we propose a novel technique to track the motion and deformation of the prosthetic valve in three dimensions based on biplane fluoroscopic image sequences. The tracking approach uses a parameterized point cloud model of the valve stent which can undergo rigid three-dimensional (3D) transformation and different modes of expansion. Rigid elements of the model are individually rotated and translated in three dimensions to approximate the motions of the stent. Tracking is performed using an iterative 2D-3D registration procedure which estimates the model parameters by minimizing the mean-squared image values at the positions of the forward-projected model points. Additionally, an initialization technique is proposed, which locates clusters of salient features to determine the initial position and orientation of the model. The proposed algorithms were evaluated based on simulations using a digital 4D CT phantom as well as experimentally acquired images of a prosthetic valve inside a chest phantom with anatomical background features. The target registration error was 0.12 ± 0.04 mm in the simulations and 0.64 ± 0.09 mm in the experimental data. The proposed algorithm could be used to generate 3D visualization of the prosthetic valve from two projections. In combination with soft-tissue sensitive-imaging techniques like transesophageal echocardiography, this technique could enable 3D image guidance during TAVR procedures. © 2018 American Association of Physicists in Medicine.

  20. Identifying misbehaving models using baseline climate variance

    NASA Astrophysics Data System (ADS)

    Schultz, Colin

    2011-06-01

    The majority of projections made using general circulation models (GCMs) are conducted to help tease out the effects on a region, or on the climate system as a whole, of changing climate dynamics. Sun et al., however, used model runs from 20 different coupled atmosphere-ocean GCMs to try to understand a different aspect of climate projections: how bias correction, model selection, and other statistical techniques might affect the estimated outcomes. As a case study, the authors focused on predicting the potential change in precipitation for the Murray-Darling Basin (MDB), a 1-million- square- kilometer area in southeastern Australia that suffered a recent decade of drought that left many wondering about the potential impacts of climate change on this important agricultural region. The authors first compared the precipitation predictions made by the models with 107 years of observations, and they then made bias corrections to adjust the model projections to have the same statistical properties as the observations. They found that while the spread of the projected values was reduced, the average precipitation projection for the end of the 21st century barely changed. Further, the authors determined that interannual variations in precipitation for the MDB could be explained by random chance, where the precipitation in a given year was independent of that in previous years.

  1. A novel background field removal method for MRI using projection onto dipole fields (PDF).

    PubMed

    Liu, Tian; Khalidov, Ildar; de Rochefort, Ludovic; Spincemaille, Pascal; Liu, Jing; Tsiouris, A John; Wang, Yi

    2011-11-01

    For optimal image quality in susceptibility-weighted imaging and accurate quantification of susceptibility, it is necessary to isolate the local field generated by local magnetic sources (such as iron) from the background field that arises from imperfect shimming and variations in magnetic susceptibility of surrounding tissues (including air). Previous background removal techniques have limited effectiveness depending on the accuracy of model assumptions or information input. In this article, we report an observation that the magnetic field for a dipole outside a given region of interest (ROI) is approximately orthogonal to the magnetic field of a dipole inside the ROI. Accordingly, we propose a nonparametric background field removal technique based on projection onto dipole fields (PDF). In this PDF technique, the background field inside an ROI is decomposed into a field originating from dipoles outside the ROI using the projection theorem in Hilbert space. This novel PDF background removal technique was validated on a numerical simulation and a phantom experiment and was applied in human brain imaging, demonstrating substantial improvement in background field removal compared with the commonly used high-pass filtering method. Copyright © 2011 John Wiley & Sons, Ltd.

  2. Nassau County Department of Recreation and Parks.

    ERIC Educational Resources Information Center

    Iowa Univ., Iowa City. Recreation Education Program.

    Presented are duplications of the responses given by the Nassau County Department of Recreation and Parks (East Meadow, New York) as part of a project to collect, share, and compile information about, and techniques in the operation of 18 community action models for recreation services to the disabled. Model programs are categorized as consumer,…

  3. Boat Building Design and Construction Techniques in the Architectural Design Studio.

    ERIC Educational Resources Information Center

    Smith, Richard A.

    1982-01-01

    Describes a model boat building project for architectural design studios. Working from traditional sailboat designs, students study the "lines" drawings of boats, make full-size drawings from scale drawings, and then construct model wooden boats. Available from Carfax Publishing Company, P.O. Box 25, Abindgon, Oxfordshire OX14 1RW…

  4. A Method for Calculating the Probability of Successfully Completing a Rocket Propulsion Ground Test

    NASA Technical Reports Server (NTRS)

    Messer, Bradley P.

    2004-01-01

    Propulsion ground test facilities face the daily challenges of scheduling multiple customers into limited facility space and successfully completing their propulsion test projects. Due to budgetary and schedule constraints, NASA and industry customers are pushing to test more components, for less money, in a shorter period of time. As these new rocket engine component test programs are undertaken, the lack of technology maturity in the test articles, combined with pushing the test facilities capabilities to their limits, tends to lead to an increase in facility breakdowns and unsuccessful tests. Over the last five years Stennis Space Center's propulsion test facilities have performed hundreds of tests, collected thousands of seconds of test data, and broken numerous test facility and test article parts. While various initiatives have been implemented to provide better propulsion test techniques and improve the quality, reliability, and maintainability of goods and parts used in the propulsion test facilities, unexpected failures during testing still occur quite regularly due to the harsh environment in which the propulsion test facilities operate. Previous attempts at modeling the lifecycle of a propulsion component test project have met with little success. Each of the attempts suffered form incomplete or inconsistent data on which to base the models. By focusing on the actual test phase of the tests project rather than the formulation, design or construction phases of the test project, the quality and quantity of available data increases dramatically. A logistic regression model has been developed form the data collected over the last five years, allowing the probability of successfully completing a rocket propulsion component test to be calculated. A logistic regression model is a mathematical modeling approach that can be used to describe the relationship of several independent predictor variables X(sub 1), X(sub 2),..,X(sub k) to a binary or dichotomous dependent variable Y, where Y can only be one of two possible outcomes, in this case Success or Failure. Logistic regression has primarily been used in the fields of epidemiology and biomedical research, but lends itself to many other applications. As indicated the use of logistic regression is not new, however, modeling propulsion ground test facilities using logistic regression is both a new and unique application of the statistical technique. Results from the models provide project managers with insight and confidence into the affectivity of rocket engine component ground test projects. The initial success in modeling rocket propulsion ground test projects clears the way for more complex models to be developed in this area.

  5. Acceleration techniques for dependability simulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  6. Investigation of air transportation technology at Massachusetts Institute of Technology, 1984

    NASA Technical Reports Server (NTRS)

    Simpson, Robert W.

    1987-01-01

    Three projects sponsored by the Joint University Program at MIT are summarized. Two projects were focussed on the potential application of Loran-C in flying nonprecision approaches to general aviation runways, and the third project involved research on aircraft icing. In one Loran-C project, Aircraft Approach Guidance Using Relative Loran-C Navigation, the concept was flight tested. It used the difference in TD's from those of the touchdown point to simplify and speed navigation computer processing and took advantage of the short term accuracy of less than 100 feet for Loran-C. The goal of the project, Probabilistic Modelling of Loran-C Error for Nonprecision Approaches, was to develop a mathematical model which would predict the probability that an approach flown to a runway with a particular Loran-C receiver would fall within a given standard. The Aircraft Icing project focussed on measurement of droplet trajectories and droplet impingement/runback characteristics and measurement of real time ice accretion using ultrasonic pulse echo techniques.

  7. Development of a volumetric projection technique for the digital evaluation of field of view.

    PubMed

    Marshall, Russell; Summerskill, Stephen; Cook, Sharon

    2013-01-01

    Current regulations for field of view requirements in road vehicles are defined by 2D areas projected on the ground plane. This paper discusses the development of a new software-based volumetric field of view projection tool and its implementation within an existing digital human modelling system. In addition, the exploitation of this new tool is highlighted through its use in a UK Department for Transport funded research project exploring the current concerns with driver vision. Focusing specifically on rearwards visibility in small and medium passenger vehicles, the volumetric approach is shown to provide a number of distinct advantages. The ability to explore multiple projections of both direct vision (through windows) and indirect vision (through mirrors) provides a greater understanding of the field of view environment afforded to the driver whilst still maintaining compatibility with the 2D projections of the regulatory standards. Field of view requirements for drivers of road vehicles are defined by simplified 2D areas projected onto the ground plane. However, driver vision is a complex 3D problem. This paper presents the development of a new software-based 3D volumetric projection technique and its implementation in the evaluation of driver vision in small- and medium-sized passenger vehicles.

  8. Developing a stochastic traffic volume prediction model for public-private partnership projects

    NASA Astrophysics Data System (ADS)

    Phong, Nguyen Thanh; Likhitruangsilp, Veerasak; Onishi, Masamitsu

    2017-11-01

    Transportation projects require an enormous amount of capital investment resulting from their tremendous size, complexity, and risk. Due to the limitation of public finances, the private sector is invited to participate in transportation project development. The private sector can entirely or partially invest in transportation projects in the form of Public-Private Partnership (PPP) scheme, which has been an attractive option for several developing countries, including Vietnam. There are many factors affecting the success of PPP projects. The accurate prediction of traffic volume is considered one of the key success factors of PPP transportation projects. However, only few research works investigated how to predict traffic volume over a long period of time. Moreover, conventional traffic volume forecasting methods are usually based on deterministic models which predict a single value of traffic volume but do not consider risk and uncertainty. This knowledge gap makes it difficult for concessionaires to estimate PPP transportation project revenues accurately. The objective of this paper is to develop a probabilistic traffic volume prediction model. First, traffic volumes were estimated following the Geometric Brownian Motion (GBM) process. Monte Carlo technique is then applied to simulate different scenarios. The results show that this stochastic approach can systematically analyze variations in the traffic volume and yield more reliable estimates for PPP projects.

  9. Carbon footprint estimator, phase II : volume II - technical appendices.

    DOT National Transportation Integrated Search

    2014-03-01

    The GASCAP model was developed to provide a software tool for analysis of the life-cycle GHG : emissions associated with the construction and maintenance of transportation projects. This phase : of development included techniques for estimating emiss...

  10. Optimizing construction quality management of pavements using mechanistic performance analysis.

    DOT National Transportation Integrated Search

    2004-08-01

    This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...

  11. A position and attitude vision measurement system for wind tunnel slender model

    NASA Astrophysics Data System (ADS)

    Cheng, Lei; Yang, Yinong; Xue, Bindang; Zhou, Fugen; Bai, Xiangzhi

    2014-11-01

    A position and attitude vision measurement system for drop test slender model in wind tunnel is designed and developed. The system used two high speed cameras, one is put to the side of the model and another is put to the position where the camera can look up the model. Simple symbols are set on the model. The main idea of the system is based on image matching technique between the 3D-digital model projection image and the image captured by the camera. At first, we evaluate the pitch angles, the roll angles and the position of the centroid of a model through recognizing symbols in the images captured by the side camera. And then, based on the evaluated attitude info, giving a series of yaw angles, a series of projection images of the 3D-digital model are obtained. Finally, these projection images are matched with the image which captured by the looking up camera, and the best match's projection images corresponds to the yaw angle is the very yaw angle of the model. Simulation experiments are conducted and the results show that the maximal error of attitude measurement is less than 0.05°, which can meet the demand of test in wind tunnel.

  12. Measuring the free neutron lifetime to <= 0.3s via the beam method

    NASA Astrophysics Data System (ADS)

    Fomin, Nadia

    2017-09-01

    Neutron beta decay is an archetype for all semi-leptonic charged-current weak processes. While of interest as a fundamental particle property, a precise value for the neutron lifetime is also required for consistency tests of the Standard Model as well as to calculate the primordial 4He abundance in Big Bang Nucleosynthesis models. An effort has begun to develop an in-beam measurement of the neutron lifetime with a projected <= 0.3s uncertainty. This effort is part of a phased campaign of neutron lifetime measurements based at the NIST Center for Neutron Research, using the Sussex-ILL-NIST technique. Recent advances in neutron fluence measurement techniques as well as new large area silicon detector technology address the two largest sources of uncertainty of in-beam measurements, paving the way for a new measurement. The experimental design and projected uncertainties for the 0.3s measurement will be discussed.

  13. Advanced Avionics and Processor Systems for a Flexible Space Exploration Architecture

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Adams, James H.; Smith, Leigh M.; Johnson, Michael A.; Cressler, John D.

    2010-01-01

    The Advanced Avionics and Processor Systems (AAPS) project, formerly known as the Radiation Hardened Electronics for Space Environments (RHESE) project, endeavors to develop advanced avionic and processor technologies anticipated to be used by NASA s currently evolving space exploration architectures. The AAPS project is a part of the Exploration Technology Development Program, which funds an entire suite of technologies that are aimed at enabling NASA s ability to explore beyond low earth orbit. NASA s Marshall Space Flight Center (MSFC) manages the AAPS project. AAPS uses a broad-scoped approach to developing avionic and processor systems. Investment areas include advanced electronic designs and technologies capable of providing environmental hardness, reconfigurable computing techniques, software tools for radiation effects assessment, and radiation environment modeling tools. Near-term emphasis within the multiple AAPS tasks focuses on developing prototype components using semiconductor processes and materials (such as Silicon-Germanium (SiGe)) to enhance a device s tolerance to radiation events and low temperature environments. As the SiGe technology will culminate in a delivered prototype this fiscal year, the project emphasis shifts its focus to developing low-power, high efficiency total processor hardening techniques. In addition to processor development, the project endeavors to demonstrate techniques applicable to reconfigurable computing and partially reconfigurable Field Programmable Gate Arrays (FPGAs). This capability enables avionic architectures the ability to develop FPGA-based, radiation tolerant processor boards that can serve in multiple physical locations throughout the spacecraft and perform multiple functions during the course of the mission. The individual tasks that comprise AAPS are diverse, yet united in the common endeavor to develop electronics capable of operating within the harsh environment of space. Specifically, the AAPS tasks for the Federal fiscal year of 2010 are: Silicon-Germanium (SiGe) Integrated Electronics for Extreme Environments, Modeling of Radiation Effects on Electronics, Radiation Hardened High Performance Processors (HPP), and and Reconfigurable Computing.

  14. Application of Boosting Regression Trees to Preliminary Cost Estimation in Building Construction Projects

    PubMed Central

    2015-01-01

    Among the recent data mining techniques available, the boosting approach has attracted a great deal of attention because of its effective learning algorithm and strong boundaries in terms of its generalization performance. However, the boosting approach has yet to be used in regression problems within the construction domain, including cost estimations, but has been actively utilized in other domains. Therefore, a boosting regression tree (BRT) is applied to cost estimations at the early stage of a construction project to examine the applicability of the boosting approach to a regression problem within the construction domain. To evaluate the performance of the BRT model, its performance was compared with that of a neural network (NN) model, which has been proven to have a high performance in cost estimation domains. The BRT model has shown results similar to those of NN model using 234 actual cost datasets of a building construction project. In addition, the BRT model can provide additional information such as the importance plot and structure model, which can support estimators in comprehending the decision making process. Consequently, the boosting approach has potential applicability in preliminary cost estimations in a building construction project. PMID:26339227

  15. Application of Boosting Regression Trees to Preliminary Cost Estimation in Building Construction Projects.

    PubMed

    Shin, Yoonseok

    2015-01-01

    Among the recent data mining techniques available, the boosting approach has attracted a great deal of attention because of its effective learning algorithm and strong boundaries in terms of its generalization performance. However, the boosting approach has yet to be used in regression problems within the construction domain, including cost estimations, but has been actively utilized in other domains. Therefore, a boosting regression tree (BRT) is applied to cost estimations at the early stage of a construction project to examine the applicability of the boosting approach to a regression problem within the construction domain. To evaluate the performance of the BRT model, its performance was compared with that of a neural network (NN) model, which has been proven to have a high performance in cost estimation domains. The BRT model has shown results similar to those of NN model using 234 actual cost datasets of a building construction project. In addition, the BRT model can provide additional information such as the importance plot and structure model, which can support estimators in comprehending the decision making process. Consequently, the boosting approach has potential applicability in preliminary cost estimations in a building construction project.

  16. Results of the Greenland Ice Sheet Model Initialisation Experiments ISMIP6 - initMIP-Greenland

    NASA Astrophysics Data System (ADS)

    Goelzer, H.; Nowicki, S.; Edwards, T.; Beckley, M.; Abe-Ouchi, A.; Aschwanden, A.; Calov, R.; Gagliardini, O.; Gillet-chaulet, F.; Golledge, N. R.; Gregory, J. M.; Greve, R.; Humbert, A.; Huybrechts, P.; Larour, E. Y.; Lipscomb, W. H.; Le ´h, S.; Lee, V.; Kennedy, J. H.; Pattyn, F.; Payne, A. J.; Rodehacke, C. B.; Rückamp, M.; Saito, F.; Schlegel, N.; Seroussi, H. L.; Shepherd, A.; Sun, S.; Vandewal, R.; Ziemen, F. A.

    2016-12-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. The goal of this intercomparison exercise (initMIP-Greenland) is to compare, evaluate and improve the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss final results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  17. Techniques for the computation in demographic projections of health manpower.

    PubMed

    Horbach, L

    1979-01-01

    Some basic principles and algorithms are presented which can be used for projective calculations of medical staff on the basis of demographic data. The effects of modifications of the input data such as by health policy measures concerning training capacity, can be demonstrated by repeated calculations with assumptions. Such models give a variety of results and may highlight the probable future balance between health manpower supply and requirements.

  18. Computer architecture evaluation for structural dynamics computations: Project summary

    NASA Technical Reports Server (NTRS)

    Standley, Hilda M.

    1989-01-01

    The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.

  19. Evaluation of Oil-Industry Stimulation Practices for Engineered Geothermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peter Van Dyke; Leen Weijers; Ann Robertson-Tait

    Geothermal energy extraction is typically achieved by use of long open-hole intervals in an attempt to connect the well with the greatest possible rock mass. This presents a problem for the development of Enhanced (Engineered) Geothermal Systems (EGS), owing to the challenge of obtaining uniform stimulation throughout the open-hole interval. Fluids are often injected in only a fraction of that interval, reducing heat transfer efficiency and increasing energy cost. Pinnacle Technologies, Inc. and GeothermEx, Inc. evaluated a variety of techniques and methods that are commonly used for hydraulic fracturing of oil and gas wells to increase and evaluate stimulation effectivenessmore » in EGS wells. Headed by Leen Weijers, formerly Manager of Technical Development at Pinnacle Technologies, Inc., the project ran from August 1, 2004 to July 31, 2006 in two one-year periods to address the following tasks and milestones: 1) Analyze stimulation results from the closest oil-field equivalents for EGS applications in the United States (e.g., the Barnett Shale in North Texas) (section 3 on page 8). Pinnacle Technologies, Inc. has collected fracture growth data from thousands of stimulations (section 3.1 on page 12). This data was further evaluated in the context of: a) Identifying techniques best suited to developing a stimulated EGS fracture network (section 3.2 on page 29), and b) quantifying the growth of the network under various conditions to develop a calibrated model for fracture network growth (section 3.3 on page 30). The developed model can be used to design optimized EGS fracture networks that maximize contact with the heat source and minimize short-circuiting (section 3.4 on page 38). 2) Evaluate methods used in oil field applications to improve fluid diversion and penetration and determine their applicability to EGS (section 4 on page 50). These methods include, but are not limited to: a) Stimulation strategies (propped fracturing versus water fracturing versus injecting fluid below fracturing gradients) (section 4.1 on page 50); b) zonal isolation methods (by use of perforated casing or packers) (section 4.2 on page 57); c) fracture re-orientation and fracture network growth techniques (e.g., by use of alternating high- and low-rate injections) (section 4.4 on page 74); and d) fluid diversion methods (by use of the SurgiFrac technique, the StimGun perforation technique, or stress shadowing). This project task is to be completed in the first project year, enabling the most promising techniques to be field tested and evaluated in the second project year. 3) Study the applicability of the methods listed above by utilizing several techniques (section 5 on page 75) including, but not limited to: a) Hydraulic Impedance Testing (HIT) to determine the location of open hydraulic fractures along a open-hole interval; b) pressure transient testing to determine reservoir permeability, pore pressure, and closure stress; and c) treatment well tilt mapping or microseismic mapping to evaluate fracture coverage. These techniques were reviewed for their potential application for EGS in the first project year (section 5.1 on page 75). This study also includes further analysis of any field testing that will be conducted in the Desert Peak area in Nevada for ORMAT Nevada, Inc. (section 5.2 on page 86), with the aim to close the loop to provide reliable calibrated fracture model results. Developed through its hydraulic fracture consulting business, techniques of Pinnacle Technologies, Inc. for stimulating and analyzing fracture growth have helped the oil and gas industry to improve hydraulic fracturing from both a technical and economic perspective. In addition to more than 30 years of experience in the development of geothermal energy for commercial power generation throughout the world, GeothermEx, Inc. brings to the project: 1) Detailed information about specific developed and potential EGS reservoirs, 2) experience with geothermal well design, completion, and testing practices, and 3) a direct connection to the Desert Peak EGS project.« less

  20. Using technology-enhanced, cooperative, group-project learning for student comprehension and academic performance

    NASA Astrophysics Data System (ADS)

    Tlhoaele, Malefyane; Suhre, Cor; Hofman, Adriaan

    2016-05-01

    Cooperative learning may improve students' motivation, understanding of course concepts, and academic performance. This study therefore enhanced a cooperative, group-project learning technique with technology resources to determine whether doing so improved students' deep learning and performance. A sample of 118 engineering students, randomly divided into two groups, participated in this study and provided data through questionnaires issued before and after the experiment. The results, obtained through analyses of variance and structural equation modelling, reveal that technology-enhanced, cooperative, group-project learning improves students' comprehension and academic performance.

  1. Validation of the Arabic Version of the Group Personality Projective Test among university students in Bahrain.

    PubMed

    Al-Musawi, Nu'man M

    2003-04-01

    Using confirmatory factor analytic techniques on data generated from 200 students enrolled at the University of Bahrain, we obtained some construct validity and reliability data for the Arabic Version of the 1961 Group Personality Projective Test by Cassel and Khan. In contrast to the 5-factor model proposed for the Group Personality Projective Test, a 6-factor solution appeared justified for the Arabic Version of this test, suggesting some variance between the cultural groups in the United States and in Bahrain.

  2. Stereoscopic display of 3D models for design visualization

    NASA Astrophysics Data System (ADS)

    Gilson, Kevin J.

    2006-02-01

    Advances in display technology and 3D design visualization applications have made real-time stereoscopic visualization of architectural and engineering projects a reality. Parsons Brinkerhoff (PB) is a transportation consulting firm that has used digital visualization tools from their inception and has helped pioneer the application of those tools to large scale infrastructure projects. PB is one of the first Architecture/Engineering/Construction (AEC) firms to implement a CAVE- an immersive presentation environment that includes stereoscopic rear-projection capability. The firm also employs a portable stereoscopic front-projection system, and shutter-glass systems for smaller groups. PB is using commercial real-time 3D applications in combination with traditional 3D modeling programs to visualize and present large AEC projects to planners, clients and decision makers in stereo. These presentations create more immersive and spatially realistic presentations of the proposed designs. This paper will present the basic display tools and applications, and the 3D modeling techniques PB is using to produce interactive stereoscopic content. The paper will discuss several architectural and engineering design visualizations we have produced.

  3. Reducing the Complexity of an Agent-Based Local Heroin Market Model

    PubMed Central

    Heard, Daniel; Bobashev, Georgiy V.; Morris, Robert J.

    2014-01-01

    This project explores techniques for reducing the complexity of an agent-based model (ABM). The analysis involved a model developed from the ethnographic research of Dr. Lee Hoffer in the Larimer area heroin market, which involved drug users, drug sellers, homeless individuals and police. The authors used statistical techniques to create a reduced version of the original model which maintained simulation fidelity while reducing computational complexity. This involved identifying key summary quantities of individual customer behavior as well as overall market activity and replacing some agents with probability distributions and regressions. The model was then extended to allow external market interventions in the form of police busts. Extensions of this research perspective, as well as its strengths and limitations, are discussed. PMID:25025132

  4. MO-AB-BRA-09: Development and Evaluation of a Biomechanical Modeling-Assisted CBCT Reconstruction Technique (Bio-Recon)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y; Nasehi Tehrani, J; Wang, J

    Purpose: To develop a Bio-recon technique by incorporating the biomechanical properties of anatomical structures into the deformation-based CBCT reconstruction process. Methods: Bio-recon reconstructs the CBCT by deforming a prior high-quality CT/CBCT using a deformation-vector-field (DVF). The DVF is solved through two alternating steps: 2D–3D deformation and finite-element-analysis based biomechanical modeling. 2D–3D deformation optimizes the DVF through an ‘intensity-driven’ approach, which updates the DVF to minimize intensity mismatches between the acquired projections and the simulated projections from the deformed CBCT. In contrast, biomechanical modeling optimizes the DVF through a ‘biomechanical-feature-driven’ approach, which updates the DVF based on the biophysical properties ofmore » anatomical structures. In general, Biorecon extracts the 2D–3D deformation-optimized DVF at high-contrast structure boundaries, and uses it as the boundary condition to drive biomechanical modeling to optimize the overall DVF, especially at low-contrast regions. The optimized DVF is fed back into the 2D–3D deformation for further optimization, which forms an iterative loop. The efficacy of Bio-recon was evaluated on 11 lung patient cases, each with a prior CT and a new CT. Cone-beam projections were generated from the new CTs to reconstruct CBCTs, which were compared with the original new CTs for evaluation. 872 anatomical landmarks were also manually identified by a clinician on both the prior and new CTs to track the lung motion, which was used to evaluate the DVF accuracy. Results: Using 10 projections for reconstruction, the average (± s.d.) relative errors of reconstructed CBCTs by the clinical FDK technique, the 2D–3D deformation-only technique and Bio-recon were 46.5±5.9%, 12.0±2.3% and 10.4±1.3%, respectively. The average residual errors of DVF-tracked landmark motion by the 2D–3D deformation-only technique and Bio-recon were 5.6±4.3mm and 3.1±2.4mm, respectively. Conclusion: Bio-recon improved accuracy for both the reconstructed CBCT and the DVF. The accurate DVF can benefit multiple clinical practices, such as image-guided adaptive radiotherapy. We acknowledge funding support from the American Cancer Society (RSG-13-326-01-CCE), from the US National Institutes of Health (R01 EB020366), and from the Cancer Prevention and Research Institute of Texas (RP130109).« less

  5. Matching motivation enhancement treatment to client motivation: re-examining the Project MATCH motivation matching hypothesis.

    PubMed

    Witkiewitz, Katie; Hartzler, Bryan; Donovan, Dennis

    2010-08-01

    The current study was designed to re-examine the motivation matching hypothesis from Project MATCH using growth mixture modeling, an analytical technique that models variation in individual drinking patterns. Secondary data analyses of data from Project MATCH (n = 1726), a large multi-site alcoholism treatment-matching study. Percentage of drinking days was the primary outcome measure, assessed from 1 month to 12 months following treatment. Treatment assignment, alcohol dependence symptoms and baseline percentage of drinking days were included as covariates. The results provided support for the motivation matching hypothesis in the out-patient sample and among females in the aftercare sample: the majority of individuals with lower baseline motivation had better outcomes if assigned to motivation enhancement treatment (MET) compared to those assigned to cognitive behavioral treatment (CBT). In the aftercare sample there was a moderating effect of gender and alcohol dependence severity, whereby males with lower baseline motivation and greater alcohol dependence drank more frequently if assigned to MET compared to those assigned to CBT. Results from the current study lend partial support to the motivation-matching hypothesis and also demonstrated the importance of moderating influences on treatment matching effectiveness. Based upon these findings, individuals with low baseline motivation in out-patient settings and males with low levels of alcohol dependence or females in aftercare settings may benefit more from motivational enhancement techniques than from cognitive-behavioral techniques.

  6. Overview and highlights of Early Warning and Crop Condition Assessment project

    NASA Technical Reports Server (NTRS)

    Boatwright, G. O.; Whitehead, V. S.

    1985-01-01

    Work of the Early Warning and Crop Condition Assessment (EW/CCA) project, one of eight projects in the Agriculture and Resources Inventory Surveys Through Aerospace Remote Sensing (AgRISTARS), is reviewed. Its mission, to develop and test remote sensing techniques that enhance operational methodologies for crop condition assessment, was in response to initiatives issued by the Secretary of Agriculture. Meteorologically driven crop stress indicator models have been developed or modified for wheat, maize, grain sorghum, and soybeans. These models provide early warning alerts of potential or actual crop stresses due to water deficits, adverse temperatures, and water excess that could delay planting or harvesting operations. Recommendations are given for future research involving vegetative index numbers and the NOAA and Landsat satellites.

  7. Time to act: crossing borders in global AIDS prevention.

    PubMed

    Convisser, J; Thuermer, K

    1993-01-01

    After 9 months of market research and collaboration among local health officials, businesses, politicians, and teenagers, Population Services International (PSI) launched Project ACTION in Portland, Oregon on December 1, 1992. It is the first PSI project in the United States concerned with prevention of human immunodeficiency virus (HIV) and acquired immunodeficiency syndrome (AIDS) in American youth. PSI has conducted 23 projects worldwide over the last 20 years that were based on social marketing (the utilization of commercial marketing techniques to promote healthy behavior). The objective of the project is promotion of safe sex practices, especially the use of condoms, among sexually active youth, aged 12-21. The Mass Media and Condom Social Marketing project of PSI in Zaire was used as a model for Project ACTION. Techniques used include mass marketing campaigns, point of purchase promotion, improvement of access to key products among target populations, and adjustment of purchase price to create a market. The target populations include adolescents who use drugs, are involved with the juvenile justice system, are pregnant, have a problem home environment, are homeless or live on the street, are chronically absent from school, or have a history of sexually transmitted disease.

  8. Amplified plant turnover in response to climate change forecast by Late Quaternary records

    NASA Astrophysics Data System (ADS)

    Nogués-Bravo, D.; Veloz, S.; Holt, B. G.; Singarayer, J.; Valdes, P.; Davis, B.; Brewer, S. C.; Williams, J. W.; Rahbek, C.

    2016-12-01

    Conservation decisions are informed by twenty-first-century climate impact projections that typically predict high extinction risk. Conversely, the palaeorecord shows strong sensitivity of species abundances and distributions to past climate changes, but few clear instances of extinctions attributable to rising temperatures. However, few studies have incorporated palaeoecological data into projections of future distributions. Here we project changes in abundance and conservation status under a climate warming scenario for 187 European and North American plant taxa using niche-based models calibrated against taxa-climate relationships for the past 21,000 years. We find that incorporating long-term data into niche-based models increases the magnitude of projected future changes for plant abundances and community turnover. The larger projected changes in abundances and community turnover translate into different, and often more threatened, projected IUCN conservation status for declining tree taxa, compared with traditional approaches. An average of 18.4% (North America) and 15.5% (Europe) of taxa switch IUCN categories when compared with single-time model results. When taxa categorized as `Least Concern' are excluded, the palaeo-calibrated models increase, on average, the conservation threat status of 33.2% and 56.8% of taxa. Notably, however, few models predict total disappearance of taxa, suggesting resilience for these taxa, if climate were the only extinction driver. Long-term studies linking palaeorecords and forecasting techniques have the potential to improve conservation assessments.

  9. Decision insight into stakeholder conflict for ERN.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siirola, John; Tidwell, Vincent Carroll; Benz, Zachary O.

    Participatory modeling has become an important tool in facilitating resource decision making and dispute resolution. Approaches to modeling that are commonly used in this context often do not adequately account for important human factors. Current techniques provide insights into how certain human activities and variables affect resource outcomes; however, they do not directly simulate the complex variables that shape how, why, and under what conditions different human agents behave in ways that affect resources and human interactions related to them. Current approaches also do not adequately reveal how the effects of individual decisions scale up to have systemic level effectsmore » in complex resource systems. This lack of integration prevents the development of more robust models to support decision making and dispute resolution processes. Development of integrated tools is further hampered by the fact that collection of primary data for decision-making modeling is costly and time consuming. This project seeks to develop a new approach to resource modeling that incorporates both technical and behavioral modeling techniques into a single decision-making architecture. The modeling platform is enhanced by use of traditional and advanced processes and tools for expedited data capture. Specific objectives of the project are: (1) Develop a proof of concept for a new technical approach to resource modeling that combines the computational techniques of system dynamics and agent based modeling, (2) Develop an iterative, participatory modeling process supported with traditional and advance data capture techniques that may be utilized to facilitate decision making, dispute resolution, and collaborative learning processes, and (3) Examine potential applications of this technology and process. The development of this decision support architecture included both the engineering of the technology and the development of a participatory method to build and apply the technology. Stakeholder interaction with the model and associated data capture was facilitated through two very different modes of engagement, one a standard interface involving radio buttons, slider bars, graphs and plots, while the other utilized an immersive serious gaming interface. The decision support architecture developed through this project was piloted in the Middle Rio Grande Basin to examine how these tools might be utilized to promote enhanced understanding and decision-making in the context of complex water resource management issues. Potential applications of this architecture and its capacity to lead to enhanced understanding and decision-making was assessed through qualitative interviews with study participants who represented key stakeholders in the basin.« less

  10. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  11. Development of a nondestructive vibration technique for bond assessment of Space Shuttle tiles

    NASA Technical Reports Server (NTRS)

    Moslehy, Faissal A.

    1994-01-01

    This final report describes the achievements of the above titled project. The project is funded by NASA-KSC (Grant No. NAG 10-0117) for the period of 1 Jan. to 31 Dec. 1993. The purpose of this project was to develop a nondestructive, noncontact technique based on 'vibration signature' of tile systems to quantify the bond conditions of the thermal protection system) tiles of Space Shuttle orbiters. The technique uses a laser rapid scan system, modal measurements, and finite element modeling. Finite element models were developed for tiles bonded to both clamped and deformable integrated skin-stringer orbiter mid-fuselage. Results showed that the size and location of a disbonded tile can be determined from frequency and mode shape information. Moreover, a frequency response survey was used to quickly identify the disbonded tiles. The finite element results were compared with experimentally determined frequency responses of a 17-tile test panel, where a rapidscan laser system was employed. An excellent degree of correlation between the mathematical simulation and experimental results was realized. An inverse solution for single-tile assemblies was also derived and is being implemented into a computer program that can interact with the modal testing software. The output of the program displays the size and location of disbond. This program has been tested with simulated input (i.e., finite element data), and excellent agreement between predicted and simulated disbonds was shown. Finally, laser vibration imaging and acoustic emission techniques were shown to be well suited for detecting and monitoring the progressive damage in Graphite/Epoxy composite materials.

  12. Locally-Adaptive, Spatially-Explicit Projection of U.S. Population for 2030 and 2050

    DOE PAGES

    McKee, Jacob J.; Rose, Amy N.; Bright, Eddie A.; ...

    2015-02-03

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Moreover, knowing the spatial distribution of future population allows for increased preparation in the event of an emergency. Building on the spatial interpolation technique previously developed for high resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically-informed spatial distribution of the projected population of the contiguous U.S. for 2030 and 2050. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection modelmore » departs from these by accounting for multiple components that affect population distribution. Modelled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the U.S. Census s projection methodology with the U.S. Census s official projection as the benchmark. Applications of our model include, but are not limited to, suitability modelling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.« less

  13. Locally-Adaptive, Spatially-Explicit Projection of U.S. Population for 2030 and 2050

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKee, Jacob J.; Rose, Amy N.; Bright, Eddie A.

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Moreover, knowing the spatial distribution of future population allows for increased preparation in the event of an emergency. Building on the spatial interpolation technique previously developed for high resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically-informed spatial distribution of the projected population of the contiguous U.S. for 2030 and 2050. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection modelmore » departs from these by accounting for multiple components that affect population distribution. Modelled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the U.S. Census s projection methodology with the U.S. Census s official projection as the benchmark. Applications of our model include, but are not limited to, suitability modelling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.« less

  14. Laser-Induced Thermal Acoustic Measurements in a Highly Back-Pressured Scramjet Isolator Model: A Research Plan

    NASA Technical Reports Server (NTRS)

    Middleton, Troy F.; Balla, Robert J.; Baurle, Robert A.; Wilson, Lloyd G.

    2008-01-01

    Under the Propulsion Discipline of NASA s Fundamental Aeronautics Program s Hypersonics Project, a test apparatus, for testing a scramjet isolator model, is being constructed at NASA's Langley Research Center. The test apparatus will incorporate a 1-inch by 2-inch by 15-inch-long scramjet isolator model supplied with 2.1 lbm/sec of unheated dry air through a Mach 2.5 converging-diverging nozzle. The planned research will incorporate progressively more challenging measurement techniques to characterize the flow field within the isolator, concluding with the application of the Laser-Induced Thermal Acoustic (LITA) measurement technique. The primary goal of this research is to use the data acquired to validate Computational Fluid Dynamics (CFD) models employed to characterize the complex flow field of a scramjet isolator. This paper describes the test apparatus being constructed, pre-test CFD simulations, and the LITA measurement technique.

  15. Thermal management of VECSELs by front surface direct liquid cooling

    NASA Astrophysics Data System (ADS)

    Smyth, Conor J. C.; Mirkhanov, Shamil; Quarterman, Adrian H.; Wilcox, Keith G.

    2016-03-01

    Efficient thermal management is vital for VECSELs, affecting the output power and several aspects of performance of the device. Presently there exist two distinct methods of effective thermal management which both possess their merits and disadvantages. Substrate removal of the VECSEL gain chip has proved a successful method in devices emitting at a wavelength near 1μm. However for other wavelengths the substrate removal technique has proved less effective primarily due to the thermal impedance of the distributed Bragg reflectors. The second method of thermal management involves the use of crystalline heat spreaders bonded to the gain chip surface. Although this is an effective thermal management scheme, the disadvantages are additional loss and the etalon effect that filters the gain spectrum, making mode locking more difficult and normally resulting in multiple peaks in the spectrum. There are considerable disadvantages associated with both methods attributed to heatspreader cost and sample processing. It is for these reasons that a proposed alternative, front surface liquid cooling, has been investigated in this project. Direct liquid cooling involves flowing a temperature-controlled liquid over the sample's surface. In this project COMSOL was used to model surface liquid cooling of a VECSEL sample in order to investigate and compare its potential thermal management with current standard thermal management techniques. Based on modelling, experiments were carried out in order to evaluate the performance of the technique. While modelling suggests that this is potentially a mid-performance low cost alternative to existing techniques, experimental measurements to date do not reflect the performance predicted from modelling.

  16. Effect of microstructure on the static and dynamic behavior of recycled asphalt material

    DOT National Transportation Integrated Search

    2002-07-01

    This report describes the research activities of a project dealing with theoretical/numerical modeling and experimental studies of the micromechanical behavior of recycled asphalt material. The theoretical work employed finite element techniques to d...

  17. Characterizing the deformation of reservoirs using interferometry, gravity, and seismic analyses

    NASA Astrophysics Data System (ADS)

    Schiek, Cara Gina

    In this dissertation, I characterize how reservoirs deform using surface and subsurface techniques. The surface technique I employ is radar interferometry, also known as InSAR (Interferometric Synthetic Aperture Radar). The subsurface analyses I explore include gravity modeling and seismic techniques consisting of determining earthquake locations from a small-temporary seismic network of six seismometers. These techniques were used in two different projects to determine how reservoirs deform in the subsurface and how this deformation relates to its remotely sensed surface deformation. The first project uses InSAR to determine land subsidence in the Mimbres basin near Deming, NM. The land subsidence measurements are visually compared to gravity models in order to determine the influence of near surface faults on the subsidence and the physical properties of the aquifers in these basins. Elastic storage coefficients were calculated for the Mimbres basin to aid in determining the stress regime of the aquifers. In the Mimbres basin, I determine that it is experiencing elastic deformation at differing compaction rates. The west side of the Mimbres basin is deforming faster, 17 mm/yr, while the east side of the basin is compacting at a rate of 11 mm/yr. The second project focuses on San Miguel volcano, El Salvador. Here, I integrate InSAR with earthquake locations using surface deformation forward modeling to investigate the explosive volcanism in this region. This investigation determined the areas around the volcano that are undergoing deformation, and that could lead to volcanic hazards such as slope failure from a fractured volcano interior. I use the earthquake epicenters with field data to define the subsurface geometry of the deformation source, which I forward model to produce synthetic interferograms. Residuals between the synthetic and observed interferograms demonstrate that the observed deformation is a direct result of the seismic activity along the San Miguel Fracture Zone. Based on the large number of earthquakes concentrated in this region and the fracturing suggested by the earthquake location results, I conclude that the southwestern slope of San Miguel is the most susceptible to volcanic hazards such as landsliding and flank lava flows. Together these projects explore the dynamics of reservoir systems, both hydrologic and magmatic. They show the utility of geodetic remote sensing to constrain the relative importance of various, complex, subsurface processes, including faulting, fluid migration, and compaction.

  18. Calibration of an arbitrarily arranged projection moiré system for 3D shape measurement

    NASA Astrophysics Data System (ADS)

    Tang, Ying; Yao, Jun; Zhou, Yihao; Sun, Chen; Yang, Peng; Miao, Hong; Chen, Jubing

    2018-05-01

    An arbitrarily arranged projection moiré system is presented for three-dimensional shape measurement. We develop a model for projection moiré system and derive a universal formula expressing the relation between height and phase variation before and after we put the object on the reference plane. With so many system parameters involved, a system calibration technique is needed. In this work, we provide a robust and accurate calibration method for an arbitrarily arranged projection moiré system. The system no longer puts restrictions on the configuration of the optical setup. Real experiments have been conducted to verify the validity of this method.

  19. Body as Echoes: Cyber Archiving of Dazu Rock Carvings

    NASA Astrophysics Data System (ADS)

    Chen, W.-W.

    2017-08-01

    "Body As Echoes: Cyber Archiving of Dazu Rock Carvings (BAE project in short)" strives to explore the tangible/intangible aspects of digital heritage conservation. Aiming at Dazu Rock Carvings - World Heritage Site of Sichuan Province, BAE project utilizes photogrammetry and digital sculpting technique to investigate digital narrative of cultural heritage conservation. It further provides collaborative opportunities to conduct the high-resolution site survey for scholars and institutions at local authorities. For preserving and making sustainable of the tangible cultural heritage at Dazu Rock Carvings, BAE project cyber-archives the selected niches and the caves at Dazu, and transform them into high-resolution, three-dimensional models. For extending the established results and making the digital resources available to broader audiences, BAE project will further develop interactive info-motion interface and apply the knowledge of digital heritage from BAE project to STEM education. BAE project expects to bridge the platform for archeology, computer graphics, and interactive info-motion design. Digital sculpting, projection mapping, interactive info-motion and VR will be the core techniques to explore the narrative of digital heritage conservation. For further protecting, educating and consolidating "building dwelling thinking" through digital heritage preservation, BAE project helps to preserve the digital humanity, and reach out to museum staffs and academia. By the joint effort of global institutions and local authorities, BAE project will also help to foster and enhance the mutual understanding through intercultural collaborations.

  20. A best-fit model for concept vectors in biomedical research grants.

    PubMed

    Johnson, Calvin; Lau, William; Bhandari, Archna; Hays, Timothy

    2008-11-06

    The Research, Condition, and Disease Categorization (RCDC) project was created to standardize budget reporting by research topic. Text mining techniques have been implemented to classify NIH grant applications into proper research and disease categories. A best-fit model is shown to achieve classification performance rivaling that of concept vectors produced by human experts.

  1. The Collinearity Free and Bias Reduced Regression Estimation Project: The Theory of Normalization Ridge Regression. Report No. 2.

    ERIC Educational Resources Information Center

    Bulcock, J. W.; And Others

    Multicollinearity refers to the presence of highly intercorrelated independent variables in structural equation models, that is, models estimated by using techniques such as least squares regression and maximum likelihood. There is a problem of multicollinearity in both the natural and social sciences where theory formulation and estimation is in…

  2. Modelling the Penetration of Salicylates through Skin Using a Silicone Membrane

    ERIC Educational Resources Information Center

    Wilkins, Andrew; Parmenter, Emily

    2012-01-01

    A diffusion cell to model the permeation of salicylate drugs through the skin using low-cost materials and a sensitive colorimetric analytical technique is described. The diffusion apparatus has been used at a further education college by a student for her AS-level Extended Project to investigate the permeation rates of salicylic acid…

  3. Improving Teacher Attitude and Morale through Maintaining Teacher Effectiveness: An Indiana Staff Development Model.

    ERIC Educational Resources Information Center

    Gilman, David A.; And Others

    The purpose of this study was to determine the effects of Maintaining Teaching Effectiveness, a staff development model, upon public school educators' attitudes toward various professional and personal factors. The techniques used for the project included a collegial support network and peer coaching. A total of 24 educators participated from…

  4. Model and controller reduction of large-scale structures based on projection methods

    NASA Astrophysics Data System (ADS)

    Gildin, Eduardo

    The design of low-order controllers for high-order plants is a challenging problem theoretically as well as from a computational point of view. Frequently, robust controller design techniques result in high-order controllers. It is then interesting to achieve reduced-order models and controllers while maintaining robustness properties. Controller designed for large structures based on models obtained by finite element techniques yield large state-space dimensions. In this case, problems related to storage, accuracy and computational speed may arise. Thus, model reduction methods capable of addressing controller reduction problems are of primary importance to allow the practical applicability of advanced controller design methods for high-order systems. A challenging large-scale control problem that has emerged recently is the protection of civil structures, such as high-rise buildings and long-span bridges, from dynamic loadings such as earthquakes, high wind, heavy traffic, and deliberate attacks. Even though significant effort has been spent in the application of control theory to the design of civil structures in order increase their safety and reliability, several challenging issues are open problems for real-time implementation. This dissertation addresses with the development of methodologies for controller reduction for real-time implementation in seismic protection of civil structures using projection methods. Three classes of schemes are analyzed for model and controller reduction: nodal truncation, singular value decomposition methods and Krylov-based methods. A family of benchmark problems for structural control are used as a framework for a comparative study of model and controller reduction techniques. It is shown that classical model and controller reduction techniques, such as balanced truncation, modal truncation and moment matching by Krylov techniques, yield reduced-order controllers that do not guarantee stability of the closed-loop system, that is, the reduced-order controller implemented with the full-order plant. A controller reduction approach is proposed such that to guarantee closed-loop stability. It is based on the concept of dissipativity (or positivity) of linear dynamical systems. Utilizing passivity preserving model reduction together with dissipative-LQG controllers, effective low-order optimal controllers are obtained. Results are shown through simulations.

  5. Observational and Modeling-based Study of Corsica Thunderstorms: Preparation of the EXAEDRE Airborne Campaign

    NASA Astrophysics Data System (ADS)

    Defer, E.; Coquillat, S.; Lambert, D.; Pinty, J. P.; Prieur, S.; Caumont, O.; Labatut, L.; Nuret, M.; Blanchet, P.; Buguet, M.; Lalande, P.; Labrouche, G.; Pedeboy, S.; Lojou, J. Y.; Schwarzenboeck, A.; Delanoë, J.; Bourdon, A.; Guiraud, L.

    2017-12-01

    The 4-year EXAEDRE (EXploiting new Atmospheric Electricity Data for Research and the Environment; Oct 2016-Sept 2020) project is sponsored by the French Science Foundation ANR (Agence Nationale de la Recherche). This project is a French contribution to the HyMeX (HYdrological cycle in the Mediterranean EXperiment) program. The EXAEDRE activities rely on innovative multi-disciplinary and state of the art instrumentation and modeling tools to provide a comprehensive description of the electrical activity in thunderstorms. The EXAEDRE observational part is based on i) existing lightning records collected during HyMeX Special Observation Period (SOP1; Sept-Nov 2012), and permanent lightning observations provided by the research Lightning Mapping Array SAETTA and the operational Météorage lightning locating systems, ii) additional lightning observations mapped with a new VHF interferometer especially developed within the EXAEDRE project, and iii) a dedicated airborne campaign over Corsica. The modeling part of the EXAEDRE project exploits the electrification and lightning schemes developed in the cloud resolving model MesoNH and promotes an innovative technique of flash data assimilation in the french operational model AROME of Météo-France. An overview of the EXAEDRE project will be given with an emphasis on the instrumental, observational and modeling activities performed during the 1st year of the project. The preparation of the EXAEDRE airborne campaign scheduled for September 2018 over Corsica will then be discussed. Acknowledgements. The EXAEDRE project is sponsored by grant ANR-16-CE04-0005 with support from the MISTRALS/HyMeX meta program.

  6. Cost and schedule analytical techniques development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.

  7. A CMMI-based approach for medical software project life cycle study.

    PubMed

    Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi

    2013-01-01

    In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.

  8. The effects of climate downscaling technique and observational data set on modeled ecological responses.

    PubMed

    Pourmokhtarian, Afshin; Driscoll, Charles T; Campbell, John L; Hayhoe, Katharine; Stoner, Anne M K

    2016-07-01

    Assessments of future climate change impacts on ecosystems typically rely on multiple climate model projections, but often utilize only one downscaling approach trained on one set of observations. Here, we explore the extent to which modeled biogeochemical responses to changing climate are affected by the selection of the climate downscaling method and training observations used at the montane landscape of the Hubbard Brook Experimental Forest, New Hampshire, USA. We evaluated three downscaling methods: the delta method (or the change factor method), monthly quantile mapping (Bias Correction-Spatial Disaggregation, or BCSD), and daily quantile regression (Asynchronous Regional Regression Model, or ARRM). Additionally, we trained outputs from four atmosphere-ocean general circulation models (AOGCMs) (CCSM3, HadCM3, PCM, and GFDL-CM2.1) driven by higher (A1fi) and lower (B1) future emissions scenarios on two sets of observations (1/8º resolution grid vs. individual weather station) to generate the high-resolution climate input for the forest biogeochemical model PnET-BGC (eight ensembles of six runs).The choice of downscaling approach and spatial resolution of the observations used to train the downscaling model impacted modeled soil moisture and streamflow, which in turn affected forest growth, net N mineralization, net soil nitrification, and stream chemistry. All three downscaling methods were highly sensitive to the observations used, resulting in projections that were significantly different between station-based and grid-based observations. The choice of downscaling method also slightly affected the results, however not as much as the choice of observations. Using spatially smoothed gridded observations and/or methods that do not resolve sub-monthly shifts in the distribution of temperature and/or precipitation can produce biased results in model applications run at greater temporal and/or spatial resolutions. These results underscore the importance of carefully considering field observations used for training, as well as the downscaling method used to generate climate change projections, for smaller-scale modeling studies. Different sources of variability including selection of AOGCM, emissions scenario, downscaling technique, and data used for training downscaling models, result in a wide range of projected forest ecosystem responses to future climate change. © 2016 by the Ecological Society of America.

  9. The German collaborative project SUGAR Utilization of a natural treasure - Developing innovative techniques for the exploration and production of natural gas from hydrate-bearing sediments

    NASA Astrophysics Data System (ADS)

    Haeckel, M.; Bialas, J.; Wallmann, K. J.

    2009-12-01

    Gas hydrates occur in nature at all active and passive continental margins as well as in permafrost regions, and vast amounts of natural gas are bound in those deposits. Geologists estimate that twice as much carbon is bound in gas hydrates than in any other fossil fuel reservoir, such as gas, oil and coal. Hence, natural gas hydrates represent a huge potential energy resource that, in addition, could be utilized in a CO2-neutral and therefore environmentally friendly manner. However, the utilization of this natural treasure is not as easy as the conventional production of oil or natural gas and calls for new and innovative techniques. In the framework of the large-scale collaborative research project SUGAR (Submarine Deposits of Gas Hydrates - Exploration, Production and Transportation), we aim to produce gas from methane hydrates and to sequester carbon dioxide from power plants and other industrial sources as CO2 hydrates in the same host sediments. Thus, the SUGAR project addresses two of the most pressing and challenging topics of our time: development of alternative energy strategies and greenhouse gas mitigation techniques. The SUGAR project is funded by two federal German ministries and the German industry for an initial period of three years. In the framework of this project new technologies starting from gas hydrate exploration techniques over drilling technologies and innovative gas production methods to CO2 storage in gas hydrates and gas transportation technologies will be developed and tested. Beside the performance of experiments, numerical simulation studies will generate data regarding the methane production and CO2 sequestration in the natural environment. Reservoir modelling with respect to gas hydrate formation and development of migration pathways complete the project. This contribution will give detailed information about the planned project parts and first results with focus on the production methods.

  10. Modelling obesity trends in Australia: unravelling the past and predicting the future.

    PubMed

    Hayes, A J; Lung, T W C; Bauman, A; Howard, K

    2017-01-01

    Modelling is increasingly being used to predict the epidemiology of obesity progression and its consequences. The aims of this study were: (a) to present and validate a model for prediction of obesity among Australian adults and (b) to use the model to project the prevalence of obesity and severe obesity by 2025. Individual level simulation combined with survey estimation techniques to model changing population body mass index (BMI) distribution over time. The model input population was derived from a nationally representative survey in 1995, representing over 12 million adults. Simulations were run for 30 years. The model was validated retrospectively and then used to predict obesity and severe obesity by 2025 among different aged cohorts and at a whole population level. The changing BMI distribution over time was well predicted by the model and projected prevalence of weight status groups agreed with population level data in 2008, 2012 and 2014.The model predicts more growth in obesity among younger than older adult cohorts. Projections at a whole population level, were that healthy weight will decline, overweight will remain steady, but obesity and severe obesity prevalence will continue to increase beyond 2016. Adult obesity prevalence was projected to increase from 19% in 1995 to 35% by 2025. Severe obesity (BMI>35), which was only around 5% in 1995, was projected to be 13% by 2025, two to three times the 1995 levels. The projected rise in obesity severe obesity will have more substantial cost and healthcare system implications than in previous decades. Having a robust epidemiological model is key to predicting these long-term costs and health outcomes into the future.

  11. Portable Wireless LAN Device and Two-Way Radio Threat Assessment for Aircraft VHF Communication Radio Band

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  12. Scalable and Accurate SMT-based Model Checking of Data Flow Systems

    DTIC Science & Technology

    2013-10-30

    guided by the semantics of the description language . In this project we developed instead a complementary and novel approach based on a somewhat brute...believe that our approach could help considerably in expanding the reach of abstract interpretation techniques to a variety of tar- get languages , as...project. We worked on developing a framework for compositional verification that capitalizes on the fact that data-flow languages , such as Lustre, have

  13. Computational modeling of optical projection tomographic microscopy using the finite difference time domain method.

    PubMed

    Coe, Ryan L; Seibel, Eric J

    2012-12-01

    We present a method for modeling image formation in optical projection tomographic microscopy (OPTM) using high numerical aperture (NA) condensers and objectives. Similar to techniques used in computed tomography, OPTM produces three-dimensional, reconstructed images of single cells from two-dimensional projections. The model is capable of simulating axial scanning of a microscope objective to produce projections, which are reconstructed using filtered backprojection. Simulation of optical scattering in transmission optical microscopy is designed to analyze all aspects of OPTM image formation, such as degree of specimen staining, refractive-index matching, and objective scanning. In this preliminary work, a set of simulations is performed to examine the effect of changing the condenser NA, objective scan range, and complex refractive index on the final reconstruction of a microshell with an outer radius of 1.5 μm and an inner radius of 0.9 μm. The model lays the groundwork for optimizing OPTM imaging parameters and triaging efforts to further improve the overall system design. As the model is expanded in the future, it will be used to simulate a more realistic cell, which could lead to even greater impact.

  14. From immunology to MRI data anlysis: Problems in mathematical biology

    NASA Astrophysics Data System (ADS)

    Waters, Ryan Samuel

    This thesis represents a collection of four distinct biological projects rising from immunology and metabolomics that required unique and creative mathematical approaches. One project focuses on understanding the role IL-2 plays in immune response regulation and exploring how these effects can be altered. We developed several dynamic models of the receptor signaling network which we analyze analytically and numerically. In a second project focused also on MS, we sought to create a system for grading magnetic resonance images (MRI) with good correlation with disability. The goal is for these MRI scores to provide a better standard for large-scale clinical drug trials, which limits the bias associated with differences in available MRI technology and general grader/participant variability. The third project involves the study of the CRISPR adaptive immune system in bacteria. Bacterial cells recognize and acquire snippets of exogenous genetic material, which they incorporate into their DNA. In this project we explore the optimal design for the CRISPR system given a viral distribution to maximize its probability of survival. The final project involves the study of the benefits for colocalization of coupled enzymes in metabolic pathways. The hypothesized kinetic advantage, known as `channeling', of putting coupled enzymes closer together has been used as justification for the colocalization of coupled enzymes in biological systems. We developed and analyzed a simple partial differential equation of the diffusion of the intermediate substrate between coupled enzymes to explore the phenomena of channeling. The four projects of my thesis represent very distinct biological problems that required a variety of techniques from diverse areas of mathematics ranging from dynamical modeling to statistics, Fourier series and calculus of variations. In each case, quantitative techniques were used to address biological questions from a mathematical perspective ultimately providing insight back to the biological problems which motivated them.

  15. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  16. Utilization of advanced calibration techniques in stochastic rock fall analysis of quarry slopes

    NASA Astrophysics Data System (ADS)

    Preh, Alexander; Ahmadabadi, Morteza; Kolenprat, Bernd

    2016-04-01

    In order to study rock fall dynamics, a research project was conducted by the Vienna University of Technology and the Austrian Central Labour Inspectorate (Federal Ministry of Labour, Social Affairs and Consumer Protection). A part of this project included 277 full-scale drop tests at three different quarries in Austria and recording key parameters of the rock fall trajectories. The tests involved a total of 277 boulders ranging from 0.18 to 1.8 m in diameter and from 0.009 to 8.1 Mg in mass. The geology of these sites included strong rock belonging to igneous, metamorphic and volcanic types. In this paper the results of the tests are used for calibration and validation a new stochastic computer model. It is demonstrated that the error of the model (i.e. the difference between observed and simulated results) has a lognormal distribution. Selecting two parameters, advanced calibration techniques including Markov Chain Monte Carlo Technique, Maximum Likelihood and Root Mean Square Error (RMSE) are utilized to minimize the error. Validation of the model based on the cross validation technique reveals that in general, reasonable stochastic approximations of the rock fall trajectories are obtained in all dimensions, including runout, bounce heights and velocities. The approximations are compared to the measured data in terms of median, 95% and maximum values. The results of the comparisons indicate that approximate first-order predictions, using a single set of input parameters, are possible and can be used to aid practical hazard and risk assessment.

  17. Final report for “Extreme-scale Algorithms and Solver Resilience”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gropp, William Douglas

    2017-06-30

    This is a joint project with principal investigators at Oak Ridge National Laboratory, Sandia National Laboratories, the University of California at Berkeley, and the University of Tennessee. Our part of the project involves developing performance models for highly scalable algorithms and the development of latency tolerant iterative methods. During this project, we extended our performance models for the Multigrid method for solving large systems of linear equations and conducted experiments with highly scalable variants of conjugate gradient methods that avoid blocking synchronization. In addition, we worked with the other members of the project on alternative techniques for resilience and reproducibility.more » We also presented an alternative approach for reproducible dot-products in parallel computations that performs almost as well as the conventional approach by separating the order of computation from the details of the decomposition of vectors across the processes.« less

  18. Investigation of advanced phase-shifting projected fringe profilometry techniques

    NASA Astrophysics Data System (ADS)

    Liu, Hongyu

    1999-11-01

    The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.

  19. The Baselines Project: Establishing Reference Environmental Conditions for Marine Habitats in the Gulf of Mexico using Forecast Models and Satellite Data

    NASA Astrophysics Data System (ADS)

    Jolliff, J. K.; Gould, R. W.; deRada, S.; Teague, W. J.; Wijesekera, H. W.

    2012-12-01

    We provide an overview of the NASA-funded project, "High-Resolution Subsurface Physical and Optical Property Fields in the Gulf of Mexico: Establishing Baselines and Assessment Tools for Resource Managers." Data assimilative models, analysis fields, and multiple satellite data streams were used to construct temperature and photon flux climatologies for the Flower Garden Banks National Marine Sanctuary (FGBNMS) and similar habitats in the northwestern Gulf of Mexico where geologic features provide a platform for unique coral reef ecosystems. Comparison metrics of the products to in situ data collected during complimentary projects are also examined. Similarly, high-resolution satellite-data streams and advanced processing techniques were used to establish baseline suspended sediment load and turbidity conditions in selected northern Gulf of Mexico estuaries. The results demonstrate the feasibility of blending models and data into accessible web-based analysis products for resource managers, policy makers, and the public.

  20. The effect of Missouri mathematics project learning model on students’ mathematical problem solving ability

    NASA Astrophysics Data System (ADS)

    Handayani, I.; Januar, R. L.; Purwanto, S. E.

    2018-01-01

    This research aims to know the influence of Missouri Mathematics Project Learning Model to Mathematical Problem-solving Ability of Students at Junior High School. This research is a quantitative research and uses experimental research method of Quasi Experimental Design. The research population includes all student of grade VII of Junior High School who are enrolled in the even semester of the academic year 2016/2017. The Sample studied are 76 students from experimental and control groups. The sampling technique being used is cluster sampling method. The instrument is consisted of 7 essay questions whose validity, reliability, difficulty level and discriminating power have been tested. Before analyzing the data by using t-test, the data has fulfilled the requirement for normality and homogeneity. The result of data shows that there is the influence of Missouri mathematics project learning model to mathematical problem-solving ability of students at junior high school with medium effect.

  1. SU-G-JeP3-04: Estimating 4D CBCT from Prior Information and Extremely Limited Angle Projections Using Structural PCA and Weighted Free-Form Deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, W; Yin, F; Zhang, Y

    Purpose: To investigate the feasibility of using structure-based principal component analysis (PCA) motion-modeling and weighted free-form deformation to estimate on-board 4D-CBCT using prior information and extremely limited angle projections for potential 4D target verification of lung radiotherapy. Methods: A technique for lung 4D-CBCT reconstruction has been previously developed using a deformation field map (DFM)-based strategy. In the previous method, each phase of the 4D-CBCT was generated by deforming a prior CT volume. The DFM was solved by a motion-model extracted by global PCA and a free-form deformation (GMM-FD) technique, using data fidelity constraint and the deformation energy minimization. In thismore » study, a new structural-PCA method was developed to build a structural motion-model (SMM) by accounting for potential relative motion pattern changes between different anatomical structures from simulation to treatment. The motion model extracted from planning 4DCT was divided into two structures: tumor and body excluding tumor, and the parameters of both structures were optimized together. Weighted free-form deformation (WFD) was employed afterwards to introduce flexibility in adjusting the weightings of different structures in the data fidelity constraint based on clinical interests. XCAT (computerized patient model) simulation with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume. The estimation accuracy was evaluated by the Volume-Percent-Difference (VPD)/Center-of-Mass-Shift (COMS) between lesions in the estimated and “ground-truth” on board 4D-CBCT. Results: Among 6 different XCAT scenarios corresponding to respirational and anatomical changes from planning CT to on-board using single 30° on-board projections, the VPD/COMS for SMM-WFD was reduced to 10.64±3.04%/1.20±0.45mm from 21.72±9.24%/1.80±0.53mm for GMM-FD. Using 15° orthogonal projections, the VPD/COMS was further reduced to 1.91±0.86%/0.31±0.42mm based on SMM-WFD. Conclusion: Compared to GMM-FD technique, the SMM-WFD technique can substantially improve the 4D-CBCT estimation accuracy using extremely small scan angles to provide ultra-fast 4D verification. This work was supported by the National Institutes of Health under Grant No. R01-CA184173 and a research grant from Varian Medical Systems.« less

  2. MoSeS: Modelling and Simulation for e-Social Science.

    PubMed

    Townend, Paul; Xu, Jie; Birkin, Mark; Turner, Andy; Wu, Belinda

    2009-07-13

    MoSeS (Modelling and Simulation for e-Social Science) is a research node of the National Centre for e-Social Science. MoSeS uses e-Science techniques to execute an events-driven model that simulates discrete demographic processes; this allows us to project the UK population 25 years into the future. This paper describes the architecture, simulation methodology and latest results obtained by MoSeS.

  3. Stereoscopic applications for design visualization

    NASA Astrophysics Data System (ADS)

    Gilson, Kevin J.

    2007-02-01

    Advances in display technology and 3D design visualization applications have made real-time stereoscopic visualization of architectural and engineering projects a reality. Parsons Brinkerhoff (PB) is a transportation consulting firm that has used digital visualization tools from their inception and has helped pioneer the application of those tools to large scale infrastructure projects. PB is one of the first Architecture/Engineering/Construction (AEC) firms to implement a CAVE- an immersive presentation environment that includes stereoscopic rear-projection capability. The firm also employs a portable stereoscopic front-projection system, and shutter-glass systems for smaller groups. PB is using commercial real-time 3D applications in combination with traditional 3D modeling programs to visualize and present large AEC projects to planners, clients and decision makers in stereo. These presentations create more immersive and spatially realistic presentations of the proposed designs. This paper will present the basic display tools and applications, and the 3D modeling techniques PB is using to produce interactive stereoscopic content. The paper will discuss several architectural and engineering design visualizations we have produced.

  4. Sustainable Hydro Assessment and Groundwater Recharge Projects (SHARP) in Germany - Water Balance Models

    NASA Astrophysics Data System (ADS)

    Niemand, C.; Kuhn, K.; Schwarze, R.

    2010-12-01

    SHARP is a European INTERREG IVc Program. It focuses on the exchange of innovative technologies to protect groundwater resources for future generations by considering the climate change and the different geological and geographical conditions. Regions involved are Austria, United Kingdom, Poland, Italy, Macedonia, Malta, Greece and Germany. They will exchange practical know-how and also determine know-how demands concerning SHARP’s key contents: general groundwater management tools, artificial groundwater recharge technologies, groundwater monitoring systems, strategic use of groundwater resources for drinking water, irrigation and industry, techniques to save water quality and quantity, drinking water safety plans, risk management tools and water balance models. SHARP Outputs & results will influence the regional policy in the frame of sustainable groundwater management to save and improve the quality and quantity of groundwater reservoirs for future generations. The main focus of the Saxon State Office for Environment, Agriculture and Landscape in this project is the enhancement and purposive use of water balance models. Already since 1992 scientists compare different existing water balance models on different scales and coupled with groundwater models. For example in the KLIWEP (Assessment of Impacts of Climate Change Projections on Water and Matter Balance for the Catchment of River Parthe in Saxony) project the coupled model WaSiM-ETH - PCGEOFIM® has been used to study the impact of climate change on water balance and water supplies. The project KliWES (Assessment of the Impacts of Climate Change Projections on Water and Matter Balance for Catchment Areas in Saxony) still running, comprises studies of fundamental effects of climate change on catchments in Saxony. Project objective is to assess Saxon catchments according to the vulnerability of their water resources towards climate change projections in order to derive region-specific recommendations for management actions. The model comparisons within reference areas showed significant differences in outcome. The values of water balance components calculated with different models partially fluctuate by a multiple of their value. The SHARP project was prepared in several previous projects that were testing suitable water balance models and is now able to assist the knowledge transfer.

  5. A rapid parallelization of cone-beam projection and back-projection operator based on texture fetching interpolation

    NASA Astrophysics Data System (ADS)

    Xie, Lizhe; Hu, Yining; Chen, Yang; Shi, Luyao

    2015-03-01

    Projection and back-projection are the most computational consuming parts in Computed Tomography (CT) reconstruction. Parallelization strategies using GPU computing techniques have been introduced. We in this paper present a new parallelization scheme for both projection and back-projection. The proposed method is based on CUDA technology carried out by NVIDIA Corporation. Instead of build complex model, we aimed on optimizing the existing algorithm and make it suitable for CUDA implementation so as to gain fast computation speed. Besides making use of texture fetching operation which helps gain faster interpolation speed, we fixed sampling numbers in the computation of projection, to ensure the synchronization of blocks and threads, thus prevents the latency caused by inconsistent computation complexity. Experiment results have proven the computational efficiency and imaging quality of the proposed method.

  6. Assessment of Medical Risks and Optimization of their Management using Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Fitts, Mary A.; Madurai, Siram; Butler, Doug; Kerstman, Eric; Risin, Diana

    2008-01-01

    The Integrated Medical Model (IMM) Project is a software-based technique that will identify and quantify the medical needs and health risks of exploration crew members during space flight and evaluate the effectiveness of potential mitigation strategies. The IMM Project employs an evidence-based approach that will quantify probability and consequences of defined in-flight medical risks, mitigation strategies, and tactics to optimize crew member health. Using stochastic techniques, the IMM will ultimately inform decision makers at both programmatic and institutional levels and will enable objective assessment of crew health and optimization of mission success using data from relevant cohort populations and from the astronaut population. The objectives of the project include: 1) identification and documentation of conditions that may occur during exploration missions (Baseline Medical Conditions List [BMCL), 2) assessment of the likelihood of conditions in the BMCL occurring during exploration missions (incidence rate), 3) determination of the risk associated with these conditions and quantify in terms of end states (Loss of Crew, Loss of Mission, Evacuation), 4) optimization of in-flight hardware mass, volume, power, bandwidth and cost for a given level of risk or uncertainty, and .. validation of the methodologies used.

  7. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  8. Refining climate change projections for organisms with low dispersal abilities: a case study of the Caspian whip snake.

    PubMed

    Sahlean, Tiberiu C; Gherghel, Iulian; Papeş, Monica; Strugariu, Alexandru; Zamfirescu, Ştefan R

    2014-01-01

    Climate warming is one of the most important threats to biodiversity. Ectothermic organisms such as amphibians and reptiles are especially vulnerable as climatic conditions affect them directly. Ecological niche models (ENMs) are increasingly popular in ecological studies, but several drawbacks exist, including the limited ability to account for the dispersal potential of the species. In this study, we use ENMs to explore the impact of global climate change on the Caspian whip snake (Dolichophis caspius) as model for organisms with low dispersal abilities and to quantify dispersal to novel areas using GIS techniques. Models generated using Maxent 3.3.3 k and GARP for current distribution were projected on future climatic scenarios. A cost-distance analysis was run in ArcGIS 10 using geomorphological features, ecological conditions, and human footprint as "costs" to dispersal of the species to obtain a Maximum Dispersal Range (MDR) estimate. All models developed were statistically significant (P<0.05) and recovered the currently known distribution of D. caspius. Models projected on future climatic conditions using Maxent predicted a doubling of suitable climatic area, while GARP predicted a more conservative expansion. Both models agreed on an expansion of suitable area northwards, with minor decreases at the southern distribution limit. The MDR area calculated using the Maxent model represented a third of the total area of the projected model. The MDR based on GARP models recovered only about 20% of the total area of the projected model. Thus, incorporating measures of species' dispersal abilities greatly reduced estimated area of potential future distributions.

  9. Hydrological impacts of climate change on the Tejo and Guadiana Rivers

    NASA Astrophysics Data System (ADS)

    Kilsby, C. G.; Tellier, S. S.; Fowler, H. J.; Howels, T. R.

    2007-05-01

    A distributed daily rainfall runoff model is applied to the Tejo and Guadiana river basins in Spain and Portugal to simulate the effects of climate change on runoff production, river flows and water resource availability with results aggregated to the monthly level. The model is calibrated, validated and then used for a series of climate change impact assessments for the period 2070 2100. Future scenarios are derived from the HadRM3H regional climate model (RCM) using two techniques: firstly a bias-corrected RCM output, with monthly mean correction factors calculated from observed rainfall records; and, secondly, a circulation-pattern-based stochastic rainfall model. Major reductions in rainfall and streamflow are projected throughout the year; these results differ from those for previous studies where winter increases are projected. Despite uncertainties in the representation of heavily managed river systems, the projected impacts are serious and pose major threats to the maintenance of bipartite water treaties between Spain and Portugal and the supply of water to urban and rural regions of Portugal.

  10. Joint Applications Pilot of the National Climate Predictions and Projections Platform and the North Central Climate Science Center: Delivering climate projections on regional scales to support adaptation planning

    NASA Astrophysics Data System (ADS)

    Ray, A. J.; Ojima, D. S.; Morisette, J. T.

    2012-12-01

    The DOI North Central Climate Science Center (NC CSC) and the NOAA/NCAR National Climate Predictions and Projections (NCPP) Platform and have initiated a joint pilot study to collaboratively explore the "best available climate information" to support key land management questions and how to provide this information. NCPP's mission is to support state of the art approaches to develop and deliver comprehensive regional climate information and facilitate its use in decision making and adaptation planning. This presentation will describe the evolving joint pilot as a tangible, real-world demonstration of linkages between climate science, ecosystem science and resource management. Our joint pilot is developing a deliberate, ongoing interaction to prototype how NCPP will work with CSCs to develop and deliver needed climate information products, including translational information to support climate data understanding and use. This pilot also will build capacity in the North Central CSC by working with NCPP to use climate information used as input to ecological modeling. We will discuss lessons to date on developing and delivering needed climate information products based on this strategic partnership. Four projects have been funded to collaborate to incorporate climate information as part of an ecological modeling project, which in turn will address key DOI stakeholder priorities in the region: Riparian Corridors: Projecting climate change effects on cottonwood and willow seed dispersal phenology, flood timing, and seedling recruitment in western riparian forests. Sage Grouse & Habitats: Integrating climate and biological data into land management decision models to assess species and habitat vulnerability Grasslands & Forests: Projecting future effects of land management, natural disturbance, and CO2 on woody encroachment in the Northern Great Plains The value of climate information: Supporting management decisions in the Plains and Prairie Potholes LCC. NCCSC's role in these projects is to provide the connections between climate data and running ecological models, and prototype these for future work. NCPP will develop capacities to provide enhanced climate information at relevant spatial and temporal scales, both for historical climate and projections of future climate, and will work to link expert guidance and understanding of modeling processes and evaluation of modeling with the use of numerical climate data. Translational information thus is a suite of information that aids in translation of numerical climate information into usable knowledge for applications, e.g. ecological response models, hydrologic risk studies. This information includes technical and scientific aspects including, but not limited to: 1) results of objective, quantitative evaluation of climate models & downscaling techniques, 2) guidance on appropriate uses and interpretation, i.e., understanding the advantages and limitations of various downscaling techniques for specific user applications, 3) characterizing and interpreting uncertainty, 4) Descriptions meaningful to applications, e.g. narratives. NCPP believes that translational information is best co-developed between climate scientists and applications scientists, such as the NC-CSC pilot.

  11. Scientific Communication and the Unified Laboratory Sequence1

    NASA Astrophysics Data System (ADS)

    Silverstein, Todd P.; Hudak, Norman J.; Chapple, Frances H.; Goodney, David E.; Brink, Christina P.; Whitehead, Joyce P.

    1997-02-01

    The "Temperature Dependent Relaxation Kinetics" lab was first implemented in 1987; it uses stopped-flow pH jump techniques to determine rate constants and activation parameters (H, S, G) for a reaction mechanism. Two new experiments (Monoamine Oxidase, and Molecular Modeling) will be implemented in the fall of 1997. The "Monoamine Oxidase" project uses chromatography and spectrophotometry to purify and characterize the enzyme. Subsequent photometric assays explore the enzyme's substrate specificity, activation energy, and denaturation. Finally, in the "Molecular Modeling"project, students characterize enzyme - substrate and drug - receptor interactions. Energy minimization protocols are used to make predictions about protein structure and ligand binding, and to explore pharmacological and biomedical implications. With these additions, the twelve Unified Laboratory projects introduce our chemistry majors to nearly all of the instrumental methods commonly encountered in modern chemistry.

  12. The POD Model: Using Communities of Practice Theory to Conceptualise Student Teachers' Professional Learning Online

    ERIC Educational Resources Information Center

    Clarke, Linda

    2009-01-01

    This paper focuses on the broad outcomes of a research project which aimed to analyse and model student teachers' learning in the online components of an initial teacher education course. It begins with discussion of the methodological approach adopted for the case study, which combined conventional data gathering techniques with those which are…

  13. Dipy, a library for the analysis of diffusion MRI data.

    PubMed

    Garyfallidis, Eleftherios; Brett, Matthew; Amirbekian, Bagrat; Rokem, Ariel; van der Walt, Stefan; Descoteaux, Maxime; Nimmo-Smith, Ian

    2014-01-01

    Diffusion Imaging in Python (Dipy) is a free and open source software project for the analysis of data from diffusion magnetic resonance imaging (dMRI) experiments. dMRI is an application of MRI that can be used to measure structural features of brain white matter. Many methods have been developed to use dMRI data to model the local configuration of white matter nerve fiber bundles and infer the trajectory of bundles connecting different parts of the brain. Dipy gathers implementations of many different methods in dMRI, including: diffusion signal pre-processing; reconstruction of diffusion distributions in individual voxels; fiber tractography and fiber track post-processing, analysis and visualization. Dipy aims to provide transparent implementations for all the different steps of dMRI analysis with a uniform programming interface. We have implemented classical signal reconstruction techniques, such as the diffusion tensor model and deterministic fiber tractography. In addition, cutting edge novel reconstruction techniques are implemented, such as constrained spherical deconvolution and diffusion spectrum imaging (DSI) with deconvolution, as well as methods for probabilistic tracking and original methods for tractography clustering. Many additional utility functions are provided to calculate various statistics, informative visualizations, as well as file-handling routines to assist in the development and use of novel techniques. In contrast to many other scientific software projects, Dipy is not being developed by a single research group. Rather, it is an open project that encourages contributions from any scientist/developer through GitHub and open discussions on the project mailing list. Consequently, Dipy today has an international team of contributors, spanning seven different academic institutions in five countries and three continents, which is still growing.

  14. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    NASA Astrophysics Data System (ADS)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  15. Dipy, a library for the analysis of diffusion MRI data

    PubMed Central

    Garyfallidis, Eleftherios; Brett, Matthew; Amirbekian, Bagrat; Rokem, Ariel; van der Walt, Stefan; Descoteaux, Maxime; Nimmo-Smith, Ian

    2014-01-01

    Diffusion Imaging in Python (Dipy) is a free and open source software project for the analysis of data from diffusion magnetic resonance imaging (dMRI) experiments. dMRI is an application of MRI that can be used to measure structural features of brain white matter. Many methods have been developed to use dMRI data to model the local configuration of white matter nerve fiber bundles and infer the trajectory of bundles connecting different parts of the brain. Dipy gathers implementations of many different methods in dMRI, including: diffusion signal pre-processing; reconstruction of diffusion distributions in individual voxels; fiber tractography and fiber track post-processing, analysis and visualization. Dipy aims to provide transparent implementations for all the different steps of dMRI analysis with a uniform programming interface. We have implemented classical signal reconstruction techniques, such as the diffusion tensor model and deterministic fiber tractography. In addition, cutting edge novel reconstruction techniques are implemented, such as constrained spherical deconvolution and diffusion spectrum imaging (DSI) with deconvolution, as well as methods for probabilistic tracking and original methods for tractography clustering. Many additional utility functions are provided to calculate various statistics, informative visualizations, as well as file-handling routines to assist in the development and use of novel techniques. In contrast to many other scientific software projects, Dipy is not being developed by a single research group. Rather, it is an open project that encourages contributions from any scientist/developer through GitHub and open discussions on the project mailing list. Consequently, Dipy today has an international team of contributors, spanning seven different academic institutions in five countries and three continents, which is still growing. PMID:24600385

  16. A Modeling and Data Analysis of Laser Beam Propagation in the Maritime Domain

    DTIC Science & Technology

    2015-05-18

    approach to computing pdfs is the Kernel Density Method (Reference [9] has an intro - duction to the method), which we will apply to compute the pdf of our...The project has two parts to it: 1) we present a computational analysis of different probability density function approximation techniques; and 2) we... computational analysis of different probability density function approximation techniques; and 2) we introduce preliminary steps towards developing a

  17. Cloudnet Project

    DOE Data Explorer

    Hogan, Robin

    2008-01-15

    Cloudnet is a research project supported by the European Commission. This project aims to use data obtained quasi-continuously for the development and implementation of cloud remote sensing synergy algorithms. The use of active instruments (lidar and radar) results in detailed vertical profiles of important cloud parameters which cannot be derived from current satellite sensing techniques. A network of three already existing cloud remote sensing stations (CRS-stations) will be operated for a two year period, activities will be co-ordinated, data formats harmonised and analysis of the data performed to evaluate the representation of clouds in four major european weather forecast models.

  18. Estimation and prediction of origin-destination matrices for I-66.

    DOT National Transportation Integrated Search

    2011-09-01

    This project uses the Box-Jenkins time-series technique to model and forecast the traffic flows and then : uses the flow forecasts to predict the origin-destination matrices. First, a detailed analysis was conducted : to investigate the best data cor...

  19. Modeling Multidisciplinary Science: Incorporating a Common Research Theme into Biology and Chemistry Courses.

    ERIC Educational Resources Information Center

    Reed, Kelynne E.; Stewart, Betty H.; Redshaw, Peggy A.

    2003-01-01

    Describes a project using a multidisciplinary approach for the simultaneous integration of a theme into several disciplines in which participating students apply techniques they learned during the semester and report their findings with a poster presentation. (YDS)

  20. Development of a Decision Model for Selection of Appropriate Timely Delivery Techniques for Highway Projects

    DOT National Transportation Integrated Search

    2009-04-01

    "The primary umbrella method used by the Oregon Department of Transportation (ODOT) to ensure on-time performance in standard construction contracting is liquidated damages. The assessment value is usually a matter of some judgment. In practice...

  1. Combining model based and data based techniques in a robust bridge health monitoring algorithm.

    DOT National Transportation Integrated Search

    2014-09-01

    Structural Health Monitoring (SHM) aims to analyze civil, mechanical and aerospace systems in order to assess : incipient damage occurrence. In this project, we are concerned with the development of an algorithm within the : SHM paradigm for applicat...

  2. Accurate Monitoring and Fault Detection in Wind Measuring Devices through Wireless Sensor Networks

    PubMed Central

    Khan, Komal Saifullah; Tariq, Muhammad

    2014-01-01

    Many wind energy projects report poor performance as low as 60% of the predicted performance. The reason for this is poor resource assessment and the use of new untested technologies and systems in remote locations. Predictions about the potential of an area for wind energy projects (through simulated models) may vary from the actual potential of the area. Hence, introducing accurate site assessment techniques will lead to accurate predictions of energy production from a particular area. We solve this problem by installing a Wireless Sensor Network (WSN) to periodically analyze the data from anemometers installed in that area. After comparative analysis of the acquired data, the anemometers transmit their readings through a WSN to the sink node for analysis. The sink node uses an iterative algorithm which sequentially detects any faulty anemometer and passes the details of the fault to the central system or main station. We apply the proposed technique in simulation as well as in practical implementation and study its accuracy by comparing the simulation results with experimental results to analyze the variation in the results obtained from both simulation model and implemented model. Simulation results show that the algorithm indicates faulty anemometers with high accuracy and low false alarm rate when as many as 25% of the anemometers become faulty. Experimental analysis shows that anemometers incorporating this solution are better assessed and performance level of implemented projects is increased above 86% of the simulated models. PMID:25421739

  3. Visualizing projected Climate Changes - the CMIP5 Multi-Model Ensemble

    NASA Astrophysics Data System (ADS)

    Böttinger, Michael; Eyring, Veronika; Lauer, Axel; Meier-Fleischer, Karin

    2017-04-01

    Large ensembles add an additional dimension to climate model simulations. Internal variability of the climate system can be assessed for example by multiple climate model simulations with small variations in the initial conditions or by analyzing the spread in large ensembles made by multiple climate models under common protocols. This spread is often used as a measure of uncertainty in climate projections. In the context of the fifth phase of the WCRP's Coupled Model Intercomparison Project (CMIP5), more than 40 different coupled climate models were employed to carry out a coordinated set of experiments. Time series of the development of integral quantities such as the global mean temperature change for all models visualize the spread in the multi-model ensemble. A similar approach can be applied to 2D-visualizations of projected climate changes such as latitude-longitude maps showing the multi-model mean of the ensemble by adding a graphical representation of the uncertainty information. This has been demonstrated for example with static figures in chapter 12 of the last IPCC report (AR5) using different so-called stippling and hatching techniques. In this work, we focus on animated visualizations of multi-model ensemble climate projections carried out within CMIP5 as a way of communicating climate change results to the scientific community as well as to the public. We take a closer look at measures of robustness or uncertainty used in recent publications suitable for animated visualizations. Specifically, we use the ESMValTool [1] to process and prepare the CMIP5 multi-model data in combination with standard visualization tools such as NCL and the commercial 3D visualization software Avizo to create the animations. We compare different visualization techniques such as height fields or shading with transparency for creating animated visualization of ensemble mean changes in temperature and precipitation including corresponding robustness measures. [1] Eyring, V., Righi, M., Lauer, A., Evaldsson, M., Wenzel, S., Jones, C., Anav, A., Andrews, O., Cionni, I., Davin, E. L., Deser, C., Ehbrecht, C., Friedlingstein, P., Gleckler, P., Gottschaldt, K.-D., Hagemann, S., Juckes, M., Kindermann, S., Krasting, J., Kunert, D., Levine, R., Loew, A., Mäkelä, J., Martin, G., Mason, E., Phillips, A. S., Read, S., Rio, C., Roehrig, R., Senftleben, D., Sterl, A., van Ulft, L. H., Walton, J., Wang, S., and Williams, K. D.: ESMValTool (v1.0) - a community diagnostic and performance metrics tool for routine evaluation of Earth system models in CMIP, Geosci. Model Dev., 9, 1747-1802, doi:10.5194/gmd-9-1747-2016, 2016.

  4. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  5. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  6. Process simulation for advanced composites production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allendorf, M.D.; Ferko, S.M.; Griffiths, S.

    1997-04-01

    The objective of this project is to improve the efficiency and lower the cost of chemical vapor deposition (CVD) processes used to manufacture advanced ceramics by providing the physical and chemical understanding necessary to optimize and control these processes. Project deliverables include: numerical process models; databases of thermodynamic and kinetic information related to the deposition process; and process sensors and software algorithms that can be used for process control. Target manufacturing techniques include CVD fiber coating technologies (used to deposit interfacial coatings on continuous fiber ceramic preforms), chemical vapor infiltration, thin-film deposition processes used in the glass industry, and coatingmore » techniques used to deposit wear-, abrasion-, and corrosion-resistant coatings for use in the pulp and paper, metals processing, and aluminum industries.« less

  7. CALM: Complex Adaptive System (CAS)-Based Decision Support for Enabling Organizational Change

    NASA Astrophysics Data System (ADS)

    Adler, Richard M.; Koehn, David J.

    Guiding organizations through transformational changes such as restructuring or adopting new technologies is a daunting task. Such changes generate workforce uncertainty, fear, and resistance, reducing morale, focus and performance. Conventional project management techniques fail to mitigate these disruptive effects, because social and individual changes are non-mechanistic, organic phenomena. CALM (for Change, Adaptation, Learning Model) is an innovative decision support system for enabling change based on CAS principles. CALM provides a low risk method for validating and refining change strategies that combines scenario planning techniques with "what-if" behavioral simulation. In essence, CALM "test drives" change strategies before rolling them out, allowing organizations to practice and learn from virtual rather than actual mistakes. This paper describes the CALM modeling methodology, including our metrics for measuring organizational readiness to respond to change and other major CALM scenario elements: prospective change strategies; alternate futures; and key situational dynamics. We then describe CALM's simulation engine for projecting scenario outcomes and its associated analytics. CALM's simulator unifies diverse behavioral simulation paradigms including: adaptive agents; system dynamics; Monte Carlo; event- and process-based techniques. CALM's embodiment of CAS dynamics helps organizations reduce risk and improve confidence and consistency in critical strategies for enabling transformations.

  8. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  9. The Silicon Trypanosome: a test case of iterative model extension in systems biology

    PubMed Central

    Achcar, Fiona; Fadda, Abeer; Haanstra, Jurgen R.; Kerkhoven, Eduard J.; Kim, Dong-Hyun; Leroux, Alejandro E.; Papamarkou, Theodore; Rojas, Federico; Bakker, Barbara M.; Barrett, Michael P.; Clayton, Christine; Girolami, Mark; Luise Krauth-Siegel, R.; Matthews, Keith R.; Breitling, Rainer

    2016-01-01

    The African trypanosome, Trypanosoma brucei, is a unicellular parasite causing African Trypanosomiasis (sleeping sickness in humans and nagana in animals). Due to some of its unique properties, it has emerged as a popular model organism in systems biology. A predictive quantitative model of glycolysis in the bloodstream form of the parasite has been constructed and updated several times. The Silicon Trypanosome (SilicoTryp) is a project that brings together modellers and experimentalists to improve and extend this core model with new pathways and additional levels of regulation. These new extensions and analyses use computational methods that explicitly take different levels of uncertainty into account. During this project, numerous tools and techniques have been developed for this purpose, which can now be used for a wide range of different studies in systems biology. PMID:24797926

  10. Results and Error Estimates from GRACE Forward Modeling over Greenland, Canada, and Alaska

    NASA Astrophysics Data System (ADS)

    Bonin, J. A.; Chambers, D. P.

    2012-12-01

    Forward modeling using a weighted least squares technique allows GRACE information to be projected onto a pre-determined collection of local basins. This decreases the impact of spatial leakage, allowing estimates of mass change to be better localized. The technique is especially valuable where models of current-day mass change are poor, such as over Greenland and Antarctica. However, the accuracy of the forward model technique has not been determined, nor is it known how the distribution of the local basins affects the results. We use a "truth" model composed of hydrology and ice-melt slopes as an example case, to estimate the uncertainties of this forward modeling method and expose those design parameters which may result in an incorrect high-resolution mass distribution. We then apply these optimal parameters in a forward model estimate created from RL05 GRACE data. We compare the resulting mass slopes with the expected systematic errors from the simulation, as well as GIA and basic trend-fitting uncertainties. We also consider whether specific regions (such as Ellesmere Island and Baffin Island) can be estimated reliably using our optimal basin layout.

  11. The accuracy of general practitioner workforce projections

    PubMed Central

    2013-01-01

    Background Health workforce projections are important instruments to prevent imbalances in the health workforce. For both the tenability and further development of these projections, it is important to evaluate the accuracy of workforce projections. In the Netherlands, health workforce projections have been done since 2000 to support health workforce planning. What is the accuracy of the techniques of these Dutch general practitioner workforce projections? Methods We backtested the workforce projection model by comparing the ex-post projected number of general practitioners with the observed number of general practitioners between 1998 and 2011. Averages of historical data were used for all elements except for inflow in training. As the required training inflow is the key result of the workforce planning model, and has actually determined past adjustments of training inflow, the accuracy of the model was backtested using the observed training inflow and not an average of historical data to avoid the interference of past policy decisions. The accuracy of projections with different lengths of projection horizon and base period (on which the projections are based) was tested. Results The workforce projection model underestimated the number of active Dutch general practitioners in most years. The mean absolute percentage errors range from 1.9% to 14.9%, with the projections being more accurate in more recent years. Furthermore, projections with a shorter projection horizon have a higher accuracy than those with a longer horizon. Unexpectedly, projections with a shorter base period have a higher accuracy than those with a longer base period. Conclusions According to the results of the present study, forecasting the size of the future workforce did not become more difficult between 1998 and 2011, as we originally expected. Furthermore, the projections with a short projection horizon and a short base period are more accurate than projections with a longer projection horizon and base period. We can carefully conclude that health workforce projections can be made with data based on relatively short base periods, although detailed data are still required to monitor and evaluate the health workforce. PMID:23866676

  12. Advanced Ground Systems Maintenance Physics Models For Diagnostics Project

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M.

    2015-01-01

    The project will use high-fidelity physics models and simulations to simulate real-time operations of cryogenic and systems and calculate the status/health of the systems. The project enables the delivery of system health advisories to ground system operators. The capability will also be used to conduct planning and analysis of cryogenic system operations. This project will develop and implement high-fidelity physics-based modeling techniques tosimulate the real-time operation of cryogenics and other fluids systems and, when compared to thereal-time operation of the actual systems, provide assessment of their state. Physics-modelcalculated measurements (called “pseudo-sensors”) will be compared to the system real-timedata. Comparison results will be utilized to provide systems operators with enhanced monitoring ofsystems' health and status, identify off-nominal trends and diagnose system/component failures.This capability can also be used to conduct planning and analysis of cryogenics and other fluidsystems designs. This capability will be interfaced with the ground operations command andcontrol system as a part of the Advanced Ground Systems Maintenance (AGSM) project to helpassure system availability and mission success. The initial capability will be developed for theLiquid Oxygen (LO2) ground loading systems.

  13. Soil moisture needs in earth sciences

    NASA Technical Reports Server (NTRS)

    Engman, Edwin T.

    1992-01-01

    The author reviews the development of passive and active microwave techniques for measuring soil moisture with respect to how the data may be used. New science programs such as the EOS, the GEWEX Continental-Scale International Project (GCIP) and STORM, a mesoscale meteorology and hydrology project, will have to account for soil moisture either as a storage in water balance computations or as a state variable in-process modeling. The author discusses future soil moisture needs such as frequency of measurement, accuracy, depth, and spatial resolution, as well as the concomitant model development that must proceed concurrently if the development in microwave technology is to have a major impact in these areas.

  14. Self-Organizing Hidden Markov Model Map (SOHMMM): Biological Sequence Clustering and Cluster Visualization.

    PubMed

    Ferles, Christos; Beaufort, William-Scott; Ferle, Vanessa

    2017-01-01

    The present study devises mapping methodologies and projection techniques that visualize and demonstrate biological sequence data clustering results. The Sequence Data Density Display (SDDD) and Sequence Likelihood Projection (SLP) visualizations represent the input symbolical sequences in a lower-dimensional space in such a way that the clusters and relations of data elements are depicted graphically. Both operate in combination/synergy with the Self-Organizing Hidden Markov Model Map (SOHMMM). The resulting unified framework is in position to analyze automatically and directly raw sequence data. This analysis is carried out with little, or even complete absence of, prior information/domain knowledge.

  15. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    NASA Astrophysics Data System (ADS)

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic and drought statistic of the historical data. A multi-objective analysis using basic statistics (mean, standard deviation and asymmetry coefficient) and droughts statistics (duration, magnitude and intensity) has been performed to identify which models are better in terms of goodness of fit to reproduce the historical series. The drought statistics have been obtained from the Standard Precipitation index (SPI) series using the Theory of Runs. This analysis allows discriminate the best RCM and the best combination of model and correction technique in the bias-correction method. We have also analyzed the possibilities of using different Stochastic Weather Generators to approximate the basic and droughts statistics of the historical series. These analyses have been performed in our case study in a lumped and in a distributed way in order to assess its sensibility to the spatial scale. The statistic of the future temperature series obtained with different ensemble options are quite homogeneous, but the precipitation shows a higher sensibility to the adopted method and spatial scale. The global increment in the mean temperature values are 31.79 %, 31.79 %, 31.03 % and 31.74 % for the distributed bias-correction, distributed delta-change, lumped bias-correction and lumped delta-change ensembles respectively and in the precipitation they are -25.48 %, -28.49 %, -26.42 % and -27.35% respectively. Acknowledgments: This research work has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02 and CORDEX projects for the data provided for this study and the R package qmap.

  16. Integration of a zebrafish research project into a molecular biology course to support critical thinking and course content goals.

    PubMed

    Felzien, Lisa K

    2016-11-12

    Engaging undergraduates in research is essential for teaching them to think like scientists, and it has become a desired component of classroom and laboratory instruction. Research projects that span an entire semester expose students to a variety of concepts and techniques and allow students to use experiments to learn scientific principles, understand why specific techniques are applicable, critically analyze varied data, and examine how experimentation leads to acquiring knowledge. To provide an experience with these features, a semester long research project was integrated into a combined lecture and laboratory course, Molecular Biology. The project utilized the zebrafish model to examine gene expression during embryonic development and required students to develop and test hypotheses about the timing of expression of previously uncharacterized genes. The main goals for the project were to provide opportunities for students to develop critical thinking skills required for conducting research and to support the content goals of the course. To determine whether these goals were met, student performance on the steps of the project and related pre-test and post-test questions was examined. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(6):565-573, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  17. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    USGS Publications Warehouse

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  18. Supporting Multi-view User Ontology to Understand Company Value Chains

    NASA Astrophysics Data System (ADS)

    Zuo, Landong; Salvadores, Manuel; Imtiaz, Sm Hazzaz; Darlington, John; Gibbins, Nicholas; Shadbolt, Nigel R.; Dobree, James

    The objective of the Market Blended Insight (MBI) project is to develop web based techniques to improve the performance of UK Business to Business (B2B) marketing activities. The analysis of company value chains is a fundamental task within MBI because it is an important model for understanding the market place and the company interactions within it. The project has aggregated rich data profiles of 3.7 million companies that form the active UK business community. The profiles are augmented by Web extractions from heterogeneous sources to provide unparalleled business insight. Advances by the Semantic Web in knowledge representation and logic reasoning allow flexible integration of data from heterogeneous sources, transformation between different representations and reasoning about their meaning. The MBI project has identified that the market insight and analysis interests of different types of users are difficult to maintain using a single domain ontology. Therefore, the project has developed a technique to undertake a plurality of analyses of value chains by deploying a distributed multi-view ontology to capture different user views over the classification of companies and their various relationships.

  19. Zooniverse - A Platform for Data-Driven Citizen Science

    NASA Astrophysics Data System (ADS)

    Smith, A.; Lintott, C.; Bamford, S.; Fortson, L.

    2011-12-01

    In July 2007 a team of astrophysicists created a web-based astronomy project called Galaxy Zoo in which members of the public were asked to classify galaxies from the Sloan Digital Sky Survey by their shape. Over the following year a community of more than 150,000 people classified each of the 1 million galaxies more than 50 times each. Four years later this community of 'citizen scientists' is more than 450,000 strong and is contributing their time and efforts to more than 10 Zooniverse projects each with its own science team and research case. With projects ranging from transcribing ancient greek texts (ancientlives.org) to lunar science (moonzoo.org) the challenges to the Zooniverse community have gone well beyond the relatively simple original Galaxy Zoo interface. Delivering a range of citizen science projects to a large web-based audience presents challenges on a number of fronts including interface design, data architecture/modelling and reduction techniques, web-infrastructure and software design. In this paper we will describe how the Zooniverse team (a collaboration of scientists, software developers and educators ) have developed tools and techniques to solve some of these issues.

  20. Application of Rapid Prototyping to the Investment Casting of Test Hardware (MSFC Center Director's Discretionary Fund Final Report, Project No. 98-08)

    NASA Technical Reports Server (NTRS)

    Cooper, K. G.; Wells, D.

    2000-01-01

    Investment casting masters of a selected propulsion hardware component, a fuel pump housing, were rapid prototyped on the several processes in-house, along with the new Z-Corp process acquired through this project. Also, tensile samples were prototyped and cast using the same significant parameters. The models were then shelled in-house using a commercial grade zircon-based slurry and stucco technique. Next, the shelled models were fired and cast by our in-house foundry contractor (IITRI), with NASA-23, a commonly used test hardware metal. The cast models are compared by their surface finish and overall appearance (i.e., the occurrence of pitting, warping, etc.), as well as dimensional accuracy.

  1. Documentation and virtual reconstruction of historical objects in Peru damaged by an earthquake and climatic events

    NASA Astrophysics Data System (ADS)

    Hanzalová, K.; Pavelka, K.

    2013-07-01

    This paper deals with the possibilities of creating a 3-D model and a visualization technique for a presentation of historical buildings and sites in Peru. The project Nasca/CTU is documenting historical objects by using several techniques. This paper describes the documentation and the visualization of two historical churches (San Jose and San Xavier Churches) and the pre-Hispanic archaeological site La Ciudad Perdida de Huayuri (Abandoned town near Huayuri) in Nasca region by using photogrammetry and remote sensing. Both churches were damaged by an earthquake. We use different process for the documentation of these objects. Firstly, PhotoModeler software was used for the photogrammetric data processing of the acquired images. The subsequent making models of both churches were different too. Google SketchUp software was used for the San Jose Church and the 3-D model of San Xavier Church was created in MicroStation software. While in the modelling of the "Abandoned town" near Huayuri, which was destroyed by a climatic event (El Niño), the terrestrial photogrammetry, satellite data and GNSS measurement were applied. The general output of the project is a thematic map of this archaeological site; C14 method was used for dating.

  2. Physics and Process Modeling (PPM) and Other Propulsion R and T. Volume 1; Materials Processing, Characterization, and Modeling; Lifting Models

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This CP contains the extended abstracts and presentation figures of 36 papers presented at the PPM and Other Propulsion R&T Conference. The focus of the research described in these presentations is on materials and structures technologies that are parts of the various projects within the NASA Aeronautics Propulsion Systems Research and Technology Base Program. These projects include Physics and Process Modeling; Smart, Green Engine; Fast, Quiet Engine; High Temperature Engine Materials Program; and Hybrid Hyperspeed Propulsion. Also presented were research results from the Rotorcraft Systems Program and work supported by the NASA Lewis Director's Discretionary Fund. Authors from NASA Lewis Research Center, industry, and universities conducted research in the following areas: material processing, material characterization, modeling, life, applied life models, design techniques, vibration control, mechanical components, and tribology. Key issues, research accomplishments, and future directions are summarized in this publication.

  3. Hydrological modeling as an evaluation tool of EURO-CORDEX climate projections and bias correction methods

    NASA Astrophysics Data System (ADS)

    Hakala, Kirsti; Addor, Nans; Seibert, Jan

    2017-04-01

    Streamflow stemming from Switzerland's mountainous landscape will be influenced by climate change, which will pose significant challenges to the water management and policy sector. In climate change impact research, the determination of future streamflow is impeded by different sources of uncertainty, which propagate through the model chain. In this research, we explicitly considered the following sources of uncertainty: (1) climate models, (2) downscaling of the climate projections to the catchment scale, (3) bias correction method and (4) parameterization of the hydrological model. We utilize climate projections at the 0.11 degree 12.5 km resolution from the EURO-CORDEX project, which are the most recent climate projections for the European domain. EURO-CORDEX is comprised of regional climate model (RCM) simulations, which have been downscaled from global climate models (GCMs) from the CMIP5 archive, using both dynamical and statistical techniques. Uncertainties are explored by applying a modeling chain involving 14 GCM-RCMs to ten Swiss catchments. We utilize the rainfall-runoff model HBV Light, which has been widely used in operational hydrological forecasting. The Lindström measure, a combination of model efficiency and volume error, was used as an objective function to calibrate HBV Light. Ten best sets of parameters are then achieved by calibrating using the genetic algorithm and Powell optimization (GAP) method. The GAP optimization method is based on the evolution of parameter sets, which works by selecting and recombining high performing parameter sets with each other. Once HBV is calibrated, we then perform a quantitative comparison of the influence of biases inherited from climate model simulations to the biases stemming from the hydrological model. The evaluation is conducted over two time periods: i) 1980-2009 to characterize the simulation realism under the current climate and ii) 2070-2099 to identify the magnitude of the projected change of streamflow under the climate scenarios RCP4.5 and RCP8.5. We utilize two techniques for correcting biases in the climate model output: quantile mapping and a new method, frequency bias correction. The FBC method matches the frequencies between observed and GCM-RCM data. In this way, it can be used to correct for all time scales, which is a known limitation of quantile mapping. A novel approach for the evaluation of the climate simulations and bias correction methods was then applied. Streamflow can be thought of as the "great integrator" of uncertainties. The ability, or the lack thereof, to correctly simulate streamflow is a way to assess the realism of the bias-corrected climate simulations. Long-term monthly mean as well as high and low flow metrics are used to evaluate the realism of the simulations under current climate and to gauge the impacts of climate change on streamflow. Preliminary results show that under present climate, calibration of the hydrological model comprises of a much smaller band of uncertainty in the modeling chain as compared to the bias correction of the GCM-RCMs. Therefore, for future time periods, we expect the bias correction of climate model data to have a greater influence on projected changes in streamflow than the calibration of the hydrological model.

  4. Drivers and uncertainties of forecasted range shifts for warm-water fishes under climate and land cover change

    USGS Publications Warehouse

    Bouska, Kristen; Whitledge, Gregory W.; Lant, Christopher; Schoof, Justin

    2018-01-01

    Land cover is an important determinant of aquatic habitat and is projected to shift with climate changes, yet climate-driven land cover changes are rarely factored into climate assessments. To quantify impacts and uncertainty of coupled climate and land cover change on warm-water fish species’ distributions, we used an ensemble model approach to project distributions of 14 species. For each species, current range projections were compared to 27 scenario-based projections and aggregated to visualize uncertainty. Multiple regression and model selection techniques were used to identify drivers of range change. Novel, or no-analogue, climates were assessed to evaluate transferability of models. Changes in total probability of occurrence ranged widely across species, from a 63% increase to a 65% decrease. Distributional gains and losses were largely driven by temperature and flow variables and underscore the importance of habitat heterogeneity and connectivity to facilitate adaptation to changing conditions. Finally, novel climate conditions were driven by mean annual maximum temperature, which stresses the importance of understanding the role of temperature on fish physiology and the role of temperature-mitigating management practices.

  5. 2D Affine and Projective Shape Analysis.

    PubMed

    Bryner, Darshan; Klassen, Eric; Huiling Le; Srivastava, Anuj

    2014-05-01

    Current techniques for shape analysis tend to seek invariance to similarity transformations (rotation, translation, and scale), but certain imaging situations require invariance to larger groups, such as affine or projective groups. Here we present a general Riemannian framework for shape analysis of planar objects where metrics and related quantities are invariant to affine and projective groups. Highlighting two possibilities for representing object boundaries-ordered points (or landmarks) and parameterized curves-we study different combinations of these representations (points and curves) and transformations (affine and projective). Specifically, we provide solutions to three out of four situations and develop algorithms for computing geodesics and intrinsic sample statistics, leading up to Gaussian-type statistical models, and classifying test shapes using such models learned from training data. In the case of parameterized curves, we also achieve the desired goal of invariance to re-parameterizations. The geodesics are constructed by particularizing the path-straightening algorithm to geometries of current manifolds and are used, in turn, to compute shape statistics and Gaussian-type shape models. We demonstrate these ideas using a number of examples from shape and activity recognition.

  6. Regional hydrologic response to climate change in the conterminous United States using high-resolution hydroclimate simulations

    DOE PAGES

    Kao, Shih -Chieh; Ashfaq, Moetasim; Mei, Rui; ...

    2016-06-16

    Despite the fact that Global Climate Model (GCM) outputs have been used to project hydrologic impacts of climate change using off-line hydrologic models for two decades, many of these efforts have been disjointed applications or at least calibrations have been focused on individual river basins and using a few of the available GCMs. This study improves upon earlier attempts by systematically projecting hydrologic impacts for the entire conterminous United States (US), using outputs from ten GCMs from the latest Coupled Model Intercomparison Project phase 5 (CMIP5) archive, with seamless hydrologic model calibration and validation techniques to produce a spatially andmore » temporally consistent set of current hydrologic projections. The Variable Infiltration Capacity (VIC) model was forced with ten-member ensemble projections of precipitation and air temperature that were dynamically downscaled using a regional climate model (RegCM4) and bias-corrected to 1/24 (~4 km) grid resolution for the baseline (1966 2005) and future (2011 2050) periods under the Representative Concentration Pathway 8.5. Based on regional analysis, the VIC model projections indicate an increase in winter and spring total runoff due to increases in winter precipitation of up to 20% in most regions of the US. However, decreases in snow water equivalent (SWE) and snow-covered days will lead to significant decreases in summer runoff with more pronounced shifts in the time of occurrence of annual peak runoff projected over the eastern and western US. In contrast, the central US will experience year-round increases in total runoff, mostly associated with increases in both extreme high and low runoff. Furthermore, the projected hydrological changes described in this study have implications for various aspects of future water resource management, including water supply, flood and drought preparation, and reservoir operation.« less

  7. The End-to-end Demonstrator for improved decision making in the water sector in Europe (EDgE)

    NASA Astrophysics Data System (ADS)

    Wood, Eric; Wanders, Niko; Pan, Ming; Sheffield, Justin; Samaniego, Luis; Thober, Stephan; Kumar, Rohinni; Prudhomme, Christel; Houghton-Carr, Helen

    2017-04-01

    High-resolution simulations of water resources from hydrological models are vital to supporting important climate services. Apart from a high level of detail, both spatially and temporally, it is important to provide simulations that consistently cover a range of timescales, from historical reanalysis to seasonal forecast and future projections. In the new EDgE project commissioned by the ECMWF (C3S) we try to fulfill these requirements. EDgE is a proof-of-concept project which combines climate data and state-of-the-art hydrological modelling to demonstrate a water-oriented information system implemented through a web application. EDgE is working with key European stakeholders representative of private and public sectors to jointly develop and tailor approaches and techniques. With these tools, stakeholders are assisted in using improved climate information in decision-making, and supported in the development of climate change adaptation and mitigation policies. Here, we present the first results of the EDgE modelling chain, which is divided into three main processes: 1) pre-processing and downscaling; 2) hydrological modelling; 3) post-processing. Consistent downscaling and bias corrections for historical simulations, seasonal forecasts and climate projections ensure that the results across scales are robust. The daily temporal resolution and 5km spatial resolution ensure locally relevant simulations. With the use of four hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), uncertainty between models is properly addressed, while consistency is guaranteed by using identical input data for static land surface parameterizations. The forecast results are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs) that have been created in collaboration with the end-user community of the EDgE project. The final product of this project is composed of 15 years of seasonal forecast and 10 climate change projections, all combined with four hydrological models. These unique high-resolution climate information simulations in the EDgE project provide an unprecedented information system for decision-making over Europe.

  8. SIMRAND I- SIMULATION OF RESEARCH AND DEVELOPMENT PROJECTS

    NASA Technical Reports Server (NTRS)

    Miles, R. F.

    1994-01-01

    The Simulation of Research and Development Projects program (SIMRAND) aids in the optimal allocation of R&D resources needed to achieve project goals. SIMRAND models the system subsets or project tasks as various network paths to a final goal. Each path is described in terms of task variables such as cost per hour, cost per unit, availability of resources, etc. Uncertainty is incorporated by treating task variables as probabilistic random variables. SIMRAND calculates the measure of preference for each alternative network. The networks yielding the highest utility function (or certainty equivalence) are then ranked as the optimal network paths. SIMRAND has been used in several economic potential studies at NASA's Jet Propulsion Laboratory involving solar dish power systems and photovoltaic array construction. However, any project having tasks which can be reduced to equations and related by measures of preference can be modeled. SIMRAND analysis consists of three phases: reduction, simulation, and evaluation. In the reduction phase, analytical techniques from probability theory and simulation techniques are used to reduce the complexity of the alternative networks. In the simulation phase, a Monte Carlo simulation is used to derive statistics on the variables of interest for each alternative network path. In the evaluation phase, the simulation statistics are compared and the networks are ranked in preference by a selected decision rule. The user must supply project subsystems in terms of equations based on variables (for example, parallel and series assembly line tasks in terms of number of items, cost factors, time limits, etc). The associated cumulative distribution functions and utility functions for each variable must also be provided (allowable upper and lower limits, group decision factors, etc). SIMRAND is written in Microsoft FORTRAN 77 for batch execution and has been implemented on an IBM PC series computer operating under DOS.

  9. Geologic CO2 Sequestration: Predicting and Confirming Performance in Oil Reservoirs and Saline Aquifers

    NASA Astrophysics Data System (ADS)

    Johnson, J. W.; Nitao, J. J.; Newmark, R. L.; Kirkendall, B. A.; Nimz, G. J.; Knauss, K. G.; Ziagos, J. P.

    2002-05-01

    Reducing anthropogenic CO2 emissions ranks high among the grand scientific challenges of this century. In the near-term, significant reductions can only be achieved through innovative sequestration strategies that prevent atmospheric release of large-scale CO2 waste streams. Among such strategies, injection into confined geologic formations represents arguably the most promising alternative; and among potential geologic storage sites, oil reservoirs and saline aquifers represent the most attractive targets. Oil reservoirs offer a unique "win-win" approach because CO2 flooding is an effective technique of enhanced oil recovery (EOR), while saline aquifers offer immense storage capacity and widespread distribution. Although CO2-flood EOR has been widely used in the Permian Basin and elsewhere since the 1980s, the oil industry has just recently become concerned with the significant fraction of injected CO2 that eludes recycling and is therefore sequestered. This "lost" CO2 now has potential economic value in the growing emissions credit market; hence, the industry's emerging interest in recasting CO2 floods as co-optimized EOR/sequestration projects. The world's first saline aquifer storage project was also catalyzed in part by economics: Norway's newly imposed atmospheric emissions tax, which spurred development of Statoil's unique North Sea Sleipner facility in 1996. Successful implementation of geologic sequestration projects hinges on development of advanced predictive models and a diverse set of remote sensing, in situ sampling, and experimental techniques. The models are needed to design and forecast long-term sequestration performance; the monitoring techniques are required to confirm and refine model predictions and to ensure compliance with environmental regulations. We have developed a unique reactive transport modeling capability for predicting sequestration performance in saline aquifers, and used it to simulate CO2 injection at Sleipner; we are now extending this capability to address CO2-flood EOR/sequestration in oil reservoirs. We have also developed a suite of innovative geophysical and geochemical techniques for monitoring sequestration performance in both settings. These include electromagnetic induction imaging and electrical resistance tomography for tracking migration of immiscible CO2, noble gas isotopes for assessing trace CO2 leakage through the cap rock, and integrated geochemical sampling, analytical, and experimental methods for determining sequestration partitioning among solubility and mineral trapping mechanisms. We have proposed to demonstrate feasibility of the co-optimized EOR/sequestration concept and utility of our modeling and monitoring technologies to design and evaluate its implementation by conducting a demonstration project in the Livermore Oil Field. This small, mature, shallow field, located less than a mile east of Lawrence Livermore National Laboratory, is representative of many potential EOR/sequestration sites in California. In approach, this proposed demonstration is analogous to the Weyburn EOR/CO2 monitoring project, to which it will provide an important complement by virtue of its contrasting depth (immiscible versus Weyburn's miscible CO2 flood) and geologic setting (clay-capped sand versus Weyburn's anhydrite-capped carbonate reservoir).

  10. The joint US/UK 1990 epoch world magnetic model

    NASA Technical Reports Server (NTRS)

    Quinn, John M.; Coleman, Rachel J.; Peck, Michael R.; Lauber, Stephen E.

    1991-01-01

    A detailed summary of the data used, analyses performed, modeling techniques employed, and results obtained in the course of the 1990 Epoch World Magnetic Modeling effort are given. Also, use and limitations of the GEOMAG algorithm are presented. Charts and tables related to the 1990 World Magnetic Model (WMM-90) for the Earth's main field and secular variation in Mercator and polar stereographic projections are presented along with useful tables of several magnetic field components and their secular variation on a 5-degree worldwide grid.

  11. A Fourier-based compressed sensing technique for accelerated CT image reconstruction using first-order methods.

    PubMed

    Choi, Kihwan; Li, Ruijiang; Nam, Haewon; Xing, Lei

    2014-06-21

    As a solution to iterative CT image reconstruction, first-order methods are prominent for the large-scale capability and the fast convergence rate [Formula: see text]. In practice, the CT system matrix with a large condition number may lead to slow convergence speed despite the theoretically promising upper bound. The aim of this study is to develop a Fourier-based scaling technique to enhance the convergence speed of first-order methods applied to CT image reconstruction. Instead of working in the projection domain, we transform the projection data and construct a data fidelity model in Fourier space. Inspired by the filtered backprojection formalism, the data are appropriately weighted in Fourier space. We formulate an optimization problem based on weighted least-squares in the Fourier space and total-variation (TV) regularization in image space for parallel-beam, fan-beam and cone-beam CT geometry. To achieve the maximum computational speed, the optimization problem is solved using a fast iterative shrinkage-thresholding algorithm with backtracking line search and GPU implementation of projection/backprojection. The performance of the proposed algorithm is demonstrated through a series of digital simulation and experimental phantom studies. The results are compared with the existing TV regularized techniques based on statistics-based weighted least-squares as well as basic algebraic reconstruction technique. The proposed Fourier-based compressed sensing (CS) method significantly improves both the image quality and the convergence rate compared to the existing CS techniques.

  12. Strategies for fitting nonlinear ecological models in R, AD Model Builder, and BUGS

    USGS Publications Warehouse

    Bolker, Benjamin M.; Gardner, Beth; Maunder, Mark; Berg, Casper W.; Brooks, Mollie; Comita, Liza; Crone, Elizabeth; Cubaynes, Sarah; Davies, Trevor; de Valpine, Perry; Ford, Jessica; Gimenez, Olivier; Kéry, Marc; Kim, Eun Jung; Lennert-Cody, Cleridy; Magunsson, Arni; Martell, Steve; Nash, John; Nielson, Anders; Regentz, Jim; Skaug, Hans; Zipkin, Elise

    2013-01-01

    1. Ecologists often use nonlinear fitting techniques to estimate the parameters of complex ecological models, with attendant frustration. This paper compares three open-source model fitting tools and discusses general strategies for defining and fitting models. 2. R is convenient and (relatively) easy to learn, AD Model Builder is fast and robust but comes with a steep learning curve, while BUGS provides the greatest flexibility at the price of speed. 3. Our model-fitting suggestions range from general cultural advice (where possible, use the tools and models that are most common in your subfield) to specific suggestions about how to change the mathematical description of models to make them more amenable to parameter estimation. 4. A companion web site (https://groups.nceas.ucsb.edu/nonlinear-modeling/projects) presents detailed examples of application of the three tools to a variety of typical ecological estimation problems; each example links both to a detailed project report and to full source code and data.

  13. Extraction of Iodine from Source Rock and Oil for Radioiodine Dating Final Report CRADA No. TC-1550-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moran, J. E.; Summa, L.

    This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Exxon Production Research Company (EPR) to develop improved techniques for extracting, concentrating, and measuring iodine from large volumes of source rock and oil. The purpose of this project was to develop a technique for measuring total iodine extracted from rock, obtain isotopic ratios, and develop age models for samples provided by EPR.

  14. High-Performance, Radiation-Hardened Electronics for Space Environments

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Watson, Michael D.; Frazier, Donald O.; Adams, James H.; Johnson, Michael A.; Kolawa, Elizabeth A.

    2007-01-01

    The Radiation Hardened Electronics for Space Environments (RHESE) project endeavors to advance the current state-of-the-art in high-performance, radiation-hardened electronics and processors, ensuring successful performance of space systems required to operate within extreme radiation and temperature environments. Because RHESE is a project within the Exploration Technology Development Program (ETDP), RHESE's primary customers will be the human and robotic missions being developed by NASA's Exploration Systems Mission Directorate (ESMD) in partial fulfillment of the Vision for Space Exploration. Benefits are also anticipated for NASA's science missions to planetary and deep-space destinations. As a technology development effort, RHESE provides a broad-scoped, full spectrum of approaches to environmentally harden space electronics, including new materials, advanced design processes, reconfigurable hardware techniques, and software modeling of the radiation environment. The RHESE sub-project tasks are: SelfReconfigurable Electronics for Extreme Environments, Radiation Effects Predictive Modeling, Radiation Hardened Memory, Single Event Effects (SEE) Immune Reconfigurable Field Programmable Gate Array (FPGA) (SIRF), Radiation Hardening by Software, Radiation Hardened High Performance Processors (HPP), Reconfigurable Computing, Low Temperature Tolerant MEMS by Design, and Silicon-Germanium (SiGe) Integrated Electronics for Extreme Environments. These nine sub-project tasks are managed by technical leads as located across five different NASA field centers, including Ames Research Center, Goddard Space Flight Center, the Jet Propulsion Laboratory, Langley Research Center, and Marshall Space Flight Center. The overall RHESE integrated project management responsibility resides with NASA's Marshall Space Flight Center (MSFC). Initial technology development emphasis within RHESE focuses on the hardening of Field Programmable Gate Arrays (FPGA)s and Field Programmable Analog Arrays (FPAA)s for use in reconfigurable architectures. As these component/chip level technologies mature, the RHESE project emphasis shifts to focus on efforts encompassing total processor hardening techniques and board-level electronic reconfiguration techniques featuring spare and interface modularity. This phased approach to distributing emphasis between technology developments provides hardened FPGA/FPAAs for early mission infusion, then migrates to hardened, board-level, high speed processors with associated memory elements and high density storage for the longer duration missions encountered for Lunar Outpost and Mars Exploration occurring later in the Constellation schedule.

  15. DYNARIP: A technique for regional forest inventory projection and policy analysis

    Treesearch

    William A. Bechtold

    1984-01-01

    DYNARIP is a policy-oriented model capable of tracking all of the treatments and disturbances experienced by the forest resources of an entire State or regional area. It can also isolate the impact of any one of the 27 man-caused or natural disturbances (including natural succession and forest land-base changes). The model is driven by empirical rates of change as...

  16. The Development and Implementation of a Model for Teaching Astronomy to Deaf Students

    ERIC Educational Resources Information Center

    Bolat, Mualla

    2016-01-01

    The purpose of this study is to develop and assess a model aimed at teaching astronomy to deaf students. Thus, a 7 day-long project of indoor and outdoor activities was developed. Since the purpose of our study was to teach astronomy to deaf students, our sample was determined by using purposeful sampling technique. The sample of this study…

  17. Random forests and stochastic gradient boosting for predicting tree canopy cover: Comparing tuning processes and model performance

    Treesearch

    Elizabeth A. Freeman; Gretchen G. Moisen; John W. Coulston; Barry T. (Ty) Wilson

    2015-01-01

    As part of the development of the 2011 National Land Cover Database (NLCD) tree canopy cover layer, a pilot project was launched to test the use of high-resolution photography coupled with extensive ancillary data to map the distribution of tree canopy cover over four study regions in the conterminous US. Two stochastic modeling techniques, random forests (RF...

  18. Market potential of solar thermal enhanced oil recovery-a techno-economic model for Issaran oil field in Egypt

    NASA Astrophysics Data System (ADS)

    Gupta, Sunay; Guédez, Rafael; Laumert, Björn

    2017-06-01

    Solar thermal enhanced oil recovery (S-EOR) is an advanced technique of using concentrated solar power (CSP) technology to generate steam and recover oil from maturing oil reservoirs. The generated steam is injected at high pressure and temperature into the reservoir wells to facilitate oil production. There are three common methods of steam injection in enhanced oil recovery - continuous steam injection, cyclic steam stimulation (CSS) and steam assisted gravity drainage (SAGD). Conventionally, this steam is generated through natural gas (NG) fired boilers with associated greenhouse gas emissions. However, pilot projects in the USA (Coalinga, California) and Oman (Miraah, Amal) demonstrated the use of S-EOR to meet their steam requirements despite the intermittent nature of solar irradiation. Hence, conventional steam based EOR projects under the Sunbelt region can benefit from S-EOR with reduced operational expenditure (OPEX) and increased profitability in the long term, even with the initial investment required for solar equipment. S-EOR can be realized as an opportunity for countries not owning any natural gas resources to make them less energy dependent and less sensible to gas price fluctuations, and for countries owning natural gas resources to reduce their gas consumption and export it for a higher margin. In this study, firstly, the market potential of S-EOR was investigated worldwide by covering some of the major ongoing steam based EOR projects as well as future projects in pipeline. A multi-criteria analysis was performed to compare local conditions and requirements of all the oil fields based on a defined set of parameters. Secondly, a modelling approach for S-EOR was designed to identify cost reduction opportunities and optimum solar integration techniques, and the Issaran oil field in Egypt was selected for a case study to substantiate the approach. This modelling approach can be consulted to develop S-EOR projects for any steam flooding based oil fields. The model was developed for steam flooding requirements in Issaran oil field using DYESOPT, KTH's in-house tool for techno-economic modelling in CSP.

  19. An Algebra-Based Introductory Computational Neuroscience Course with Lab.

    PubMed

    Fink, Christian G

    2017-01-01

    A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.

  20. Assess, Map and Predict the Integrity, Resilience, and ...

    EPA Pesticide Factsheets

    This project will provide knowledge and adaptive management techniques to both maintain healthy waters and to improve degraded systems. It will provide scientific support for the National Aquatic Resource Surveys. Results will provide a basis for informed decision making and tools applicable to EPA program office and regional needs at national regional, and local scales. The research products, tools, models, and maps produced will be an excellent means to communicate management options with stakeholders. To share information about SSWR research projects

  1. Implications of climate change for wetland-dependent birds in the Prairie Pothole Region

    USGS Publications Warehouse

    Steen, Valerie; Skagen, Susan K.; Melcher, Cynthia P.

    2016-01-01

    The habitats and food resources required to support breeding and migrant birds dependent on North American prairie wetlands are threatened by impending climate change. The North American Prairie Pothole Region (PPR) hosts nearly 120 species of wetland-dependent birds representing 21 families. Strategic management requires knowledge of avian habitat requirements and assessment of species most vulnerable to future threats. We applied bioclimatic species distribution models (SDMs) to project range changes of 29 wetland-dependent bird species using ensemble modeling techniques, a large number of General Circulation Models (GCMs), and hydrological climate covariates. For the U.S. PPR, mean projected range change, expressed as a proportion of currently occupied range, was −0.31 (± 0.22 SD; range − 0.75 to 0.16), and all but two species were projected to lose habitat. Species associated with deeper water were expected to experience smaller negative impacts of climate change. The magnitude of climate change impacts was somewhat lower in this study than earlier efforts most likely due to use of different focal species, varying methodologies, different modeling decisions, or alternative GCMs. Quantification of the projected species-specific impacts of climate change using species distribution modeling offers valuable information for vulnerability assessments within the conservation planning process.

  2. The Bologna Process Implementation and its Consequent Changes in the Teaching/Learning Model—the Industrial Management and Engineering Degree Case

    NASA Astrophysics Data System (ADS)

    Luísa Soares, Ana; Costa, Elga; Ferreira, Luís Pinto

    2009-11-01

    The present paper aims to present a Project included in a diversified programme and consequent implementation of a new Teaching/Learning model adapted to the Industrial Management and Engineering Degree (IMED) of the Management and Industrial Studies School (O'Porto Polytechnic Institute). Owning particular and specific characteristics, this model is based on the graduates' professional profile as well as on the work market dynamics, placing the student in the centre of the Learning Process, in opposition to the `teacher centred' method (as conceived by the Bologna Treat). Diverse in the approach, the model includes differentiating factors when compared to the project based traditional model. Through the development and conception of practical Interdisciplinary Projects, centring knowledges and techniques from the different Industrial Management and Engineering areas, we seek a new way of implementing the `Project Led Education' (PLE) bases, according to the Active Learning paradigm. This teaching/learning model aims to contribute to the Industrial Management and Engineering graduates' formation focused on a high level of performance and professional rectitude, to induce students' enthusiasm and motivation for acquiring scientific and technical knowledge, as well as to satisfy the diverse interest groups' expectations and promote the regional development.

  3. Evaluation of smartphone-based interaction techniques in a CAVE in the context of immersive digital project review

    NASA Astrophysics Data System (ADS)

    George, Paul; Kemeny, Andras; Colombet, Florent; Merienne, Frédéric; Chardonnet, Jean-Rémy; Thouvenin, Indira Mouttapa

    2014-02-01

    Immersive digital project reviews consist in using virtual reality (VR) as a tool for discussion between various stakeholders of a project. In the automotive industry, the digital car prototype model is the common thread that binds them. It is used during immersive digital project reviews between designers, engineers, ergonomists, etc. The digital mockup is also used to assess future car architecture, habitability or perceived quality requirements with the aim to reduce using physical mockups for optimized cost, delay and quality efficiency. Among the difficulties identified by the users, handling the mockup is a major one. Inspired by current uses of nomad devices (multi-touch gestures, IPhone UI look'n'feel and AR applications), we designed a navigation technique taking advantage of these popular input devices: Space scrolling allows moving around the mockup. In this paper, we present the results of a study we conducted on the usability and acceptability of the proposed smartphone-based interaction metaphor compared to traditional technique and we provide indications of the most efficient choices for different use-cases accordingly. It was carried out in a traditional 4-sided CAVE and its purpose is to assess a chosen set of interaction techniques to be implemented in Renault's new 5-sides 4K x 4K wall high performance CAVE. The proposed new metaphor using nomad devices is well accepted by novice VR users and future implementation should allow an efficient industrial use. Their use is an easy and user friendly alternative of the existing traditional control devices such as a joystick.

  4. ULTRASONIC STUDIES OF THE FUNDAMENTAL MECHANISMS OF RECRYSTALLIZATION AND SINTERING OF METALS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TURNER, JOSEPH A.

    2005-11-30

    The purpose of this project was to develop a fundamental understanding of the interaction of an ultrasonic wave with complex media, with specific emphases on recrystallization and sintering of metals. A combined analytical, numerical, and experimental research program was implemented. Theoretical models of elastic wave propagation through these complex materials were developed using stochastic wave field techniques. The numerical simulations focused on finite element wave propagation solutions through complex media. The experimental efforts were focused on corroboration of the models developed and on the development of new experimental techniques. The analytical and numerical research allows the experimental results to bemore » interpreted quantitatively.« less

  5. Face Hallucination with Linear Regression Model in Semi-Orthogonal Multilinear PCA Method

    NASA Astrophysics Data System (ADS)

    Asavaskulkiet, Krissada

    2018-04-01

    In this paper, we propose a new face hallucination technique, face images reconstruction in HSV color space with a semi-orthogonal multilinear principal component analysis method. This novel hallucination technique can perform directly from tensors via tensor-to-vector projection by imposing the orthogonality constraint in only one mode. In our experiments, we use facial images from FERET database to test our hallucination approach which is demonstrated by extensive experiments with high-quality hallucinated color faces. The experimental results assure clearly demonstrated that we can generate photorealistic color face images by using the SO-MPCA subspace with a linear regression model.

  6. Control of free-flying space robot manipulator systems

    NASA Technical Reports Server (NTRS)

    Cannon, Robert H., Jr.

    1990-01-01

    New control techniques for self contained, autonomous free flying space robots were developed and tested experimentally. Free flying robots are envisioned as a key element of any successful long term presence in space. These robots must be capable of performing the assembly, maintenance, and inspection, and repair tasks that currently require human extravehicular activity (EVA). A set of research projects were developed and carried out using lab models of satellite robots and a flexible manipulator. The second generation space robot models use air cushion vehicle (ACV) technology to simulate in 2-D the drag free, zero g conditions of space. The current work is divided into 5 major projects: Global Navigation and Control of a Free Floating Robot, Cooperative Manipulation from a Free Flying Robot, Multiple Robot Cooperation, Thrusterless Robotic Locomotion, and Dynamic Payload Manipulation. These projects are examined in detail.

  7. [The GIPSY-RECPAM model: a versatile approach for integrated evaluation in cardiologic care].

    PubMed

    Carinci, F

    2009-01-01

    Tree-structured methodology applied for the GISSI-PSICOLOGIA project, although performed in the framework of earliest GISSI studies, represents a powerful tool to analyze different aspects of cardiologic care. The GISSI-PSICOLOGIA project has delivered a novel methodology based on the joint application of psychometric tools and sophisticated statistical techniques. Its prospective use could allow building effective epidemiological models relevant to the prognosis of the cardiologic patient. The various features of the RECPAM method allow a versatile use in the framework of modern e-health projects. The study used the Cognitive Behavioral Assessment H Form (CBA-H) psychometrics scales. The potential for its future application in the framework of Italian cardiology is relevant and particularly indicated to assist planning of systems for integrated care and routine evaluation of the cardiologic patient.

  8. NE Ohio Urban Growth Monitoring and Modeling Prototype. Revised

    NASA Technical Reports Server (NTRS)

    Siebert, Loren; Klosterman, Richard E.

    2001-01-01

    At the University of Akron, Dr. Loren Siebert, Dr. Richard Klosterman, and their graduate research assistants (Jung-Wook Kim, Mohammed Hoque, Aziza Parveen, and Ben Stabler) worked on the integration of remote sensing and GIs-based planning support systems. The primary goal of the project was to develop methods that use remote sensing land cover mapping and GIs-based modeling to monitor and project urban growth and farmland loss in northeast Ohio. Another research goal has been to use only GIS data that are accessible via the World Wide Web, to determine whether Ohio's small counties and townships that do not currently have parcel-level GIS systems can apply these techniques. The project was jointly funded by NASA and USGS OhioView grants during the 2000-2001 academic year; the work is now being continued under a USGS grant.

  9. Convection in containerless processing.

    PubMed

    Hyers, Robert W; Matson, Douglas M; Kelton, Kenneth F; Rogers, Jan R

    2004-11-01

    Different containerless processing techniques have different strengths and weaknesses. Applying more than one technique allows various parts of a problem to be solved separately. For two research projects, one on phase selection in steels and the other on nucleation and growth of quasicrystals, a combination of experiments using electrostatic levitation (ESL) and electromagnetic levitation (EML) is appropriate. In both experiments, convection is an important variable. The convective conditions achievable with each method are compared for two very different materials: a low-viscosity, high-temperature stainless steel, and a high-viscosity, low-temperature quasicrystal-forming alloy. It is clear that the techniques are complementary when convection is a parameter to be explored in the experiments. For a number of reasons, including the sample size, temperature, and reactivity, direct measurement of the convective velocity is not feasible. Therefore, we must rely on computation techniques to estimate convection in these experiments. These models are an essential part of almost any microgravity investigation. The methods employed and results obtained for the projects levitation observation of dendrite evolution in steel ternary alloy rapid solidification (LODESTARS) and quasicrystalline undercooled alloys for space investigation (QUASI) are explained.

  10. Continuous statistical modelling for rapid detection of adulteration of extra virgin olive oil using mid infrared and Raman spectroscopic data.

    PubMed

    Georgouli, Konstantia; Martinez Del Rincon, Jesus; Koidis, Anastasios

    2017-02-15

    The main objective of this work was to develop a novel dimensionality reduction technique as a part of an integrated pattern recognition solution capable of identifying adulterants such as hazelnut oil in extra virgin olive oil at low percentages based on spectroscopic chemical fingerprints. A novel Continuous Locality Preserving Projections (CLPP) technique is proposed which allows the modelling of the continuous nature of the produced in-house admixtures as data series instead of discrete points. The maintenance of the continuous structure of the data manifold enables the better visualisation of this examined classification problem and facilitates the more accurate utilisation of the manifold for detecting the adulterants. The performance of the proposed technique is validated with two different spectroscopic techniques (Raman and Fourier transform infrared, FT-IR). In all cases studied, CLPP accompanied by k-Nearest Neighbors (kNN) algorithm was found to outperform any other state-of-the-art pattern recognition techniques. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Development of a decision model for selection of appropriate timely delivery techniques for highway projects : final report, April 2009.

    DOT National Transportation Integrated Search

    2009-04-01

    The primary umbrella method used by the Oregon Department of Transportation (ODOT) to ensure on-time performance in standard construction contracting is liquidated damages. The assessment value is usually a matter of some judgment. In practice,...

  12. Development of a Graduate Course in Computer-Aided Geometric Design.

    ERIC Educational Resources Information Center

    Ault, Holly K.

    1991-01-01

    Described is a course that focuses on theory and techniques for ideation and refinement of geometric models used in mechanical engineering design applications. The course objectives, course outline, a description of the facilities, sample exercises, and a discussion of final projects are included. (KR)

  13. A Grade 6 Project in the Social Studies: The Wall of Old Jerusalem.

    ERIC Educational Resources Information Center

    Ediger, Marlow

    1993-01-01

    Presents a classroom lesson based on the walls of old Jerusalem. Maintains that cooperative-learning techniques used to build a model of the wall helped students understand the meaning of the original wall and the division of modern-day Jerusalem. (CFR)

  14. Quantifying uncertainty in partially specified biological models: how can optimal control theory help us?

    PubMed

    Adamson, M W; Morozov, A Y; Kuzenkov, O A

    2016-09-01

    Mathematical models in biology are highly simplified representations of a complex underlying reality and there is always a high degree of uncertainty with regards to model function specification. This uncertainty becomes critical for models in which the use of different functions fitting the same dataset can yield substantially different predictions-a property known as structural sensitivity. Thus, even if the model is purely deterministic, then the uncertainty in the model functions carries through into uncertainty in model predictions, and new frameworks are required to tackle this fundamental problem. Here, we consider a framework that uses partially specified models in which some functions are not represented by a specific form. The main idea is to project infinite dimensional function space into a low-dimensional space taking into account biological constraints. The key question of how to carry out this projection has so far remained a serious mathematical challenge and hindered the use of partially specified models. Here, we propose and demonstrate a potentially powerful technique to perform such a projection by using optimal control theory to construct functions with the specified global properties. This approach opens up the prospect of a flexible and easy to use method to fulfil uncertainty analysis of biological models.

  15. The EGS Collab Project: Stimulation Investigations for Geothermal Modeling Analysis and Validation

    NASA Astrophysics Data System (ADS)

    Blankenship, D.; Kneafsey, T. J.

    2017-12-01

    The US DOE's EGS Collab project team is establishing a suite of intermediate-scale ( 10-20 m) field test beds for coupled stimulation and interwell flow tests. The multiple national laboratory and university team is designing the tests to compare measured data to models to improve measurement and modeling toolsets available for use in field sites and investigations such as DOE's Frontier Observatory for Research in Geothermal Energy (FORGE) Project. Our tests will be well-controlled, in situexperiments focused on rock fracture behavior, seismicity, and permeability enhancement. Pre- and post-test modeling will allow for model prediction and validation. High-quality, high-resolution geophysical and other fracture characterization data will be collected, analyzed, and compared with models and field observations to further elucidate the basic relationships between stress, induced seismicity, and permeability enhancement. Coring through the stimulated zone after tests will provide fracture characteristics that can be compared to monitoring data and model predictions. We will also observe and quantify other key governing parameters that impact permeability, and attempt to understand how these parameters might change throughout the development and operation of an Enhanced Geothermal System (EGS) project with the goal of enabling commercial viability of EGS. The Collab team will perform three major experiments over the three-year project duration. Experiment 1, intended to investigate hydraulic fracturing, will be performed in the Sanford Underground Research Facility (SURF) at 4,850 feet depth and will build on kISMET Project findings. Experiment 2 will be designed to investigate hydroshearing. Experiment 3 will investigate changes in fracturing strategies and will be further specified as the project proceeds. The tests will provide quantitative insights into the nature of stimulation (e.g., hydraulic fracturing, hydroshearing, mixed-mode fracturing, thermal fracturing) in crystalline rock under reservoir-like stress conditions and generate high-quality, high-resolution, diverse data sets to be simulated allowing model validation. Monitoring techniques will also be evaluated under controlled conditions identifying technologies appropriate for deeper full-scale EGS sites.

  16. Performance modeling for large database systems

    NASA Astrophysics Data System (ADS)

    Schaar, Stephen; Hum, Frank; Romano, Joe

    1997-02-01

    One of the unique approaches Science Applications International Corporation took to meet performance requirements was to start the modeling effort during the proposal phase of the Interstate Identification Index/Federal Bureau of Investigations (III/FBI) project. The III/FBI Performance Model uses analytical modeling techniques to represent the III/FBI system. Inputs to the model include workloads for each transaction type, record size for each record type, number of records for each file, hardware envelope characteristics, engineering margins and estimates for software instructions, memory, and I/O for each transaction type. The model uses queuing theory to calculate the average transaction queue length. The model calculates a response time and the resources needed for each transaction type. Outputs of the model include the total resources needed for the system, a hardware configuration, and projected inherent and operational availability. The III/FBI Performance Model is used to evaluate what-if scenarios and allows a rapid response to engineering change proposals and technical enhancements.

  17. A Rigorous Investigation on the Ground State of the Penson-Kolb Model

    NASA Astrophysics Data System (ADS)

    Yang, Kai-Hua; Tian, Guang-Shan; Han, Ru-Qi

    2003-05-01

    By using either numerical calculations or analytical methods, such as the bosonization technique, the ground state of the Penson-Kolb model has been previously studied by several groups. Some physicists argued that, as far as the existence of superconductivity in this model is concerned, it is canonically equivalent to the negative-U Hubbard model. However, others did not agree. In the present paper, we shall investigate this model by an independent and rigorous approach. We show that the ground state of the Penson-Kolb model is nondegenerate and has a nonvanishing overlap with the ground state of the negative-U Hubbard model. Furthermore, we also show that the ground states of both the models have the same good quantum numbers and may have superconducting long-range order at the same momentum q = 0. Our results support the equivalence between these models. The project partially supported by the Special Funds for Major State Basic Research Projects (G20000365) and National Natural Science Foundation of China under Grant No. 10174002

  18. Assessing the impacts of climate change in Mediterranean catchments under conditions of data scarcity - The Gaza case study

    NASA Astrophysics Data System (ADS)

    Gampe, David; Ludwig, Ralf

    2013-04-01

    According to current climate projections, Mediterranean countries are at high risk for an even pronounced susceptibility to changes in the hydrological budget and extremes. While there is scientific consensus that climate induced changes on the hydrology of Mediterranean regions are presently occurring and are projected to amplify in the future, very little knowledge is available about the quantification of these changes, which is hampered by a lack of suitable and cost effective hydrological monitoring and modeling systems. The European FP7-project CLIMB is aiming to analyze climate induced changes on the hydrology of the Mediterranean Basins by investigating seven test sites located in the countries Italy, France, Turkey, Tunisia, Gaza and Egypt. CLIMB employs a combination of novel geophysical field monitoring concepts, remote sensing techniques and integrated hydrologic modeling to improve process descriptions and understanding and to quantify existing uncertainties in climate change impact analysis. One of those seven sites is the Gaza Strip, located in the Eastern Mediterranean and part of the Palestinian Autonomous Area, covers an area of 365km² with a length of 35km and 6 to 12km in width. Elevation ranges from sea level up to 104m in the East of the test site. Mean annual precipitation varies from 235mm in the South to 420mm in the North of the area. The inter annual variability of rainfall and the rapid population growth in an highly agricultural used area represent the major challenges in this area. The physically based Water Simulation Model WaSiM Vers. 2 (Schulla & Jasper (1999)) is setup to model current and projected future hydrological conditions. The availability of measured meteorological and hydrological data is poor as common to many Mediterranean catchments. The lack of available measured input data hampers the calibration of the model setup and the validation of model outputs. WaSiM was driven with meteorological forcing taken from 4 different ENSEMBLES climate projections for a reference (1971-2000) and a future (2041-2070) times series. State of the art remote sensing techniques and field measuring techniques were applied to improve the quality of hydrological input parameters. For the parameterization of the vegetation the Leaf Area Index (LAI) is a crucial component. However, the LAI is difficult to access at field scale, hence a simple remote sensing approach, using the Normalized Difference Vegetation Index (NDVI) and MODIS LAI information, was applied for the parameterization in WaSiM. As no permanent streams, hence no discharge measurements, exist in the Gaza Strip, the actual evapotranspiration (ETact) outputs of the model were used for model validation. Landsat TM images were applied to calculate the actual monthly mean ETact rates using the triangle method (Jiang and Islam, 1999). Simulated spatial ETact patterns and those derived from remote sensing show a good fit especially for the growing season.

  19. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  20. Methodological Ambiguities of the Projective Technique: An Overview and Attempt to Clarify.

    ERIC Educational Resources Information Center

    Veiel, H.; Coles, E. M.

    1982-01-01

    Definitions of projective tests are critiqued. A distinction is made between projective tests and projective techniques. The unique feature of the latter is its scoring process: response categories are intensionally defined and comprise infinite sets of responses. A continuity from psychometric to projective tests is argued. Statistical…

  1. The Validity of Projective Techniques and Their Clinical and Research Contributions

    ERIC Educational Resources Information Center

    Blatt, Sidney J.

    1975-01-01

    Questions about the limitations and potential contributions of projective techniques in research are considered and issues which limit the contributions of diagnostic assessment and projective techniques in clinical practice are examined. A proposal is made for conceptualizing diagnostic assessment as a more integral part of the therapeutic…

  2. FY08 LDRD Final Report A New Method for Wave Propagation in Elastic Media LDRD Project Tracking Code: 05-ERD-079

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petersson, A

    The LDRD project 'A New Method for Wave Propagation in Elastic Media' developed several improvements to the traditional finite difference technique for seismic wave propagation, including a summation-by-parts discretization which is provably stable for arbitrary heterogeneous materials, an accurate treatment of non-planar topography, local mesh refinement, and stable outflow boundary conditions. This project also implemented these techniques in a parallel open source computer code called WPP, and participated in several seismic modeling efforts to simulate ground motion due to earthquakes in Northern California. This research has been documented in six individual publications which are summarized in this report. Of thesemore » publications, four are published refereed journal articles, one is an accepted refereed journal article which has not yet been published, and one is a non-refereed software manual. The report concludes with a discussion of future research directions and exit plan.« less

  3. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    NASA Technical Reports Server (NTRS)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    GLASS, S. JILL; LOEHMAN, RONALD E.; HOSKING, F. MICHAEL

    The main objective of this project was to develop reliable, low-cost techniques for joining silicon nitride (Si{sub 3}N{sub 4}) to itself and to metals. For Si{sub 3}N{sub 4} to be widely used in advanced turbomachinery applications, joining techniques must be developed that are reliable, cost-effective, and manufacturable. This project addressed those needs by developing and testing two Si{sub 3}N{sub 4} joining systems; oxynitride glass joining materials and high temperature braze alloys. Extensive measurements were also made of the mechanical properties and oxidation resistance of the braze materials. Finite element models were used to predict the magnitudes and positions of themore » stresses in the ceramic regions of ceramic-to-metal joints sleeve and butt joints, similar to the geometries used for stator assemblies.« less

  5. A review of surrogate models and their application to groundwater modeling

    NASA Astrophysics Data System (ADS)

    Asher, M. J.; Croke, B. F. W.; Jakeman, A. J.; Peeters, L. J. M.

    2015-08-01

    The spatially and temporally variable parameters and inputs to complex groundwater models typically result in long runtimes which hinder comprehensive calibration, sensitivity, and uncertainty analysis. Surrogate modeling aims to provide a simpler, and hence faster, model which emulates the specified output of a more complex model in function of its inputs and parameters. In this review paper, we summarize surrogate modeling techniques in three categories: data-driven, projection, and hierarchical-based approaches. Data-driven surrogates approximate a groundwater model through an empirical model that captures the input-output mapping of the original model. Projection-based models reduce the dimensionality of the parameter space by projecting the governing equations onto a basis of orthonormal vectors. In hierarchical or multifidelity methods the surrogate is created by simplifying the representation of the physical system, such as by ignoring certain processes, or reducing the numerical resolution. In discussing the application to groundwater modeling of these methods, we note several imbalances in the existing literature: a large body of work on data-driven approaches seemingly ignores major drawbacks to the methods; only a fraction of the literature focuses on creating surrogates to reproduce outputs of fully distributed groundwater models, despite these being ubiquitous in practice; and a number of the more advanced surrogate modeling methods are yet to be fully applied in a groundwater modeling context.

  6. The MERMAID project

    NASA Technical Reports Server (NTRS)

    Cowderoy, A. J. C.; Jenkins, John O.; Poulymenakou, A

    1992-01-01

    The tendency for software development projects to be completed over schedule and over budget was documented extensively. Additionally many projects are completed within budgetary and schedule target only as a result of the customer agreeing to accept reduced functionality. In his classic book, The Mythical Man Month, Fred Brooks exposes the fallacy that effort and schedule are freely interchangeable. All current cost models are produced on the assumption that there is very limited scope for schedule compression unless there is a corresponding reduction in delivered functionality. The Metrication and Resources Modeling Aid (MERMAID) project, partially financed by the Commission of the European Communities (CEC) as Project 2046 began in Oct. 1988 and its goal were as follows: (1) improvement of understanding of the relationships between software development productivity and product and process metrics; (2) to facilitate the widespread technology transfer from the Consortium to the European Software Industry; and (3) to facilitate the widespread uptake of cost estimation techniques by the provision of prototype cost estimation tools. MERMAID developed a family of methods for cost estimation, many of which have had tools implemented in prototypes. These prototypes are best considered as toolkits or workbenches.

  7. CI-WATER HPC Model: Cyberinfrastructure to Advance High Performance Water Resources Modeling in the Intermountain Western U.S

    NASA Astrophysics Data System (ADS)

    Ogden, F. L.; Lai, W.; Douglas, C. C.; Miller, S. N.; Zhang, Y.

    2012-12-01

    The CI-WATER project is a cooperative effort between the Utah and Wyoming EPSCoR jurisdictions, and is funded through a cooperative agreement with the U.S. National Science Foundation EPSCoR. The CI-WATER project is acquiring hardware and developing software cyberinfrastructure (CI) to enhance accessibility of High Performance Computing for water resources modeling in the Western U.S. One of the components of the project is development of a large-scale, high-resolution, physically-based, data-driven, integrated computational water resources model, which we call the CI-WATER HPC model. The objective of this model development is to enable evaluation of integrated system behavior to guide and support water system planning and management by individual users, cities, or states. The model is first being tested in the Green River basin of Wyoming, which is the largest tributary to the Colorado River. The model will ultimately be applied to simulate the entire Upper Colorado River basin for hydrological studies, watershed management, economic analysis, as well as evaluation of potential changes in environmental policy and law, population, land use, and climate. In addition to hydrologically important processes simulated in many hydrological models, the CI-WATER HPC model will emphasize anthropogenic influences such as land use change, water resources infrastructure, irrigation practices, trans-basin diversions, and urban/suburban development. The model operates on an unstructured mesh, employing adaptive mesh at grid sizes as small as 10 m as needed- particularly in high elevation snow melt regions. Data for the model are derived from remote sensing sources, atmospheric models and geophysical techniques. Monte-Carlo techniques and ensemble Kalman filtering methodologies are employed for data assimilation. The model includes application programming interface (API) standards to allow easy substitution of alternative process-level simulation routines, and provide post-processing, visualization, and communication of massive amounts of output. The open-source CI-WATER model represents a significant advance in water resources modeling, and will be useful to water managers, planners, resource economists, and the hydrologic research community in general.

  8. Proceedings of the Advanced Seminar on one-dimensional, open-channel Flow and transport modeling

    USGS Publications Warehouse

    Schaffranek, Raymond W.

    1989-01-01

    In view of the increased use of mathematical/numerical simulation models, of the diversity of both model investigations and informational project objectives, and of the technical demands of complex model applications by U.S. Geological Survey personnel, an advanced seminar on one-dimensional open-channel flow and transport modeling was organized and held on June 15-18, 1987, at the National Space Technology Laboratory, Bay St. Louis, Mississippi. Principal emphasis in the Seminar was on one-dimensional flow and transport model-implementation techniques, operational practices, and application considerations. The purposes of the Seminar were to provide a forum for the exchange of information, knowledge, and experience among model users, as well as to identify immediate and future needs with respect to model development and enhancement, user support, training requirements, and technology transfer. The Seminar program consisted of a mix of topical and project presentations by Geological Survey personnel. This report is a compilation of short papers that summarize the presentations made at the Seminar.

  9. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    NASA Technical Reports Server (NTRS)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of the IMM Project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the HRP NASA-STD-7009 Guidance Document working group and the NASA-HDBK-7009 [2]. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including operations, science and technology planning, and exploration planning. IMM v4.0 is slated for operational release in the FY015 and current VVC assessments illustrate the expected VVC status prior to the completion of customer lead external review efforts. CONCLUSIONS: The VVC approach established by the IMM Project of incorporating Project-specific recommended practices and guidelines for implementing the 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM Project represented a critical communication tool in providing clear and concise suitability assessments to IMM customers. These processes have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.

  10. Longitudinal Evaluation of Fatty Acid Metabolism in Normal and Spontaneously Hypertensive Rat Hearts with Dynamic MicroSPECT Imaging

    DOE PAGES

    Reutter, Bryan W.; Huesman, Ronald H.; Brennan, Kathleen M.; ...

    2011-01-01

    The goal of this project is to develop radionuclide molecular imaging technologies using a clinical pinhole SPECT/CT scanner to quantify changes in cardiac metabolism using the spontaneously hypertensive rat (SHR) as a model of hypertensive-related pathophysiology. This paper quantitatively compares fatty acid metabolism in hearts of SHR and Wistar-Kyoto normal rats as a function of age and thereby tracks physiological changes associated with the onset and progression of heart failure in the SHR model. The fatty acid analog, 123 I-labeled BMIPP, was used in longitudinal metabolic pinhole SPECT imaging studies performed every seven months for 21 months. The uniqueness ofmore » this project is the development of techniques for estimating the blood input function from projection data acquired by a slowly rotating camera that is imaging fast circulation and the quantification of the kinetics of 123 I-BMIPP by fitting compartmental models to the blood and tissue time-activity curves.« less

  11. Yielding physically-interpretable emulators - A Sparse PCA approach

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Alsahaf, A.; Giuliani, M.; Castelletti, A.

    2015-12-01

    Projection-based techniques, such as Principal Orthogonal Decomposition (POD), are a common approach to surrogate high-fidelity process-based models by lower order dynamic emulators. With POD, the dimensionality reduction is achieved by using observations, or 'snapshots' - generated with the high-fidelity model -, to project the entire set of input and state variables of this model onto a smaller set of basis functions that account for most of the variability in the data. While reduction efficiency and variance control of POD techniques are usually very high, the resulting emulators are structurally highly complex and can hardly be given a physically meaningful interpretation as each basis is a projection of the entire set of inputs and states. In this work, we propose a novel approach based on Sparse Principal Component Analysis (SPCA) that combines the several assets of POD methods with the potential for ex-post interpretation of the emulator structure. SPCA reduces the number of non-zero coefficients in the basis functions by identifying a sparse matrix of coefficients. While the resulting set of basis functions may retain less variance of the snapshots, the presence of a few non-zero coefficients assists in the interpretation of the underlying physical processes. The SPCA approach is tested on the reduction of a 1D hydro-ecological model (DYRESM-CAEDYM) used to describe the main ecological and hydrodynamic processes in Tono Dam, Japan. An experimental comparison against a standard POD approach shows that SPCA achieves the same accuracy in emulating a given output variable - for the same level of dimensionality reduction - while yielding better insights of the main process dynamics.

  12. Net present value analysis: appropriate for public utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, W.N. III

    1980-08-28

    The net-present-value technique widely used by unregulated companies for capital budgeting can also apply to regulated public utilities. Used to decide whether an investment is worthwhile, the NPV technique discounts an investment's initial outlay or cost. The type of project most appropriate for an NPV analysis is that designed to lower costs. Efficiency-improving investments can be adequately evaluated by the NPV method, which in certain cases is easier to use than some of the more complicated revenue-requirement computer models.

  13. Setting analyst: A practical harvest planning technique

    Treesearch

    Olivier R.M. Halleux; W. Dale Greene

    2001-01-01

    Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...

  14. LAND USE CHANGE DUE TO URBANIZATION FOR THE NEUSE RIVER BASIN

    EPA Science Inventory

    The Urban Growth Model (UGM) was applied to analysis of land use change in the Neuse River Basin as part of a larger project for estimating the regional and broader impact of urbanization. UGM is based on cellular automation (CA) simulation techniques developed at the University...

  15. Test Design Project: Studies in Test Adequacy. Annual Report.

    ERIC Educational Resources Information Center

    Wilcox, Rand R.

    These studies in test adequacy focus on two problems: procedures for estimating reliability, and techniques for identifying ineffective distractors. Fourteen papers are presented on recent advances in measuring achievement (a response to Molenaar); "an extension of the Dirichlet-multinomial model that allows true score and guessing to be…

  16. Connected vehicle impacts on transportation planning : analysis of the need for new and enhanced analysis tools, techniques and data, briefing for traffic simulation models.

    DOT National Transportation Integrated Search

    2016-03-11

    The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...

  17. Connected vehicle impacts on transportation planning : analysis of the need for new and enhanced analysis tools, techniques and data—briefing for traffic simulation models.

    DOT National Transportation Integrated Search

    2016-03-11

    The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...

  18. Jacobian projection reduced-order models for dynamic systems with contact nonlinearities

    NASA Astrophysics Data System (ADS)

    Gastaldi, Chiara; Zucca, Stefano; Epureanu, Bogdan I.

    2018-02-01

    In structural dynamics, the prediction of the response of systems with localized nonlinearities, such as friction dampers, is of particular interest. This task becomes especially cumbersome when high-resolution finite element models are used. While state-of-the-art techniques such as Craig-Bampton component mode synthesis are employed to generate reduced order models, the interface (nonlinear) degrees of freedom must still be solved in-full. For this reason, a new generation of specialized techniques capable of reducing linear and nonlinear degrees of freedom alike is emerging. This paper proposes a new technique that exploits spatial correlations in the dynamics to compute a reduction basis. The basis is composed of a set of vectors obtained using the Jacobian of partial derivatives of the contact forces with respect to nodal displacements. These basis vectors correspond to specifically chosen boundary conditions at the contacts over one cycle of vibration. The technique is shown to be effective in the reduction of several models studied using multiple harmonics with a coupled static solution. In addition, this paper addresses another challenge common to all reduction techniques: it presents and validates a novel a posteriori error estimate capable of evaluating the quality of the reduced-order solution without involving a comparison with the full-order solution.

  19. Optical interferometry and Gaia parallaxes for a robust calibration of the Cepheid distance scale

    NASA Astrophysics Data System (ADS)

    Kervella, Pierre; Mérand, Antoine; Gallenne, Alexandre; Trahin, Boris; Borgniet, Simon; Pietrzynski, Grzegorz; Nardetto, Nicolas; Gieren, Wolfgang

    2018-04-01

    We present the modeling tool we developed to incorporate multi-technique observations of Cepheids in a single pulsation model: the Spectro-Photo-Interferometry of Pulsating Stars (SPIPS). The combination of angular diameters from optical interferometry, radial velocities and photometry with the coming Gaia DR2 parallaxes of nearby Galactic Cepheids will soon enable us to calibrate the projection factor of the classical Parallax-of-Pulsation method. This will extend its applicability to Cepheids too distant for accurate Gaia parallax measurements, and allow us to precisely calibrate the Leavitt law's zero point. As an example application, we present the SPIPS model of the long-period Cepheid RS Pup that provides a measurement of its projection factor, using the independent distance estimated from its light echoes.

  20. Modeling the Volcanic Source at Long Valley, CA, Using a Genetic Algorithm Technique

    NASA Technical Reports Server (NTRS)

    Tiampo, Kristy F.

    1999-01-01

    In this project, we attempted to model the deformation pattern due to the magmatic source at Long Valley caldera using a real-value coded genetic algorithm (GA) inversion similar to that found in Michalewicz, 1992. The project has been both successful and rewarding. The genetic algorithm, coded in the C programming language, performs stable inversions over repeated trials, with varying initial and boundary conditions. The original model used a GA in which the geophysical information was coded into the fitness function through the computation of surface displacements for a Mogi point source in an elastic half-space. The program was designed to invert for a spherical magmatic source - its depth, horizontal location and volume - using the known surface deformations. It also included the capability of inverting for multiple sources.

  1. Software-Enabled Project Management Techniques and Their Relationship to the Triple Constraints

    ERIC Educational Resources Information Center

    Elleh, Festus U.

    2013-01-01

    This study investigated the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). There was the dearth of academic literature that focused on the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). Based on the gap…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching andmore » partitioning (Q&P) heat treatment, as an example.« less

  3. Leaving No Stone Unturned in the Pursuit of New Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, Timothy

    The major goal of this project was to investigate a variety of topics in theoretical particle physics, with an emphasis on beyond the Standard Model phenomena. A particular emphasis is placed on making a connection to ongoing experimental efforts designed to extend our knowledge of the fundamental physics frontiers. The principal investigator aimed to play a leading role in theoretical research that complements this impressive experimental endeavor. Progress requires a strong synergy between the theoretical and experimental communities to design and interpret the data that is produced. Thus, this project's main goal was to improve our understanding of models, signatures,more » and techniques as we continue the hunt for new physics.« less

  4. Modeling the temperature-dependent peptide vibrational spectra based on implicit-solvent model and enhance sampling technique

    NASA Astrophysics Data System (ADS)

    Tianmin, Wu; Tianjun, Wang; Xian, Chen; Bin, Fang; Ruiting, Zhang; Wei, Zhuang

    2016-01-01

    We herein review our studies on simulating the thermal unfolding Fourier transform infrared and two-dimensional infrared spectra of peptides. The peptide-water configuration ensembles, required forspectrum modeling, aregenerated at a series of temperatures using the GBOBC implicit solvent model and the integrated tempering sampling technique. The fluctuating vibrational Hamiltonians of the amide I vibrational band are constructed using the Frenkel exciton model. The signals are calculated using nonlinear exciton propagation. The simulated spectral features such as the intensity and ellipticity are consistent with the experimental observations. Comparing the signals for two beta-hairpin polypeptides with similar structures suggests that this technique is sensitive to peptide folding landscapes. Project supported by the National Natural Science Foundation of China (Grant No. 21203178), the National Natural Science Foundation of China (Grant No. 21373201), the National Natural Science Foundation of China (Grant No. 21433014), the Science and Technological Ministry of China (Grant No. 2011YQ09000505), and “Strategic Priority Research Program” of the Chinese Academy of Sciences (Grant Nos. XDB10040304 and XDB100202002).

  5. Intraspecific variation buffers projected climate change impacts on Pinus contorta

    PubMed Central

    Oney, Brian; Reineking, Björn; O'Neill, Gregory; Kreyling, Juergen

    2013-01-01

    Species distribution modeling (SDM) is an important tool to assess the impact of global environmental change. Many species exhibit ecologically relevant intraspecific variation, and few studies have analyzed its relevance for SDM. Here, we compared three SDM techniques for the highly variable species Pinus contorta. First, applying a conventional SDM approach, we used MaxEnt to model the subject as a single species (species model), based on presence–absence observations. Second, we used MaxEnt to model each of the three most prevalent subspecies independently and combined their projected distributions (subspecies model). Finally, we used a universal growth transfer function (UTF), an approach to incorporate intraspecific variation utilizing provenance trial tree growth data. Different model approaches performed similarly when predicting current distributions. MaxEnt model discrimination was greater (AUC – species model: 0.94, subspecies model: 0.95, UTF: 0.89), but the UTF was better calibrated (slope and bias – species model: 1.31 and −0.58, subspecies model: 1.44 and −0.43, UTF: 1.01 and 0.04, respectively). Contrastingly, for future climatic conditions, projections of lodgepole pine habitat suitability diverged. In particular, when the species' intraspecific variability was acknowledged, the species was projected to better tolerate climatic change as related to suitable habitat without migration (subspecies model: 26% habitat loss or UTF: 24% habitat loss vs. species model: 60% habitat loss), and given unlimited migration may increase amount of suitable habitat (subspecies model: 8% habitat gain or UTF: 12% habitat gain vs. species model: 51% habitat loss) in the climatic period 2070–2100 (SRES A2 scenario, HADCM3). We conclude that models derived from within-species data produce different and better projections, and coincide with ecological theory. Furthermore, we conclude that intraspecific variation may buffer against adverse effects of climate change. A key future research challenge lies in assessing the extent to which species can utilize intraspecific variation under rapid environmental change. PMID:23467191

  6. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  7. Recent Experience Using Active Love Wave Techniques to Characterize Seismographic Station Sites

    NASA Astrophysics Data System (ADS)

    Martin, A. J.; Yong, A.; Salomone, L.

    2014-12-01

    Active-source Love waves recorded by the multi-channel analysis of surface wave (MASLW) technique were recently analyzed in two site characterization projects. Between 2010 and 2011, the 2009 American Recovery and Reinvestment Act (ARRA) funded GEOVision to conduct geophysical investigations at 189 seismographic stations—185 in California and 4 in the Central Eastern U.S. (CEUS). The original project plan was to utilize active and passive Rayleigh wave-based techniques to obtain shear-wave velocity (VS) profiles to a minimum depth of 30 m and the time-averaged VS of the upper 30 meters (VS30). Early in the investigation it became evident that Rayleigh wave techniques, such as multi-channel analysis of surface waves (MASRW), were not effective at characterizing all sites. Shear-wave seismic refraction and MASLW techniques were therefore applied. The MASLW technique was deployed at a total of 38 sites, in addition to other methods, and used as the primary technique to characterize 22 sites, 5 of which were also characterized using Rayleigh wave techniques. In 2012, the Electric Power Research Institute funded characterization of 33 CEUS station sites. Based on experience from the ARRA investigation, both MASRW and MASLW data were acquired by GEOVision at 24 CEUS sites—the remaining 9 sites and 2 overlapping sites were characterized by University of Texas, Austin. Of the 24 sites characterized by GEOVision, 16 were characterized using MASLW data, 4 using both MASLW and MASRW data and 4 using MASRW data. Love wave techniques were often found to perform better, or at least yield phase velocity data that could be more readily modeled using the fundamental mode assumption, at shallow rock sites, sites with steep velocity gradients, and, sites with a thin, low velocity, surficial soil layer overlying stiffer sediments. These types of velocity structure often excite dominant higher modes in Rayleigh wave data, but not in Love wave data. At such sites, it may be possible to model Rayleigh wave data using multi- or effective-mode techniques; however, in many cases extraction of adequate Rayleigh wave dispersion data for modeling was difficult. These results imply that field procedures should include careful scrutiny of Rayleigh wave-based dispersion data in order to collect Love wave data when warranted.

  8. Growth and Yield Predictions for Thinned and Unthinned Slash Pine Plantations on Cutover Sites in the West Gulf Region

    Treesearch

    Stanley J. Zarnoch; Donald P. Feduccia; V. Clark Baldwin; Tommy R. Dell

    1991-01-01

    A-growth and yield model has been developed for slash pine plantations on problem-free cutover sites in the west gulf region. The model was based on the moment-percentile method using the Weibull distribution for tree diameters. This technique was applied to untbinned and thinned stand projections and, subsequently, to the prediction of residual stands immediately...

  9. Process Engineering with the Evolutionary Spiral Process Model. Version 01.00.06

    DTIC Science & Technology

    1994-01-01

    program . Process Definition and SPC-92041-CMC Provides methods for defining and Modeling Guidebook documenting processes so they can be analyzed, modified...and Program Evaluation and Review Technique (PERT) support the activity of developing a project schedule. A variety of automated tools, such as...keep the organiza- tion from becoming disoriented during the improvement program (Curtis, Kellner, and Over 1992). Analyzing and documenting how

  10. Preliminary survey on site-adaptation techniques for satellite-derived and reanalysis solar radiation datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polo, J.; Wilbert, S.; Ruiz-Arias, J. A.

    2016-07-01

    At any site, the bankability of a projected solar power plant largely depends on the accuracy and general quality of the solar radiation data generated during the solar resource assessment phase. The term 'site adaptation' has recently started to be used in the framework of solar energy projects to refer to the improvement that can be achieved in satellite-derived solar irradiance and model data when short-term local ground measurements are used to correct systematic errors and bias in the original dataset. This contribution presents a preliminary survey of different possible techniques that can improve long-term satellite-derived and model-derived solar radiationmore » data through the use of short-term on-site ground measurements. The possible approaches that are reported here may be applied in different ways, depending on the origin and characteristics of the uncertainties in the modeled data. This work, which is the first step of a forthcoming in-depth assessment of methodologies for site adaptation, has been done within the framework of the International Energy Agency Solar Heating and Cooling Programme Task 46 'Solar Resource Assessment and Forecasting.'« less

  11. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  12. New optical tomographic & topographic techniques for biomedical applications

    NASA Astrophysics Data System (ADS)

    Buytaert, Jan

    The mammalian middle ear contains the eardrum and the three auditory ossicles, and forms an impedance match between sound in air and pressure waves in the fluid of the inner ear. Without this intermediate system, with its unsurpassed efficiency and dynamic range, we would be practically deaf. Physics-based modeling of this extremely complex mechanical system is necessary to help our basic understanding of the functioning of hearing. Highly realistic models will make it possible to predict the outcome of surgical interventions and to optimize design of ossicle prostheses and active middle ear implants. To obtain such models and with realistic output, basic input data is still missing. In this dissertation I developed and used two new optical techniques to obtain two essential sets of data: accurate three-dimensional morphology of the middle ear structures, and elasticity parameters of the eardrum. The first technique is a new method for optical tomography of macroscopic biomedical objects, which makes it possible to measure the three-dimensional geometry of the middle ear ossicles and soft tissues which are connecting and suspending them. I made a new and high-resolution version of this orthogonal-plane fluorescence optical sectioning method, to obtain micrometer resolution in macroscopic specimens. The result is thus a complete 3-D model of the middle (and inner) ear of gerbil in unprecedented quality. On top of high-resolution morphological models of the middle ear structures, I applied the technique in other fields of research as well. The second device works according to a new optical profilometry technique which allows to measure shape and deformations of the eardrum and other membranes or objects. The approach is called projection moire profilometry, and creates moire interference fringes which contain the height information. I developed a setup which uses liquid crystal panels for grid projection and optical demodulation. Hence no moving parts are present and the setup is entirely digitally controlled. This measurement method is developed to determine the elasticity parameters of the eardrum in-situ. Other surface shapes however can also be measured.

  13. Multi-criterion model ensemble of CMIP5 surface air temperature over China

    NASA Astrophysics Data System (ADS)

    Yang, Tiantian; Tao, Yumeng; Li, Jingjing; Zhu, Qian; Su, Lu; He, Xiaojia; Zhang, Xiaoming

    2018-05-01

    The global circulation models (GCMs) are useful tools for simulating climate change, projecting future temperature changes, and therefore, supporting the preparation of national climate adaptation plans. However, different GCMs are not always in agreement with each other over various regions. The reason is that GCMs' configurations, module characteristics, and dynamic forcings vary from one to another. Model ensemble techniques are extensively used to post-process the outputs from GCMs and improve the variability of model outputs. Root-mean-square error (RMSE), correlation coefficient (CC, or R) and uncertainty are commonly used statistics for evaluating the performances of GCMs. However, the simultaneous achievements of all satisfactory statistics cannot be guaranteed in using many model ensemble techniques. In this paper, we propose a multi-model ensemble framework, using a state-of-art evolutionary multi-objective optimization algorithm (termed MOSPD), to evaluate different characteristics of ensemble candidates and to provide comprehensive trade-off information for different model ensemble solutions. A case study of optimizing the surface air temperature (SAT) ensemble solutions over different geographical regions of China is carried out. The data covers from the period of 1900 to 2100, and the projections of SAT are analyzed with regard to three different statistical indices (i.e., RMSE, CC, and uncertainty). Among the derived ensemble solutions, the trade-off information is further analyzed with a robust Pareto front with respect to different statistics. The comparison results over historical period (1900-2005) show that the optimized solutions are superior over that obtained simple model average, as well as any single GCM output. The improvements of statistics are varying for different climatic regions over China. Future projection (2006-2100) with the proposed ensemble method identifies that the largest (smallest) temperature changes will happen in the South Central China (the Inner Mongolia), the North Eastern China (the South Central China), and the North Western China (the South Central China), under RCP 2.6, RCP 4.5, and RCP 8.5 scenarios, respectively.

  14. Sludge Settling Rate Observations and Projections at the Savannah River Site - 13238

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillam, Jeffrey M.; Shah, Hasmukh B.; Keefer, Mark T.

    2013-07-01

    Since 2004, sludge batches have included a high percentage of stored sludge generated from the H- modified (HM) process. The slow-settling nature of HM sludge means that the settling is often the major part of the washing tank quiescent period between required pump runs to maintain flammability control. Reasonable settling projections are needed to wash soluble salts from sludge in an efficient manner, to determine how much sludge can be washed in a batch within flammability limits, and to provide composition projections for batch qualification work done in parallel with field preparation. Challenges to providing reasonably accurate settling projections includemore » (1) large variations in settling behavior from tank-to-tank, (2) accounting for changing initial concentrations, sludge masses, and combinations of different sludge types, (3) changing the settling behavior upon dissolving some sludge compounds, and (4) sludge preparation schedules that do not allow for much data collection for a particular sludge before washing begins. Scaling from laboratory settling tests has provided inconsistent results. Several techniques have been employed to improve settling projections and therefore the overall batch preparation efficiency. Before any observations can be made on a particular sludge mixture, projections can only be made based on historical experience with similar sludge types. However, scaling techniques can be applied to historical settling models to account for different sludge masses, concentrations, and even combinations of types of sludge. After sludge washing/settling cycles begin, the direct measurement of the sludge height, once generally limited to a single turbidity meter measurement per settle period, is now augmented by examining the temperature profile in the settling tank, to help determine the settled sludge height over time. Recently, a settling model examined at PNNL [1,2,3] has been applied to observed thermocouple and turbidity meter readings to quickly provide settling correlations to project settled heights for other conditions. These tools improve the accuracy and adaptability of short and mid-range planning for sludge batch preparation. (authors)« less

  15. Space construction system analysis study: Project systems and missions descriptions

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Three project systems are defined and summarized. The systems are: (1) a Solar Power Satellite (SPS) Development Flight Test Vehicle configured for fabrication and compatible with solar electric propulsion orbit transfer; (2) an Advanced Communications Platform configured for space fabrication and compatible with low thrust chemical orbit transfer propulsion; and (3) the same Platform, configured to be space erectable but still compatible with low thrust chemical orbit transfer propulsion. These project systems are intended to serve as configuration models for use in detailed analyses of space construction techniques and processes. They represent feasible concepts for real projects; real in the sense that they are realistic contenders on the list of candidate missions currently projected for the national space program. Thus, they represent reasonable configurations upon which to base early studies of alternative space construction processes.

  16. The Earth System Documentation (ES-DOC) Software Process

    NASA Astrophysics Data System (ADS)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  17. Using proxies to explore ensemble uncertainty in climate impact studies: the example of air pollution

    NASA Astrophysics Data System (ADS)

    Lemaire, V. E. P.; Colette, A.; Menut, L.

    2015-10-01

    Because of its sensitivity to unfavorable weather patterns, air pollution is sensitive to climate change so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projection. However, the computing cost of such method requires optimizing ensemble exploration techniques. By using a training dataset of deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed simple statistical models that could be used to predict air pollutant concentrations. The evolution of the key climate variables driving either particulate or gaseous pollution allows concluding on the robustness of the climate impact on air quality. The climate benefit for PM2.5 was confirmed -0.96 (±0.18), -1.00 (±0.37), -1.16 ± (0.23) μg m-3, for resp. Eastern Europe, Mid Europe and Northern Italy and for the Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy regions a climate penalty on ozone was identified 10.11 (±3.22), 8.23 (±2.06), 9.23 (±1.13), 6.41 (±2.14), 7.43 (±2.02) μg m-3. This technique also allows selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections.

  18. Parametric Cost Models for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtney

    2010-01-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  19. Validation of New Wind Resource Maps

    NASA Astrophysics Data System (ADS)

    Elliott, D.; Schwartz, M.

    2002-05-01

    The National Renewable Energy Laboratory (NREL) recently led a project to validate updated state wind resource maps for the northwestern United States produced by a private U.S. company, TrueWind Solutions (TWS). The independent validation project was a cooperative activity among NREL, TWS, and meteorological consultants. The independent validation concept originated at a May 2001 technical workshop held at NREL to discuss updating the Wind Energy Resource Atlas of the United States. Part of the workshop, which included more than 20 attendees from the wind resource mapping and consulting community, was dedicated to reviewing the latest techniques for wind resource assessment. It became clear that using a numerical modeling approach for wind resource mapping was rapidly gaining ground as a preferred technique and if the trend continues, it will soon become the most widely-used technique around the world. The numerical modeling approach is a relatively fast application compared to older mapping methods and, in theory, should be quite accurate because it directly estimates the magnitude of boundary-layer processes that affect the wind resource of a particular location. Numerical modeling output combined with high resolution terrain data can produce useful wind resource information at a resolution of 1 km or lower. However, because the use of the numerical modeling approach is new (last 35 years) and relatively unproven, meteorological consultants question the accuracy of the approach. It was clear that new state or regional wind maps produced by this method would have to undergo independent validation before the results would be accepted by the wind energy community and developers.

  20. Explicit treatment for Dirichlet, Neumann and Cauchy boundary conditions in POD-based reduction of groundwater models

    NASA Astrophysics Data System (ADS)

    Gosses, Moritz; Nowak, Wolfgang; Wöhling, Thomas

    2018-05-01

    In recent years, proper orthogonal decomposition (POD) has become a popular model reduction method in the field of groundwater modeling. It is used to mitigate the problem of long run times that are often associated with physically-based modeling of natural systems, especially for parameter estimation and uncertainty analysis. POD-based techniques reproduce groundwater head fields sufficiently accurate for a variety of applications. However, no study has investigated how POD techniques affect the accuracy of different boundary conditions found in groundwater models. We show that the current treatment of boundary conditions in POD causes inaccuracies for these boundaries in the reduced models. We provide an improved method that splits the POD projection space into a subspace orthogonal to the boundary conditions and a separate subspace that enforces the boundary conditions. To test the method for Dirichlet, Neumann and Cauchy boundary conditions, four simple transient 1D-groundwater models, as well as a more complex 3D model, are set up and reduced both by standard POD and POD with the new extension. We show that, in contrast to standard POD, the new method satisfies both Dirichlet and Neumann boundary conditions. It can also be applied to Cauchy boundaries, where the flux error of standard POD is reduced by its head-independent contribution. The extension essentially shifts the focus of the projection towards the boundary conditions. Therefore, we see a slight trade-off between errors at model boundaries and overall accuracy of the reduced model. The proposed POD extension is recommended where exact treatment of boundary conditions is required.

  1. Projective techniques and the detection of child sexual abuse.

    PubMed

    Garb, H N; Wood, J M; Nezworski, M T

    2000-05-01

    Projective techniques (e.g., the Rorschach, Human Figure Drawings) are sometimes used to detect child sexual abuse. West recently conducted a meta-analysis on this topic, but she systematically excluded nonsignificant results. In this article, a reanalysis of her data is presented. The authors conclude that projective techniques should not be used to detect child sexual abuse. Many of the studies purportedly demonstrating validity are flawed, and none of the projective test scores have been well replicated.

  2. D Modelling with the Samsung Gear 360

    NASA Astrophysics Data System (ADS)

    Barazzetti, L.; Previtali, M.; Roncoroni, F.

    2017-02-01

    The Samsung Gear 360 is a consumer grade spherical camera able to capture photos and videos. The aim of this work is to test the metric accuracy and the level of detail achievable with the Samsung Gear 360 coupled with digital modelling techniques based on photogrammetry/computer vision algorithms. Results demonstrate that the direct use of the projection generated inside the mobile phone or with Gear 360 Action Direction (the desktop software for post-processing) have a relatively low metric accuracy. As results were in contrast with the accuracy achieved by using the original fisheye images (front and rear facing images) in photogrammetric reconstructions, an alternative solution to generate the equirectangular projections was developed. A calibration aimed at understanding the intrinsic parameters of the two lenses camera, as well as their relative orientation, allowed one to generate new equirectangular projections from which a significant improvement of geometric accuracy has been achieved.

  3. Near term climate projections for invasive species distributions

    USGS Publications Warehouse

    Jarnevich, C.S.; Stohlgren, T.J.

    2009-01-01

    Climate change and invasive species pose important conservation issues separately, and should be examined together. We used existing long term climate datasets for the US to project potential climate change into the future at a finer spatial and temporal resolution than the climate change scenarios generally available. These fine scale projections, along with new species distribution modeling techniques to forecast the potential extent of invasive species, can provide useful information to aide conservation and invasive species management efforts. We created habitat suitability maps for Pueraria montana (kudzu) under current climatic conditions and potential average conditions up to 30 years in the future. We examined how the potential distribution of this species will be affected by changing climate, and the management implications associated with these changes. Our models indicated that P. montana may increase its distribution particularly in the Northeast with climate change and may decrease in other areas. ?? 2008 Springer Science+Business Media B.V.

  4. Projecting climate change impacts on hydrology: the potential role of daily GCM output

    NASA Astrophysics Data System (ADS)

    Maurer, E. P.; Hidalgo, H. G.; Das, T.; Dettinger, M. D.; Cayan, D.

    2008-12-01

    A primary challenge facing resource managers in accommodating climate change is determining the range and uncertainty in regional and local climate projections. This is especially important for assessing changes in extreme events, which will drive many of the more severe impacts of a changed climate. Since global climate models (GCMs) produce output at a spatial scale incompatible with local impact assessment, different techniques have evolved to downscale GCM output so locally important climate features are expressed in the projections. We compared skill and hydrologic projections using two statistical downscaling methods and a distributed hydrology model. The downscaling methods are the constructed analogues (CA) and the bias correction and spatial downscaling (BCSD). CA uses daily GCM output, and can thus capture GCM projections for changing extreme event occurrence, while BCSD uses monthly output and statistically generates historical daily sequences. We evaluate the hydrologic impacts projected using downscaled climate (from the NCEP/NCAR reanalysis as a surrogate GCM) for the late 20th century with both methods, comparing skill in projecting soil moisture, snow pack, and streamflow at key locations in the Western United States. We include an assessment of a new method for correcting for GCM biases in a hybrid method combining the most important characteristics of both methods.

  5. Certification of a hybrid parameter model of the fully flexible Shuttle Remote Manipulator System

    NASA Technical Reports Server (NTRS)

    Barhorst, Alan A.

    1995-01-01

    The development of high fidelity models of mechanical systems with flexible components is in flux. Many working models of these devices assume the elastic motion is small and can be superimposed on the overall rigid body motion. A drawback associated with this type of modeling technique is that it is required to regenerate the linear modal model of the device if the elastic motion is sufficiently far from the base rigid motion. An advantage to this type of modeling is that it uses NASTRAN modal data which is the NASA standard means of modal information exchange. A disadvantage to the linear modeling is that it fails to accurately represent large motion of the system, unless constant modal updates are performed. In this study, which is a continuation of a project started last year, the drawback of the currently used modal snapshot modeling technique is addressed in a rigorous fashion by novel and easily applied means.

  6. A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data

    DOE PAGES

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...

    2016-01-01

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  7. Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plechac, Petr

    2016-03-01

    The overall objective of this project was to develop multiscale models for understanding and eventually designing complex processes for renewables. To the best of our knowledge, our work is the first attempt at modeling complex reacting systems, whose performance relies on underlying multiscale mathematics and developing rigorous mathematical techniques and computational algorithms to study such models. Our specific application lies at the heart of biofuels initiatives of DOE and entails modeling of catalytic systems, to enable economic, environmentally benign, and efficient conversion of biomass into either hydrogen or valuable chemicals.

  8. GEOCAB Portal: A gateway for discovering and accessing capacity building resources in Earth Observation

    NASA Astrophysics Data System (ADS)

    Desconnets, Jean-Christophe; Giuliani, Gregory; Guigoz, Yaniss; Lacroix, Pierre; Mlisa, Andiswa; Noort, Mark; Ray, Nicolas; Searby, Nancy D.

    2017-02-01

    The discovery of and access to capacity building resources are often essential to conduct environmental projects based on Earth Observation (EO) resources, whether they are Earth Observation products, methodological tools, techniques, organizations that impart training in these techniques or even projects that have shown practical achievements. Recognizing this opportunity and need, the European Commission through two FP7 projects jointly with the Group on Earth Observations (GEO) teamed up with the Committee on Earth observation Satellites (CEOS). The Global Earth Observation CApacity Building (GEOCAB) portal aims at compiling all current capacity building efforts on the use of EO data for societal benefits into an easily updateable and user-friendly portal. GEOCAB offers a faceted search to improve user discovery experience with a fully interactive world map with all inventoried projects and activities. This paper focuses on the conceptual framework used to implement the underlying platform. An ISO19115 metadata model associated with a terminological repository are the core elements that provide a semantic search application and an interoperable discovery service. The organization and the contribution of different user communities to ensure the management and the update of the content of GEOCAB are addressed.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert P. Lucht

    Laser-induced polarization spectroscopy (LIPS), degenerate four-wave mixing (DFWM), and electronic-resonance-enhanced (ERE) coherent anti-Stokes Raman scattering (CARS) are techniques that shows great promise for sensitive measurements of transient gas-phase species, and diagnostic applications of these techniques are being pursued actively at laboratories throughout the world. However, significant questions remain regarding strategies for quantitative concentration measurements using these techniques. The primary objective of this research program is to develop and test strategies for quantitative concentration measurements in flames and plasmas using these nonlinear optical techniques. Theoretically, we are investigating the physics of these processes by direct numerical integration (DNI) of the time-dependentmore » density matrix equations that describe the wave-mixing interaction. Significantly fewer restrictive assumptions are required when the density matrix equations are solved using this DNI approach compared with the assumptions required to obtain analytical solutions. For example, for LIPS calculations, the Zeeman state structure and hyperfine structure of the resonance and effects such as Doppler broadening can be included. There is no restriction on the intensity of the pump and probe beams in these nonperturbative calculations, and both the pump and probe beam intensities can be high enough to saturate the resonance. As computer processing speeds have increased, we have incorporated more complicated physical models into our DNI codes. During the last project period we developed numerical methods for nonperturbative calculations of the two-photon absorption process. Experimentally, diagnostic techniques are developed and demonstrated in gas cells and/or well-characterized flames for ease of comparison with model results. The techniques of two-photon, two-color H-atom LIPS and three-laser ERE CARS for NO and C{sub 2}H{sub 2} were demonstrated during the project period, and nonperturbative numerical models of both of these techniques were developed. In addition, we developed new single-mode, injection-seeded optical parametric laser sources (OPLSs) that will be used to replace multi-mode commercial dye lasers in our experimental measurements. The use of single-mode laser radiation in our experiments will increase significantly the rigor with which theory and experiment are compared.« less

  10. COSP: Satellite simulation software for model assessment

    DOE PAGES

    Bodas-Salcedo, A.; Webb, M. J.; Bony, S.; ...

    2011-08-01

    Errors in the simulation of clouds in general circulation models (GCMs) remain a long-standing issue in climate projections, as discussed in the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report. This highlights the need for developing new analysis techniques to improve our knowledge of the physical processes at the root of these errors. The Cloud Feedback Model Intercomparison Project (CFMIP) pursues this objective, and under that framework the CFMIP Observation Simulator Package (COSP) has been developed. COSP is a flexible software tool that enables the simulation of several satellite-borne active and passive sensor observations from model variables. The flexibilitymore » of COSP and a common interface for all sensors facilitates its use in any type of numerical model, from high-resolution cloud-resolving models to the coarser-resolution GCMs assessed by the IPCC, and the scales in between used in weather forecast and regional models. The diversity of model parameterization techniques makes the comparison between model and observations difficult, as some parameterized variables (e.g., cloud fraction) do not have the same meaning in all models. The approach followed in COSP permits models to be evaluated against observations and compared against each other in a more consistent manner. This thus permits a more detailed diagnosis of the physical processes that govern the behavior of clouds and precipitation in numerical models. The World Climate Research Programme (WCRP) Working Group on Coupled Modelling has recommended the use of COSP in a subset of climate experiments that will be assessed by the next IPCC report. Here we describe COSP, present some results from its application to numerical models, and discuss future work that will expand its capabilities.« less

  11. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  12. The URban Greenhouse gas Emissions assessment through inverse modeling (URGE) project: a pilot study in the Oslo area

    NASA Astrophysics Data System (ADS)

    Pisso, I. J.; Lopez-Aparicio, S.; Schneider, P.; Schmidbauer, N.; Vogt, M.

    2017-12-01

    Norway has set the target of cutting greenhouse gas (GHG) emissions by at least 40% compared to 1990 levels by 2030. This goal will require the implementation of policy measures aiming at strong reductions of GHGs emissions, especially in the urban environment. The implementation of urban policy measures is still a challenging task and it requires control and verification for success. The URGE project aims at assessing the emission flux of GHGs including comprehensive uncertainty estimates based on inverse transport modelling techniques and optimized use of measurements. The final goal is to establish a coherent and consistent GHG urban emission inventory. This will be carried out in a case study in Oslo (Norway), where CO2 will be the priority compound. The overall outcome of the project will provide support in the development of strategies to effectively reduce GHG emissions in the urban environment. The overall goal will be reached through establishing the baseline urban CO2 emission inventory for Oslo; determining the optimal measurement locations based on transport modelling (with flexpart-wrf); designing and carrying out a pilot measurement campaign of the CO2-rich air downwind of the city plume combining state-of-the-art instruments (Picarro) and small sensors; assessing the feasibility of determining the background concentration surrounding the city with satellite measurements (OCO2); and providing optimised estimates of the emissions and their uncertainties via inverse modelling (source-receptor relationship). One of our main interests is the interoperability and exchange of information with similar activities in other urban areas. We will present the overall project and the preliminary results of the network design. We will discuss the data exchange formats, the algorithms and data structures that could be used for results and methodology intercomparisons as well as the suitability to apply the same techniques to other atmospheric compounds.

  13. A Biomechanical Modeling Guided CBCT Estimation Technique

    PubMed Central

    Zhang, You; Tehrani, Joubin Nasehi; Wang, Jing

    2017-01-01

    Two-dimensional-to-three-dimensional (2D-3D) deformation has emerged as a new technique to estimate cone-beam computed tomography (CBCT) images. The technique is based on deforming a prior high-quality 3D CT/CBCT image to form a new CBCT image, guided by limited-view 2D projections. The accuracy of this intensity-based technique, however, is often limited in low-contrast image regions with subtle intensity differences. The solved deformation vector fields (DVFs) can also be biomechanically unrealistic. To address these problems, we have developed a biomechanical modeling guided CBCT estimation technique (Bio-CBCT-est) by combining 2D-3D deformation with finite element analysis (FEA)-based biomechanical modeling of anatomical structures. Specifically, Bio-CBCT-est first extracts the 2D-3D deformation-generated displacement vectors at the high-contrast anatomical structure boundaries. The extracted surface deformation fields are subsequently used as the boundary conditions to drive structure-based FEA to correct and fine-tune the overall deformation fields, especially those at low-contrast regions within the structure. The resulting FEA-corrected deformation fields are then fed back into 2D-3D deformation to form an iterative loop, combining the benefits of intensity-based deformation and biomechanical modeling for CBCT estimation. Using eleven lung cancer patient cases, the accuracy of the Bio-CBCT-est technique has been compared to that of the 2D-3D deformation technique and the traditional CBCT reconstruction techniques. The accuracy was evaluated in the image domain, and also in the DVF domain through clinician-tracked lung landmarks. PMID:27831866

  14. The role of global cloud climatologies in validating numerical models

    NASA Technical Reports Server (NTRS)

    HARSHVARDHAN

    1991-01-01

    The net upward longwave surface radiation is exceedingly difficult to measure from space. A hybrid method using General Circulation Model (GCM) simulations and satellite data from the Earth Radiation Budget Experiment (ERBE) and the International Satellite Cloud Climatology Project (ISCCP) was used to produce global maps of this quantity over oceanic areas. An advantage of this technique is that no independent knowledge or assumptions regarding cloud cover for a particular month are required. The only information required is a relationship between the cloud radiation forcing (CRF) at the top of the atmosphere and that at the surface, which is obtained from the GCM simulation. A flow diagram of the technique and results are given.

  15. Creation of Synthetic Surface Temperature and Precipitation Ensembles Through A Computationally Efficient, Mixed Method Approach

    NASA Astrophysics Data System (ADS)

    Hartin, C.; Lynch, C.; Kravitz, B.; Link, R. P.; Bond-Lamberty, B. P.

    2017-12-01

    Typically, uncertainty quantification of internal variability relies on large ensembles of climate model runs under multiple forcing scenarios or perturbations in a parameter space. Computationally efficient, standard pattern scaling techniques only generate one realization and do not capture the complicated dynamics of the climate system (i.e., stochastic variations with a frequency-domain structure). In this study, we generate large ensembles of climate data with spatially and temporally coherent variability across a subselection of Coupled Model Intercomparison Project Phase 5 (CMIP5) models. First, for each CMIP5 model we apply a pattern emulation approach to derive the model response to external forcing. We take all the spatial and temporal variability that isn't explained by the emulator and decompose it into non-physically based structures through use of empirical orthogonal functions (EOFs). Then, we perform a Fourier decomposition of the EOF projection coefficients to capture the input fields' temporal autocorrelation so that our new emulated patterns reproduce the proper timescales of climate response and "memory" in the climate system. Through this 3-step process, we derive computationally efficient climate projections consistent with CMIP5 model trends and modes of variability, which address a number of deficiencies inherent in the ability of pattern scaling to reproduce complex climate model behavior.

  16. Using the scenario method in the context of health and health care--a scoping review.

    PubMed

    Vollmar, Horst Christian; Ostermann, Thomas; Redaèlli, Marcus

    2015-10-16

    The scenario technique is a method for future research and for strategic planning. Today, it includes both qualitative and quantitative elements. The aims of this scoping review are to give an overview of the application of the scenario method in the fields of health care and to make suggestions for better reporting in future scenario projects. Between January 2013 and October 2013 we conducted a systematic search in the databases Medline, Embase, PsycInfo, Eric, The Cochrane Library, Scopus, Web of Science, and Cinahl since inception for the term 'scenario(s)' in combination with other terms, e.g. method, model, and technique. Our search was not restricted by date or language. In addition, we screened the reference lists of the included articles. A total of 576 bibliographical records were screened. After removing duplicates and three rounds of screening, 41 articles covering 38 different scenario projects were included for the final analysis. Nine of the included articles addressed disease related issues, led by mental health and dementia (n = 4), and followed by cancer (n = 3). Five scenario projects focused on public health issues at an organizational level and five focused on the labor market for different health care professionals. In addition, four projects dealt with health care 'in general', four with the field of biotechnology and personalized medicine, and additional four with other technology developments. Some of the scenario projects suffered from poor reporting of methodological aspects. Despite its potential, use of the scenario method seems to be published rarely in comparison to other methods such as the Delphi-technique, at least in the field of health care. This might be due to the complexity of the methodological approach. Individual project methods and activities vary widely and are poorly reported. Improved criteria are required for reporting of scenario project methods. With improved standards and greater transparency, the scenario method will be a good tool for scientific health care planning and strategic decision-making in public health.

  17. Modeling to Improve the Risk Reduction Process for Command File Errors

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Bryant, Larry; Waggoner, Bruce

    2013-01-01

    The Jet Propulsion Laboratory has learned that even innocuous errors in the spacecraft command process can have significantly detrimental effects on a space mission. Consequently, such Command File Errors (CFE), regardless of their effect on the spacecraft, are treated as significant events for which a root cause is identified and corrected. A CFE during space mission operations is often the symptom of imbalance or inadequacy within the system that encompasses the hardware and software used for command generation as well as the human experts and processes involved in this endeavor. As we move into an era of increased collaboration with other NASA centers and commercial partners, these systems become more and more complex. Consequently, the ability to thoroughly model and analyze CFEs formally in order to reduce the risk they pose is increasingly important. In this paper, we summarize the results of applying modeling techniques previously developed to the DAWN flight project. The original models were built with the input of subject matter experts from several flight projects. We have now customized these models to address specific questions for the DAWN flight project and formulating use cases to address their unique mission needs. The goal of this effort is to enhance the project's ability to meet commanding reliability requirements for operations and to assist them in managing their Command File Errors.

  18. Forecasting the effects of coastal protection and restoration projects on wetland morphology in coastal Louisiana under multiple environmental uncertainty scenarios

    USGS Publications Warehouse

    Couvillion, Brady R.; Steyer, Gregory D.; Wang, Hongqing; Beck, Holly J.; Rybczyk, John M.

    2013-01-01

    Few landscape scale models have assessed the effects of coastal protection and restoration projects on wetland morphology while taking into account important uncertainties in environmental factors such as sea-level rise (SLR) and subsidence. In support of Louisiana's 2012 Coastal Master Plan, we developed a spatially explicit wetland morphology model and coupled it with other predictive models. The model is capable of predicting effects of protection and restoration projects on wetland area, landscape configuration, surface elevation, and soil organic carbon (SOC) storage under multiple environmental uncertainty scenarios. These uncertainty scenarios included variability in parameters such as eustatic SLR (ESLR), subsidence rate, and Mississippi River discharge. Models were run for a 2010–2060 simulation period. Model results suggest that under a “future-without-action” condition (FWOA), coastal Louisiana is at risk of losing between 2118 and 4677 km2 of land over the next 50 years, but with protection and restoration projects proposed in the Master Plan, between 40% and 75% of that loss could be mitigated. Moreover, model results indicate that under a FWOA condition, SOC storage (to a depth of 1 m) could decrease by between 108 and 250 million metric tons, a loss of 12% to 30% of the total coastwide SOC, but with the Master Plan implemented, between 35% and 74% of the SOC loss could be offset. Long-term maintenance of project effects was best attained in areas of low SLR and subsidence, with a sediment source to support marsh accretion. Our findings suggest that despite the efficacy of restoration projects in mitigating losses in certain areas, net loss of wetlands in coastal Louisiana is likely to continue. Model results suggest certain areas may eventually be lost regardless of proposed restoration investment, and, as such, other techniques and strategies of adaptation may have to be utilized in these areas.

  19. Spatial analysis on future housing markets: economic development and housing implications.

    PubMed

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael Allen; Marker, Bryan

    This report summarizes the progress made as part of a one year lab-directed research and development (LDRD) project to fund the research efforts of Bryan Marker at the University of Texas at Austin. The goal of the project was to develop new techniques for automatically tuning the performance of dense linear algebra kernels. These kernels often represent the majority of computational time in an application. The primary outcome from this work is a demonstration of the value of model driven engineering as an approach to accurately predict and study performance trade-offs for dense linear algebra computations.

  1. A general software reliability process simulation technique

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  2. Spatial Analysis on Future Housing Markets: Economic Development and Housing Implications

    PubMed Central

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand. PMID:24892097

  3. Finite state projection based bounds to compare chemical master equation models using single-cell data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, Zachary; Neuert, Gregor; Department of Pharmacology, School of Medicine, Vanderbilt University, Nashville, Tennessee 37232

    2016-08-21

    Emerging techniques now allow for precise quantification of distributions of biological molecules in single cells. These rapidly advancing experimental methods have created a need for more rigorous and efficient modeling tools. Here, we derive new bounds on the likelihood that observations of single-cell, single-molecule responses come from a discrete stochastic model, posed in the form of the chemical master equation. These strict upper and lower bounds are based on a finite state projection approach, and they converge monotonically to the exact likelihood value. These bounds allow one to discriminate rigorously between models and with a minimum level of computational effort.more » In practice, these bounds can be incorporated into stochastic model identification and parameter inference routines, which improve the accuracy and efficiency of endeavors to analyze and predict single-cell behavior. We demonstrate the applicability of our approach using simulated data for three example models as well as for experimental measurements of a time-varying stochastic transcriptional response in yeast.« less

  4. Applications of remote sensing to estuarine management. [environmental surveys of the Chesapeake Bay (U.S.)

    NASA Technical Reports Server (NTRS)

    Munday, J. C., Jr.; Gordon, H. H.; Welch, C. S.; Williams, G.

    1976-01-01

    Projects for sewage outfall siting for pollution control in the lower Chesapeake Bay wetlands are reported. A dye-buoy/photogrammetry and remote sensing technique was employed to gather circulation data used in outfall siting. This technique is greatly favored over alternate methods because it is inexpensive, produces results quickly, and reveals Lagrangian current paths which are preferred in making siting decisions. Wetlands data were obtained by interpretation of color and color infrared photographic imagery from several altitudes. Historical sequences of photographs are shown that were used to document wetlands changes. Sequential infrared photography of inlet basins was employed to determine tidal prisms, which were input to mathematical models to be used by state agencies in pollution control. A direct and crucial link between remote sensing and management decisions was demonstrated in the various projects.

  5. Simulation of generation of new ideas for new product development and IT services

    NASA Astrophysics Data System (ADS)

    Nasiopoulos, Dimitrios K.; Sakas, Damianos P.; Vlachos, D. S.; Mavrogianni, Amanda

    2015-02-01

    This paper describes a dynamic model of the New Product Development (NPD) process. The model has been occurring from best practice noticed in our research conducted at a range of situations. The model contributes to determine and put an IT company's NPD activities into the frame of the overall NPD process[1]. It has been found to be a useful tool for organizing data on IT company's NPD activities without enforcement an excessively restrictive research methodology refers to the model of NPD. The framework, which strengthens the model, will help to promote a research of the methods undertaken within an IT company's NPD process, thus promoting understanding and improvement of the simulation process[2]. IT companies tested many techniques with several different practices designed to improve the validity and efficacy of their NPD process[3]. Supported by the model, this research examines how widely accepted stated tactics are and what impact these best tactics have on NPD performance. The main assumption of this study is that simulation of generation of new ideas[4] will lead to greater NPD effectiveness and more successful products in IT companies. With the model implementation, practices concern the implementation strategies of NPD (product selection, objectives, leadership, marketing strategy and customer satisfaction) are all more widely accepted than best practices related with controlling the application of NPD (process control, measurements, results). In linking simulation with impact, our results states product success depends on developing strong products and ensuring organizational emphasis, through proper project selection. Project activities strengthens both product and project success. IT products and services success also depends on monitoring the NPD procedure through project management and ensuring team consistency with group rewards. Sharing experiences between projects can positively influence the NPD process.

  6. Advanced Modeling Techniques to Study Anthropogenic Influences on Atmospheric Chemical Budgets

    NASA Technical Reports Server (NTRS)

    Mathur, Rohit

    1997-01-01

    This research work is a collaborative effort between research groups at MCNC and the University of North Carolina at Chapel Hill. The overall objective of this research is to improve the level of understanding of the processes that determine the budgets of chemically and radiatively active compounds in the atmosphere through development and application of advanced methods for calculating the chemical change in atmospheric models. The research performed during the second year of this project focused on four major aspects: (1) The continued development and refinement of multiscale modeling techniques to address the issue of the disparate scales of the physico-chemical processes that govern the fate of atmospheric pollutants; (2) Development and application of analysis methods utilizing process and mass balance techniques to increase the interpretive powers of atmospheric models and to aid in complementary analysis of model predictions and observations; (3) Development of meteorological and emission inputs for initial application of the chemistry/transport model over the north Atlantic region; and, (4) The continued development and implementation of a totally new adaptive chemistry representation that changes the details of what is represented as the underlying conditions change.

  7. The Effects of Climate Model Similarity on Local, Risk-Based Adaptation Planning

    NASA Astrophysics Data System (ADS)

    Steinschneider, S.; Brown, C. M.

    2014-12-01

    The climate science community has recently proposed techniques to develop probabilistic projections of climate change from ensemble climate model output. These methods provide a means to incorporate the formal concept of risk, i.e., the product of impact and probability, into long-term planning assessments for local systems under climate change. However, approaches for pdf development often assume that different climate models provide independent information for the estimation of probabilities, despite model similarities that stem from a common genealogy. Here we utilize an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) to develop probabilistic climate information, with and without an accounting of inter-model correlations, and use it to estimate climate-related risks to a local water utility in Colorado, U.S. We show that the tail risk of extreme climate changes in both mean precipitation and temperature is underestimated if model correlations are ignored. When coupled with impact models of the hydrology and infrastructure of the water utility, the underestimation of extreme climate changes substantially alters the quantification of risk for water supply shortages by mid-century. We argue that progress in climate change adaptation for local systems requires the recognition that there is less information in multi-model climate ensembles than previously thought. Importantly, adaptation decisions cannot be limited to the spread in one generation of climate models.

  8. The Monitoring, Detection, Isolation and Assessment of Information Warfare Attacks Through Multi-Level, Multi-Scale System Modeling and Model Based Technology

    DTIC Science & Technology

    2004-01-01

    login identity to the one under which the system call is executed, the parameters of the system call execution - file names including full path...Anomaly detection COAST-EIMDT Distributed on target hosts EMERALD Distributed on target hosts and security servers Signature recognition Anomaly...uses a centralized architecture, and employs an anomaly detection technique for intrusion detection. The EMERALD project [80] proposes a

  9. Applications of Evolutionary Technology to Manufacturing and Logistics Systems : State-of-the Art Survey

    NASA Astrophysics Data System (ADS)

    Gen, Mitsuo; Lin, Lin

    Many combinatorial optimization problems from industrial engineering and operations research in real-world are very complex in nature and quite hard to solve them by conventional techniques. Since the 1960s, there has been an increasing interest in imitating living beings to solve such kinds of hard combinatorial optimization problems. Simulating the natural evolutionary process of human beings results in stochastic optimization techniques called evolutionary algorithms (EAs), which can often outperform conventional optimization methods when applied to difficult real-world problems. In this survey paper, we provide a comprehensive survey of the current state-of-the-art in the use of EA in manufacturing and logistics systems. In order to demonstrate the EAs which are powerful and broadly applicable stochastic search and optimization techniques, we deal with the following engineering design problems: transportation planning models, layout design models and two-stage logistics models in logistics systems; job-shop scheduling, resource constrained project scheduling in manufacturing system.

  10. Evaluation of near surface ozone and particulate matter in air quality simulations driven by dynamically downscaled historical meteorological fields

    EPA Science Inventory

    In this study, techniques typically used for future air quality projections are applied to a historical 11-year period to assess the performance of the modeling system when the driving meteorological conditions are obtained using dynamical downscaling of coarse-scale fields witho...

  11. Using Aluminum Foil to Record Structures in Sedimentary Rock.

    ERIC Educational Resources Information Center

    Metz, Robert

    1982-01-01

    Aluminum foil can be used to make impressions of structures preserved in sedimentary rock. The impressions can be projected onto a screen, photographed, or a Plaster of Paris model can be made from them. Impressions of ripple marks, mudcracks, and raindrop impressions are provided in photographs illustrating the technique. (Author/JN)

  12. Project Eagle: Techniques for Multi-Family Psycho-Educational Group Therapy with Gifted American Indian Adolescents and Their Parents.

    ERIC Educational Resources Information Center

    Robbins, Rockey; Tonemah, Stuart; Robbins, Sharla

    2002-01-01

    A culturally relevant group therapy model for gifted American Indian students and their parents uses non-didactic facilitation to focus on cultural identity, play, self-disclosure, parental involvement, silence, cognitive processing, emotional expression, and social responsibility. Evaluation results indicate the program builds self-esteem, pride…

  13. An Application of Fuzzy Analytic Hierarchy Process (FAHP) for Evaluating Students' Project

    ERIC Educational Resources Information Center

    Çebi, Ayça; Karal, Hasan

    2017-01-01

    In recent years, artificial intelligence applications for understanding the human thinking process and transferring it to virtual environments come into prominence. The fuzzy logic which paves the way for modeling human behaviors and expressing even vague concepts mathematically, and is also regarded as an artificial intelligence technique has…

  14. World-Wide Web: The Information Universe.

    ERIC Educational Resources Information Center

    Berners-Lee, Tim; And Others

    1992-01-01

    Describes the World-Wide Web (W3) project, which is designed to create a global information universe using techniques of hypertext, information retrieval, and wide area networking. Discussion covers the W3 data model, W3 architecture, the document naming scheme, protocols, document formats, comparison with other systems, experience with the W3…

  15. A tomographic technique for the simultaneous imaging of temperature, chemical species, and pressure in reactive flows using absorption spectroscopy with frequency-agile lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Weiwei; Kaminski, Clemens F., E-mail: cfk23@cam.ac.uk

    2014-01-20

    This paper proposes a technique that can simultaneously retrieve distributions of temperature, concentration of chemical species, and pressure based on broad bandwidth, frequency-agile tomographic absorption spectroscopy. The technique holds particular promise for the study of dynamic combusting flows. A proof-of-concept numerical demonstration is presented, using representative phantoms to model conditions typically prevailing in near-atmospheric or high pressure flames. The simulations reveal both the feasibility of the proposed technique and its robustness. Our calculations indicate precisions of ∼70 K at flame temperatures and ∼0.05 bars at high pressure from reconstructions featuring as much as 5% Gaussian noise in the projections.

  16. Rainfall assimilation in RAMS by means of the Kuo parameterisation inversion: method and preliminary results

    NASA Astrophysics Data System (ADS)

    Orlandi, A.; Ortolani, A.; Meneguzzo, F.; Levizzani, V.; Torricella, F.; Turk, F. J.

    2004-03-01

    In order to improve high-resolution forecasts, a specific method for assimilating rainfall rates into the Regional Atmospheric Modelling System model has been developed. It is based on the inversion of the Kuo convective parameterisation scheme. A nudging technique is applied to 'gently' increase with time the weight of the estimated precipitation in the assimilation process. A rough but manageable technique is explained to estimate the partition of convective precipitation from stratiform one, without requiring any ancillary measurement. The method is general purpose, but it is tuned for geostationary satellite rainfall estimation assimilation. Preliminary results are presented and discussed, both through totally simulated experiments and through experiments assimilating real satellite-based precipitation observations. For every case study, Rainfall data are computed with a rapid update satellite precipitation estimation algorithm based on IR and MW satellite observations. This research was carried out in the framework of the EURAINSAT project (an EC research project co-funded by the Energy, Environment and Sustainable Development Programme within the topic 'Development of generic Earth observation technologies', Contract number EVG1-2000-00030).

  17. Statistical bias correction method applied on CMIP5 datasets over the Indian region during the summer monsoon season for climate change applications

    NASA Astrophysics Data System (ADS)

    Prasanna, V.

    2018-01-01

    This study makes use of temperature and precipitation from CMIP5 climate model output for climate change application studies over the Indian region during the summer monsoon season (JJAS). Bias correction of temperature and precipitation from CMIP5 GCM simulation results with respect to observation is discussed in detail. The non-linear statistical bias correction is a suitable bias correction method for climate change data because it is simple and does not add up artificial uncertainties to the impact assessment of climate change scenarios for climate change application studies (agricultural production changes) in the future. The simple statistical bias correction uses observational constraints on the GCM baseline, and the projected results are scaled with respect to the changing magnitude in future scenarios, varying from one model to the other. Two types of bias correction techniques are shown here: (1) a simple bias correction using a percentile-based quantile-mapping algorithm and (2) a simple but improved bias correction method, a cumulative distribution function (CDF; Weibull distribution function)-based quantile-mapping algorithm. This study shows that the percentile-based quantile mapping method gives results similar to the CDF (Weibull)-based quantile mapping method, and both the methods are comparable. The bias correction is applied on temperature and precipitation variables for present climate and future projected data to make use of it in a simple statistical model to understand the future changes in crop production over the Indian region during the summer monsoon season. In total, 12 CMIP5 models are used for Historical (1901-2005), RCP4.5 (2005-2100), and RCP8.5 (2005-2100) scenarios. The climate index from each CMIP5 model and the observed agricultural yield index over the Indian region are used in a regression model to project the changes in the agricultural yield over India from RCP4.5 and RCP8.5 scenarios. The results revealed a better convergence of model projections in the bias corrected data compared to the uncorrected data. The study can be extended to localized regional domains aimed at understanding the changes in the agricultural productivity in the future with an agro-economy or a simple statistical model. The statistical model indicated that the total food grain yield is going to increase over the Indian region in the future, the increase in the total food grain yield is approximately 50 kg/ ha for the RCP4.5 scenario from 2001 until the end of 2100, and the increase in the total food grain yield is approximately 90 kg/ha for the RCP8.5 scenario from 2001 until the end of 2100. There are many studies using bias correction techniques, but this study applies the bias correction technique to future climate scenario data from CMIP5 models and applied it to crop statistics to find future crop yield changes over the Indian region.

  18. Behind the fence forum theater: an arts performance partnership to address lupus and environmental justice.

    PubMed

    Williams, Edith Marie; Anderson, Judith; Lee, Rhonda; White, Janice; Hahn-Baker, David

    2009-01-01

    Community-based participatory research (CBPR) is a method to improve environmental quality in communities primarily inhabited by minorities or low-income families. The Buffalo Lupus Project was a CBPR partnership formed to explore the relationship between a local waste site and high rates of lupus. The "Behind the Fence" Community Environmental Forum Theater project was able to successfully funnel the results of scientific research and ongoing activities to the community by utilizing a Forum Theater approach, image-making techniques, an interactive workshop, and energetic public performance. Filming of project activities will expand the reach of that original performance and provide other communities with a potential model for similar efforts.

  19. Development of hybrid computer plasma models for different pressure regimes

    NASA Astrophysics Data System (ADS)

    Hromadka, Jakub; Ibehej, Tomas; Hrach, Rudolf

    2016-09-01

    With increased performance of contemporary computers during last decades numerical simulations became a very powerful tool applicable also in plasma physics research. Plasma is generally an ensemble of mutually interacting particles that is out of the thermodynamic equilibrium and for this reason fluid computer plasma models give results with only limited accuracy. On the other hand, much more precise particle models are often limited only on 2D problems because of their huge demands on the computer resources. Our contribution is devoted to hybrid modelling techniques that combine advantages of both modelling techniques mentioned above, particularly to their so-called iterative version. The study is focused on mutual relations between fluid and particle models that are demonstrated on the calculations of sheath structures of low temperature argon plasma near a cylindrical Langmuir probe for medium and higher pressures. Results of a simple iterative hybrid plasma computer model are also given. The authors acknowledge the support of the Grant Agency of Charles University in Prague (project 220215).

  20. Modelling a single phase voltage controlled rectifier using Laplace transforms

    NASA Technical Reports Server (NTRS)

    Kraft, L. Alan; Kankam, M. David

    1992-01-01

    The development of a 20 kHz, AC power system by NASA for large space projects has spurred a need to develop models for the equipment which will be used on these single phase systems. To date, models for the AC source (i.e., inverters) have been developed. It is the intent of this paper to develop a method to model the single phase voltage controlled rectifiers which will be attached to the AC power grid as an interface for connected loads. A modified version of EPRI's HARMFLO program is used as the shell for these models. The results obtained from the model developed in this paper are quite adequate for the analysis of problems such as voltage resonance. The unique technique presented in this paper uses the Laplace transforms to determine the harmonic content of the load current of the rectifier rather than a curve fitting technique. Laplace transforms yield the coefficient of the differential equations which model the line current to the rectifier directly.

  1. ANEMOS: Development of a next generation wind power forecasting system for the large-scale integration of onshore and offshore wind farms.

    NASA Astrophysics Data System (ADS)

    Kariniotakis, G.; Anemos Team

    2003-04-01

    Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.

  2. Human genome and open source: balancing ethics and business.

    PubMed

    Marturano, Antonio

    2011-01-01

    The Human Genome Project has been completed thanks to a massive use of computer techniques, as well as the adoption of the open-source business and research model by the scientists involved. This model won over the proprietary model and allowed a quick propagation and feedback of research results among peers. In this paper, the author will analyse some ethical and legal issues emerging by the use of such computer model in the Human Genome property rights. The author will argue that the Open Source is the best business model, as it is able to balance business and human rights perspectives.

  3. Creating computer aided 3D model of spleen and kidney based on Visible Human Project.

    PubMed

    Aldur, Muhammet M

    2005-01-01

    To investigate the efficacy of computer aided 3-dimensional (3D) reconstruction technique on visualization and modeling of gross anatomical structures with an affordable methodology applied on the spleen and kidney. From The Visible Human Project Dataset cryosection images, developed by the National Library of Medicine, the spleen and kidney sections were preferred to be used due to their highly distinct contours. The software used for the reconstruction were SurfDriver 3.5.3 for Mac and Cinema 4D XL version 7.1 for Mac OS X. This study was carried out in May 2004 at the Department of Anatomy, Hacettepe University, Ankara, Turkey. As a result of this study, it is determined that these 2 programs could be effectively used both for 3D modeling of the mentioned organs and volumetric analyses on these models. It is also seen that it is possible to hold the physical models of these gross anatomical digital ones with stereolithography technique by means of the data exchange file format provided by the program and present such images as anaglyph. SurfDriver 3.5.3 for Mac OS and Cinema 4 DXL version 7.1 for Mac OS X can be used effectively for reconstruction of gross anatomical structures from serial parallel sections with distinct contours such as spleen and kidney and the animation of models. These software constitute a highly effective way of getting volumetric calculations, spatial relations and morphometrical measurements of reconstructed structures.

  4. The Scientific Status of Projective Techniques.

    PubMed

    Lilienfeld, S O; Wood, J M; Garb, H N

    2000-11-01

    Although projective techniques continue to be widely used in clinical and forensic settings, their scientific status remains highly controversial. In this monograph, we review the current state of the literature concerning the psychometric properties (norms, reliability, validity, incremental validity, treatment utility) of three major projective instruments: Rorschach Inkblot Test, Thematic Apperception Test (TAT), and human figure drawings. We conclude that there is empirical support for the validity of a small number of indexes derived from the Rorschach and TAT. However, the substantial majority of Rorschach and TAT indexes are not empirically supported. The validity evidence for human figure drawings is even more limited. With a few exceptions, projective indexes have not consistently demonstrated incremental validity above and beyond other psychometric data. In addition, we summarize the results of a new meta-analysis intended to examine the capacity of these three instruments to detect child sexual abuse. Although some projective instruments were better than chance at detecting child sexual abuse, there were virtually no replicated findings across independent investigative teams. This meta-analysis also provides the first clear evidence of substantial file drawer effects in the projectives literature, as the effect sizes from published studies markedly exceeded those from unpublished studies. We conclude with recommendations regarding the (a) construction of projective techniques with adequate validity, (b) forensic and clinical use of projective techniques, and (c) education and training of future psychologists regarding projective techniques. © 2000 Association for Psychological Science.

  5. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  6. Web-based Data Visualization of the MGClimDeX Climate Model Output: An Integrated Perspective of Climate Change Impact on Natural Resources in Highly Vulnerable Regions.

    NASA Astrophysics Data System (ADS)

    Martinez-Rey, J.; Brockmann, P.; Cadule, P.; Nangini, C.

    2016-12-01

    Earth System Models allow us to understand the interactions between climate and biogeological processes. These models generate a very large amount of data. These data are usually reduced to a few number of static figures shown in highly specialized scientific publications. However, the potential impacts of climate change demand a broader perspective regarding the ways in which climate model results of this kind are disseminated, particularly in the amount and variety of data, and the target audience. This issue is of great importance particularly for scientific projects that seek a large broadcast with different audiences on their key results. The MGClimDeX project, which assesses the climate change impact on La Martinique island in the Lesser Antilles, will provide tools and means to help the key stakeholders -responsible for addressing the critical social, economic, and environmental issues- to take the appropriate adaptation and mitigation measures in order to prevent future risks associated with climate variability and change, and its role on human activities. The MGClimDeX project will do so by using model output and data visualization techniques within the next year, showing the cross-connected impacts of climate change on various sectors (agriculture, forestry, ecosystems, water resources and fisheries). To address this challenge of representing large sets of data from model output, we use back-end data processing and front-end web-based visualization techniques, going from the conventional netCDF model output stored on hub servers to highly interactive web-based data-powered visualizations on browsers. We use the well-known javascript library D3.js extended with DC.js -a dimensional charting library for all the front-end interactive filtering-, in combination with Bokeh, a Python library to synthesize the data, all framed in the essential HTML+CSS scripts. The resulting websites exist as standalone information units or embedded into journals or scientific-related information hubs. These visualizations encompass all the relevant findings, allowing individual model intercomparisons in the context of observations and socioeconomic references. In this way, the full spectrum of results of the MGClimDeX project is available to the public in general and policymakers in particular.

  7. Environmental mapping and monitoring of Iceland by remote sensing (EMMIRS)

    NASA Astrophysics Data System (ADS)

    Pedersen, Gro B. M.; Vilmundardóttir, Olga K.; Falco, Nicola; Sigurmundsson, Friðþór S.; Rustowicz, Rose; Belart, Joaquin M.-C.; Gísladóttir, Gudrun; Benediktsson, Jón A.

    2016-04-01

    Iceland is exposed to rapid and dynamic landscape changes caused by natural processes and man-made activities, which impact and challenge the country. Fast and reliable mapping and monitoring techniques are needed on a big spatial scale. However, currently there is lack of operational advanced information processing techniques, which are needed for end-users to incorporate remote sensing (RS) data from multiple data sources. Hence, the full potential of the recent RS data explosion is not being fully exploited. The project Environmental Mapping and Monitoring of Iceland by Remote Sensing (EMMIRS) bridges the gap between advanced information processing capabilities and end-user mapping of the Icelandic environment. This is done by a multidisciplinary assessment of two selected remote sensing super sites, Hekla and Öræfajökull, which encompass many of the rapid natural and man-made landscape changes that Iceland is exposed to. An open-access benchmark repository of the two remote sensing supersites is under construction, providing high-resolution LIDAR topography and hyperspectral data for land-cover and landform classification. Furthermore, a multi-temporal and multi-source archive stretching back to 1945 allows a decadal evaluation of landscape and ecological changes for the two remote sensing super sites by the development of automated change detection techniques. The development of innovative pattern recognition and machine learning-based approaches to image classification and change detection is one of the main tasks of the EMMIRS project, aiming to extract and compute earth observation variables as automatically as possible. Ground reference data collected through a field campaign will be used to validate the implemented methods, which outputs are then inferred with geological and vegetation models. Here, preliminary results of an automatic land-cover classification based on hyperspectral image analysis are reported. Furthermore, the EMMIRS project investigates the complex landscape dynamics between geological and ecological processes. This is done through cross-correlation of mapping results and implementation of modelling techniques that simulate geological and ecological processes in order to extrapolate the landscape evolution

  8. Advanced Ground Systems Maintenance Prognostics Project

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M.

    2015-01-01

    The project implements prognostics capabilities to predict when a component system or subsystem will no longer meet desired functional or performance criteria, called the end of life. The capability also provides an assessment of the remaining useful life of a hardware component. The project enables the delivery of system health advisories to ground system operators. This project will use modeling techniques and algorithms to assess components' health andpredict remaining life for such components. The prognostics capability being developed will beused:during the design phase and during pre/post operations to conduct planning and analysis ofsystem design, maintenance & logistics plans, and system/mission operations plansduring real-time operations to monitor changes to components' health and assess their impacton operations.This capability will be interfaced to Ground Operations' command and control system as a part ofthe AGSM project to help assure system availability and mission success. The initial modelingeffort for this capability will be developed for Liquid Oxygen ground loading applications.

  9. Development of a national system for prevention and mitigation of earthquake damages to people and properties, and the reduction of costs related to earthquakes for the Italian Government

    NASA Astrophysics Data System (ADS)

    Console, R.; Greco, M.; Colangelo, A.; Cioè, A.; Trivigno, L.; Chiappini, M.; Ponzo, F.

    2015-12-01

    Recognizing that the Italian territory is prone to disasters in connection with seismic and hydro-geological risk, it has become necessary to define novel regulations and viable solutions aimed at conveying the economical resources of the Italian Government, too often utilized for the management of post-event situations, towards prevention activities. The work synthetically presents the project developed by the CGIAM together with the INGV, and open to collaboration with other Italian and International partners. This project is aimed at the development of a National System for prevention and mitigation of the earthquakes damages, through the definition of a model that achieves the mitigation of the building collapsing risk and the consequent reduction of casualties. Such a model is based on two main issues a) a correct evaluation of risk, defined as a reliable assessment of the hazard expected at a given site and of the vulnerability of civil and industrial buildings, b) setting up of novel strategies for the safety of buildings. The hazard assessment is pursued through the application of innovative multidisciplinary geophysical methodologies and the application of a physically based earthquake simulator. The structural vulnerability of buildings is estimated by means of simplified techniques based on few representative parameters (such as different structural typologies, dynamic soil-structure interaction, etc.) and, for detailed studies, standard protocols for model updating techniques. We analyze, through numerical and experimental approaches, new solutions for the use of innovative materials, and new techniques for the reduction of seismic vulnerability of structural, non-structural and accessorial elements, including low cost type. The project activities are initially implemented on a study area in Southern Italy (Calabria) selected because of its tectonic complexity. The results are expected to be applicable for other hazardous seismic areas of Italy.

  10. Back-Projection Cortical Potential Imaging: Theory and Results.

    PubMed

    Haor, Dror; Shavit, Reuven; Shapiro, Moshe; Geva, Amir B

    2017-07-01

    Electroencephalography (EEG) is the single brain monitoring technique that is non-invasive, portable, passive, exhibits high-temporal resolution, and gives a directmeasurement of the scalp electrical potential. Amajor disadvantage of the EEG is its low-spatial resolution, which is the result of the low-conductive skull that "smears" the currents coming from within the brain. Recording brain activity with both high temporal and spatial resolution is crucial for the localization of confined brain activations and the study of brainmechanismfunctionality, whichis then followed by diagnosis of brain-related diseases. In this paper, a new cortical potential imaging (CPI) method is presented. The new method gives an estimation of the electrical activity on the cortex surface and thus removes the "smearing effect" caused by the skull. The scalp potentials are back-projected CPI (BP-CPI) onto the cortex surface by building a well-posed problem to the Laplace equation that is solved by means of the finite elements method on a realistic head model. A unique solution to the CPI problem is obtained by introducing a cortical normal current estimation technique. The technique is based on the same mechanism used in the well-known surface Laplacian calculation, followed by a scalp-cortex back-projection routine. The BP-CPI passed four stages of validation, including validation on spherical and realistic head models, probabilistic analysis (Monte Carlo simulation), and noise sensitivity tests. In addition, the BP-CPI was compared with the minimum norm estimate CPI approach and found superior for multi-source cortical potential distributions with very good estimation results (CC >0.97) on a realistic head model in the regions of interest, for two representative cases. The BP-CPI can be easily incorporated in different monitoring tools and help researchers by maintaining an accurate estimation for the cortical potential of ongoing or event-related potentials in order to have better neurological inferences from the EEG.

  11. Internal variability of a dynamically downscaled climate over North America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiali; Bessac, Julie; Kotamarthi, Rao

    This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 km and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemblemore » during the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late 21st century. However, the IV is larger than the projected changes in precipitation for the mid- and late 21st century.« less

  12. Internal variability of a dynamically downscaled climate over North America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiali; Bessac, Julie; Kotamarthi, Rao

    This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemble duringmore » the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late twenty-first century. However, the IV is larger than the projected changes in precipitation for the mid- and late twenty-first century.« less

  13. Internal variability of a dynamically downscaled climate over North America

    NASA Astrophysics Data System (ADS)

    Wang, Jiali; Bessac, Julie; Kotamarthi, Rao; Constantinescu, Emil; Drewniak, Beth

    2018-06-01

    This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemble during the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late twenty-first century. However, the IV is larger than the projected changes in precipitation for the mid- and late twenty-first century.

  14. Internal variability of a dynamically downscaled climate over North America

    NASA Astrophysics Data System (ADS)

    Wang, Jiali; Bessac, Julie; Kotamarthi, Rao; Constantinescu, Emil; Drewniak, Beth

    2017-09-01

    This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemble during the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late twenty-first century. However, the IV is larger than the projected changes in precipitation for the mid- and late twenty-first century.

  15. Advantages of active love wave techniques in geophysical characterizations of seismographic station - Case studies in California and the central and eastern United States

    USGS Publications Warehouse

    Martin, Antony; Yong, Alan K.; Salomone, Larry A.

    2014-01-01

    Active-source Love waves, recorded by the multi-channel analysis of surface wave (MASLW) technique, were recently analyzed in two site characterization projects. Between 2010 and 2012, the 2009 American Recovery and Reinvestment Act (ARRA) funded GEOVision to conduct geophysical investigations at 191 seismographic stations in California and the Central Eastern U.S. (CEUS). The original project plan was to utilize active and passive Rayleigh wave-based techniques to obtain shear-wave velocity (VS) profiles to a minimum depth of 30 m and the time-averaged VS of the upper 30 meters (VS30). Early in this investigation it became clear that Rayleigh wave techniques, such as multi-channel analysis of surface waves (MASRW), were not suited for characterizing all sites. Shear-wave seismic refraction and MASLW techniques were therefore applied. In 2012, the Electric Power Research Institute funded characterization of 33 CEUS station sites. Based on experience from the ARRA investigation, both MASRW and MASLW data were acquired by GEOVision at 24 CEUS sites. At shallow rock sites, sites with steep velocity gradients, and, sites with a thin, low velocity, surficial soil layer overlying stiffer sediments, Love wave techniques generally were found to be easier to interpret, i.e., Love wave data typically yielded unambiguous fundamental mode dispersion curves and thus, reduce uncertainty in the resultant VS model. These types of velocity structure often excite dominant higher modes in Rayleigh wave data, but not in the Love wave data. It is possible to model Rayleigh wave data using multi- or effective-mode techniques; however, extraction of Rayleigh wave dispersion data was found to be difficult in many cases. These results imply that field procedures should include careful scrutiny of Rayleigh wave-based dispersion data in order to also collect Love wave data when warranted.

  16. Preliminary evaluation of spectral, normal and meteorological crop stage estimation approaches

    NASA Technical Reports Server (NTRS)

    Cate, R. B.; Artley, J. A.; Doraiswamy, P. C.; Hodges, T.; Kinsler, M. C.; Phinney, D. E.; Sestak, M. L. (Principal Investigator)

    1980-01-01

    Several of the projects in the AgRISTARS program require crop phenology information, including classification, acreage and yield estimation, and detection of episodal events. This study evaluates several crop calendar estimation techniques for their potential use in the program. The techniques, although generic in approach, were developed and tested on spring wheat data collected in 1978. There are three basic approaches to crop stage estimation: historical averages for an area (normal crop calendars), agrometeorological modeling of known crop-weather relationships agrometeorological (agromet) crop calendars, and interpretation of spectral signatures (spectral crop calendars). In all, 10 combinations of planting and biostage estimation models were evaluated. Dates of stage occurrence are estimated with biases between -4 and +4 days while root mean square errors range from 10 to 15 days. Results are inconclusive as to the superiority of any of the models and further evaluation of the models with the 1979 data set is recommended.

  17. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is considered. Many tasks in computational materials science can be posed as optimization problems. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The last approach is concerned with the generation of realizations of materials with specified but limited microstructural information: an intriguing inverse problem of both fundamental and practical importance. Computational models based upon the theories of molecular dynamics or quantum mechanics would enable the prediction and modification of fundamental materials properties. This problem is solved using deterministic and stochastic optimization techniques. The main optimization approaches in the frame of the EU project "Superlight-weight thermal protection system for space application" are discussed. Optimization approach to the alloys for obtaining materials with required properties using modeling techniques and experimental data will be also considered. This report is supported by the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)"

  18. Multi-model ensemble projections of future extreme heat stress on rice across southern China

    NASA Astrophysics Data System (ADS)

    He, Liang; Cleverly, James; Wang, Bin; Jin, Ning; Mi, Chunrong; Liu, De Li; Yu, Qiang

    2017-08-01

    Extreme heat events have become more frequent and intense with climate warming, and these heatwaves are a threat to rice production in southern China. Projected changes in heat stress in rice provide an assessment of the potential impact on crop production and can direct measures for adaptation to climate change. In this study, we calculated heat stress indices using statistical scaling techniques, which can efficiently downscale output from general circulation models (GCMs). Data across the rice belt in southern China were obtained from 28 GCMs in the Coupled Model Intercomparison Project phase 5 (CMIP5) with two emissions scenarios (RCP4.5 for current emissions and RCP8.5 for increasing emissions). Multi-model ensemble projections over the historical period (1960-2010) reproduced the trend of observations in heat stress indices (root-mean-square error RMSE = 6.5 days) better than multi-model arithmetic mean (RMSE 8.9 days) and any individual GCM (RMSE 11.4 days). The frequency of heat stress events was projected to increase by 2061-2100 in both scenarios (up to 185 and 319% for RCP4.5 and RCP8.5, respectively), especially in the middle and lower reaches of the Yangtze River. This increasing risk of exposure to heat stress above 30 °C during flowering and grain filling is predicted to impact rice production. The results of our study suggest the importance of specific adaption or mitigation strategies, such as selection of heat-tolerant cultivars and adjustment of planting date in a warmer future world.

  19. NASA/ASEE Summer Faculty Fellowship Program, 1990, Volume 1

    NASA Technical Reports Server (NTRS)

    Bannerot, Richard B. (Editor); Goldstein, Stanley H. (Editor)

    1990-01-01

    The 1990 Johnson Space Center (JSC) NASA/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program was conducted by the University of Houston-University Park and JSC. A compilation of the final reports on the research projects are presented. The topics covered include: the Space Station; the Space Shuttle; exobiology; cell biology; culture techniques; control systems design; laser induced fluorescence; spacecraft reliability analysis; reduced gravity; biotechnology; microgravity applications; regenerative life support systems; imaging techniques; cardiovascular system; physiological effects; extravehicular mobility units; mathematical models; bioreactors; computerized simulation; microgravity simulation; and dynamic structural analysis.

  20. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  1. Multiple element isotope probes, NanoSIMS, and the functional genomics of microbial carbon cycling in soils in response to chronic climatic change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungate, Bruce; Pett-Ridge, Jennifer; Blazewicz, Steven

    In this project, we developed an innovative and ground-breaking technique, quantitative stable isotope probing, a technique that uses density separation of nucleic acids as a quantitative measurement technique. This work is substantial because it advances SIP beyond the qualitative technique that has dominate the field for years. The first methods paper was published in Applied and Environmental Microbiology (Hungate et al. 2015), and this paper describes the mathematical model underlying the quantitative interpretation. A second methods paper (Schwartz et al. 2015) provides a conceptual overview of the method and its application to research problems. A third methods paper was justmore » published (Koch et al. 2018), in which we develop the quantitative model combining sequencing and isotope data to estimate actual rates of microbial growth and death in natural populations. This work has met much enthusiasm in scientific presentations around the world. It has met with equally enthusiastic resistance in the peer-review process, though our record of publication to date argues that people are accepting the merits of the approach. The skepticism and resistance are also potentially signs that this technique is pushing the field forward, albeit with some of the discomfort that accompanies extrapolation. Part of this is a cultural element in the field – the field of microbiology is not accustomed to the assumptions of ecosystem science. Research conducted in this project has pushed the philosophical perspective that major advances can occur when we advocate a sound merger between the traditions of strong inference in microbiology with those of grounded scaling in ecosystem science.« less

  2. Multiple element isotope probes, NanoSIMS, and the functional genomics of microbial carbon cycling in soils in response to chronic climatic change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungate, Bruce; PettRidge, Jennifer; Blazewicz, St

    In this project, we developed an innovative and groundbreaking technique, quantitative stable isotope probing, a technique that uses density separation of nucleic acids as a quantitative measurement technique. This work is substantial because it advances SIP beyond the qualitative technique that has dominate the field for years. The first methods paper was published in Applied and Environmental Microbiology (Hungate et al. 2015), and this paper describes the mathematical model underlying the quantitative interpretation. A second methods paper (Schwartz et al. 2015) provides a conceptual overview of the method and its application to research problems. A third methods paper was justmore » published (Koch et al. 2018), in which we develop the quantitative model combining sequencing and isotope data to estimate actual rates of microbial growth and death in natural populations. This work has met much enthusiasm in scientific presentations around the world. It has met with equally enthusiastic resistance in the peerreview process, though our record of publication to date argues that people are accepting the merits of the approach. The skepticism and resistance are also potentially signs that this technique is pushing the field forward, albeit with some of the discomfort that accompanies extrapolation. Part of this is a cultural element in the field – the field of microbiology is not accustomed to the assumptions of ecosystem science. Research conducted in this project has pushed the philosophical perspective that major advances can occur when we advocate a sound merger between the traditions of strong inference in microbiology with those of grounded scaling in ecosystem science.« less

  3. Improved methods for distribution loss evaluation. Volume 1: analytic and evaluative techniques. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flinn, D.G.; Hall, S.; Morris, J.

    This volume describes the background research, the application of the proposed loss evaluation techniques, and the results. The research identified present loss calculation methods as appropriate, provided care was taken to represent the various system elements in sufficient detail. The literature search of past methods and typical data revealed that extreme caution in using typical values (load factor, etc.) should be taken to ensure that all factors were referred to the same time base (daily, weekly, etc.). The performance of the method (and computer program) proposed in this project was determined by comparison of results with a rigorous evaluation ofmore » losses on the Salt River Project system. This rigorous evaluation used statistical modeling of the entire system as well as explicit enumeration of all substation and distribution transformers. Further tests were conducted at Public Service Electric and Gas of New Jersey to check the appropriateness of the methods in a northern environment. Finally sensitivity tests indicated data elements inaccuracy of which would most affect the determination of losses using the method developed in this project.« less

  4. Downscaling global land-use/land-cover projections for use in region-level state-and-transition simulation modeling

    USGS Publications Warehouse

    Sherba, Jason T.; Sleeter, Benjamin M.; Davis, Adam W.; Parker, Owen P.

    2015-01-01

    Global land-use/land-cover (LULC) change projections and historical datasets are typically available at coarse grid resolutions and are often incompatible with modeling applications at local to regional scales. The difficulty of downscaling and reapportioning global gridded LULC change projections to regional boundaries is a barrier to the use of these datasets in a state-and-transition simulation model (STSM) framework. Here we compare three downscaling techniques to transform gridded LULC transitions into spatial scales and thematic LULC classes appropriate for use in a regional STSM. For each downscaling approach, Intergovernmental Panel on Climate Change (IPCC) Representative Concentration Pathway (RCP) LULC projections, at the 0.5 × 0.5 cell resolution, were downscaled to seven Level III ecoregions in the Pacific Northwest, United States. RCP transition values at each cell were downscaled based on the proportional distribution between ecoregions of (1) cell area, (2) land-cover composition derived from remotely-sensed imagery, and (3) historic LULC transition values from a LULC history database. Resulting downscaled LULC transition values were aggregated according to their bounding ecoregion and “cross-walked” to relevant LULC classes. Ecoregion-level LULC transition values were applied in a STSM projecting LULC change between 2005 and 2100. While each downscaling methods had advantages and disadvantages, downscaling using the historical land-use history dataset consistently apportioned RCP LULC transitions in agreement with historical observations. Regardless of the downscaling method, some LULC projections remain improbable and require further investigation.

  5. Use of hydrologic and hydrodynamic modeling for ecosystem restoration

    USGS Publications Warehouse

    Obeysekera, J.; Kuebler, L.; Ahmed, S.; Chang, M.-L.; Engel, V.; Langevin, C.; Swain, E.; Wan, Y.

    2011-01-01

    Planning and implementation of unprecedented projects for restoring the greater Everglades ecosystem are underway and the hydrologic and hydrodynamic modeling of restoration alternatives has become essential for success of restoration efforts. In view of the complex nature of the South Florida water resources system, regional-scale (system-wide) hydrologic models have been developed and used extensively for the development of the Comprehensive Everglades Restoration Plan. In addition, numerous subregional-scale hydrologic and hydrodynamic models have been developed and are being used for evaluating project-scale water management plans associated with urban, agricultural, and inland costal ecosystems. The authors provide a comprehensive summary of models of all scales, as well as the next generation models under development to meet the future needs of ecosystem restoration efforts in South Florida. The multiagency efforts to develop and apply models have allowed the agencies to understand the complex hydrologic interactions, quantify appropriate performance measures, and use new technologies in simulation algorithms, software development, and GIS/database techniques to meet the future modeling needs of the ecosystem restoration programs. Copyright ?? 2011 Taylor & Francis Group, LLC.

  6. The Dynamical Core Model Intercomparison Project (DCMIP-2016): Results of the Supercell Test Case

    NASA Astrophysics Data System (ADS)

    Zarzycki, C. M.; Reed, K. A.; Jablonowski, C.; Ullrich, P. A.; Kent, J.; Lauritzen, P. H.; Nair, R. D.

    2016-12-01

    The 2016 Dynamical Core Model Intercomparison Project (DCMIP-2016) assesses the modeling techniques for global climate and weather models and was recently held at the National Center for Atmospheric Research (NCAR) in conjunction with a two-week summer school. Over 12 different international modeling groups participated in DCMIP-2016 and focused on the evaluation of the newest non-hydrostatic dynamical core designs for future high-resolution weather and climate models. The paper highlights the results of the third DCMIP-2016 test case, which is an idealized supercell storm on a reduced-radius Earth. The supercell storm test permits the study of a non-hydrostatic moist flow field with strong vertical velocities and associated precipitation. This test assesses the behavior of global modeling systems at extremely high spatial resolution and is used in the development of next-generation numerical weather prediction capabilities. In this regime the effective grid spacing is very similar to the horizontal scale of convective plumes, emphasizing resolved non-hydrostatic dynamics. The supercell test case sheds light on the physics-dynamics interplay and highlights the impact of diffusion on model solutions.

  7. A Novel Nipple Reconstruction Technique for Maintaining Nipple Projection: The Boomerang Flap

    PubMed Central

    Kim, Young-Eun; Hong, Ki Yong; Minn, Kyung Won

    2016-01-01

    Nipple-areolar complex (NAC) reconstruction is the final step in the long journey of breast reconstruction for mastectomy patients. Successful NAC reconstruction depends on the use of appropriate surgical techniques that are simple and reliable. To date, numerous techniques have been used for nipple reconstruction, including contralateral nipple sharing and various local flaps. Recently, it has been common to utilize local flaps. However, the most common nipple reconstruction problem encountered with local flaps is the loss of nipple projection; there can be approximately 50% projection loss in reconstructed nipples over long-term follow-up. Several factors might contribute to nipple projection loss, and we tried to overcome these factors by performing nipple reconstructions using a boomerang flap technique, which is a modified C–V flap that utilizes the previous mastectomy scar to maintain long-term nipple projection. PMID:27689057

  8. A Novel Nipple Reconstruction Technique for Maintaining Nipple Projection: The Boomerang Flap.

    PubMed

    Kim, Young-Eun; Hong, Ki Yong; Minn, Kyung Won; Jin, Ung Sik

    2016-09-01

    Nipple-areolar complex (NAC) reconstruction is the final step in the long journey of breast reconstruction for mastectomy patients. Successful NAC reconstruction depends on the use of appropriate surgical techniques that are simple and reliable. To date, numerous techniques have been used for nipple reconstruction, including contralateral nipple sharing and various local flaps. Recently, it has been common to utilize local flaps. However, the most common nipple reconstruction problem encountered with local flaps is the loss of nipple projection; there can be approximately 50% projection loss in reconstructed nipples over long-term follow-up. Several factors might contribute to nipple projection loss, and we tried to overcome these factors by performing nipple reconstructions using a boomerang flap technique, which is a modified C-V flap that utilizes the previous mastectomy scar to maintain long-term nipple projection.

  9. A heuristic method for consumable resource allocation in multi-class dynamic PERT networks

    NASA Astrophysics Data System (ADS)

    Yaghoubi, Saeed; Noori, Siamak; Mazdeh, Mohammad Mahdavi

    2013-06-01

    This investigation presents a heuristic method for consumable resource allocation problem in multi-class dynamic Project Evaluation and Review Technique (PERT) networks, where new projects from different classes (types) arrive to system according to independent Poisson processes with different arrival rates. Each activity of any project is operated at a devoted service station located in a node of the network with exponential distribution according to its class. Indeed, each project arrives to the first service station and continues its routing according to precedence network of its class. Such system can be represented as a queuing network, while the discipline of queues is first come, first served. On the basis of presented method, a multi-class system is decomposed into several single-class dynamic PERT networks, whereas each class is considered separately as a minisystem. In modeling of single-class dynamic PERT network, we use Markov process and a multi-objective model investigated by Azaron and Tavakkoli-Moghaddam in 2007. Then, after obtaining the resources allocated to service stations in every minisystem, the final resources allocated to activities are calculated by the proposed method.

  10. Modelling gas transport in the shallow subsurface in the Maguelone field experiment

    NASA Astrophysics Data System (ADS)

    Basirat, Farzad; Niemi, Auli; Perroud, Hervé; Lofi, Johanna; Denchik, Nataliya; Lods, Gérard; Pezard, Philippe; Sharma, Prabhakar; Fagerlund, Fritjof

    2013-04-01

    Developing reliable monitoring techniques to detect and characterize CO2 leakage in shallow subsurface is necessary for the safety of any GCS project. To test different monitoring techniques, shallow injection-monitoring experiment have and are being carried out at the Maguelone, along the Mediterranean lido of the Gulf of Lions, near Montpellier, France. This experimental site was developed in the context of EU FP7 project MUSTANG and is documented in Lofi et al. (2012). Gas injection experiments are being carried out and three techniques of pressure, electrical resistivity and seismic monitoring have been used to detect the nitrogen and CO2 release in the near surface environment. In the present work we use the multiphase and multicomponent TOUGH2/EOS7CA model to simulate the gaseous nitrogen and CO2 transport of the experiments carried out so far. The objective is both to gain understanding of the system performance based on the model analysis as well as to further develop and validate modelling approaches for gas transport in the shallow subsurface, against the well-controlled data sets. Numerical simulation can also be used for the prediction of experimental setup limitations. We expect the simulations to represent the breakthrough time for the different tested injection rates. Based on the hydrogeological formation data beneath the lido, we also expect the vertical heterogeneities in grain size distribution create an effective capillary barrier against upward gas transport in numerical simulations. Lofi J., Pezard P.A., Bouchette F., Raynal O., Sabatier P., Denchik N., Levannier A., Dezileau L., and Certain R. Integrated onshore-offshore geophysical investigation of a layered coastal aquifer, NW Mediterranean. Ground Water, (2012).

  11. Development and Testing of a Simple Calibration Technique for Long-Term Hydrological Impact Assessment (L-THIA) Model

    NASA Astrophysics Data System (ADS)

    Muthukrishnan, S.; Harbor, J.

    2001-12-01

    Hydrological studies are significant part of every engineering, developmental project and geological studies done to assess and understand the interactions between the hydrology and the environment. Such studies are generally conducted before the beginning of the project as well as after the project is completed, such that a comprehensive analysis can be done on the impact of such projects on the local and regional hydrology of the area. A good understanding of the chain of relationships that form the hydro-eco-biological and environmental cycle can be of immense help in maintaining the natural balance as we work towards exploration and exploitation of the natural resources as well as urbanization of undeveloped land. Rainfall-Runoff modeling techniques have been of great use here for decades since they provide fast and efficient means of analyzing vast amount of data that is gathered. Though process based, detailed models are better than the simple models, the later ones are used more often due to their simplicity, ease of use, and easy availability of data needed to run them. The Curve Number (CN) method developed by the United States Department of Agriculture (USDA) is one of the most widely used hydrologic modeling tools in the US, and has earned worldwide acceptance as a practical method for evaluating the effects of land use changes on the hydrology of an area. The Long-Term Hydrological Impact Assessment (L-THIA) model is a basic, CN-based, user-oriented model that has gained popularity amongst watershed planners because of its reliance on readily available data, and because the model is easy to use (http://www.ecn.purdue.edu/runoff) and produces results geared to the general information needs of planners. The L-THIA model was initially developed to study the relative long-term hydrologic impacts of different land use (past/current/future) scenarios, and it has been successful in meeting this goal. However, one of the weaknesses of L-THIA, as well as other models that focus strictly on surface runoff, is that many users are interested in predictions of runoff that match observations of flow in streams and rivers. To make L-THIA more useful for the planners and engineers alike, a simple, long-term calibration method based on linear regression of L-THIA predicted and observed surface runoff has been developed and tested here. The results from Little Eagle Creek (LEC) in Indiana show that such calibrations are successful and valuable. This method can be used to calibrate other simple rainfall-runoff models too.

  12. The Role of Inflation and Price Escalation Adjustments in Properly Estimating Program Costs: F-35 Case Study

    DTIC Science & Technology

    2016-03-01

    regression models that yield hedonic price indexes is closely related to standard techniques for developing cost estimating relationships ( CERs ...October 2014). iii analysis) and derives a price index from the coefficients on variables reflecting the year of purchase. In CER development, the...index. The relevant cost metric in both cases is unit recurring flyaway (URF) costs. For the current project, we develop a “Baseline” CER model, taking

  13. Geophysical techniques applied to urban planning in complex near surface environments. Examples of Zaragoza, NE Spain

    NASA Astrophysics Data System (ADS)

    Pueyo-Anchuela, Ó.; Casas-Sainz, A. M.; Soriano, M. A.; Pocoví-Juan, A.

    Complex geological shallow subsurface environments represent an important handicap in urban and building projects. The geological features of the Central Ebro Basin, with sharp lateral changes in Quaternary deposits, alluvial karst phenomena and anthropic activity can preclude the characterization of future urban areas only from isolated geomechanical tests or from non-correctly dimensioned geophysical techniques. This complexity is here analyzed in two different test fields, (i) one of them linked to flat-bottomed valleys with irregular distribution of Quaternary deposits related to sharp lateral facies changes and irregular preconsolidated substratum position and (ii) a second one with similar complexities in the alluvial deposits and karst activity linked to solution of the underlying evaporite substratum. The results show that different geophysical techniques allow for similar geological models to be obtained in the first case (flat-bottomed valleys), whereas only the application of several geophysical techniques can permit to correctly evaluate the geological model complexities in the second case (alluvial karst). In this second case, the geological and superficial information permit to refine the sensitivity of the applied geophysical techniques to different indicators of karst activity. In both cases 3D models are needed to correctly distinguish alluvial lateral sedimentary changes from superimposed karstic activity.

  14. Designing for Change: Minimizing the Impact of Changing Requirements in the Later Stages of a Spaceflight Software Project

    NASA Technical Reports Server (NTRS)

    Allen, B. Danette

    1998-01-01

    In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  16. Three Dimensional Reconstruction Workflows for Lost Cultural Heritage Monuments Exploiting Public Domain and Professional Photogrammetric Imagery

    NASA Astrophysics Data System (ADS)

    Wahbeh, W.; Nebiker, S.

    2017-08-01

    In our paper, we document experiments and results of image-based 3d reconstructions of famous heritage monuments which were recently damaged or completely destroyed by the so-called Islamic state in Syria and Iraq. The specific focus of our research is on the combined use of professional photogrammetric imagery and of publicly available imagery from the web for optimally 3d reconstructing those monuments. The investigated photogrammetric reconstruction techniques include automated bundle adjustment and dense multi-view 3d reconstruction using public domain and professional imagery on the one hand and an interactive polygonal modelling based on projected panoramas on the other. Our investigations show that the combination of these two image-based modelling techniques delivers better results in terms of model completeness, level of detail and appearance.

  17. Using patient data similarities to predict radiation pneumonitis via a self-organizing map

    NASA Astrophysics Data System (ADS)

    Chen, Shifeng; Zhou, Sumin; Yin, Fang-Fang; Marks, Lawrence B.; Das, Shiva K.

    2008-01-01

    This work investigates the use of the self-organizing map (SOM) technique for predicting lung radiation pneumonitis (RP) risk. SOM is an effective method for projecting and visualizing high-dimensional data in a low-dimensional space (map). By projecting patients with similar data (dose and non-dose factors) onto the same region of the map, commonalities in their outcomes can be visualized and categorized. Once built, the SOM may be used to predict pneumonitis risk by identifying the region of the map that is most similar to a patient's characteristics. Two SOM models were developed from a database of 219 lung cancer patients treated with radiation therapy (34 clinically diagnosed with Grade 2+ pneumonitis). The models were: SOMall built from all dose and non-dose factors and, for comparison, SOMdose built from dose factors alone. Both models were tested using ten-fold cross validation and Receiver Operating Characteristics (ROC) analysis. Models SOMall and SOMdose yielded ten-fold cross-validated ROC areas of 0.73 (sensitivity/specificity = 71%/68%) and 0.67 (sensitivity/specificity = 63%/66%), respectively. The significant difference between the cross-validated ROC areas of these two models (p < 0.05) implies that non-dose features add important information toward predicting RP risk. Among the input features selected by model SOMall, the two with highest impact for increasing RP risk were: (a) higher mean lung dose and (b) chemotherapy prior to radiation therapy. The SOM model developed here may not be extrapolated to treatment techniques outside that used in our database, such as several-field lung intensity modulated radiation therapy or gated radiation therapy.

  18. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  19. A partitioned model order reduction approach to rationalise computational expenses in nonlinear fracture mechanics

    PubMed Central

    Kerfriden, P.; Goury, O.; Rabczuk, T.; Bordas, S.P.A.

    2013-01-01

    We propose in this paper a reduced order modelling technique based on domain partitioning for parametric problems of fracture. We show that coupling domain decomposition and projection-based model order reduction permits to focus the numerical effort where it is most needed: around the zones where damage propagates. No a priori knowledge of the damage pattern is required, the extraction of the corresponding spatial regions being based solely on algebra. The efficiency of the proposed approach is demonstrated numerically with an example relevant to engineering fracture. PMID:23750055

  20. Characterization of Model-Based Reasoning Strategies for Use in IVHM Architectures

    NASA Technical Reports Server (NTRS)

    Poll, Scott; Iverson, David; Patterson-Hine, Ann

    2003-01-01

    Open architectures are gaining popularity for Integrated Vehicle Health Management (IVHM) applications due to the diversity of subsystem health monitoring strategies in use and the need to integrate a variety of techniques at the system health management level. The basic concept of an open architecture suggests that whatever monitoring or reasoning strategy a subsystem wishes to deploy, the system architecture will support the needs of that subsystem and will be capable of transmitting subsystem health status across subsystem boundaries and up to the system level for system-wide fault identification and diagnosis. There is a need to understand the capabilities of various reasoning engines and how they, coupled with intelligent monitoring techniques, can support fault detection and system level fault management. Researchers in IVHM at NASA Ames Research Center are supporting the development of an IVHM system for liquefying-fuel hybrid rockets. In the initial stage of this project, a few readily available reasoning engines were studied to assess candidate technologies for application in next generation launch systems. Three tools representing the spectrum of model-based reasoning approaches, from a quantitative simulation based approach to a graph-based fault propagation technique, were applied to model the behavior of the Hybrid Combustion Facility testbed at Ames. This paper summarizes the characterization of the modeling process for each of the techniques.

  1. Diverse Short-Term Dynamics of Inhibitory Synapses Converging on Striatal Projection Neurons: Differential Changes in a Rodent Model of Parkinson's Disease

    PubMed Central

    Herrera-Valdez, Marco A.; Lopez-Huerta, Violeta Gisselle; Galarraga, Elvira

    2015-01-01

    Most neurons in the striatum are projection neurons (SPNs) which make synapses with each other within distances of approximately 100 µm. About 5% of striatal neurons are GABAergic interneurons whose axons expand hundreds of microns. Short-term synaptic plasticity (STSP) between fast-spiking (FS) interneurons and SPNs and between SPNs has been described with electrophysiological and optogenetic techniques. It is difficult to obtain pair recordings from some classes of interneurons and due to limitations of actual techniques, no other types of STSP have been described on SPNs. Diverse STSPs may reflect differences in presynaptic release machineries. Therefore, we focused the present work on answering two questions: Are there different identifiable classes of STSP between GABAergic synapses on SPNs? And, if so, are synapses exhibiting different classes of STSP differentially affected by dopamine depletion? Whole-cell voltage-clamp recordings on SPNs revealed three classes of STSPs: depressing, facilitating, and biphasic (facilitating-depressing), in response to stimulation trains at 20 Hz, in a constant ionic environment. We then used the 6-hydroxydopamine (6-OHDA) rodent model of Parkinson's disease to show that synapses with different STSPs are differentially affected by dopamine depletion. We propose a general model of STSP that fits all the dynamics found in our recordings. PMID:26167304

  2. Influence of three common calibration metrics on the diagnosis of climate change impacts on water resources

    NASA Astrophysics Data System (ADS)

    Seiller, G.; Roy, R.; Anctil, F.

    2017-04-01

    Uncertainties associated to the evaluation of the impacts of climate change on water resources are broad, from multiple sources, and lead to diagnoses sometimes difficult to interpret. Quantification of these uncertainties is a key element to yield confidence in the analyses and to provide water managers with valuable information. This work specifically evaluates the influence of hydrological modeling calibration metrics on future water resources projections, on thirty-seven watersheds in the Province of Québec, Canada. Twelve lumped hydrologic models, representing a wide range of operational options, are calibrated with three common objective functions derived from the Nash-Sutcliffe efficiency. The hydrologic models are forced with climate simulations corresponding to two RCP, twenty-nine GCM from CMIP5 (Coupled Model Intercomparison Project phase 5) and two post-treatment techniques, leading to future projections in the 2041-2070 period. Results show that the diagnosis of the impacts of climate change on water resources are quite affected by the hydrologic models selection and calibration metrics. Indeed, for the four selected hydrological indicators, dedicated to water management, parameters from the three objective functions can provide different interpretations in terms of absolute and relative changes, as well as projected changes direction and climatic ensemble consensus. The GR4J model and a multimodel approach offer the best modeling options, based on calibration performance and robustness. Overall, these results illustrate the need to provide water managers with detailed information on relative changes analysis, but also absolute change values, especially for hydrological indicators acting as security policy thresholds.

  3. Reliability ensemble averaging of 21st century projections of terrestrial net primary productivity reduces global and regional uncertainties

    NASA Astrophysics Data System (ADS)

    Exbrayat, Jean-François; Bloom, A. Anthony; Falloon, Pete; Ito, Akihiko; Smallman, T. Luke; Williams, Mathew

    2018-02-01

    Multi-model averaging techniques provide opportunities to extract additional information from large ensembles of simulations. In particular, present-day model skill can be used to evaluate their potential performance in future climate simulations. Multi-model averaging methods have been used extensively in climate and hydrological sciences, but they have not been used to constrain projected plant productivity responses to climate change, which is a major uncertainty in Earth system modelling. Here, we use three global observationally orientated estimates of current net primary productivity (NPP) to perform a reliability ensemble averaging (REA) method using 30 global simulations of the 21st century change in NPP based on the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) business as usual emissions scenario. We find that the three REA methods support an increase in global NPP by the end of the 21st century (2095-2099) compared to 2001-2005, which is 2-3 % stronger than the ensemble ISIMIP mean value of 24.2 Pg C y-1. Using REA also leads to a 45-68 % reduction in the global uncertainty of 21st century NPP projection, which strengthens confidence in the resilience of the CO2 fertilization effect to climate change. This reduction in uncertainty is especially clear for boreal ecosystems although it may be an artefact due to the lack of representation of nutrient limitations on NPP in most models. Conversely, the large uncertainty that remains on the sign of the response of NPP in semi-arid regions points to the need for better observations and model development in these regions.

  4. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  5. Determining the Marker Configuration and Modeling Technique to Optimize the Biomechanical Analysis of Running-Specific Prostheses

    DTIC Science & Technology

    2012-03-01

    4 Body ...Final Report requirement. 5  Body The approved Statement of Work proposed the following timeline (Table 1): Table 1. Timeline for...prosthesis designs (Figure 1) were tested for this project including the 1E90 Sprinter (OttoBock Inc.), Flex-Run (Ossur), Cheetah ® (Ossur) and Nitro

  6. Organizational Analysis and Career Projections Based on a Level-of-Responsibility/Equitable Payment Model. Technical Report.

    ERIC Educational Resources Information Center

    Laner, Stephen; And Others

    Following an explanation of the Level of Responsibility/Equitable Pay Function, its applicability is demonstrated to the analysis and to the design and redesign of organizational hierarchies. It is shown how certain common dysfuntional anomalies can be avoided by structuring an organization along the principles outlined. A technique is then…

  7. The Transient Excitation and Oscillation Testing Technique Applied to a Captive Model.

    DTIC Science & Technology

    1981-06-01

    8217Kall io I 9. PERFORMING ORGANIZATION NAME AND ADDRESS PROGRAM ELEMENT. PROJECT, TASK David W. Tayl.or Naval Ship Research and -OW UNIT NUMBERS...corrective tare terms (C’. 11 C35’ C 53 and C 55) used in calculating the nondimensional oscillat ion coff ticient s. Figures 7--14 contain comparisons of

  8. Teaching Record-Keeping Skills to 4-H Youths through Experiential Learning Techniques

    ERIC Educational Resources Information Center

    Roland, Tyanne J.; Fisher, Meredith

    2016-01-01

    Teaching record keeping for breeding projects in a way that keeps youths engaged is a difficult task. The activity discussed in this article was used to teach 4-H participants the importance of record keeping by implementing the experiential learning model and without lecturing. A description of the activity, instructions and materials for the…

  9. Model Solar Energy Training Program II. Final Report, July 1, 1981-June 30, 1982.

    ERIC Educational Resources Information Center

    Talcott Mountain Science Center, Avon, CT.

    Trained personnel will be needed in the future to install solar energy heating and hot water systems, and public school vocational education teachers will be needed to train these technicians. A project to train high school vocational teachers so that they can teach their students about solar energy concepts, manufacturing techniques, testing, and…

  10. AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, D.; Alfonsi, A.; Talbot, P.

    2016-10-01

    The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less

  11. [Parallel virtual reality visualization of extreme large medical datasets].

    PubMed

    Tang, Min

    2010-04-01

    On the basis of a brief description of grid computing, the essence and critical techniques of parallel visualization of extreme large medical datasets are discussed in connection with Intranet and common-configuration computers of hospitals. In this paper are introduced several kernel techniques, including the hardware structure, software framework, load balance and virtual reality visualization. The Maximum Intensity Projection algorithm is realized in parallel using common PC cluster. In virtual reality world, three-dimensional models can be rotated, zoomed, translated and cut interactively and conveniently through the control panel built on virtual reality modeling language (VRML). Experimental results demonstrate that this method provides promising and real-time results for playing the role in of a good assistant in making clinical diagnosis.

  12. Methods for studying the zebrafish brain: past, present and future.

    PubMed

    Wyatt, Cameron; Bartoszek, Ewelina M; Yaksi, Emre

    2015-07-01

    The zebrafish (Danio rerio) is one of the most promising new model organisms. The increasing popularity of this amazing small vertebrate is evident from the exponentially growing numbers of research articles, funded projects and new discoveries associated with the use of zebrafish for studying development, brain function, human diseases and screening for new drugs. Thanks to the development of novel technologies, the range of zebrafish research is constantly expanding with new tools synergistically enhancing traditional techniques. In this review we will highlight the past and present techniques which have made, and continue to make, zebrafish an attractive model organism for various fields of biology, with a specific focus on neuroscience. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  13. Lower bounds for the ground state energy for the PPP and Hubbard models of the benzene molecule

    NASA Astrophysics Data System (ADS)

    Číẑek, J.; Vinette, F.

    1988-09-01

    The optimized inner projection (OIP) technique, which is equivalent to the method of intermediate Hamiltonians (MIH), is applied to the PPP and Hubbard models of the benzene molecule. Both these methods are applicable since the electrostatic part of the PPP and Hubbard Hamiltonians is positive definite. Lower energy bounds are calculated using OIP and MIH for all values of the resonance integral β. In this study, β plays the role of a coupling constant. The deviation of the OIP results from exact ones is smaller than 7% for all values of β. The OIP results are also compared with the correlation energies obtained by other techniques. The OIP method gives surprisingly good results even for small |β| values.

  14. NASA. Lewis Research Center Advanced Modulation and Coding Project: Introduction and overview

    NASA Technical Reports Server (NTRS)

    Budinger, James M.

    1992-01-01

    The Advanced Modulation and Coding Project at LeRC is sponsored by the Office of Space Science and Applications, Communications Division, Code EC, at NASA Headquarters and conducted by the Digital Systems Technology Branch of the Space Electronics Division. Advanced Modulation and Coding is one of three focused technology development projects within the branch's overall Processing and Switching Program. The program consists of industry contracts for developing proof-of-concept (POC) and demonstration model hardware, university grants for analyzing advanced techniques, and in-house integration and testing of performance verification and systems evaluation. The Advanced Modulation and Coding Project is broken into five elements: (1) bandwidth- and power-efficient modems; (2) high-speed codecs; (3) digital modems; (4) multichannel demodulators; and (5) very high-data-rate modems. At least one contract and one grant were awarded for each element.

  15. Present-value analysis: A systems approach to public decisionmaking for cost effectiveness

    NASA Technical Reports Server (NTRS)

    Herbert, T. T.

    1971-01-01

    Decision makers within Governmental agencies and Congress must evaluate competing (and sometimes conflicting) proposals which seek funding and implementation. Present value analysis can be an effective decision making tool by enabling the formal evaluation of the effects of competing proposals on efficient national resource utilization. A project's costs are not only its direct disbursements, but its social costs as well. How much does it cost to have those funds diverted from their use and economic benefit by the private sector to the public project? Comparisons of competing projects' social costs allow decision makers to expand their decision bases by quantifying the projects' impacts upon the economy and the efficient utilization of the country's limited national resources. A conceptual model is established for the choosing of the appropriate discount rate to be used in evaluation decisions through the technique.

  16. Results and Error Estimates from GRACE Forward Modeling over Antarctica

    NASA Astrophysics Data System (ADS)

    Bonin, Jennifer; Chambers, Don

    2013-04-01

    Forward modeling using a weighted least squares technique allows GRACE information to be projected onto a pre-determined collection of local basins. This decreases the impact of spatial leakage, allowing estimates of mass change to be better localized. The technique is especially valuable where models of current-day mass change are poor, such as over Antarctica. However when tested previously, the least squares technique has required constraints in the form of added process noise in order to be reliable. Poor choice of local basin layout has also adversely affected results, as has the choice of spatial smoothing used with GRACE. To develop design parameters which will result in correct high-resolution mass detection and to estimate the systematic errors of the method over Antarctica, we use a "truth" simulation of the Antarctic signal. We apply the optimal parameters found from the simulation to RL05 GRACE data across Antarctica and the surrounding ocean. We particularly focus on separating the Antarctic peninsula's mass signal from that of the rest of western Antarctica. Additionally, we characterize how well the technique works for removing land leakage signal from the nearby ocean, particularly that near the Drake Passage.

  17. Cognitive measure on different profiles.

    PubMed

    Spindola, Marilda; Carra, Giovani; Balbinot, Alexandre; Zaro, Milton A

    2010-01-01

    Based on neurology and cognitive science many studies are developed to understand the human model mental, getting to know how human cognition works, especially about learning processes that involve complex contents and spatial-logical reasoning. Event Related Potential - ERP - is a basic and non-invasive method of electrophysiological investigation. It can be used to assess aspects of human cognitive processing by changing the rhythm of the frequency bands brain indicate that some type of processing or neuronal behavior. This paper focuses on ERP technique to help understand cognitive pathway in subjects from different areas of knowledge when they are exposed to an external visual stimulus. In the experiment we used 2D and 3D visual stimulus in the same picture. The signals were captured using 10 (ten) Electroencephalogram - EEG - channel system developed for this project and interfaced in a ADC (Analogical Digital System) board with LabVIEW system - National Instruments. That research was performed using project of experiments technique - DOE. The signal processing were done (math and statistical techniques) showing the relationship between cognitive pathway by groups and intergroups.

  18. Entanglement entropy of critical spin liquids.

    PubMed

    Zhang, Yi; Grover, Tarun; Vishwanath, Ashvin

    2011-08-05

    Quantum spin liquids are phases of matter whose internal structure is not captured by a local order parameter. Particularly intriguing are critical spin liquids, where strongly interacting excitations control low energy properties. Here we calculate their bipartite entanglement entropy that characterizes their quantum structure. In particular we calculate the Renyi entropy S(2) on model wave functions obtained by Gutzwiller projection of a Fermi sea. Although the wave functions are not sign positive, S(2) can be calculated on relatively large systems (>324 spins) using the variational Monte Carlo technique. On the triangular lattice we find that entanglement entropy of the projected Fermi sea state violates the boundary law, with S(2) enhanced by a logarithmic factor. This is an unusual result for a bosonic wave function reflecting the presence of emergent fermions. These techniques can be extended to study a wide class of other phases.

  19. Measuring the free neutron lifetime to <= 0.3s via the beam method

    NASA Astrophysics Data System (ADS)

    Mulholland, Jonathan; Fomin, Nadia; BL3 Collaboration

    2015-10-01

    Neutron beta decay is an archetype for all semi-leptonic charged-current weak processes. A precise value for the neutron lifetime is required for consistency tests of the Standard Model and is needed to predict the primordial 4He abundance from the theory of Big Bang Nucleosynthesis. An effort has begun for an in-beam measurement of the neutron lifetime with an projected <=0.3s uncertainty. This effort is part of a phased campaign of neutron lifetime measurements based at the NIST Center for Neutron Research, using the Sussex-ILL-NIST technique. Recent advances in neutron fluence measurement techniques as well as new large area silicon detector technology address the two largest sources of uncertainty of in-beam measurements, paving the way for a new measurement. The experimental design and projected uncertainties for the 0.3s measurement will be discussed.

  20. Simple Assessment Techniques for Soil and Water. Environmental Factors in Small Scale Development Projects. Workshops.

    ERIC Educational Resources Information Center

    Coordination in Development, New York, NY.

    This booklet was produced in response to the growing need for reliable environmental assessment techniques that can be applied to small-scale development projects. The suggested techniques emphasize low-technology environmental analysis. Although these techniques may lack precision, they can be extremely valuable in helping to assure the success…

Top