Science.gov

Sample records for scenario based performance

  1. Mars base buildup scenarios

    NASA Technical Reports Server (NTRS)

    Blacic, J. D.

    1986-01-01

    Two Mars surface based build-up scenarios are presented in order to help visualize the mission and to serve as a basis for trade studies. In the first scenario, direct manned landings on the Martian surface occur early in the missions and scientific investigation is the main driver and rationale. In the second senario, Earth development of an infrastructure to exploit the volatile resources of the Martian moons for economic purposes is emphasized. Scientific exploration of the surface is delayed at first in this scenario relative to the first, but once begun develops rapidly, aided by the presence of a permanently manned orbital station.

  2. Mars base buildup scenarios

    SciTech Connect

    Blacic, J.D.

    1985-01-01

    Two surface base build-up scenarios are presented in order to help visualize the mission and to serve as a basis for trade studies. In the first scenario, direct manned landings on the Martian surface occur early in the missions and scientific investigation is the main driver and rationale. In the second scenario, early development of an infrastructure to exploite the volatile resources of the Martian moons for economic purposes is emphasized. Scientific exploration of the surface is delayed at first, but once begun develops rapidly aided by the presence of a permanently manned orbital station.

  3. Web Based Tool for Mission Operations Scenarios

    NASA Technical Reports Server (NTRS)

    Boyles, Carole A.; Bindschadler, Duane L.

    2008-01-01

    A conventional practice for spaceflight projects is to document scenarios in a monolithic Operations Concept document. Such documents can be hundreds of pages long and may require laborious updates. Software development practice utilizes scenarios in the form of smaller, individual use cases, which are often structured and managed using UML. We have developed a process and a web-based scenario tool that utilizes a similar philosophy of smaller, more compact scenarios (but avoids the formality of UML). The need for a scenario process and tool became apparent during the authors' work on a large astrophysics mission. It was noted that every phase of the Mission (e.g., formulation, design, verification and validation, and operations) looked back to scenarios to assess completeness of requirements and design. It was also noted that terminology needed to be clarified and structured to assure communication across all levels of the project. Attempts to manage, communicate, and evolve scenarios at all levels of a project using conventional tools (e.g., Excel) and methods (Scenario Working Group meetings) were not effective given limitations on budget and staffing. The objective of this paper is to document the scenario process and tool created to offer projects a low-cost capability to create, communicate, manage, and evolve scenarios throughout project development. The process and tool have the further benefit of allowing the association of requirements with particular scenarios, establishing and viewing relationships between higher- and lower-level scenarios, and the ability to place all scenarios in a shared context. The resulting structured set of scenarios is widely visible (using a web browser), easily updated, and can be searched according to various criteria including the level (e.g., Project, System, and Team) and Mission Phase. Scenarios are maintained in a web-accessible environment that provides a structured set of scenario fields and allows for maximum

  4. The Impact of Collegiate Aviation Student Learning Styles on Flight Performance: A Scenario-Based Training Approach

    ERIC Educational Resources Information Center

    Harriman, Stanley L.

    2011-01-01

    The introduction of the glass cockpit, as well as a whole new generation of high performance general aviation aircraft, highlights the need for a comprehensive overhaul of the traditional approach to training pilots. Collegiate aviation institutions that are interested in upgrading their training aircraft fleets will need to design new curricula…

  5. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    SciTech Connect

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  6. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  7. Scenario-Based E-Learning Design

    ERIC Educational Resources Information Center

    Iverson, Kathleen; Colkey, Deborah

    2004-01-01

    As it was initially implemented, e-learning did little other than supply facts and information, offering limited opportunity for interactivity and problem-solving. Designers need to find ways to address past limitations and bring the engagement of classroom training to the web. One method that merits attention is scenario-based learning. The…

  8. Mars Scenario-Based Visioning: Logistical Optimization of Transportation Architectures

    NASA Astrophysics Data System (ADS)

    1999-01-01

    The purpose of this conceptual design investigation is to examine transportation forecasts for future human Wu missions to Mars. - Scenario-Based Visioning is used to generate possible future demand projections. These scenarios are then coupled with availability, cost, and capacity parameters for indigenously designed Mars Transfer Vehicles (solar electric, nuclear thermal, and chemical propulsion types) and Earth-to-Orbit launch vehicles (current, future, and indigenous) to provide a cost-conscious dual-phase launch manifest to meet such future demand. A simulator named M-SAT (Mars Scenario Analysis Tool) is developed using this method. This simulation is used to examine three specific transportation scenarios to Mars: a limited "flaus and footprints" mission, a More ambitious scientific expedition similar to an expanded version of the Design Reference Mission from NASA, and a long-term colonization scenario. Initial results from the simulation indicate that chemical propulsion systems might be the architecture of choice for all three scenarios. With this mind, "what if' analyses were performed which indicated that if nuclear production costs were reduced by 30% for the colonization scenario, then the nuclear architecture would have a lower life cycle cost than the chemical. Results indicate that the most cost-effective solution to the Mars transportation problem is to plan for segmented development, this involves development of one vehicle at one opportunity and derivatives of that vehicle at subsequent opportunities.

  9. Reliable Freestanding Position-Based Routing in Highway Scenarios

    PubMed Central

    Galaviz-Mosqueda, Gabriel A.; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-01-01

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159

  10. Reliable freestanding position-based routing in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Gabriel A; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-01-01

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159

  11. Future Scenarios for Fission Based Reactors

    NASA Astrophysics Data System (ADS)

    David, S.

    2005-04-01

    The coming century will see the exhaustion of standard fossil fuels, coal, gas and oil, which today represent 75% of the world energy production. Moreover, their use will have caused large-scale emission of greenhouse gases (GEG), and induced global climate change. This problem is exacerbated by a growing world energy demand. In this context, nuclear power is the only GEG-free energy source available today capable of responding significantly to this demand. Some scenarios consider a nuclear energy production of around 5 Gtoe in 2050, wich would represent a 20% share of the world energy supply. Present reactors generate energy from the fission of U-235 and require around 200 tons of natural Uranium to produce 1GWe.y of energy, equivalent to the fission of one ton of fissile material. In a scenario of a significant increase in nuclear energy generation, these standard reactors will consume the whole of the world's estimated Uranium reserves in a few decades. However, natural Uranium or Thorium ore, wich are not themselves fissile, can produce a fissile material after a neutron capture ( 239Pu and 233U respectively). In a breeder reactor, the mass of fissile material remains constant, and the fertile ore is the only material to be consumed. In this case, only 1 ton of natural ore is needed to produce 1GWe.y. Thus, the breeding concept allows optimal use of fertile ore and development of sustainable nuclear energy production for several thousand years into the future. Different sustainable nuclear reactor concepts are studied in the international forum "generation IV". Different types of coolant (Na, Pb and He) are studied for fast breeder reactors based on the Uranium cycle. The thermal Thorium cycle requires the use of a liquid fuel, which can be reprocessed online in order to extract the neutron poisons. This paper presents these different sustainable reactors, based on the Uranium or Thorium fuel cycles and will compare the different options in term of fissile

  12. Wiki Based Collaborative Learning in Interuniversity Scenarios

    ERIC Educational Resources Information Center

    Katzlinger, Elisabeth; Herzog, Michael A.

    2014-01-01

    In business education advanced collaboration skills and media literacy are important for surviving in a globalized business where virtual communication between enterprises is part of the day-by-day business. To transform these global working situations into higher education, a learning scenario between two universities in Germany and Austria was…

  13. Flooding Capability for River-based Scenarios

    SciTech Connect

    Smith, Curtis L.; Prescott, Steven; Ryan, Emerald; Calhoun, Donna; Sampath, Ramprasad; Anderson, S. Danielle; Casteneda, Cody

    2015-10-01

    This report describes the initial investigation into modeling and simulation tools for application of riverine flooding representation as part of the Risk-Informed Safety Margin Characterization (RISMC) Pathway external hazards evaluations. The report provides examples of different flooding conditions and scenarios that could impact river and watershed systems. Both 2D and 3D modeling approaches are described.

  14. Scenarios Based on Shared Socioeconomic Pathway Assumptions

    NASA Astrophysics Data System (ADS)

    Edmonds, J.

    2013-12-01

    A set of new scenarios is being developed by the international scientific community as part of a larger program that was articulated in Moss, et al. (2009), published in Nature. A long series of meetings including climate researchers drawn from the climate modeling, impacts, adaptation and vulnerability (IAV) and integrated assessment modeling (IAM) communities have led to the development of a set of five Shared Socioeconomic Pathways (SSPs), which define the state of human and natural societies at a macro scale over the course of the 21st century without regard to climate mitigation or change. SSPs were designed to explore a range of possible futures consistent with greater or lesser challenges to mitigation and challenges to adaptation. They include a narrative storyline and a set of quantified measures--e.g. demographic and economic profiles--that define the high-level state of society as it evolves over the 21st century under the assumption of no significant climate feedback. SSPs can be used to develop quantitative scenarios of human Earth systems using IAMs. IAMs produce information about greenhouse gas emissions, energy systems, the economy, agriculture and land use. Each set of SSPs will have a different human Earth system realization for each IAM. Five groups from the IAM community have begun to explore the implications of SSP assumptions for emissions, energy, economy, agriculture and land use. We report the quantitative results of initial experiments from those groups. A major goal of the Moss, et al. strategy was to enable the use of CMIP5 climate model ensemble products for IAV research. CMIP5 climate scenarios used four Representative Concentration Pathway (RCP) scenarios, defined in terms of radiative forcing in the year 2100: 2.6, 4.5, 6.0, and 8.5 Wm-2. There is no reason to believe that the SSPs will generate year 2100 levels of radiative forcing that correspond to the four RCP levels, though it is important that at least one SSP produce a

  15. Space mission scenario development and performance analysis tool

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

    2004-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

  16. The scenario-based generalization of radiation therapy margins

    NASA Astrophysics Data System (ADS)

    Fredriksson, Albin; Bokrantz, Rasmus

    2016-03-01

    We give a scenario-based treatment plan optimization formulation that is equivalent to planning with geometric margins if the scenario doses are calculated using the static dose cloud approximation. If the scenario doses are instead calculated more accurately, then our formulation provides a novel robust planning method that overcomes many of the difficulties associated with previous scenario-based robust planning methods. In particular, our method protects only against uncertainties that can occur in practice, it gives a sharp dose fall-off outside high dose regions, and it avoids underdosage of the target in ‘easy’ scenarios. The method shares the benefits of the previous scenario-based robust planning methods over geometric margins for applications where the static dose cloud approximation is inaccurate, such as irradiation with few fields and irradiation with ion beams. These properties are demonstrated on a suite of phantom cases planned for treatment with scanned proton beams subject to systematic setup uncertainty.

  17. An Example Implementation of Schank's Goal-Based Scenarios

    ERIC Educational Resources Information Center

    Hsu, Chung-Yuan; Moore, David Richard

    2010-01-01

    The Goal-based Scenario method is a design model for applying simulations to instruction. This portfolio item describes an implementation of Goal-based Scenarios for the teaching of statistics. The application demonstrates how simulations can be contextualized and how they can allow learners to engage in legitimate inquiry in the pursuit of their…

  18. "The Strawberry Caper": Using Scenario-Based Problem Solving to Integrate Middle School Science Topics

    ERIC Educational Resources Information Center

    Gonda, Rebecca L.; DeHart, Kyle; Ashman, Tia-Lynn; Legg, Alison Slinskey

    2015-01-01

    Achieving a deep understanding of the many topics covered in middle school biology classes is difficult for many students. One way to help students learn these topics is through scenario-based learning, which enhances students' performance. The scenario-based problem-solving module presented here, "The Strawberry Caper," not only…

  19. Aerosol cloud interaction: a multiplatform-scenario-based methodology

    NASA Astrophysics Data System (ADS)

    Landulfo, Eduardo; Lopes, Fabío. J. S.; Guerrero-Rascado, Juan Luis; Alados-Arboledas, Lucas

    2015-10-01

    Suspended atmospheric particles i.e. aerosol particles go through many chemical and physical processes and those interactions and transformations may cause particle change in size, structure and composition regulated by mechanisms, which are also present in clouds. These interactions play a great role in the radiation transfer in the atmosphere and are not completely understood as competing effects might occur which are known as indirect aerosol effects. Performing measurements and experiments in remote sensing to improve the knowledge of these processes are also a challenge. In face of that we propose a multi-platform approach based lidar, sun photometry and satellite observations which should be characterized under a scenario perspective in which given the cloud height, geometric and optical geometries in a diurnal/nocturnal basis will make possible to apply different analytical tools in each a set of product that specify the aerosol present in the vicinity of clouds, their optical and physical properties. These scenarios are meant to aid in tagging the expected products and help in creating a robust database to systematically study the aerosol-cloud interaction.In total we will present 6 scenarios: 3 under daylight conditions, 3 under at nighttime. Each scenario and their counterpart should be able to provide the cloud base/top height, aerosol backscattering profile and cloud optical/geometric thickness. In each instance we should count on a 5 wavelength Raman lidar system measurement, a collocated sun photometer and CALIPSO/MODIS observation from AQUA/TERRA platforms. To further improve the aerosol cloud interaction the Raman lidar system should have a water vapor channel or moreover a liquid water channel. In our study we will present a two-day case study to show the methodology feasibility and its potential application.

  20. E-maintenance Scenarios Based on Augmented Reality Software Architecture

    NASA Astrophysics Data System (ADS)

    Benbelkacem, S.; Zenati-Henda, N.; Belhocine, M.

    2008-06-01

    This paper presents architecture of augmented reality for e-maintenance application. In our case, the aim is not to develop a vision system based on augmented reality concept, but to show the relationship between the different actors in the proposed architecture and to facilitate maintenance of the machine. This architecture allows implementing different scenarios which give to the technician possibilities to intervene on a breakdown device with a distant expert help. Each scenario is established according to machine parameters and technician competences. In our case, a hardware platform is designed to carry out e-maintenance scenarios. An example of e-maintenance scenario is then presented.

  1. Modeling and Composing Scenario-Based Requirements with Aspects

    NASA Technical Reports Server (NTRS)

    Araujo, Joao; Whittle, Jon; Ki, Dae-Kyoo

    2004-01-01

    There has been significant recent interest, within the Aspect-Oriented Software Development (AOSD) community, in representing crosscutting concerns at various stages of the software lifecycle. However, most of these efforts have concentrated on the design and implementation phases. We focus in this paper on representing aspects during use case modeling. In particular, we focus on scenario-based requirements and show how to compose aspectual and non-aspectual scenarios so that they can be simulated as a whole. Non-aspectual scenarios are modeled as UML sequence diagram. Aspectual scenarios are modeled as Interaction Pattern Specifications (IPS). In order to simulate them, the scenarios are transformed into a set of executable state machines using an existing state machine synthesis algorithm. Previous work composed aspectual and non-aspectual scenarios at the sequence diagram level. In this paper, the composition is done at the state machine level.

  2. Usability standards meet scenario-based design: challenges and opportunities.

    PubMed

    Vincent, Christopher J; Blandford, Ann

    2015-02-01

    The focus of this paper is on the challenges and opportunities presented by developing scenarios of use for interactive medical devices. Scenarios are integral to the international standard for usability engineering of medical devices (IEC 62366:2007), and are also applied to the development of health software (draft standard IEC 82304-1). The 62366 standard lays out a process for mitigating risk during normal use (i.e. use as per the instructions, or accepted medical practice). However, this begs the question of whether "real use" (that which occurs in practice) matches "normal use". In this paper, we present an overview of the product lifecycle and how it impacts on the type of scenario that can be practically applied. We report on the development and testing of a set of scenarios intended to inform the design of infusion pumps based on "real use". The scenarios were validated by researchers and practitioners experienced in clinical practice, and their utility was assessed by developers and practitioners representing different stages of the product lifecycle. These evaluations highlighted previously unreported challenges and opportunities for the use of scenarios in this context. Challenges include: integrating scenario-based design with usability engineering practice; covering the breadth of uses of infusion devices; and managing contradictory evidence. Opportunities included scenario use beyond design to guide marketing, to inform purchasing and as resources for training staff. This study exemplifies one empirically grounded approach to communicating and negotiating the realities of practice. PMID:25460202

  3. Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation

    SciTech Connect

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    2014-01-01

    Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From these five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.

  4. OBEST: The Object-Based Event Scenario Tree Methodology

    SciTech Connect

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-03-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies.

  5. Technical Feasibility Assessment of Lunar Base Mission Scenarios

    NASA Astrophysics Data System (ADS)

    Magelssen, Trygve ``Spike''; Sadeh, Eligar

    2005-02-01

    Investigation of the literature pertaining to lunar base (LB) missions and the technologies required for LB development has revealed an information gap that hinders technical feasibility assessment. This information gap is the absence of technical readiness levels (TRL) (Mankins, 1995) and information pertaining to the criticality of the critical enabling technologies (CETs) that enable mission success. TRL is a means of identifying technical readiness stages of a technology. Criticality is defined as the level of influence the CET has on the mission scenario. The hypothesis of this research study is that technical feasibility is a function of technical readiness and technical readiness is a function of criticality. A newly developed research analysis method is used to identify the technical feasibility of LB mission scenarios. A Delphi is used to ascertain technical readiness levels and CET criticality-to-mission. The research analysis method is applied to the Delphi results to determine the technical feasibility of the LB mission scenarios that include: observatory, science research, lunar settlement, space exploration gateway, space resource utilization, and space tourism. The CETs identified encompasses four major system level technologies of: transportation, life support, structures, and power systems. Results of the technical feasibility assessment show the observatory and science research LB mission scenarios to be more technical ready out of all the scenarios, but all mission scenarios are in very close proximity to each other in regard to criticality and TRL and no one mission scenario stands out as being absolutely more technically ready than any of the other scenarios. What is significant and of value are the Delphi results concerning CET criticality-to-mission and the TRL values evidenced in the Tables that can be used by anyone assessing the technical feasibility of LB missions.

  6. Active, Collaborative and Case-Based Learning with Computer-Based Case Scenarios.

    ERIC Educational Resources Information Center

    Ward, Robert

    1998-01-01

    Discusses ideas and observations about the development, use, and pedagogy of computer-based case scenarios. Outlines two large computer-based case scenarios written to help students develop their skills and knowledge in business information systems. Considers factors in the design of computer-based case scenarios and related activities that might…

  7. Dual Mission Scenarios for the Human Lunar Campaign - Performance, Cost and Risk Benefits

    NASA Technical Reports Server (NTRS)

    Saucillo, Rudolph J.; Reeves, David M.; Chrone, Jonathan D.; Stromgren, Chel; Reeves, John D.; North, David D.

    2008-01-01

    Scenarios for human lunar operations with capabilities significantly beyond Constellation Program baseline missions are potentially feasible based on the concept of dual, sequential missions utilizing a common crew and a single Ares I/CEV (Crew Exploration Vehicle). For example, scenarios possible within the scope of baseline technology planning include outpost-based sortie missions and dual sortie missions. Top level cost benefits of these dual sortie scenarios may be estimated by comparison to the Constellation Program reference two-mission-per-year lunar campaign. The primary cost benefit is the accomplishment of Mission B with a "single launch solution" since no Ares I launch is required. Cumulative risk to the crew is lowered since crew exposure to launch risks and Earth return risks are reduced versus comparable Constellation Program reference two-mission-per-year scenarios. Payload-to-the-lunar-surface capability is substantially increased in the Mission B sortie as a result of additional propellant available for Lunar Lander #2 descent. This additional propellant is a result of EDS #2 transferring a smaller stack through trans-lunar injection and using remaining propellant to perform a portion of the lunar orbit insertion (LOI) maneuver. This paper describes these dual mission concepts, including cost, risk and performance benefits per lunar sortie site, and provides an initial feasibility assessment.

  8. Scenario-Based Training at the F.B.I.

    ERIC Educational Resources Information Center

    Whitcomb, Chris

    1999-01-01

    The 16-week training program offered by the FBI Academy for all new agents is a scenario-based curriculum that includes a range of subjects from the rules of evidence to defensive tactics and provides agents with a clear understanding of how to conduct a full investigation from start to finish. (JOW)

  9. Improving learning performance with happiness by interactive scenarios.

    PubMed

    Chuang, Chi-Hung; Chen, Ying-Nong; Tsai, Luo-Wei; Lee, Chun-Chieh; Tsai, Hsin-Chun

    2014-01-01

    Recently, digital learning has attracted a lot of researchers to improve the problems of learning carelessness, low learning ability, lack of concentration, and difficulties in comprehending the logic of math. In this study, a digital learning system based on Kinect somatosensory system is proposed to make children and teenagers happily learn in the course of the games and improve the learning performance. We propose two interactive geometry and puzzle games. The proposed somatosensory games can make learners feel curious and raise their motivation to find solutions for boring problems via abundant physical expressions and interactive operations. The players are asked to select particular operation by gestures and physical expressions within a certain time. By doing so, the learners can feel the fun of game playing and train their logic ability before they are aware. Experimental results demonstrate that the proposed somatosensory system can effectively improve the students' learning performance. PMID:24558331

  10. Improving Learning Performance with Happiness by Interactive Scenarios

    PubMed Central

    Chuang, Chi-Hung; Chen, Ying-Nong; Tsai, Luo-Wei; Lee, Chun-Chieh; Tsai, Hsin-Chun

    2014-01-01

    Recently, digital learning has attracted a lot of researchers to improve the problems of learning carelessness, low learning ability, lack of concentration, and difficulties in comprehending the logic of math. In this study, a digital learning system based on Kinect somatosensory system is proposed to make children and teenagers happily learn in the course of the games and improve the learning performance. We propose two interactive geometry and puzzle games. The proposed somatosensory games can make learners feel curious and raise their motivation to find solutions for boring problems via abundant physical expressions and interactive operations. The players are asked to select particular operation by gestures and physical expressions within a certain time. By doing so, the learners can feel the fun of game playing and train their logic ability before they are aware. Experimental results demonstrate that the proposed somatosensory system can effectively improve the students' learning performance. PMID:24558331

  11. Scenario-based Storm Surge Vulnerability Assessment of Catanduanes

    NASA Astrophysics Data System (ADS)

    Suarez, J. K. B.

    2015-12-01

    After the devastating storm surge effect of Typhoon Haiyan, the public recognized an improved communication about risks, vulnerabilities and what is threatened by storm surge. This can be provided by vulnerability maps which allow better visual presentations and understanding of the risks and vulnerabilities. Local implementers can direct the resources needed for protection of these areas. Moreover, vulnerability and hazard maps are relevant in all phases of disaster management designed by the National Disaster Risk Reduction Council (NDRRMC) - disaster preparedness, prevention and mitigation and response and recovery and rehabilitation. This paper aims to analyze the vulnerability of Catanduanes, a coastal province in the Philippines, to storm surges in terms of four parameters: population, built environment, natural environment and agricultural production. The vulnerability study relies on the storm surge inundation maps based on the Department of Science and Technology Nationwide Operational Assessment of Hazards' (DOST-Project NOAH) proposed four Storm Surge Advisory (SSA) scenarios (1-2, 3, 4, and 5 meters) for predicting storm surge heights. To determine total percent affected for each parameter elements, an overlay analysis was performed in ArcGIS Desktop. Moreover, vulnerability and hazard maps are generated as a final output and a tool for visualizing the impacts of storm surge event at different surge heights. The result of this study would help the selected province to know their present condition and adapt strategies to strengthen areas where they are found to be most vulnerable in order to prepare better for the future.

  12. Performance analysis of seismocardiography for heart sound signal recording in noisy scenarios.

    PubMed

    Jain, Puneet Kumar; Tiwari, Anil Kumar; Chourasia, Vijay S

    2016-01-01

    This paper presents a system based on Seismocardiography (SCG) to monitor the heart sound signal for the long-term. It uses an accelerometer, which is of small size and low weight and, thus, convenient to wear. Such a system should also be robust to various noises which occur in real life scenarios. Therefore, a detailed analysis is provided of the proposed system and its performance is compared to the performance of the Phoncardiography (PCG) system. For this purpose, both signals of five subjects were simultaneously recorded in clinical and different real life noisy scenarios. For the quantitative analysis, the detection rate of fundamental heart sound components, S1 and S2, is obtained. Furthermore, a quality index based on the energy of fundamental components is also proposed and obtained for the same. Results show that both the techniques are able to acquire the S1 and S2, in clinical set-up. However, in real life scenarios, we observed many favourable features in the proposed system as compared to PCG, for its use for long-term monitoring. PMID:26860039

  13. Assessing magnitude probability distribution through physics-based rupture scenarios

    NASA Astrophysics Data System (ADS)

    Hok, Sébastien; Durand, Virginie; Bernard, Pascal; Scotti, Oona

    2016-04-01

    When faced with complex network of faults in a seismic hazard assessment study, the first question raised is to what extent the fault network is connected and what is the probability that an earthquake ruptures simultaneously a series of neighboring segments. Physics-based dynamic rupture models can provide useful insight as to which rupture scenario is most probable, provided that an exhaustive exploration of the variability of the input parameters necessary for the dynamic rupture modeling is accounted for. Given the random nature of some parameters (e.g. hypocenter location) and the limitation of our knowledge, we used a logic-tree approach in order to build the different scenarios and to be able to associate them with a probability. The methodology is applied to the three main faults located along the southern coast of the West Corinth rift. Our logic tree takes into account different hypothesis for: fault geometry, location of hypocenter, seismic cycle position, and fracture energy on the fault plane. The variability of these parameters is discussed, and the different values tested are weighted accordingly. 64 scenarios resulting from 64 parameter combinations were included. Sensitivity studies were done to illustrate which parameters control the variability of the results. Given the weight of the input parameters, we evaluated the probability to obtain a full network break to be 15 %, while single segment rupture represents 50 % of the scenarios. These rupture scenario probability distribution along the three faults of the West Corinth rift fault network can then be used as input to a seismic hazard calculation.

  14. The real world and lunar base activation scenarios

    NASA Technical Reports Server (NTRS)

    Schmitt, Harrison H.

    1992-01-01

    A lunar base or a network of lunar bases may have highly desirable support functions in a national or international program to explore and settle Mars. In addition, He-3 exported from the Moon could be the basis for providing much of the energy needs of humankind in the twenty-first century. Both technical and managerial issues must be addressed when considering the establishment of a lunar base that can serve the needs of human civilization in space. Many of the technical issues become evident in the consideration of hypothetical scenarios for the activation of a network of lunar bases. Specific and realistic assumptions must be made about the conduct of various types of activities in addition to the general assumptions given above. These activities include landings, crew consumables, power production, crew selection, risk management, habitation, science station placement, base planning, science, agriculture, resource evaluation, readaptation, plant activation and test, storage module landings, resource transport module landings, integrated operations, maintenance, Base 2 activation, and management. The development of scenarios for the activation of a lunar base or network of bases will require close attention to the 'real world' of space operations. That world is defined by the natural environment, available technology, realistic objectives, and common sense.

  15. Improvement of nursing students' learning outcomes through scenario-based skills training

    PubMed Central

    Uysal, Nurcan

    2016-01-01

    Abstract Objective: this study analyzed the influence of scenario-based skills training on students' learning skills. Method: the author evaluated the nursing skills laboratory exam papers of 605 sophomores in nursing programs for seven years. The study determined the common mistakes of students and the laboratory work was designed in a scenario-based format. The effectiveness of this method was evaluated by assessing the number of errors the students committed and their achievement scores in laboratory examinations. This study presents the students' common mistakes in intramuscular and subcutaneous injection and their development of intravenous access skills, included in the nursing skills laboratory examination. Results: an analysis of the students' most common mistakes revealed that the most common was not following the principles of asepsis for all three skills (intramuscular, subcutaneous injection, intravenous access) in the first year of the scenario-based training. The students' exam achievement scores increased gradually, except in the fall semester of the academic year 2009-2010. The study found that the scenario-based skills training reduced students' common mistakes in examinations and enhanced their performance on exams. Conclusion: this method received a positive response from both students and instructors. The scenario-based training is available for use in addition to other skills training methods. PMID:27508922

  16. Scenario-based approach to risk analysis in support of cyber security

    SciTech Connect

    Gertman, D. I.; Folkers, R.; Roberts, J.

    2006-07-01

    control systems, perpetrators will attempt to control and defeat automation systems, engineering access, control systems and protective systems implemented in today's critical infrastructures. Major systems such as supervisory control and data acquisition (SCADA) systems are likely targets for attack. Not all attack scenarios have the same expected frequency or consequence. The attacks will be directed and structured and thus, are not be characterized as random events when one considers failure probabilities. Attack types differ in their consequence as a function of the probability associated with various sub events in the presence of specific system configurations. Ideally, a series of generic scenarios can be identified for each of the major critical infrastructure (CI) sectors. A scenario-based approach to risk assessment allows decision makers to place financial and personnel resources in-place for attacks that truly matter: e.g. attacks that generate physical and economic damage. The use of scenario-based analysis allows risk reduction goals to be informed by more than consequence analysis alone. The key CI targets used in the present study were identified previously as part of a mid-level consequence analysis performed at INL by the Control System Security Program (CSSP) for the National Cyber Security Div. (NCSD) of the Dept. of Homeland Security (DHS). This paper discusses the process for and results associated with the development of scenario-based cyber attacks upon control systems including the information and personnel requirements for scenario development. Challenges to scenario development including completeness and uncertainty characterization are discussed as well. Further, the scenario discussed herein, is one of a number of scenarios for infrastructures currently under review. (authors)

  17. Raman resonance in iron-based superconductors: The magnetic scenario

    NASA Astrophysics Data System (ADS)

    Hinojosa, Alberto; Cai, Jiashen; Chubukov, Andrey V.

    2016-02-01

    We perform theoretical analysis of polarization-sensitive Raman spectroscopy on NaFe1 -xCoxAs , EuFe 2 As2 , SrFe2As2 , and Ba (Fe1 -xCox )2As2 , focusing on two features seen in the B1 g symmetry channel (in one Fe unit cell notation): the strong temperature dependence of the static, uniform Raman response in the normal state and the existence of a collective mode in the superconducting state. We show that both features can be explained by the coupling of fermions to pairs of magnetic fluctuations via the Aslamazov-Larkin process. We first analyze magnetically mediated Raman intensity at the leading two-loop order and then include interactions between pairs of magnetic fluctuations. We show that the full Raman intensity in the B1 g channel can be viewed as the result of the coupling of light to Ising-nematic susceptibility via Aslamazov-Larkin process. We argue that the singular temperature dependence in the normal state is the combination of the temperature dependencies of the Aslamazov-Larkin vertex and of Ising-nematic susceptibility. We discuss two scenario for the resonance below Tc. One is the resonance due to development of a pole in the fully renormalized Ising-nematic susceptibility. Another is orbital excitonic scenario, in which spin fluctuations generate attractive interaction between low-energy fermions.

  18. Supply Chain Vulnerability Analysis Using Scenario-Based Input-Output Modeling: Application to Port Operations.

    PubMed

    Thekdi, Shital A; Santos, Joost R

    2016-05-01

    Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario-based methods to measure economic sensitivity to sudden-onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management. PMID:26271771

  19. Intelligent scenario generation for simulation-based training

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen; Wang, Lui; Baffes, Paul

    1989-01-01

    A training scenario generator object database was developed to serve as a general-purpose mechanism for constructing the context needed to define a simulation scenario. It is found that the ability to automate the development of the input parameters required to produce a challenging simulation scenario targeted at a specific trainee can greatly enhance the efficiency of intelligent training systems. The approach described was used successfully in the payload-assist module deploy/intelligent computer-aided training system.

  20. Using Crash Data to Develop Simulator Scenarios for Assessing Novice Driver Performance

    PubMed Central

    McDonald, Catherine C.; Tanenbaum, Jason B.; Lee, Yi-Ching; Fisher, Donald L.; Mayhew, Daniel R.; Winston, Flaura K.

    2013-01-01

    Teenage drivers are at their highest crash risk in their first 6 months or first 1,000 mi of driving. Driver training, adult-supervised practice driving, and other interventions are aimed at improving driving performance in novice drivers. Previous driver training programs have enumerated thousands of scenarios, with each scenario requiring one or more skills. Although there is general agreement about the broad set of skills needed to become a competent driver, there is no consensus set of scenarios and skills to assess whether novice drivers are likely to crash or to assess the effects of novice driver training programs on the likelihood of a crash. The authors propose that a much narrower, common set of scenarios can be used to focus on the high-risk crashes of young drivers. Until recently, it was not possible to identify the detailed set of scenarios that were specific to high-risk crashes. However, an integration of police crash reports from previous research, a number of critical simulator studies, and a nationally representative database of serious teen crashes (the National Motor Vehicle Crash Causation Survey) now make identification of these scenarios possible. In this paper, the authors propose this novel approach and discuss how to create a common set of simulated scenarios and skills to assess novice driver performance and the effects of training and interventions as they relate to high-risk crashes. PMID:23543947

  1. Late pleistocene ice age scenarios based on observational evidence

    SciTech Connect

    DeBlonde, G. ); Peltier, W.R. )

    1993-04-01

    Ice age scenarios for the last glacial interglacial cycle, based on observations of Boyle and Keigwin concerning the North Atlantic thermohaline circulation and of Barnola et al. concerning atmospheric CO[sub 2] variations derived from the Vostok ice cores, are herein analyzed. Northern Hemisphere continental ice sheets are simulated with an energy balance model (EBM) that is asynchronously coupled to vertically integrated ice sheets models based on the Glen flow law. The EBM includes both a realistic land-sea distribution and temperature-albedo feedback and is driven with orbital variations of effective solar insolation. With the addition of atmospheric CO[sub 2] and ocean heat flux variations, but not in their absence, a complete collapse is obtained for the Eurasian ice sheet but not for the North American ice sheet. We therefore suggest that further feedback mechanisms, perhaps involving more accurate modeling of the dynamics of the mostly marine-based Laurentide complex appears necessary to explain termination I. 96 refs., 12 figs., 2 tabs.

  2. Performance of aged cement-polymer composite immobilizing borate waste simulates during flooding scenarios

    NASA Astrophysics Data System (ADS)

    Eskander, S. B.; Bayoumi, T. A.; Saleh, H. M.

    2012-01-01

    An advanced composite of cement and water extended polyester based on the recycled Poly(ethylene terephthalate) waste was developed to incorporate the borate waste. Previous studies have reported the characterizations of the waste form (cement-polymer composite immobilizing borate waste simulates) after 28 days of curing time. The current work studied the performance of waste form aged for 7 years and subjected to flooding scenario during 260 days using three types of water. The state of waste form was assessed at the end of each definite interval of the water infiltration through visual examination and mechanical measurement. Scanning electron microscopy, infrared spectroscopy, X-ray diffraction and thermal analyses were used to investigate the changes that may occur in the microstructure of the waste form under aging and flooding effects. The actual experimental results indicated reasonable evidence for the durable waste form. Acceptable consistency was confirmed for the waste form even after aging 7 years and exposure to flooding scenario for 260 days.

  3. Treatment of hypogonadotropic male hypogonadism: Case-based scenarios

    PubMed Central

    Crosnoe-Shipley, Lindsey E; Elkelany, Osama O; Rahnema, Cyrus D; Kim, Edward D

    2015-01-01

    The aim of this study is to review four case-based scenarios regarding the treatment of symptomatic hypogonadism in men. The article is designed as a review of published literature. We conducted a PubMed literature search for the time period of 1989-2014, concentrating on 26 studies investigating the efficacy of various therapeutic options on semen analysis, pregnancy outcomes, time to recovery of spermatogenesis, as well as serum and intratesticular testosterone levels. Our results demonstrated that exogenous testosterone suppresses intratesticular testosterone production, which is an absolute prerequisite for normal spermatogenesis. Cessation of exogenous testosterone should be recommended for men desiring to maintain their fertility. Therapies that protect the testis involve human chorionic gonadotropin (hCG) therapy or selective estrogen receptor modulators (SERMs), but may also include low dose hCG with exogenous testosterone. Off-label use of SERMs, such as clomiphene citrate, are effective for maintaining testosterone production long-term and offer the convenience of representing a safe, oral therapy. At present, routine use of aromatase inhibitors is not recommended based on a lack of long-term data. We concluded that exogenous testosterone supplementation decreases sperm production. It was determined that clomiphene citrate is a safe and effective therapy for men who desire to maintain fertility. Although less frequently used in the general population, hCG therapy with or without testosterone supplementation represents an alternative treatment. PMID:25949938

  4. Diminished Wastewater Treatment: Evaluation of Septic System Performance Under a Climate Change Scenario

    NASA Astrophysics Data System (ADS)

    Cooper, J.; Loomis, G.; Kalen, D.; Boving, T. B.; Morales, I.; Amador, J.

    2015-12-01

    The effects of climate change are expected to reduce the ability of soil-based onsite wastewater treatment systems (OWTS), to treat domestic wastewater. In the northeastern U.S., the projected increase in atmospheric temperature, elevation of water tables from rising sea levels, and heightened precipitation will reduce the volume of unsaturated soil and oxygen available for treatment. Incomplete removal of contaminants may lead to transport of pathogens, nutrients, and biochemical oxygen demand (BOD) to groundwater, increasing the risk to public health and likelihood of eutrophying aquatic ecosystems. Advanced OWTS, which include pre-treatment steps and provide unsaturated drainfields of greater volume relative to conventional OWTS, are expected to be more resilient to climate change. We used intact soil mesocosms to quantify water quality functions for two advanced shallow narrow drainfield types and a conventional drainfield under a current climate scenario and a moderate climate change scenario of 30 cm rise in water table and 5°C increase in soil temperature. While no fecal coliform bacteria (FCB) was released under the current climate scenario, up to 109 CFU FCB/mL (conventional) and up to 20 CFU FCB/mL (shallow narrow) were released under the climate change scenario. Total P removal rates dropped from 100% to 54% (conventional) and 71% (shallow narrow) under the climate change scenario. Total N removal averaged 17% under both climate scenarios in the conventional, but dropped from 5.4% to 0% in the shallow narrow under the climate change scenario, with additional leaching of N in excess of inputs indicating release of previously held N. No significant difference was observed between scenarios for BOD removal. The initial data indicate that while advanced OWTS retain more function under the climate change scenario, all three drainfield types experience some diminished treatment capacity.

  5. Scenario details of NPE 2012 - Independent performance assessment by simulated CTBT violation

    NASA Astrophysics Data System (ADS)

    Gestermann, N.; Bönnemann, C.; Ceranna, L.; Ross, O.; Schlosser, C.

    2012-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) was opened for signature on 24 September 1996. The technical preparations for monitoring CTBT compliance are moving ahead rapidly and many efforts have been made since then to establish the verification system. In that regard the two underground nuclear explosions conducted by the Democratic People's Republic of Korea 2006 and 2009 were the first real performance tests of the system. In the light of these events National Data Centres (NDCs) realized the need of getting more familiar with the verification regime details. The idea of an independent annual exercise to evaluate the processing and analysis procedures applied at the National Data Centres of the CTBT was born at the NDC Evaluation Workshop in Kiev, Ukraine, 2006. The exercises should simulate a fictitious violation of the CTBT and all NDCs are invited to clarify the nature of the selected event. This exercise should help to evaluate the effectiveness of procedures applied at NDCs, as well as the quality, completeness, and usefulness of IDC products. Moreover, the National Data Centres Preparedness Exercise (NPE) is a measure for the readiness of the NDCs to fulfill their duties in regard of the CTBT verification, the treaty compliance based judgments about the nature of events as natural or artificial and chemical or nuclear, respectively. NPEs proved to be an efficient indicative tool for testing the performance of the verification system and its elements. In 2007 and 2008 the exercise were focused on seismic waveform data analysis. Since 2009 the analysis of infrasound data was included and additional attention was attached to the radionuclide component. In 2010 a realistic noble gas release scenario was selected as the trigger event which could be expected after an underground nuclear test. The epicenter location of an event from the Reviewed Event Bulletin (REB), unknown for participants of the exercise, was selected as the source of the noble gas

  6. Tracking Systems for Virtual Rehabilitation: Objective Performance vs. Subjective Experience. A Practical Scenario

    PubMed Central

    Lloréns, Roberto; Noé, Enrique; Naranjo, Valery; Borrego, Adrián; Latorre, Jorge; Alcañiz, Mariano

    2015-01-01

    Motion tracking systems are commonly used in virtual reality-based interventions to detect movements in the real world and transfer them to the virtual environment. There are different tracking solutions based on different physical principles, which mainly define their performance parameters. However, special requirements have to be considered for rehabilitation purposes. This paper studies and compares the accuracy and jitter of three tracking solutions (optical, electromagnetic, and skeleton tracking) in a practical scenario and analyzes the subjective perceptions of 19 healthy subjects, 22 stroke survivors, and 14 physical therapists. The optical tracking system provided the best accuracy (1.074 ± 0.417 cm) while the electromagnetic device provided the most inaccurate results (11.027 ± 2.364 cm). However, this tracking solution provided the best jitter values (0.324 ± 0.093 cm), in contrast to the skeleton tracking, which had the worst results (1.522 ± 0.858 cm). Healthy individuals and professionals preferred the skeleton tracking solution rather than the optical and electromagnetic solution (in that order). Individuals with stroke chose the optical solution over the other options. Our results show that subjective perceptions and preferences are far from being constant among different populations, thus suggesting that these considerations, together with the performance parameters, should be also taken into account when designing a rehabilitation system. PMID:25808765

  7. 77 FR 48107 - Workshop on Performance Assessments of Near-Surface Disposal Facilities: FEPs Analysis, Scenario...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ... Radioactive Waste.'' These regulations were published in the Federal Register on December 27, 1982 (47 FR..., Environmental Protection and Performance Assessment Directorate, Division of Waste Management and Environmental...-Surface Disposal Facilities: FEPs Analysis, Scenario and Conceptual Model Development, and Code...

  8. Long Pulse High Performance Plasma Scenario Development for the National Spherical Torus Experiment

    SciTech Connect

    Kessel, C.E.; Bell, R.E.; Bell, M.G.; Gates, D.A.; Harvey, R.W.

    2006-01-01

    The National Spherical Torus Experiment [Ono et al., Nucl. Fusion, 44, 452 (2004)] is targeting long pulse high performance, noninductive sustained operations at low aspect ratio, and the demonstration of nonsolenoidal startup and current rampup. The modeling of these plasmas provides a framework for experimental planning and identifies the tools to access these regimes. Simulations based on neutral beam injection (NBI)-heated plasmas are made to understand the impact of various modifications and identify the requirements for (1) high elongation and triangularity, (2) density control to optimize the current drive, (3) plasma rotation and/or feedback stabilization to operate above the no-wall limit, and (4) electron Bernstein waves (EBW) for off-axis heating/current drive (H/CD). Integrated scenarios are constructed to provide the transport evolution and H/CD source modeling, supported by rf and stability analyses. Important factors include the energy confinement, Zeff, early heating/H mode, broadening of the NBI-driven current profile, and maintaining q(0) and qmin>1.0. Simulations show that noninductive sustained plasmas can be reached at IP=800 kA, BT=0.5 T, 2.5, N5, 15%, fNI=92%, and q(0)>1.0 with NBI H/CD, density control, and similar global energy confinement to experiments. The noninductive sustained high plasmas can be reached at IP=1.0 MA, BT=0.35 T, 2.5, N9, 43%, fNI=100%, and q(0)>1.5 with NBI H/CD and 3.0 MW of EBW H/CD, density control, and 25% higher global energy confinement than experiments. A scenario for nonsolenoidal plasma current rampup is developed using high harmonic fast wave H/CD in the early low IP and low Te phase, followed by NBI H/CD to continue the current ramp, reaching a maximum of 480 kA after 3.4 s.

  9. Virtual screening applications: a study of ligand-based methods and different structure representations in four different scenarios.

    PubMed

    Hristozov, Dimitar P; Oprea, Tudor I; Gasteiger, Johann

    2007-01-01

    Four different ligand-based virtual screening scenarios are studied: (1) prioritizing compounds for subsequent high-throughput screening (HTS); (2) selecting a predefined (small) number of potentially active compounds from a large chemical database; (3) assessing the probability that a given structure will exhibit a given activity; (4) selecting the most active structure(s) for a biological assay. Each of the four scenarios is exemplified by performing retrospective ligand-based virtual screening for eight different biological targets using two large databases--MDDR and WOMBAT. A comparison between the chemical spaces covered by these two databases is presented. The performance of two techniques for ligand--based virtual screening--similarity search with subsequent data fusion (SSDF) and novelty detection with Self-Organizing Maps (ndSOM) is investigated. Three different structure representations--2,048-dimensional Daylight fingerprints, topological autocorrelation weighted by atomic physicochemical properties (sigma electronegativity, polarizability, partial charge, and identity) and radial distribution functions weighted by the same atomic physicochemical properties--are compared. Both methods were found applicable in scenario one. The similarity search was found to perform slightly better in scenario two while the SOM novelty detection is preferred in scenario three. No method/descriptor combination achieved significant success in scenario four. PMID:18008169

  10. Scenario-Based Spoken Interaction with Virtual Agents

    ERIC Educational Resources Information Center

    Morton, Hazel; Jack, Mervyn A.

    2005-01-01

    This paper describes a CALL approach which integrates software for speaker independent continuous speech recognition with embodied virtual agents and virtual worlds to create an immersive environment in which learners can converse in the target language in contextualised scenarios. The result is a self-access learning package: SPELL (Spoken…

  11. Designing, Developing and Implementing a Software Tool for Scenario Based Learning

    ERIC Educational Resources Information Center

    Norton, Geoff; Taylor, Mathew; Stewart, Terry; Blackburn, Greg; Jinks, Audrey; Razdar, Bahareh; Holmes, Paul; Marastoni, Enrique

    2012-01-01

    The pedagogical value of problem-based and inquiry-based learning activities has led to increased use of this approach in many courses. While scenarios or case studies were initially presented to learners as text-based material, the development of modern software technology provides the opportunity to deliver scenarios as e-learning modules,…

  12. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  13. Earthquake scenarios based on lessons from the past

    NASA Astrophysics Data System (ADS)

    Solakov, Dimcho; Simeonova, Stella; Aleksandrova, Irena; Popova, Iliana

    2010-05-01

    Earthquakes are the most deadly of the natural disasters affecting the human environment; indeed catastrophic earthquakes have marked the whole human history. Global seismic hazard and vulnerability to earthquakes are increasing steadily as urbanization and development occupy more areas that are prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The implementation of the earthquake scenarios into the policies for seismic risk reduction will allow focusing on the prevention of earthquake effects rather than on intervention following the disasters. The territory of Bulgaria (situated in the eastern part of the Balkan Peninsula) represents a typical example of high seismic risk area. Over the centuries, Bulgaria has experienced strong earthquakes. At the beginning of the 20-the century (from 1901 to 1928) five earthquakes with magnitude larger than or equal to MS=7.0 occurred in Bulgaria. However, no such large earthquakes occurred in Bulgaria since 1928, which may induce non-professionals to underestimate the earthquake risk. The 1986 earthquake of magnitude MS=5.7 occurred in the central northern Bulgaria (near the town of Strazhitsa) is the strongest quake after 1928. Moreover, the seismicity of the neighboring countries, like Greece, Turkey, former Yugoslavia and Romania (especially Vrancea-Romania intermediate earthquakes), influences the seismic hazard in Bulgaria. In the present study deterministic scenarios (expressed in seismic intensity) for two Bulgarian cities (Rouse and Plovdiv) are presented. The work on

  14. Scenario simulation based assessment of subsurface energy storage

    NASA Astrophysics Data System (ADS)

    Beyer, C.; Bauer, S.; Dahmke, A.

    2014-12-01

    Energy production from renewable sources such as solar or wind power is characterized by temporally varying power supply. The politically intended transition towards renewable energies in Germany („Energiewende") hence requires the installation of energy storage technologies to compensate for the fluctuating production. In this context, subsurface energy storage represents a viable option due to large potential storage capacities and the wide prevalence of suited geological formations. Technologies for subsurface energy storage comprise cavern or deep porous media storage of synthetic hydrogen or methane from electrolysis and methanization, or compressed air, as well as heat storage in shallow or moderately deep porous formations. Pressure build-up, fluid displacement or temperature changes induced by such operations may affect local and regional groundwater flow, geomechanical behavior, groundwater geochemistry and microbiology. Moreover, subsurface energy storage may interact and possibly be in conflict with other "uses" like drinking water abstraction or ecological goods and functions. An utilization of the subsurface for energy storage therefore requires an adequate system and process understanding for the evaluation and assessment of possible impacts of specific storage operations on other types of subsurface use, the affected environment and protected entities. This contribution presents the framework of the ANGUS+ project, in which tools and methods are developed for these types of assessments. Synthetic but still realistic scenarios of geological energy storage are derived and parameterized for representative North German storage sites by data acquisition and evaluation, and experimental work. Coupled numerical hydraulic, thermal, mechanical and reactive transport (THMC) simulation tools are developed and applied to simulate the energy storage and subsurface usage scenarios, which are analyzed for an assessment and generalization of the imposed THMC

  15. Evaluating performance of law enforcement personnel during a stressful training scenario.

    PubMed

    Meyerhoff, James L; Norris, William; Saviolakis, George A; Wollert, Terry; Burge, Bob; Atkins, Valerie; Spielberger, Charles

    2004-12-01

    Police trainees who were ready to graduate from the Federal Law Enforcement Training Center (FLETC) volunteered to participate in an exercise designed to evaluate their survivability. In a highly stressful interactive scenario, which included a hostage situation, performance was evaluated for a range of responses, including: shooting judgment and accuracy, communications, and coping with a weapon malfunction. Nineteen percent of subjects shot the hostage, a failure rate that falls in the reported range of friendly fire casualties in military combat. The Spielberger Trait Anger Scale showed an association with shot placement and performance during the gunfight as well as with overall performance scores. PMID:15677421

  16. Robust Performance of Marginal Pacific Coral Reef Habitats in Future Climate Scenarios

    PubMed Central

    Freeman, Lauren A.

    2015-01-01

    Coral reef ecosystems are under dual threat from climate change. Increasing sea surface temperatures and thermal stress create environmental limits at low latitudes, and decreasing aragonite saturation state creates environmental limits at high latitudes. This study examines the response of unique coral reef habitats to climate change in the remote Pacific, using the National Center for Atmospheric Research Community Earth System Model version 1 alongside the species distribution algorithm Maxent. Narrow ranges of physico-chemical variables are used to define unique coral habitats and their performance is tested in future climate scenarios. General loss of coral reef habitat is expected in future climate scenarios and has been shown in previous studies. This study found exactly that for most of the predominant physico-chemical environments. However, certain coral reef habitats considered marginal today at high latitude, along the equator and in the eastern tropical Pacific were found to be quite robust in climate change scenarios. Furthermore, an environmental coral reef refuge previously identified in the central south Pacific near French Polynesia was further reinforced. Studying the response of specific habitats showed that the prevailing conditions of this refuge during the 20th century shift to a new set of conditions, more characteristic of higher latitude coral reefs in the 20th century, in future climate scenarios projected to 2100. PMID:26053439

  17. Robust Performance of Marginal Pacific Coral Reef Habitats in Future Climate Scenarios.

    PubMed

    Freeman, Lauren A

    2015-01-01

    Coral reef ecosystems are under dual threat from climate change. Increasing sea surface temperatures and thermal stress create environmental limits at low latitudes, and decreasing aragonite saturation state creates environmental limits at high latitudes. This study examines the response of unique coral reef habitats to climate change in the remote Pacific, using the National Center for Atmospheric Research Community Earth System Model version 1 alongside the species distribution algorithm Maxent. Narrow ranges of physico-chemical variables are used to define unique coral habitats and their performance is tested in future climate scenarios. General loss of coral reef habitat is expected in future climate scenarios and has been shown in previous studies. This study found exactly that for most of the predominant physico-chemical environments. However, certain coral reef habitats considered marginal today at high latitude, along the equator and in the eastern tropical Pacific were found to be quite robust in climate change scenarios. Furthermore, an environmental coral reef refuge previously identified in the central south Pacific near French Polynesia was further reinforced. Studying the response of specific habitats showed that the prevailing conditions of this refuge during the 20th century shift to a new set of conditions, more characteristic of higher latitude coral reefs in the 20th century, in future climate scenarios projected to 2100. PMID:26053439

  18. Scenarios and performance measures for advanced ISDN satellite design and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1991-01-01

    Described here are the contemplated input and expected output for the Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) and Full Service ISDN Satellite (FSIS) Models. The discrete event simulations of these models are presented with specific scenarios that stress ISDN satellite parameters. Performance measure criteria are presented for evaluating the advanced ISDN communication satellite designs of the NASA Satellite Communications Research (SCAR) Program.

  19. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data

    PubMed Central

    2011-01-01

    Background No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. Methods A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Results Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments

  20. On the Performance of Video Quality Assessment Metrics under Different Compression and Packet Loss Scenarios

    PubMed Central

    Martínez-Rach, Miguel O.; Piñol, Pablo; López, Otoniel M.; Perez Malumbres, Manuel; Oliver, José; Calafate, Carlos Tavares

    2014-01-01

    When comparing the performance of video coding approaches, evaluating different commercial video encoders, or measuring the perceived video quality in a wireless environment, Rate/distortion analysis is commonly used, where distortion is usually measured in terms of PSNR values. However, PSNR does not always capture the distortion perceived by a human being. As a consequence, significant efforts have focused on defining an objective video quality metric that is able to assess quality in the same way as a human does. We perform a study of some available objective quality assessment metrics in order to evaluate their behavior in two different scenarios. First, we deal with video sequences compressed by different encoders at different bitrates in order to properly measure the video quality degradation associated with the encoding system. In addition, we evaluate the behavior of the quality metrics when measuring video distortions produced by packet losses in mobile ad hoc network scenarios with variable degrees of network congestion and node mobility. Our purpose is to determine if the analyzed metrics can replace the PSNR while comparing, designing, and evaluating video codec proposals, and, in particular, under video delivery scenarios characterized by bursty and frequent packet losses, such as wireless multihop environments. PMID:24982988

  1. Performance Analysis of Ad Hoc Routing Protocols in City Scenario for VANET

    NASA Astrophysics Data System (ADS)

    Das, Sanjoy; Raw, Ram Shringar; Das, Indrani

    2011-12-01

    In this paper, performance analysis of Location Aided Routing (LAR), AODV and DSR protocol in city scenarios has been done. The mobility model considered is Manhattan model. This mobility model used to emulate the movement pattern of nodes i.e., vehicles on streets defined by maps. Our objective is to provide a comparative analysis among LAR, AODV and DSR protocol in city scenarios in Vehicular Ad hoc Networks. The simulation work has been conducted using the Glomosim 2.03 simulator. The results show that LAR1 protocol achieves maximum packet delivery ratio is 100% in the sparsely populated network. The delay is maximum in AODV 121.88 ms when the number of node is 10 in the network. The results show that LAR1 outperform DSR and AODV in term of packet delivery ratio and end to end delay.

  2. A Problem-Based Learning Scenario That Can Be Used in Science Teacher Education

    ERIC Educational Resources Information Center

    Sezgin Selçuk, Gamze

    2015-01-01

    The purpose of this study is to introduce a problem-based learning (PBL) scenario that elementary school science teachers in middle school (5th-8th grades) can use in their in-service training. The scenario treats the subjects of heat, temperature and thermal expansion within the scope of the 5th and 6th grade science course syllabi and has been…

  3. Work-Based Learning in the UK: Scenarios for the Future

    ERIC Educational Resources Information Center

    Mohamud, Mohamed; Jennings, Chris; Rix, Mike; Gold, Jeff

    2006-01-01

    Purpose: Aims to consider scenarios created by work-based learning (WBL) providers in the Tees Valley in the UK. Design/methodology/approach: The context of WBL is examined in relation to the notion of the skills gap. The method of scenario development is described. Findings: A key task of WBL is to raise the skills levels of young people. WBL…

  4. A Scenario-Based Dieting Self-Efficacy Scale: The DIET-SE

    ERIC Educational Resources Information Center

    Stich, Christine; Knauper, Barbel; Tint, Ami

    2009-01-01

    The article discusses a scenario-based dieting self-efficacy scale, the DIET-SE, developed from dieter's inventory of eating temptations (DIET). The DIET-SE consists of items that describe scenarios of eating temptations for a range of dieting situations, including high-caloric food temptations. Four studies assessed the psychometric properties of…

  5. Scenario-based water resources planning for utilities in the Lake Victoria region

    NASA Astrophysics Data System (ADS)

    Mehta, V. K.; Aslam, O.; Dale, L.; Miller, N.; Purkey, D.

    2010-12-01

    Cities in the Lake Victoria (LV) region are experiencing the highest growth rates in Africa, at the same time that their water resource is threatened by domestic waste and industrial pollution. Urban centers use local springs, wetlands and Lake Victoria as source waters. As efforts to meet increasing demand accelerate, integrated water resources management (IWRM) tools provide opportunities for utilities and other stakeholders to develop a planning framework comprehensive enough to include short term (e.g. landuse change), as well as longer term (e.g. climate change) scenarios. This paper presents IWRM models built using the Water Evaluation And Planning (WEAP) decision support system, for three pilot towns in the LV region - Bukoba (Tanzania), Masaka (Uganda), and Kisii (Kenya). Their current populations are 100,000, 70,000 and 200,000 respectively. Demand coverage is ~70% in Masaka and Bukoba, and less than 50% in Kisii. IWRM models for each town were calibrated under current system performance based on site visits, utility reporting and interviews. Projected water supply, demand, revenues and costs were then evaluated against a combination of climate, demographic and infrastructure scenarios upto 2050. In Masaka, flow and climate data were available to calibrate a runoff model to simulate streamflow at water intake. In Masaka, without considering climate change, the system is infrastructure-limited and not water availability (hydrology) limited until 2035, under projected population growth of 2.17%. Under a wet climate scenario as projected by GCM’s for the LV region, the current wetland source could supply all expected demands until 2050. Even under a drought scenario, the wetland could supply all demand until 2032, if the supply infrastructure is updated at an estimated cost of USD 10.8 million. However, demand targets can only be met at the expense of almost no water returning to the wetland downstream of the intake by 2035, unless substantial investments

  6. Scenario-based water resources planning for utilities in the Lake Victoria region

    NASA Astrophysics Data System (ADS)

    Mehta, Vishal K.; Aslam, Omar; Dale, Larry; Miller, Norman; Purkey, David R.

    Urban areas in the Lake Victoria (LV) region are experiencing the highest growth rates in Africa. As efforts to meet increasing demand accelerate, integrated water resources management (IWRM) tools provide opportunities for utilities and other stakeholders to develop a planning framework comprehensive enough to include short term (e.g. landuse change), as well as longer term (e.g. climate change) scenarios. This paper presents IWRM models built using the Water Evaluation And Planning (WEAP) decision support system, for three towns in the LV region - Bukoba (Tanzania), Masaka (Uganda), and Kisii (Kenya). Each model was calibrated under current system performance based on site visits, utility reporting and interviews. Projected water supply, demand, revenues and costs were then evaluated against a combination of climate, demographic and infrastructure scenarios up to 2050. Our results show that water supply in all three towns is currently infrastructure limited; achieving existing design capacity could meet most projected demand until 2020s in Masaka beyond which new supply and conservation strategies would be needed. In Bukoba, reducing leakages would provide little performance improvement in the short-term, but doubling capacity would meet all demands until 2050. In Kisii, major infrastructure investment is urgently needed. In Masaka, streamflow simulations show that wetland sources could satisfy all demand until 2050, but at the cost of almost no water downstream of the intake. These models demonstrate the value of IWRM tools for developing water management plans that integrate hydroclimatology-driven supply to demand projections on a single platform.

  7. Performance Based Counselor Certification.

    ERIC Educational Resources Information Center

    Bernknopf, Stan; Ware, William B.

    For the past four years the Georgia Department of Education has been involved in a statewide effort to establish standards and procedures for certification of educational personnel based on competency demonstration. As part of this effort, a project was commissioned to develop a performance-based system for the certification of school counselors.…

  8. Accessing technical data bases using STDS: A collection of scenarios

    NASA Technical Reports Server (NTRS)

    Hardgrave, W. T.

    1975-01-01

    A line by line description is given of sessions using the set-theoretic data system (STDS) to interact with technical data bases. The data bases contain data from actual applications at NASA Langley Research Center. The report is meant to be a tutorial document that accompanies set processing in a network environment.

  9. Mannich Bases: An Important Pharmacophore in Present Scenario

    PubMed Central

    Sharma, Neha; Kajal, Anu; Saini, Vipin

    2014-01-01

    Mannich bases are the end products of Mannich reaction and are known as beta-amino ketone carrying compounds. Mannich reaction is a carbon-carbon bond forming nucleophilic addition reaction and is a key step in synthesis of a wide variety of natural products, pharmaceuticals, and so forth. Mannich reaction is important for the construction of nitrogen containing compounds. There is a number of aminoalkyl chain bearing Mannich bases like fluoxetine, atropine, ethacrynic acid, trihexyphenidyl, and so forth with high curative value. The literature studies enlighten the fact that Mannich bases are very reactive and recognized to possess potent diverse activities like anti-inflammatory, anticancer, antifilarial, antibacterial, antifungal, anticonvulsant, anthelmintic, antitubercular, analgesic, anti-HIV, antimalarial, antipsychotic, antiviral activities and so forth. The biological activity of Mannich bases is mainly attributed to α, β-unsaturated ketone which can be generated by deamination of hydrogen atom of the amine group. PMID:25478226

  10. Scenario based approach for multiple source Tsunami Hazard assessment for Sines, Portugal

    NASA Astrophysics Data System (ADS)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-08-01

    In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines - Portugal, one of the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING a Non-linear Shallow Water Model With Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water), MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawback, runup and inundation distance. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at Sines test site considering the single scenarios at mean sea level, the aggregate scenario and the influence of the tide on the aggregate scenario. The results confirm the composite of Horseshoe and Marques Pombal fault as the worst case scenario. It governs the aggregate scenario with about 60 % and inundates an area of 3.5 km2.

  11. River discharge and flood inundation over the Amazon based on IPCC AR5 scenarios

    NASA Astrophysics Data System (ADS)

    Paiva, Rodrigo; Sorribas, Mino; Jones, Charles; Carvalho, Leila; Melack, John; Bravo, Juan Martin; Beighley, Edward

    2015-04-01

    Climate change and related effects over the hydrologic regime of the Amazon River basin could have major impacts over human and ecological communities, including issues with transportation, flood vulnerability, fisheries and hydropower generation. We examined future changes in discharge and floodplain inundation within the Amazon River basin. We used the hydrological model MGB-IPH (Modelo de Grandes Bacias - Instituto de Pesquisas Hidráulicas) coupled with a 1D river hydrodynamic model simulating water storage over the floodplains. The model was forced using satellite based precipitation from the TRMM 3B42 dataset, and it had a good performance when validated against discharge and stage measurements as well as remotely sensed data, including radar altimetry-based water levels, gravity anomaly-based terrestrial water storage and flood inundation extent. Future scenarios of precipitation and other relevant climatic variables for the 2070 to 2100 time period were taken from five coupled atmosphere-ocean general circulation models (AOGCMs) from IPCC's Fifth Assessment Report (AR5) Coupled Model Intercomparison Project Phase 5 (CMIP5). The climate models were chosen based on their ability to represent the main aspects of recent (1970 to 2000) Amazon climate. A quantile-quantile bias removal procedure was applied to climate model precipitation to mitigate unreliable predictions. The hydrologic model was then forced using past observed climate data altered by delta change factors based on the past and future climate models aiming to estimate projected discharge and floodplain inundation in climate change scenario at several control points in the basin. The climate projections present large uncertainty, especially the precipitation rate, and predictions using different climate models do not agree on the sign of changes on total Amazon flood extent or discharge along the main stem of the Amazon River. However, analyses of results at different regions indicate an increase

  12. Scenario-Based Programming, Usability-Oriented Perception

    ERIC Educational Resources Information Center

    Alexandron, Giora; Armoni, Michal; Gordon, Michal; Harel, David

    2014-01-01

    In this article, we discuss the possible connection between the programming language and the paradigm behind it, and programmers' tendency to adopt an external or internal perspective of the system they develop. Based on a qualitative analysis, we found that when working with the visual, interobject language of live sequence charts (LSC),…

  13. Review of scenario selection approaches for performance assessment of high-level waste repositories and related issues.

    SciTech Connect

    Banano, E.J.; Baca, R.G.

    1995-08-01

    The selection of scenarios representing plausible realizations of the future conditions-with associated probabilities of occurrence-that can affect the long-term performance of a high-level radioactive waste (HLW) repository is the commonly used method for treating the uncertainty in the prediction of the future states of the system. This method, conventionally referred to as the ``scenario approach,`` while common is not the only method to deal with this uncertainty; other method ``ch as the environmental simulation approach (ESA), have also been proposed. Two of the difficulties with the scenario approach are the lack of uniqueness in the definition of the term ``scenario`` and the lack of uniqueness in the approach to formulate scenarios, which relies considerably on subjective judgments. Consequently, it is difficult to assure that a complete and unique set of scenarios can be defined for use in a performance assessment. Because scenarios are key to the determination of the long-term performance of the repository system, this lack of uniqueness can present a considerable challenge when attempting to reconcile the set of scenarios, and their level of detail, obtained using different approaches, particularly among proponents and regulators of a HLW repository.

  14. Applying fuzzy bi-dimensional scenario-based model to the assessment of Mars mission architecture scenarios

    NASA Astrophysics Data System (ADS)

    Tavana, Madjid; Zandi, Faramak

    2012-02-01

    Sending man to Mars has been a long-held dream of humankind. NASA plans human planetary explorations using approaches that are technically feasible, have reasonable risks and have relatively low costs. This study presents a novel Multi-Attribute Decision Making (MADM) model for evaluating a range of potential mission scenarios for the human exploration of Mars. The three alternatives identified by the Mission Operations Directorate (MOD) at the Johnson Space Center (JSC) include split mission, combo lander and dual scenarios. The proposed framework subsumes the following key methods: first, the conjunction method is used to minimize the number of alternative mission scenarios; second, the Fuzzy Risk Failure Mode and Effects Analysis (RFMEA) is used to analyze the potential failure of the alternative scenarios; third, the fuzzy group Real Option Analysis (ROA) is used to estimate the expected costs and benefits of the alternative scenarios; and fourth, the fuzzy group permutation approach is used to select the optimal mission scenario. We present the results of a case study at NASA's Johnson Space center to demonstrate: (1) the complexity of mission scenario selection involving subjective and objective judgments provided by multiple space exploration experts; and (2) a systematic and structured method for aggregating quantitative and qualitative data concerning a large number of competing and conflicting mission events.

  15. Ground surface temperature scenarios in complex high-mountain topography based on regional climate model results

    NASA Astrophysics Data System (ADS)

    Salzmann, Nadine; NöTzli, Jeannette; Hauck, Christian; Gruber, Stephan; Hoelzle, Martin; Haeberli, Wilfried

    2007-06-01

    Climate change can have severe impacts on the high-mountain cryosphere, such as instabilities in rock walls induced by thawing permafrost. Relating climate change scenarios produced from global climate models (GCMs) and regional climate models (RCMs) to complex high-mountain environments is a challenging task. The qualitative and quantitative impact of changes in climatic conditions on local to microscale ground surface temperature (GST) and the ground thermal regime is not readily apparent. This study assesses a possible range of changes in the GST (ΔGST) in complex mountain topography. To account for uncertainties associated with RCM output, a set of 12 different scenario climate time series (including 10 RCM-based and 2 incremental scenarios) was applied to the topography and energy balance (TEBAL) model to simulate average ΔGST for 36 different topographic situations. Variability of the simulated ΔGST is related primarily to the emission scenarios, the RCM, and the approach used to apply RCM results to the impact model. In terms of topography, significant influence on GST simulation was shown by aspect because it modifies the received amount of solar radiation at the surface. North faces showed higher sensitivity to the applied climate scenarios, while uncertainties are higher for south faces. On the basis of the results of this study, use of RCM-based scenarios is recommended for mountain permafrost impact studies, as opposed to incremental scenarios.

  16. Elements of Scenario-Based Learning on Suicidal Patient Care Using Real-Time Video.

    PubMed

    Lu, Chuehfen; Lee, Hueying; Hsu, Shuhui; Shu, Inmei

    2016-01-01

    This study aims understanding of students' learning experiences when receiving scenario-based learning combined with real-time video. Videos that recorded student nurses intervention with a suicidal standardized patient (SP) were replayed immediately as teaching materials. Videos clips and field notes from ten classes were analysed. Investigators and method triangulation were used to boost the robustness of the study. Three key elements, emotional involvement, concretizing of the teaching material and substitute learning were identified. Emotions were evoked among the SP, the student performer and the students who were observing, thus facilitating a learning effect. Concretizing of the teaching material refers to students were able to focus on the discussions using visual and verbal information. Substitute learning occurred when the students watching the videos, both the strengths and weaknesses represented were similar to those that would be likely to occur. These key elements explicate their learning experience and suggested a strategic teaching method. PMID:27332202

  17. ABM and GIS-based multi-scenarios volcanic evacuation modelling of Merapi

    NASA Astrophysics Data System (ADS)

    Jumadi, Carver, Steve; Quincey, Duncan

    2016-05-01

    Conducting effective evacuation is one of the successful keys to deal with such crisis. Therefore, a plan that considers the probability of the spatial extent of the hazard occurrences is needed. Likewise, the evacuation plan in Merapi is already prepared before the eruption on 2010. However, the plan could not be performed because the eruption magnitude was bigger than it was predicted. In this condition, the extent of the hazardous area was increased larger than the prepared hazard model. Managing such unpredicted situation need adequate information that flexible and adaptable to the current situation. Therefore, we applied an Agent-based Model (ABM) and Geographic Information System (GIS) using multi-scenarios hazard model to support the evacuation management. The methodology and the case study in Merapi is provided.

  18. Ontology-based Software for Generating Scenarios for Characterizing Searches for Nuclear Materials

    SciTech Connect

    Ward, Richard C; Sorokine, Alexandre; Schlicher, Bob G; Wright, Michael C; Kruse, Kara L

    2011-01-01

    A software environment was created in which ontologies are used to significantly expand the number and variety of scenarios for special nuclear materials (SNM) detection based on a set of simple generalized initial descriptions. A framework was built that combined advanced reasoning from ontologies with geographical and other data sources to generate a much larger list of specific detailed descriptions from a simple initial set of user-input variables. This presentation shows how basing the scenario generation on a process of inferencing from multiple ontologies, including a new SNM Detection Ontology (DO) combined with data extraction from geodatabases, provided the desired significant variability of scenarios for testing search algorithms, including unique combinations of variables not previously expected. The various components of the software environment and the resulting scenarios generated will be discussed.

  19. Advance yield markings and drivers’ performance in response to multiple-threat scenarios at mid-block crosswalks

    PubMed Central

    Fisher, Donald; Garay-Vega, Lisandra

    2012-01-01

    This study compares, on a simulator, drivers’ performance (eye fixations and yielding behavior) at marked mid-block crosswalks in multi-threat scenarios when the crosswalks have advance yield markings and pedestrian crosswalk prompt signs versus their performance in such scenarios when the crosswalks have standard markings. Advance yield markings and prompt signs in multi-threat scenarios lead to changes in drivers’ behaviors which are likely to reduce pedestrian–vehicle conflicts, including increases in the likelihood that the driver glances towards the pedestrian, increases in the distance at which the first glance towards the pedestrian is taken, and increases the likelihood of yielding to the pedestrian. PMID:22062334

  20. Design Scenarios for Web-Based Management of Online Information

    NASA Astrophysics Data System (ADS)

    Hepting, Daryl H.; Maciag, Timothy

    The Internet enables access to more information, from a greater variety of perspectives and with greater immediacy, than ever before. A person may be interested in information to become more informed or to coordinate his or her local activities and place them into a larger, more global context. The challenge, as has been noted by many, is to sift through all the information to find what is relevant without becoming overwhelmed. Furthermore, the selected information must be put into an actionable form. The diversity of the Web has important consequences for the variety of ideas that are now available. While people once relied on newspaper editors to shape their view of the world, today's technology creates room for a more democratic approach. Today it is easy to pull news feeds from a variety of sources and aggregate them. It is less easy to push that information to a variety of channels. At a higher level, we might have the goal of collecting all the available information about a certain topic, on a daily basis. There are many new technologies available under the umbrella of Web 2.0, but it can be difficult to use them together for the management of online information. Web-based support for online communication management is the most appropriate choice to address the deficiencies apparent with current technologies. We consider the requirements and potential designs for such information management support, by following an example related to local food.

  1. Nanocarriers Based Anticancer Drugs: Current Scenario and Future Perceptions.

    PubMed

    Raj, Rakesh; Mongia, Pooja; Kumar Sahu, Suresh; Ram, Alpana

    2016-01-01

    Anticancer therapies mostly depend on the ability of the bioactives to reach their designated cellular and subcellular target sites, while minimizing accumulation and side effects at non specific sites. The development of nanotechnology based drug delivery systems that are able to modify the biodistribution, tissue uptake and pharmacokinetics of therapeutic agents is considered of great importance in biomedical research and treatment therapy. Controlled releases from nanocarriers can significantly enhance the therapeutic effect of a drug. Nanotechnology has the potential to revolutionize in cancer diagnosis and therapy. Targeted nano medicines either marketed or under development, are designed for the treatment of various types of cancer. Nanocarriers are able to reduce cytotoxic effect of the active anticancer drugs by increasing cancer cell targeting in comparison to conventional formulations. The newly developed nano devices such as quantum dots, liposomes, nanotubes, nanoparticles, micelles, gold nanoparticles, carbon nanotubes and solid lipid nanoparticles are the most promising applications for various cancer treatments. This review is focused on currently available information regarding pharmaceutical nanocarriers for cancer therapy and imaging. PMID:26201484

  2. Performance-based ratemaking

    SciTech Connect

    Cross, P.S.

    1995-07-15

    Performance-based ratemaking (PBR) departs from the cost-of-service standard in setting just and reasonable utility rates, but that departure isn`t as easy as it looks. Up until now, cost-of-service ratemaking has provided relatively stable rates, while enabling utilities to attract enormous amounts of capital. Of late, however, regulators appear to be heeding the argument that changing markets warrant a second look. Throughout the country and across the utility industry, some regulators appear willing to abandon cost of service as a proxy for competition, instead favoring performance-based methods that would rely on competitive forces. These performance-based schemes vary in their details but generally afford utilities the opportunity to increase profits by exceeding targets for efficiency and cost savings. Moreover, these plans purport to streamline the regulatory process. Annual, accounting-type reviews replace rate hearings. Cost-of-service studies might not be required at all once initial rates are fixed. Nevertheless, these PBR plans rely on cost-based rates as a starting point and still contain safeguards to protect ratepayers. PBR falls short of true deregulation. As the Massachusetts Department of Public Utilities noted recently in an order approving a PBR variant known as price-cap regulation for New England Telephone and Telegraph Co., `price-cap regulation is not deregulation; it is merely another way for regulators to control the rates charged by a firm.`

  3. Scenario Simulation-Based Assessment of Trip Difficulty for Urban Residents under Rainstorm Waterlogging

    PubMed Central

    Chen, Peng; Zhang, Jiquan; Jiang, Xinyu; Liu, Xingpeng; Bao, Yulong; Sun, Yingyue

    2012-01-01

    In this study, an experiment was performed to assess the trip difficulty for urban residents of different age groups walking in various depths of water, and the data were corroborated with the real urban rainstorm waterlogging scenarios in downtown (Daoli district) Ha-Erbin (China). Mathematical models of urban rainstorm waterlogging were constructed using scenario simulation methods, aided by the GIS spatial analysis technology and hydrodynamic analysis of the waterway systems in the study area. Then these models were used to evaluate the impact of waterlogging on the safety of residents walking in the affected area. Results are summarized as: (1) for an urban rainstorm waterlogging scenario reoccurring once every 10 years, three grid regions would have waterlogging above 0.5 m moving at a velocity of 1.5 m/s. Under this scenario, waterlogging would accumulate on traffic roads only in small areas, affecting the safety and mobility of residents walking in the neighborhood; (2) for an urban rainstorm waterlogging scenario reoccurring once every 20 years, 13 grids experienced the same waterlogging situation affecting a larger area of the city; (3) for an urban rainstorm waterlogging scenario reoccurring once every 50 years, 86 grid regions were affected (waterlogging above 0.5 m moving at 1.5 m/s), and those areas would become impassable for residents. PMID:22829790

  4. Scenario-Based Training on Human Errors Contributing to Security Incidents

    SciTech Connect

    Greitzer, Frank L.; Pond, Daniel J.; Jannotta, Marjorie

    2004-12-06

    Error assessment studies reveal that ''human errors'' are often the consequence of unsuitable environmental factors, ineffective systems, inappropriate task conditions, and individual actions or failures to act. The US Department of Energy (DOE) initiated a program to determine if system-induced human errors could also be contributing factors to security incidents. As the seminal basis for this work, the Enhanced Security Through Human Error Reduction (ESTHER) program at Los Alamos National Laboratory (LANL) produced a contributing factors data set and systems categorization for security related incidents attributed to human error. This material supports the development and delivery of training for security incident inquiry officials. While LANL's initial work focused on classroom training, a collaborative effort between LANL and Pacific Northwest National Laboratory (PNNL) has focused on delivering interactive e-Learning training applications based on ESTHER principles. Through training, inquiry officials will understand and be capable of applying the underlying human error control concepts to new or novel situations. Their performance requires a high degree of analysis and judgment to accomplish the associated cognitive and procedural tasks. To meet this requirement, we employed cognitive principles of instructional design to engage the learner in interactive, realistic, problem-centered activity; we constructed scenarios within a guided-discovery framework; and we utilized learner-centered developmental sequences leading to field application. To enhance the relevance and realism of the training experience, we employed 3-D modeling technologies in constructing interactive scenarios. This paper describes the application of cognitive learning principles, use of varied media, and the implementation challenges in developing a technology-rich, interactive security incident training program that includes Web-based training.

  5. Real-time determination of the worst tsunami scenario based on Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Furuya, Takashi; Koshimura, Shunichi; Hino, Ryota; Ohta, Yusaku; Inoue, Takuya

    2016-04-01

    In recent years, real-time tsunami inundation forecasting has been developed with the advances of dense seismic monitoring, GPS Earth observation, offshore tsunami observation networks, and high-performance computing infrastructure (Koshimura et al., 2014). Several uncertainties are involved in tsunami inundation modeling and it is believed that tsunami generation model is one of the great uncertain sources. Uncertain tsunami source model has risk to underestimate tsunami height, extent of inundation zone, and damage. Tsunami source inversion using observed seismic, geodetic and tsunami data is the most effective to avoid underestimation of tsunami, but needs to expect more time to acquire the observed data and this limitation makes difficult to terminate real-time tsunami inundation forecasting within sufficient time. Not waiting for the precise tsunami observation information, but from disaster management point of view, we aim to determine the worst tsunami source scenario, for the use of real-time tsunami inundation forecasting and mapping, using the seismic information of Earthquake Early Warning (EEW) that can be obtained immediately after the event triggered. After an earthquake occurs, JMA's EEW estimates magnitude and hypocenter. With the constraints of earthquake magnitude, hypocenter and scaling law, we determine possible multi tsunami source scenarios and start searching the worst one by the superposition of pre-computed tsunami Green's functions, i.e. time series of tsunami height at offshore points corresponding to 2-dimensional Gaussian unit source, e.g. Tsushima et al., 2014. Scenario analysis of our method consists of following 2 steps. (1) Searching the worst scenario range by calculating 90 scenarios with various strike and fault-position. From maximum tsunami height of 90 scenarios, we determine a narrower strike range which causes high tsunami height in the area of concern. (2) Calculating 900 scenarios that have different strike, dip, length

  6. The centricity of presence in scenario-based high fidelity human patient simulation: a model.

    PubMed

    Dunnington, Renee M

    2015-01-01

    Enhancing immersive presence has been shown to have influence on learning outcomes in virtual types of simulation. Scenario-based human patient simulation, a mixed reality form, may pose unique challenges for inducing the centricity of presence among participants in simulation. A model for enhancing the centricity of presence in scenario-based human patient simulation is presented here. The model represents a theoretical linkage among the interaction of pedagogical, individual, and group factors that influence the centricity of presence among participants in simulation. Presence may have an important influence on the learning experiences and learning outcomes in scenario-based high fidelity human patient simulation. This report is a follow-up to an article published in 2014 by the author where connections were made to the theoretical basis of presence as articulated by nurse scholars. PMID:25520467

  7. SAFRR AND Physics-Based Scenarios: The Power of Scientifically Credible Stories

    NASA Astrophysics Data System (ADS)

    Cox, D. A.; Jones, L.

    2015-12-01

    USGS's SAFRR (Science Application for Risk Reduction) Project and its predecessor, the Multi Hazards Demonstration Project, uses the latest earth science to develop scenarios so that communities can improve disaster resilience. SAFRR has created detailed physics-based natural-disaster scenarios of a M7.8 San Andreas earthquake in southern California (ShakeOut), atmospheric-river storms rivaling the Great California Flood of 1862 (ARkStorm), a Tohoku-sized earthquake and tsunami in the eastern Aleutians (SAFRR Tsunami), and now a M7.05 quake on the Hayward Fault in the San Francisco Bay area (HayWired), as novel ways of providing science for decision making. Each scenario is scientifically plausible, deterministic, and large enough to demand attention but not too large to be believable. The scenarios address interacting hazards, requiring involvement of multiple science disciplines and user communities. The scenarios routinely expose hitherto unknown or ignored vulnerabilities, most often in cascading effects missed when impacts are considered in isolation. They take advantage of story telling to provide decision makers with clear explanations and justifications for mitigation and preparedness actions, and have been used for national-to-local disaster response exercises and planning. Effectiveness is further leveraged by downscaling the scenarios to local levels. For example, although the ARkStorm scenario describes state-scale events and has been used that way by NASA and the Navy, SAFRR also partnered with FEMA to focus on two local areas, Ventura County in the coastal plain and the mountain setting of Lake Tahoe with downstream impacts in Reno, Sparks and Carson City. Downscaling and focused analyses increased usefulness to user communities, drawing new participants into the study. SAFRR scenarios have also motivated new research to answer questions uncovered by stakeholders, closing the circle of co-evolving disaster-science and disaster-response improvements.

  8. A multivariate copula-based framework for dealing with hazard scenarios and failure probabilities

    NASA Astrophysics Data System (ADS)

    Salvadori, G.; Durante, F.; De Michele, C.; Bernardi, M.; Petrella, L.

    2016-05-01

    This paper is of methodological nature, and deals with the foundations of Risk Assessment. Several international guidelines have recently recommended to select appropriate/relevant Hazard Scenarios in order to tame the consequences of (extreme) natural phenomena. In particular, the scenarios should be multivariate, i.e., they should take into account the fact that several variables, generally not independent, may be of interest. In this work, it is shown how a Hazard Scenario can be identified in terms of (i) a specific geometry and (ii) a suitable probability level. Several scenarios, as well as a Structural approach, are presented, and due comparisons are carried out. In addition, it is shown how the Hazard Scenario approach illustrated here is well suited to cope with the notion of Failure Probability, a tool traditionally used for design and risk assessment in engineering practice. All the results outlined throughout the work are based on the Copula Theory, which turns out to be a fundamental theoretical apparatus for doing multivariate risk assessment: formulas for the calculation of the probability of Hazard Scenarios in the general multidimensional case (d≥2) are derived, and worthy analytical relationships among the probabilities of occurrence of Hazard Scenarios are presented. In addition, the Extreme Value and Archimedean special cases are dealt with, relationships between dependence ordering and scenario levels are studied, and a counter-example concerning Tail Dependence is shown. Suitable indications for the practical application of the techniques outlined in the work are given, and two case studies illustrate the procedures discussed in the paper.

  9. Hybrid Modeling for Scenario-Based Evaluation of Failure Effects in Advanced Hardware-Software Designs

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David

    2001-01-01

    This paper describes an incremental scenario-based simulation approach to evaluation of intelligent software for control and management of hardware systems. A hybrid continuous/discrete event simulation of the hardware dynamically interacts with the intelligent software in operations scenarios. Embedded anomalous conditions and failures in simulated hardware can lead to emergent software behavior and identification of missing or faulty software or hardware requirements. An approach is described for extending simulation-based automated incremental failure modes and effects analysis, to support concurrent evaluation of intelligent software and the hardware controlled by the software

  10. Response of extreme flood characteristics based on future climate change scenarios at Yermasoyia watershed, Cyprus

    NASA Astrophysics Data System (ADS)

    Vasiliades, Lampros; Gkilimanakis, Eleftherios; Loukas, Athanasios

    2014-05-01

    The aim of this study which was performed within working group 4 in the FloodFreq COST Action is to assess and quantify changes in daily streamflow and subsequent flood response modelling due to potential climate change in Yermasoyia watershed, Cyprus. Eight statistical downscaling methods are used to estimate historical and future daily precipitation and temperature timeseries. Four methods are based on change factors and four are bias correction methods and these methods are used to downscale precipitation and temperature output from fifteen RCMs from the ENSEMBLES project. Several well-known lumped hydrological model structures (such as the GR4J, the IHACRES models, and the AWBM) are applied to estimate the daily streamflows. Performance of the models is evaluated with the use of fit statistics or metrics for calibration and validation periods using the split sample test. A set of flood indices are derived from the daily simulated streamflows and their changes have been evaluated by comparing the periods 1960-1990 and 2070-2100. The results show that both the magnitude and the volume of annual peakflows is decreasing fow all examined scenarios, downscaling methods and employed hydrological models.

  11. Design Process of a Goal-Based Scenario on Computing Fundamentals

    ERIC Educational Resources Information Center

    Beriswill, Joanne Elizabeth

    2014-01-01

    In this design case, an instructor developed a goal-based scenario (GBS) for undergraduate computer fundamentals students to apply their knowledge of computer equipment and software. The GBS, entitled the MegaTech Project, presented the students with descriptions of the everyday activities of four persons needing to purchase a computer system. The…

  12. The Effects of Task, Database, and Guidance on Interaction in a Goal-Based Scenario.

    ERIC Educational Resources Information Center

    Bell, Benjamin

    This paper describes the "Sickle Cell Counselor" (SCC), a goal based scenario on permanent display at the Museum of Science and Industry in Chicago. SCC is an exploratory hypermedia simulation program which provides users with a basic understanding of Sickle Cell Anemia. The user of the program plays the role of a genetic counselor, and, while…

  13. THE SCENARIOS APPROACH TO ATTENUATION-BASED REMEDIES FOR INORGANIC AND RADIONUCLIDE CONTAMINANTS

    SciTech Connect

    Vangelas, K.; Rysz, M.; Truex, M.; Brady, P.; Newell, C.; Denham, M.

    2011-08-04

    Guidance materials based on use of conceptual model scenarios were developed to assist evaluation and implementation of attenuation-based remedies for groundwater and vadose zones contaminated with inorganic and radionuclide contaminants. The Scenarios approach is intended to complement the comprehensive information provided in the US EPA's Technical Protocol for Monitored Natural Attenuation (MNA) of Inorganic Contaminants by providing additional information on site conceptual models and extending the evaluation to consideration of Enhanced Attenuation approaches. The conceptual models incorporate the notion of reactive facies, defined as units with hydrogeochemical properties that are different from surrounding units and that react with contaminants in distinct ways. The conceptual models also incorporate consideration of biogeochemical gradients, defined as boundaries between different geochemical conditions that have been induced by waste disposal or other natural phenomena. Gradients can change over time when geochemical conditions from one area migrate into another, potentially affecting contaminant mobility. A recognition of gradients allows the attenuation-affecting conditions of a site to be projected into the future. The Scenarios approach provides a stepwise process to identify an appropriate category of conceptual model and refine it for a specific site. Scenario materials provide links to pertinent sections in the EPA technical protocol and present information about contaminant mobility and important controlling mechanism for attenuation-based remedies based on the categories of conceptual models.

  14. Assessing the Psychometric Properties of a Scenario-Based Measure of Achievement Guilt and Shame

    ERIC Educational Resources Information Center

    Thompson, Ted; Sharp, Jessica; Alexander, James

    2008-01-01

    In this study, the psychometric properties of the scenario-based Achievement Guilt and Shame Scale (AGSS) were established. The AGSS and scales assessing interpersonal guilt and shame, high standards, overgeneralization, self-criticism, self-esteem, academic self-concept, fear of failure, and tendency to respond in a socially desirable manner were…

  15. Pre-Service Teachers' Perspectives on Using Scenario-Based Virtual Worlds in Science Education

    ERIC Educational Resources Information Center

    Kennedy-Clark, Shannon

    2011-01-01

    This paper presents the findings of a study on the current knowledge and attitudes of pre-service teachers on the use of scenario-based multi-user virtual environments in science education. The 28 participants involved in the study were introduced to "Virtual Singapura," a multi-user virtual environment, and completed an open-ended questionnaire.…

  16. Multimedia Scenario Based Learning Programme for Enhancing the English Language Efficiency among Primary School Students

    ERIC Educational Resources Information Center

    Tupe, Navnath

    2015-01-01

    This research was undertaken with a view to assess the deficiencies in English language among Primary School Children and to develop Multimedia Scenario Based Learning Programme (MSBLP) for mastery of English language which required special attention and effective treatment. The experimental study with pre-test, post-test control group design was…

  17. Teaching Early Childhood Education Students through Interactive Scenario-Based Course Design

    ERIC Educational Resources Information Center

    Sheridan, Kathleen Mary; Kelly, Melissa A.

    2012-01-01

    Early childhood teacher education courses must prepare students for the types of challenges they will face in communities and classrooms after graduation. By adopting a scenario-based approach, teacher educators and others designing online environments can help prepare students for these challenges. Solving complex problems inherent in a…

  18. Life cycle assessment of Italian citrus-based products. Sensitivity analysis and improvement scenarios.

    PubMed

    Beccali, Marco; Cellura, Maurizio; Iudicello, Maria; Mistretta, Marina

    2010-07-01

    Though many studies concern the agro-food sector in the EU and Italy, and its environmental impacts, literature is quite lacking in works regarding LCA application on citrus products. This paper represents one of the first studies on the environmental impacts of citrus products in order to suggest feasible strategies and actions to improve their environmental performance. In particular, it is part of a research aimed to estimate environmental burdens associated with the production of the following citrus-based products: essential oil, natural juice and concentrated juice from oranges and lemons. The life cycle assessment of these products, published in a previous paper, had highlighted significant environmental issues in terms of energy consumption, associated CO(2) emissions, and water consumption. Starting from such results the authors carry out an improvement analysis of the assessed production system, whereby sustainable scenarios for saving water and energy are proposed to reduce environmental burdens of the examined production system. In addition, a sensitivity analysis to estimate the effects of the chosen methods will be performed, giving data on the outcome of the study. Uncertainty related to allocation methods, secondary data sources, and initial assumptions on cultivation, transport modes, and waste management is analysed. The results of the performed analyses allow stating that every assessed eco-profile is differently influenced by the uncertainty study. Different assumptions on initial data and methods showed very sensible variations in the energy and environmental performances of the final products. Besides, the results show energy and environmental benefits that clearly state the improvement of the products eco-profile, by reusing purified water use for irrigation, using the railway mode for the delivery of final products, when possible, and adopting efficient technologies, as the mechanical vapour recompression, in the pasteurisation and

  19. A Scenario-Based Protocol Checker for Public-Key Authentication Scheme

    NASA Astrophysics Data System (ADS)

    Saito, Takamichi

    Security protocol provides communication security for the internet. One of the important features of it is authentication with key exchange. Its correctness is a requirement of the whole of the communication security. In this paper, we introduce three attack models realized as their attack scenarios, and provide an authentication-protocol checker for applying three attack-scenarios based on the models. We also utilize it to check two popular security protocols: Secure SHell (SSH) and Secure Socket Layer/Transport Layer Security (SSL/TLS).

  20. An Ontology-Based Scenario for Teaching the Management of Health Information Systems.

    PubMed

    Jahn, Franziska; Schaaf, Michael; Kahmann, Christian; Tahar, Kais; Kücherer, Christian; Paech, Barbara; Winter, Alfred

    2016-01-01

    The terminology for the management of health information systems is characterized by complexity and polysemy which is both challenging for medical informatics students and practitioners. SNIK, an ontology of information management (IMI) in hospitals, brings together IM concepts from different literature sources. Based on SNIK, we developed a blended learning scenario to teach medical informatics students IM concepts and their relationships. In proof-of-concept teaching units, students found the use of SNIK in teaching and learning motivating and useful. In the next step, the blended learning scenario will be rolled out to an international course for medical informatics students. PMID:27577404

  1. Scenario planning based on geomatics: a case study in Zijin mountain national forest park

    NASA Astrophysics Data System (ADS)

    Li, Mingyang; He, Yanjie; Xu, Guangcai; Wu, Wenhao; Wang, Baozhong

    2007-06-01

    With the rapid development of forest tourism, it is crucial to coordinate the conflicting goals of a forest park by making a scientific plan. It is difficult to determine the complex relationship by means of traditional laboratory and field experiments on the scale of landscape. Zijin Mountain national forest park is taken as a case study area, while RS and GIS software ERDAS 8.7, ArcGis 9.0 are chosen as the spatial platforms of doing scenario planning. Three different periods remote sensing data in the years of 2000 (IKNOS), 2002(SPOT5), 2004 ( QuickBird ) are gathered, then supervised classification, neighborhood analysis are being done before three scenarios of national park in ten years are built based on Cellular Automation Model (CAM). Three spatial pattern index of mean patch area, shape index, patch density of each scenario are calculated by using the spatial pattern analysis program of Fragstats 3.3. After comparison of the three scenarios from two aspects of landscape spatial pattern and protection goals, an optimized planning is made and compared with the land classes in 2002. In the end of the paper, some problems concerned with the scenario making are discussed.

  2. Scenario Based Approach for Multiple Source Tsunami Hazard Assessment for Sines, Portugal

    NASA Astrophysics Data System (ADS)

    Wronna, Martin; Omira, Rachid; Baptista, Maria Ana

    2015-04-01

    In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines, Portugal one the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean to the southwest facing the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, a total of five scenarios were selected to assess tsunami impact at the test site. These scenarios correspond to the worst-case credible scenario approach based upon the largest events of the historical and paleo tsunami catalogues. The tsunami simulations from the source area towards the coast is carried out using NSWING a Non-linear Shallow Water Model With Nested Grids. The code solves the non-linear shallow water equations using the discretization and explicit leap-frog finite difference scheme, in a Cartesian or Spherical frame. The initial sea surface displacement is assumed to be equal to the sea bottom deformation that is computed by Okada equations. Both uniform and non-uniform slip conditions are used. The presented results correspond to the models using non-uniform slip conditions. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water) MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawdown, run-up and inundation distance. Synthetic waveforms are computed at virtual tide gages at specific locations outside and inside the harbour. The final results consist of Aggregate Scenario Maps presented for the different inundation parameters. This work is funded by ASTARTE - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839

  3. Analysis of cloud-based solutions on EHRs systems in different scenarios.

    PubMed

    Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud. PMID:22492177

  4. Lung cancer screening: review and performance comparison under different risk scenarios.

    PubMed

    Tota, Joseph E; Ramanakumar, Agnihotram V; Franco, Eduardo L

    2014-02-01

    Lung cancer is currently one of the most common malignant diseases and is responsible for substantial mortality worldwide. Compared with never smokers, former smokers remain at relatively high risk for lung cancer, accounting for approximately half of all newly diagnosed cases in the US. Screening offers former smokers the best opportunity to reduce their risk of advanced stage lung cancer and there is now evidence that annual screening using low-dose computed tomography (LDCT) is effective in preventing mortality. Studies are being conducted to evaluate whether the benefits of LDCT screening outweigh its costs and potential harms and to determine the most appropriate workup for patients with screen-detected lung nodules. Program efficiency would be optimized by targeting high-risk current smokers, but low uptake among this group is a concern. Former smokers may be invited for screening; however, if fewer long-term current smokers and more former smokers with long quit duration elect to attend, this could have very adverse effects on cost and screening test parameters. To illustrate this point, we present three possible screening scenarios with lung cancer prevalence ranging from between 0.62 and 5.0 %. In summary, cost-effectiveness of lung cancer screening may be improved if linked to successful smoking cessation programs and if better approaches are developed to reach very high-risk patients, e.g., long-term current smokers or others based on more accurate risk prediction models. PMID:24153450

  5. Probabilistic scenario-based water resource planning and management:A case study in the Yellow River Basin, China

    NASA Astrophysics Data System (ADS)

    Dong, C.; Schoups, G.; van de Giesen, N.

    2012-04-01

    Water resource planning and management is subject to large uncertainties with respect to the impact of climate change and socio-economic development on water systems. In order to deal with these uncertainties, probabilistic climate and socio-economic scenarios were developed based on the Principle of Maximum Entropy, as defined within information theory, and as inputs to hydrological models to construct probabilistic water scenarios using Monte Carlo simulation. Probabilistic scenarios provide more explicit information than equally-likely scenarios for decision-making in water resource management. A case was developed for the Yellow River Basin, China, where future water availability and water demand are affected by both climate change and socio-economic development. Climate scenarios of future precipitation and temperature were developed based on the results of multiple Global climate models; and socio-economic scenarios were downscaled from existing large-scale scenarios. Probability distributions were assigned to these scenarios to explicitly represent a full set of future possibilities. Probabilistic climate scenarios were used as input to a rainfall-runoff model to simulate future river discharge and socio-economic scenarios for calculating water demand. A full set of possible future water supply-demand scenarios and their associated probability distributions were generated. This set can feed the further analysis of the future water balance, which can be used as a basis to plan and manage water resources in the Yellow River Basin. Key words: Probabilistic scenarios, climate change, socio-economic development, water management

  6. Exposure Scenarios and Unit Dose Factors for the Hanford Immobilized Low Activity Tank Waste Performance Assessment

    SciTech Connect

    RITTMANN, P.D.

    1999-12-29

    Exposure scenarios are defined to identify potential pathways and combinations of pathways that could lead to radiation exposure from immobilized tank waste. Appropriate data and models are selected to permit calculation of dose factors for each exposure

  7. Scenarios of Future Water use on Mediterranean Islands based on an Integrated Assessment of Water Management

    NASA Astrophysics Data System (ADS)

    Lange, M. A.

    2006-12-01

    The availability of water in sufficient quantities and adequate quality presents considerable problems on Mediterranean islands. Because of their isolation and thus the impossibility to draw on more distant or more divers aquifers, they rely entirely on precipitation as natural replenishing mechanism. Recent observations indicate decreasing precipitation, increasing evaporation and steadily growing demand for water on the islands. Future climate change will exacerbate this problem, thus increasing the already pertinent vulnerability to droughts. Responsible planning of water management strategies requires scenarios of future supply and demand through an integrated assessment including climate scenarios based on regional climate modeling as well as scenarios on changes in societal and economical determinants of water demand. Constructing such strategies necessitates a thorough understanding about the interdependencies and feedbacks between physical/hydrological and socio-economic determinants of water balances on an island. This has to be based on a solid understanding of past and present developments of these drivers. In the framework of the EU-funded MEDIS project (Towards sustainable water use on Mediterranean Islands: addressing conflicting demands and varying hydrological, social and economic conditions, EVK1-CT-2001-00092), detailed investigations on present vulnerabilities and adaptation strategies to droughts have been carried out on Mallorca, Corsica, Sicily, Crete and Cyprus. This was based on an interdisciplinary study design including hydrological, geophysical, agricultural-, social and political sciences investigations. A central element of the study has been the close interaction with stakeholders on the islands and their contribution to strategy formulation. An important result has been a specification of vulnerability components including: a physical/environmental-, an economical/regulatory- and a social/institutional/political component. Their

  8. Performance of a Frequency-Hopped Real-Time Remote Control System in a Multiple Access Scenario

    NASA Astrophysics Data System (ADS)

    Cervantes, Frank

    A recent trend is observed in the context of the radio-controlled aircrafts and automobiles within the hobby grade category and Unmanned Aerial Vehicles (UAV) applications moving to the well-known Industrial, Scientific and Medical (ISM) band. Based on this technological fact, the present thesis evaluates an individual user performance by featuring a multiple-user scenario where several point-to-point co-located real-time Remote Control (RC) applications operate using Frequency Hopping Spread Spectrum (FHSS) as a medium access technique in order to handle interference efficiently. Commercial-off-the-shelf wireless transceivers ready to operate in the ISM band are considered as the operational platform supporting the above-mentioned applications. The impact of channel impairments and of different critical system engineering issues, such as working with real clock oscillators and variable packet duty cycle, are considered. Based on the previous, simulation results allowed us to evaluate the range of variation for those parameters for an acceptable system performance under Multiple Access (MA) environments.

  9. Distributed Collaboration Activities in a Blended Learning Scenario and the Effects on Learning Performance

    ERIC Educational Resources Information Center

    Gerber, M.; Grund, S.; Grote, G.

    2008-01-01

    The aim of this study was to investigate the nature of tutor and student online communication and collaboration activities in a blended learning course. The hypothesis that these activities are related to student learning performance (exam results) was tested based on the number of messages posted, as well as the nature of these messages (type of…

  10. Analytic Performance of Monopulse Spread Spectrum Tracking System in Multiple-target Scenario

    NASA Astrophysics Data System (ADS)

    Cong, Bo; Qu, Yuanxin; Zhang, Yongliang

    2016-02-01

    With the application of spread spectrum technique in satellite tracking, the accuracy of CDMA-based multiple-target tracking needs to be analyzed. In this paper,we present the analytic form of the multiple-target tracking performance through mathematical deduction, together with simulation results.

  11. Organizational Learning and Performance: Understanding Indian Scenario in Present Global Context

    ERIC Educational Resources Information Center

    Khandekar, Aradhana; Sharma, Anuradha

    2006-01-01

    Purpose: The purpose of this paper is to show that the role of organizational learning is increasingly becoming crucial for organizational performance. Based on the study of three Indian global firms operating in National Capital Region of Delhi, India, this study explores the correlation of organizational learning with organizational performance…

  12. Scenarios which may lead to the rise of an asteroid-based technical civilisation

    NASA Astrophysics Data System (ADS)

    Kecskes, Csaba

    2002-05-01

    In a previous paper, the author described a hypothetical development path of technical civilisations which has the following stages: planet dwellers, asteroid dwellers, interstellar travellers, interstellar space dwellers. In this paper, several scenarios are described which may cause the rise of an asteroid-based technical civilisation. Before such a transition may take place, certain space technologies must be developed fully (now these exist only in very preliminary forms): closed-cycle biological life support systems, space manufacturing systems, electrical propulsion systems. After mastering these technologies, certain events may provide the necessary financial means and social impetus for the foundation of the first asteroid-based colonies. In the first scenario, a rich minority group becomes persecuted and they decide to leave the Earth. In the second scenario, a "cold war"-like situation exists and the leaders of the superpowers order the creation of asteroid-based colonies to show off their empires' technological (and financial) grandiosity. In the third scenario, the basic situation is similar to the second one, but in this case the asteroids are not just occupied by the colonists. With several decades of hard work, an asteroid can be turned into a kinetic energy weapon which can provide the same (or greater) threat as the nuclear arsenal of a present superpower. In the fourth scenario, some military asteroids are moved to Earth-centred orbits and utilised as "solar power satellites" (SPS). This would be a quite economical solution because a "military asteroid" already contains most of the important components of an SPS (large solar collector arrays, power distribution devices, orbit modifying rocket engine), one should add only a large microwave transmitter.

  13. An exploration of scenario discussion in a Web-based nursing course.

    PubMed

    Hsu, Li-Ling; Hsieh, Suh-Ing

    2006-06-01

    Complexity in nursing education has increased as it is challenged to meet the needs of diverse populations in rapidly evolving and highly technical health care settings. To accomplish or meet these societal wants, needs, and demands, nursing educators must prepare students successfully to become active, independent learners and problem solvers. The purpose of this study was to design a nursing course on the basis of scenario discussion, Web-based instruction (WBI), and the assessment of learning outcomes. The design of the study involved two stages. The first, beginning in 2001, developed the scenario discussion with the WBI system. The second evaluated learning outcomes within the context of a scenario discussion. Two instruments were examined in this study: a nursing assessment score and learning effectiveness survey. The target population in this study consisted of students enrolled in a two-year nursing program and registered for the course, Nursing I, during the fall semester of 2002. Using simple random sampling, 43 students were recruited and agreed to participate in the study. Most of the students chose "good" for learning effectiveness. Overall, the students gave higher learning effectiveness survey scores and nursing assessment scores. Due to their lack of previous exposure to scenario discussion, the students here felt frustration and anxiety while taking this course. Faculty should devote more time explaining the advantages of scenario discussion. In addition, in comparison with traditional teaching, Web-based instruction (WBI) imposes a heavier burden on the instructors and institutions involved. Nurse educators must continue to use innovative strategies to enhance student learning. Students registered both positive and negative feedback in open-ended questions on Web- based instruction. However in the future, special attention should be given to the learning software, Internet access speed, synchronous and asynchronous meetings, and the interaction

  14. Scenarios which may lead to the rise of an asteroid-based technical civilisation.

    PubMed

    Kecskes, Csaba

    2002-05-01

    In a previous paper, the author described a hypothetical development path of technical civilisations which has the following stages: planet dwellers, asteroid dwellers, interstellar travellers, interstellar space dwellers. In this paper, several scenarios are described which may cause the rise of an asteroid-based technical civilisation. Before such a transition may take place, certain space technologies must be developed fully (now these exist only in very preliminary forms): closed-cycle biological life support systems, space manufacturing systems, electrical propulsion systems. After mastering these technologies, certain events may provide the necessary financial means and social impetus for the foundation of the first asteroid-based colonies. In the first scenario, a rich minority group becomes persecuted and they decide to leave the Earth. In the second scenario, a "cold war"-like situation exists and the leaders of the superpowers order the creation of asteroid-based colonies to show off their empires' technological (and financial) grandiosity. In the third scenario, the basic situation is similar to the second one, but in this case the asteroids are not just occupied by the colonists. With several decades of hard work, an asteroid can be turned into a kinetic energy weapon which can provide the same (or greater) threat as the nuclear arsenal of a present superpower. In the fourth scenario, some military asteroids are moved to Earth-centred orbits and utilised as "solar power satellites" (SPS). This would be a quite economical solution because a "military asteroid" already contains most of the important components of an SPS (large solar collector arrays, power distribution devices, orbit modifying rocket engine), one should add only a large microwave transmitter. PMID:11989487

  15. Lunar Outpost Life Support Architecture Study Based on a High Mobility Exploration Scenario

    NASA Technical Reports Server (NTRS)

    Lange, Kevin E.; Anderson, Molly S.

    2009-01-01

    As scenarios for lunar surface exploration and habitation continue to evolve within NASA s Constellation program, so must studies of optimal life support system architectures and technologies. This paper presents results of a life support architecture study based on a 2009 NASA scenario known as Scenario 12. Scenario 12 represents a consolidation of ideas from earlier NASA scenarios and includes an outpost near the Lunar South Pole comprised of three larger fixed surface elements and four attached pressurized rovers. The scenario places a high emphasis on surface mobility, with planning assuming that all four crewmembers spend roughly 50% of the time away from the outpost on 3-14 day excursions in two of the pressurized rovers. Some of the larger elements can also be mobilized for longer duration excursions. This emphasis on mobility poses a significant challenge for a regenerative life support system in terms of cost-effective waste collection and resource recovery across multiple elements, including rovers with very constrained infrastructure resources. The current study considers pressurized rovers as part of a distributed outpost life support architecture in both stand-alone and integrated configurations. A range of architectures are examined reflecting different levels of closure and distributed functionality. Different lander propellant scavenging options are also considered involving either initial conversion of residual oxygen and hydrogen propellants to water or initial direct oxygen scavenging. Monte Carlo simulations are used to assess the sensitivity of results to volatile high-impact mission variables, including the quantity of residual lander propellants available for scavenging, the fraction of crew time away from the outpost on excursions, total extravehicular activity hours, and habitat leakage. Architectures are evaluated by estimating surpluses or deficits of water and oxygen per 180-day mission and differences in fixed and 10-year

  16. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    NASA Astrophysics Data System (ADS)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging

  17. Health care professional workstation: software system construction using DSSA scenario-based engineering process.

    PubMed

    Hufnagel, S; Harbison, K; Silva, J; Mettala, E

    1994-01-01

    This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries. PMID:8125652

  18. Making pharmacogenomic-based prescribing alerts more effective: A scenario-based pilot study with physicians.

    PubMed

    Overby, Casey Lynnette; Devine, Emily Beth; Abernethy, Neil; McCune, Jeannine S; Tarczy-Hornoch, Peter

    2015-06-01

    To facilitate personalized drug dosing (PDD), this pilot study explored the communication effectiveness and clinical impact of using a prototype clinical decision support (CDS) system embedded in an electronic health record (EHR) to deliver pharmacogenomic (PGx) information to physicians. We employed a conceptual framework and measurement model to access the impact of physician characteristics (previous experience, awareness, relative advantage, perceived usefulness), technology characteristics (methods of implementation-semi-active/active, actionability-low/high) and a task characteristic (drug prescribed) on communication effectiveness (usefulness, confidence in prescribing decision), and clinical impact (uptake, prescribing intent, change in drug dosing). Physicians performed prescribing tasks using five simulated clinical case scenarios, presented in random order within the prototype PGx-CDS system. Twenty-two physicians completed the study. The proportion of physicians that saw a relative advantage to using PGx-CDS was 83% at the start and 94% at the conclusion of our study. Physicians used semi-active alerts 74-88% of the time. There was no association between previous experience with, awareness of, and belief in a relative advantage of using PGx-CDS and improved uptake. The proportion of physicians reporting confidence in their prescribing decisions decreased significantly after using the prototype PGx-CDS system (p=0.02). Despite decreases in confidence, physicians perceived a relative advantage to using PGx-CDS, viewed semi-active alerts on most occasions, and more frequently changed doses toward doses supported by published evidence. Specifically, sixty-five percent of physicians reduced their dosing, significantly for capecitabine (p=0.002) and mercaptopurine/thioguanine (p=0.03). These findings suggest a need to improve our prototype such that PGx CDS content is more useful and delivered in a way that improves physician's confidence in their prescribing

  19. Representing Instructional Material for Scenario-Based Guided-Discovery Courseware

    SciTech Connect

    Greitzer, Frank L.; Merrill, M. DAVID.; Rice, Douglas M.; Curtis, Darren S.

    2004-12-06

    The focus of this paper is to discuss paradigms for learning that are based on sound principles of human learning and cognition, and to discuss technical challenges that must be overcome in achieving this research goal through instructional system design (ISD) approaches that are cost-effective as well as conformant with today's interactive multimedia instruction standards. Fundamental concepts are to: engage learners to solve real-world problems (progress from simple to complex); relate material to previous experience; demonstrate what is to be learned using interactive, problem-centered activities rather than passive exposure to material; require learners to use their new knowledge to solve problems that demonstrate their knowledge in a relevant applied setting; and guide the learner with feedback and coaching early, then gradually withdraw this support as learning progresses. Many of these principles have been put into practice by employing interactive learning objects as re-usable components of larger, more integrated exercises. A challenge is to make even more extensive use of interactive, scenario-based activities within a guided-discovery framework. Because the design and construction of interactive, scenario-based learning objects and more complex integrated exercises is labor-intensive, this paper explores the use of interactive learning objects and associated representation schema for instructional content to facilitate development of tools for creating scenario-based, guided-discovery courseware.

  20. Scenario-based User Testing to Guide Consumer Health Informatics Design

    PubMed Central

    Zayas-Cabán, Teresa; Marquard, Jenna L.; Radhakrishnan, Kavita; Duffey, Noah; Evernden, Dana L.

    2009-01-01

    For consumer health informatics (CHI) interventions to successfully aid laypeople, the interventions must fit and support their health work. This paper outlines a scenario-based human factors assessment of a disease management CHI intervention. Two student users undertook a patient use case and another user followed a nurse use case. Each user completed pre-specified tasks over a ten-day trial, recorded challenges encountered while utilizing the intervention, and logged daily time spent on each task. Results show the scenario-based user testing approach helps effectively and systematically assess potential physical, cognitive, and macroergonomic challenges for end-users, rate the severity of the challenges, and identify mediation strategies for each challenge. In particular, scenario-based user testing aids in identifying challenges that would be difficult, if not impossible, to detect in a laboratory-based usability study. With this information, CHI interventions can be re-designed and/or supplemented, making the intervention more closely fit end-users’ work. PMID:20351947

  1. Performance-Based Funding Brief

    ERIC Educational Resources Information Center

    Washington Higher Education Coordinating Board, 2011

    2011-01-01

    A number of states have made progress in implementing performance-based funding (PFB) and accountability. This policy brief summarizes main features of performance-based funding systems in three states: Tennessee, Ohio, and Indiana. The brief also identifies key issues that states considering performance-based funding must address, as well as…

  2. The Workplace of the Future: Insights from Futures Scenarios and Today's High Performance Workplaces.

    ERIC Educational Resources Information Center

    Curtain, Richard

    1998-01-01

    Studies of the workplace of the future that used scenario-planning methodology and survey data suggest that nonmarket organizations will provide stability for temporary workers and result in the emergence of networks. Survey data suggest that future workplaces will foster intellectual capital through research and development. (JOW)

  3. Scenario analysis of energy-based low-carbon development in China.

    PubMed

    Zhou, Yun; Hao, Fanghua; Meng, Wei; Fu, Jiafeng

    2014-08-01

    China's increasing energy consumption and coal-dominant energy structure have contributed not only to severe environmental pollution, but also to global climate change. This article begins with a brief review of China's primary energy use and associated environmental problems and health risks. To analyze the potential of China's transition to low-carbon development, three scenarios are constructed to simulate energy demand and CO₂ emission trends in China up to 2050 by using the Long-range Energy Alternatives Planning System (LEAP) model. Simulation results show that with the assumption of an average annual Gross Domestic Product (GDP) growth rate of 6.45%, total primary energy demand is expected to increase by 63.4%, 48.8% and 12.2% under the Business as Usual (BaU), Carbon Reduction (CR) and Integrated Low Carbon Economy (ILCE) scenarios in 2050 from the 2009 levels. Total energy-related CO₂ emissions will increase from 6.7 billiontons in 2009 to 9.5, 11, 11.6 and 11.2 billiontons; 8.2, 9.2, 9.6 and 9 billiontons; 7.1, 7.4, 7.2 and 6.4 billiontons in 2020, 2030, 2040 and 2050 under the BaU, CR and ILCE scenarios, respectively. Total CO₂ emission will drop by 19.6% and 42.9% under the CR and ILCE scenarios in 2050, compared with the BaU scenario. To realize a substantial cut in energy consumption and carbon emissions, China needs to make a long-term low-carbon development strategy targeting further improvement of energy efficiency, optimization of energy structure, deployment of clean coal technology and use of market-based economic instruments like energy/carbon taxation. PMID:25108719

  4. Measuring Engagement in Later Life Activities: Rasch-Based Scenario Scales for Work, Caregiving, Informal Helping, and Volunteering

    ERIC Educational Resources Information Center

    Ludlow, Larry H.; Matz-Costa, Christina; Johnson, Clair; Brown, Melissa; Besen, Elyssa; James, Jacquelyn B.

    2014-01-01

    The development of Rasch-based "comparative engagement scenarios" based on Guttman's facet theory and sentence mapping procedures is described. The scenario scales measuring engagement in work, caregiving, informal helping, and volunteering illuminate the lived experiences of role involvement among older adults and offer multiple…

  5. Delivering CMIP5-based climate scenarios for impact assessments in Europe

    NASA Astrophysics Data System (ADS)

    Semenov, Mikhail

    2014-05-01

    Local-scale climate scenarios are required as input to impact models for assessment of climate change impacts. These scenarios incorporate changes in climatic variability as well as extreme events which are particularly important when used in conjunctions with process-based non-linear impact models. ELPIS is a repository of climate scenarios for Europe, which is based on the LARS-WG weather generator and future climate projections. Recently, projections from 18 global climate models (GCMs) from the CMIP5 multi-model ensembles used in the latest IPCC AR5 were incorporated into ELPIS. In ELPIS, the site parameters for climatic variables for the baseline period, 1980-2010, were estimated by LARS-WG from the European Crop Growth Monitoring System (CGMS) daily weather which were interpolated from observed sites over 25-km grid in Europe. Using change-factors derived from GCMs, LARS-WG perturbed site distributions for the baseline climate to generate local-scale daily weather for the future under RCP4.5 and RCP8.5 concentration pathways. The ability of LARS-WG to reproduce daily weather time series for 1980-2010 was assessed using statistical tests. Baseline site parameters, derived from CGMS, were validated against independent dataset obtained from the ECA&D archive. ELPIS represents a unique resource for impact assessments of climate change in Europe.

  6. Scenario-based decision making in water resource management: A case study in the Yellow River Delta

    NASA Astrophysics Data System (ADS)

    Dong, Congli; Schoups, Gerrit; van de Giesen, Nick

    2013-04-01

    Decision making in water resource management encounters difficulties due to uncertainties about the future. Scenarios are useful to explore uncertainties and inform decision makers to take actions. Scenarios are originally used to describe the future states in the form of storylines. These are then supplemented with numerical information from model predictions and expert judgement. Probabilities are attached to scenarios to encourage the specific explanation of the assumptions and expectations behind the storylines, and communicate the possibility of each scenario. Bayesian probability offers a prior probability on the basis of available knowledge and beliefs at the presence of uncertainties, and allows for updating to the posterior probability as new evidence arises. Bayesian rules are also applicable for decision making given the existing probabilistic scenarios. Decisions can be ranked according to their performance on the utility function given each possible scenario. A case study is provided to find an optimal solution to alleviate the water stress problem in the Yellow River Delta for the next 30 years. Scenarios of water availability and water demand are developed for the planning period. In order to make decisions rationally, cost-benefit analysis is used to evaluate the performance of viable decisions given the probabilistic scenarios. Key word: Scenarios, Water Management, Uncertainty, Decision making, Bayesian approach

  7. Thermal Performance Expectations of the Advanced Stirling Convertor Over a Range of Operating Scenarios

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Dyson, Rodger W.

    2010-01-01

    Objectives of this work are: (1) Assist the Science Mission Directorate in developing technologies for space missions. (2) Explore the capability of computational modeling to assist in the development of the Advanced Stirling Convertor. (3) Baseline computational simulation with available experimental data of the ASC. (4) Calculate peak external pressure vessel wall temperatures and compare them with anticipated values. (5) Calculated peak magnet temperature inside the ASC over a range of operational scenarios.

  8. Exposure to sulfosulfuron in agricultural drainage ditches: field monitoring and scenario-based modelling.

    PubMed

    Brown, Colin D; Dubus, Igor G; Fogg, Paul; Spirlet, Marie; Gustin, Christophe

    2004-08-01

    Field monitoring and scenario-based modelling were used to assess exposure of small ditches in the UK to the herbicide sulfosulfuron following transport via field drains. A site in central England on a high pH, clay soil was treated with sulfosulfuron, and concentrations were monitored in the single drain outfall and in the receiving ditch 1 km downstream. Drainflow in the nine months following application totalled 283 mm. Pesticide lost in the first 12.5 mm of flow was 99% of the total loading to drains (0.5% of applied). Significant dilution was observed in the receiving ditch and quantifiable residues were only detected in one sample (0.06 microg litre(-1)). The MACRO model was evaluated against the field data with minimal calibration. The parameterisation over-estimated the importance of macropore flow at the site. As a consequence, the maximum concentration in drainflow (2.3 microg litre(-1)) and the total loading to drains (0.76 g) were over-estimated by factors of 2.4 and 5, respectively. MACRO was then used to simulate long-term fate of the herbicide for each of 20 environmental scenarios. Resulting estimates for concentrations of sulfosulfuron in a receiving ditch were weighted according to the prevalence of each scenario to produce a probability distribution of daily exposure. PMID:15307668

  9. [Study on strategies of pollution prevention in coastal city of Zhejiang Province based on scenario analysis].

    PubMed

    Tian, Jin-Ping; Chen, Lü-Jun; Du, Peng-Fei; Qian, Yi

    2013-01-01

    Scenario analysis was used to study the environmental burden in a coastal city of Zhejiang province under different patterns of economic development. The aim of this research is to propose advices on decision making by illustrating how to make emissions reduced by transforming the pattern of economic development in a developed coastal area, which had acquired the level of 70 000 yuan GDP per cap. At first, 18 heavy pollution industries were screened out, by referencing total emissions of chemical oxygen demand, ammonia-nitrogen, sulfur dioxide, and nitrogen oxide. Then, a model of scenario analysis and the back-up calculation program were designed to study the sustainable development of the heavy pollution industries. With 2008 and 2015 as the reference year and the target year respectively, emissions of four pollutants mentioned above in the 18 heavy pollution industries in the city were analyzed under six scenarios. The total emissions of 4 pollutants should be reduced to an expectant degree, which is set as the constraint prerequisite of the scenario analysis. At last, some suggestions for decision-making are put forward, which include maintaining a moderate increase rate of GDP around 7%, strengthening the adjustment of economic structure, controlling the increasing rate of industrial added value of the industries with heavy pollution, optimizing the structure of industries with heavy pollution, decreasing the intensity of waste emission by implementing cleaner production to reduce emission produce at the source, and strengthening regulations on the operation of waste treatment plants to further promote the efficiency of waste treatment. Only by implementing such measures mentioned above, can the total emissions of chemical oxygen demand, ammonia-nitrogen, sulfur dioxide, and nitrogen oxide of the 18 industries with heavy pollution in the city be reduced by a 10%, 10%, 5%, and 15% respectively based on the reference year. PMID:23487960

  10. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    NASA Astrophysics Data System (ADS)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  11. What Did I Do? A Scenario-Based Program To Assist Specific Learning Disabled Adolescents in Understanding Legal Issues.

    ERIC Educational Resources Information Center

    McDougall, Donna M.

    This practicum was designed to train eight adolescents with specific learning disabilities (SLD) about their legal rights and responsibilities, through a scenario-based program presented in the classroom as part of a transition program. The practicum involved the development of 22 scenarios, a pretest and posttest, and discussions and role-playing…

  12. Developing Authentic Online Problem-Based Learning Case Scenarios for Teachers of Students with Visual Impairments in the United Kingdom

    ERIC Educational Resources Information Center

    McLinden, Mike; McCall, Steve; Hinton, Danielle; Weston, Annette

    2010-01-01

    This article reports on the development of online problem-based learning case scenarios for use in a distance education program for teachers of students with visual impairments in the United Kingdom. Following participation in two case scenarios, a cohort of teachers provided feedback. This feedback was analyzed in relation to the relevant…

  13. CMIP5 Global Climate Model Performance Evaluation and Climate Scenario Development over the South-Central United States

    NASA Astrophysics Data System (ADS)

    Rosendahl, D. H.; Rupp, D. E.; Mcpherson, R. A.; Moore, B., III

    2015-12-01

    Future climate change projections from Global Climate Models (GCMs) are the primary drivers of regional downscaling and impacts research - from which relevant information for stakeholders is generated at the regional and local levels. Therefore understanding uncertainties in GCMs is a fundamental necessity if the scientific community is to provide useful and reliable future climate change information that can be utilized by end users and decision makers. Two different assessments of the Coupled Model Intercomparison Project Phase 5 (CMIP5) GCM ensemble were conducted for the south-central United States. The first was a performance evaluation over the historical period for metrics of near surface meteorological variables (e.g., temperature, precipitation) and system-based phenomena, which include large-scale processes that can influence the region (e.g., low-level jet, ENSO). These metrics were used to identify a subset of models of higher performance across the region which were then used to constrain future climate change projections. A second assessment explored climate scenario development where all model climate change projections were assumed equally likely and future projections with the highest impact were identified (e.g., temperature and precipitation combinations of hottest/driest, hottest/wettest, and highest variability). Each of these assessments identify a subset of models that may prove useful to regional downscaling and impacts researchers who may be restricted by the total number of GCMs they can utilize. Results from these assessments will be provided as well as a discussion on when each would be useful and appropriate to use.

  14. Partial Ambiguity Resolution for Ground and Space-Based Applications in a GPS+Galileo scenario: A simulation study

    NASA Astrophysics Data System (ADS)

    Nardo, A.; Li, B.; Teunissen, P. J. G.

    2016-01-01

    Integer Ambiguity Resolution (IAR) is the key to fast and precise GNSS positioning. The proper diagnostic metric for successful IAR is provided by the ambiguity success rate being the probability of correct integer estimation. In this contribution we analyse the performance of different GPS+Galileo models in terms of number of epochs needed to reach a pre-determined success rate, for various ground and space-based applications. The simulation-based controlled model environment enables us to gain insight into the factors contributing to the ambiguity resolution strength of the different GPS+Galileo models. Different scenarios of modernized GPS+Galileo are studied, encompassing the long baseline ground case as well as the medium dynamics case (airplane) and the space-based Low Earth Orbiter (LEO) case. In our analyses of these models the capabilities of partial ambiguity resolution (PAR) are demonstrated and compared to the limitations of full ambiguity resolution (FAR). The results show that PAR is generally a more efficient way than FAR to reduce the time needed to achieve centimetre-level positioning precision. For long single baselines, PAR can achieve time reductions of fifty percent to achieve such precision levels, while for multiple baselines it even becomes more effective, reaching reductions up to eighty percent for four station networks. For a LEO, the rapidly changing observation geometry does not even allow FAR, while PAR is then still possible for both dual- and triple-frequency scenarios. With the triple-frequency GPS+Galileo model the availability of precise positioning improves by fifteen percent with respect to the dual-frequency scenario.

  15. Environmental performance of construction waste: Comparing three scenarios from a case study in Catalonia, Spain.

    PubMed

    Ortiz, O; Pasqualino, J C; Castells, F

    2010-04-01

    The main objective of this paper is to evaluate environmental impacts of construction wastes in terms of the LIFE 98 ENV/E/351 project. Construction wastes are classified in accordance with the Life Program Environment Directive of the European Commission. Three different scenarios to current waste management from a case study in Catalonia (Spain) have been compared: landfilling, recycling and incineration, and these scenarios were evaluated by means of Life Cycle Assessment. The recommendations of the Catalan Waste Catalogue and the European Waste Catalogue have been taken into account. Also, the influence of transport has been evaluated. Results show that in terms of the Global Warming Potential, the most environmentally friendly treatment was recycling, followed by incineration and lastly landfilling. According to the influence of treatment plants location on the GWP indicator, we observe that incineration and recycling of construction wastes are better than landfilling, even for long distances from the building site to the plants. This is true for most wastes except for the stony types, than should be recycled close to the building site. In summary, data from construction waste of a Catalan case study was evaluated using the well established method of LCA to determine the environmental impacts. PMID:20005694

  16. Environmental performance of construction waste: Comparing three scenarios from a case study in Catalonia, Spain

    SciTech Connect

    Ortiz, O.; Pasqualino, J.C.; Castells, F.

    2010-04-15

    The main objective of this paper is to evaluate environmental impacts of construction wastes in terms of the LIFE 98 ENV/E/351 project. Construction wastes are classified in accordance with the Life Program Environment Directive of the European Commission. Three different scenarios to current waste management from a case study in Catalonia (Spain) have been compared: landfilling, recycling and incineration, and these scenarios were evaluated by means of Life Cycle Assessment. The recommendations of the Catalan Waste Catalogue and the European Waste Catalogue have been taken into account. Also, the influence of transport has been evaluated. Results show that in terms of the Global Warming Potential, the most environmentally friendly treatment was recycling, followed by incineration and lastly landfilling. According to the influence of treatment plants location on the GWP indicator, we observe that incineration and recycling of construction wastes are better than landfilling, even for long distances from the building site to the plants. This is true for most wastes except for the stony types, than should be recycled close to the building site. In summary, data from construction waste of a Catalan case study was evaluated using the well established method of LCA to determine the environmental impacts.

  17. Incorporating scenario-based simulation into a hospital nursing education program.

    PubMed

    Nagle, Beth M; McHale, Jeanne M; Alexander, Gail A; French, Brian M

    2009-01-01

    Nurse educators are challenged to provide meaningful and effective learning opportunities for both new and experienced nurses. Simulation as a teaching and learning methodology is being embraced by nursing in academic and practice settings to provide innovative educational experiences to assess and develop clinical competency, promote teamwork, and improve care processes. This article provides an overview of the historical basis for using simulation in education, simulation methodologies, and perceived advantages and disadvantages. It also provides a description of the integration of scenario-based programs using a full-scale patient simulator into nursing education programming at a large academic medical center. PMID:19226995

  18. TRIDEC Cloud - a Web-based Platform for Tsunami Early Warning tested with NEAMWave14 Scenarios

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven; Necmioglu, Ocal; Comoglu, Mustafa; Ozer Sozdinler, Ceren; Carrilho, Fernando; Wächter, Joachim

    2015-04-01

    In times of cloud computing and ubiquitous computing the use of concepts and paradigms introduced by information and communications technology (ICT) have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in research projects new technologies are exploited to implement a cloud-based and web-based platform - the TRIDEC Cloud - to open up new prospects for EWS. The platform in its current version addresses tsunami early warning and mitigation. It merges several complementary external and in-house cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The TRIDEC Cloud can be accessed in two different modes, the monitoring mode and the exercise-and-training mode. The monitoring mode provides important functionality required to act in a real event. So far, the monitoring mode integrates historic and real-time sea level data and latest earthquake information. The integration of sources is supported by a simple and secure interface. The exercise and training mode enables training and exercises with virtual scenarios. This mode disconnects real world systems and connects with a virtual environment that receives virtual earthquake information and virtual sea level data re-played by a scenario player. Thus operators and other stakeholders are able to train skills and prepare for real events and large exercises. The GFZ German Research Centre for Geosciences (GFZ), the Kandilli Observatory and Earthquake Research Institute (KOERI), and the Portuguese Institute for the Sea and Atmosphere (IPMA) have used the opportunity provided by NEAMWave14 to test the TRIDEC Cloud as a collaborative activity based on previous partnership and commitments at

  19. The Impact of New Estimates of Mixing Ratio and Flux-based Halogen Scenarios on Ozone Evolution

    NASA Technical Reports Server (NTRS)

    Oman, Luke D.; Douglass, Anne R.; Liang, Qing; Strahan, Susan E.

    2014-01-01

    The evolution of ozone in the 21st century has been shown to be mainly impacted by the halogen emissions scenario and predicted changes in the circulation of the stratosphere. New estimates of mixing ratio and flux-based emission scenarios have been produced from the SPARC Lifetime Assessment 2013. Simulations using the Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM) are conducted using this new A1 2014 halogen scenario and compared to ones using the A1 2010 scenario. This updated version of GEOSCCM includes a realistic representation of the Quasi-Biennial Oscillation and improvements related to the break up of the Antarctic polar vortex. We will present results of the ozone evolution over the recent past and 21st century to the A1 2010, A1 2014 mixing ratio, and an A1 2014 flux-based halogen scenario. Implications of the uncertainties in these estimates as well as those from possible circulation changes will be discussed.

  20. Land-use threats and protected areas: a scenario-based, landscape level approach

    USGS Publications Warehouse

    Wilson, Tamara S.; Sleeter, Benjamin M.; Sleeter, Rachel R.; Soulard, Christopher E.

    2014-01-01

    Anthropogenic land use will likely present a greater challenge to biodiversity than climate change this century in the Pacific Northwest, USA. Even if species are equipped with the adaptive capacity to migrate in the face of a changing climate, they will likely encounter a human-dominated landscape as a major dispersal obstacle. Our goal was to identify, at the ecoregion-level, protected areas in close proximity to lands with a higher likelihood of future land-use conversion. Using a state-and-transition simulation model, we modeled spatially explicit (1 km2) land use from 2000 to 2100 under seven alternative land-use and emission scenarios for ecoregions in the Pacific Northwest. We analyzed scenario-based land-use conversion threats from logging, agriculture, and development near existing protected areas. A conversion threat index (CTI) was created to identify ecoregions with highest projected land-use conversion potential within closest proximity to existing protected areas. Our analysis indicated nearly 22% of land area in the Coast Range, over 16% of land area in the Puget Lowland, and nearly 11% of the Cascades had very high CTI values. Broader regional-scale land-use change is projected to impact nearly 40% of the Coast Range, 30% of the Puget Lowland, and 24% of the Cascades (i.e., two highest CTI classes). A landscape level, scenario-based approach to modeling future land use helps identify ecoregions with existing protected areas at greater risk from regional land-use threats and can help prioritize future conservation efforts.

  1. Scenario-based design: A method for connecting information system design with public health operations and emergency management

    PubMed Central

    Reeder, Blaine; Turner, Anne M

    2011-01-01

    Responding to public health emergencies requires rapid and accurate assessment of workforce availability under adverse and changing circumstances. However, public health information systems to support resource management during both routine and emergency operations are currently lacking. We applied scenario-based design as an approach to engage public health practitioners in the creation and validation of an information design to support routine and emergency public health activities. Methods: Using semi-structured interviews we identified the information needs and activities of senior public health managers of a large municipal health department during routine and emergency operations. Results: Interview analysis identified twenty-five information needs for public health operations management. The identified information needs were used in conjunction with scenario-based design to create twenty-five scenarios of use and a public health manager persona. Scenarios of use and persona were validated and modified based on follow-up surveys with study participants. Scenarios were used to test and gain feedback on a pilot information system. Conclusion: The method of scenario-based design was applied to represent the resource management needs of senior-level public health managers under routine and disaster settings. Scenario-based design can be a useful tool for engaging public health practitioners in the design process and to validate an information system design. PMID:21807120

  2. Emergence of the First Catalytic Oligonucleotides in a Formamide-Based Origin Scenario.

    PubMed

    Šponer, Judit E; Šponer, Jiří; Nováková, Olga; Brabec, Viktor; Šedo, Ondrej; Zdráhal, Zbyněk; Costanzo, Giovanna; Pino, Samanta; Saladino, Raffaele; Di Mauro, Ernesto

    2016-03-01

    50 years after the historical Miller-Urey experiment, the formamide-based scenario is perhaps the most powerful concurrent hypothesis for the origin of life on our planet besides the traditional HCN-based concept. The information accumulated during the last 15 years in this topic is astonishingly growing and nowadays the formamide-based model represents one of the most complete and coherent pathways leading from simple prebiotic precursors up to the first catalytically active RNA molecules. In this work, we overview the major events of this long pathway that have emerged from recent experimental and theoretical studies, mainly concentrating on the mechanistic, methodological, and structural aspects of this research. PMID:26807661

  3. Multi-Purpose Avionic Architecture for Vision Based Navigation Systems for EDL and Surface Mobility Scenarios

    NASA Astrophysics Data System (ADS)

    Tramutola, A.; Paltro, D.; Cabalo Perucha, M. P.; Paar, G.; Steiner, J.; Barrio, A. M.

    2015-09-01

    Vision Based Navigation (VBNAV) has been identified as a valid technology to support space exploration because it can improve autonomy and safety of space missions. Several mission scenarios can benefit from the VBNAV: Rendezvous & Docking, Fly-Bys, Interplanetary cruise, Entry Descent and Landing (EDL) and Planetary Surface exploration. For some of them VBNAV can improve the accuracy in state estimation as additional relative navigation sensor or as absolute navigation sensor. For some others, like surface mobility and terrain exploration for path identification and planning, VBNAV is mandatory. This paper presents the general avionic architecture of a Vision Based System as defined in the frame of the ESA R&T study “Multi-purpose Vision-based Navigation System Engineering Model - part 1 (VisNav-EM-1)” with special focus on the surface mobility application.

  4. Earthquake Scenario-Based Tsunami Wave Heights in the Eastern Mediterranean and Connected Seas

    NASA Astrophysics Data System (ADS)

    Necmioglu, Ocal; Özel, Nurcan Meral

    2015-12-01

    We identified a set of tsunami scenario input parameters in a 0.5° × 0.5° uniformly gridded area in the Eastern Mediterranean, Aegean (both for shallow- and intermediate-depth earthquakes) and Black Seas (only shallow earthquakes) and calculated tsunami scenarios using the SWAN-Joint Research Centre (SWAN-JRC) code ( Mader 2004; Annunziato 2007) with 2-arcmin resolution bathymetry data for the range of 6.5—Mwmax with an Mw increment of 0.1 at each grid in order to realize a comprehensive analysis of tsunami wave heights from earthquakes originating in the region. We defined characteristic earthquake source parameters from a compiled set of sources such as existing moment tensor catalogues and various reference studies, together with the Mwmax assigned in the literature, where possible. Results from 2,415 scenarios show that in the Eastern Mediterranean and its connected seas (Aegean and Black Sea), shallow earthquakes with Mw ≥ 6.5 may result in coastal wave heights of 0.5 m, whereas the same wave height would be expected only from intermediate-depth earthquakes with Mw ≥ 7.0 . The distribution of maximum wave heights calculated indicate that tsunami wave heights up to 1 m could be expected in the northern Aegean, whereas in the Black Sea, Cyprus, Levantine coasts, northern Libya, eastern Sicily, southern Italy, and western Greece, up to 3-m wave height could be possible. Crete, the southern Aegean, and the area between northeast Libya and Alexandria (Egypt) is prone to maximum tsunami wave heights of >3 m. Considering that calculations are performed at a minimum bathymetry depth of 20 m, these wave heights may, according to Green's Law, be amplified by a factor of 2 at the coastline. The study can provide a basis for detailed tsunami hazard studies in the region.

  5. Estimation of Design Rainfall Based on Climate Change Scenario in Jeju Island

    NASA Astrophysics Data System (ADS)

    Yang, S. K.; Lee, J. H.; Jung, W. Y.

    2014-12-01

    As occurrence of gradually increasing extreme temperature events in Jeju Island, a hybrid downscaling technique that simultaneously applies by dynamical method and statistical method has implemented on design rainfall in order to reduce flood damages from severe storms and typhoons. The region has high density of rain gage stations consist of 24 rain gage stations, but more than 30 long-term data are using for trend analysis. Accordingly, Jeju and Seogwipo rain gage stations has selected to comparatively analyze and design for daily rainfall data actually measured at rain gage stations for over 30 years and rainfall data predicted for A1B scenario (Source: Climate Change Information Center). Future rainfall design has computed for each rain gage station based on the analysis result. As a result of computation, Case 1 shows a strong tendency to excessively compute rainfall, which is continuously increasing. While Case 2 showed similar trend as Case 1, low design rainfall has computed by rainfall in A1B scenario. Based on the design rainfall computation method mainly used in Preventive Disaster System through Pre-disaster Effect Examination System and Basic Plan for River of Jeju Island which are considering climatic change for selecting 50-year and 100-year frequencies. Case 3 selecting for Jeju rain gage station and Case 1 for Seogwipo rain gage station. The results were different for each rain gage station because of difference in rainfall characteristics according to recent climatic change, and the risk of currently known design rainfall can be increased in near future.

  6. Two strategies to better constrain physics-based rupture scenarios and their uncertainties

    NASA Astrophysics Data System (ADS)

    Hok, Sébastien

    2016-04-01

    Physics-based rupture modelling needs some estimates of the physical parameters controlling the rupture mechanics, such as stresses, friction properties, fault geometries, as well as their variability in space. Given the lack of knowledge and direct way to infer the physical parameters controlling the rupture, these parameters come with uncertainties. To go further toward physics-based source models, we need to find strategies both for improving constraints on the input parameters, especially their variability along the fault plane, and for taking into account the uncertainties in the models. Here I present two interesting ways to improve our prediction capabilities. First, to reduce the uncertainties on the models, new strategies need to be tested for a better estimation of the input friction and stress parameters. In this framework, I will show examples of using interseismic coupling maps (Japan, Chile) as a proxy for the variability of stress drop along the fault plane. This strategy is an efficient way to introduce independent external constraint on the modelling, reducing the total uncertainty of the scenarios. Second, in order to quantify the final uncertainty of the results, we need to choose an appropriate way to handle of the variability of the input parameters. One way is to use logic trees. In this way the final results (rupture scenarios or ground motions) will come with an estimation of the uncertainty. I will illustrate this point with an application to the segmentation of rupture in the Corinth rift and magnitude probabilistic estimation.

  7. Variance-based global sensitivity analysis for multiple scenarios and models with implementation using sparse grid collocation

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Ye, Ming

    2015-09-01

    Sensitivity analysis is a vital tool in hydrological modeling to identify influential parameters for inverse modeling and uncertainty analysis, and variance-based global sensitivity analysis has gained popularity. However, the conventional global sensitivity indices are defined with consideration of only parametric uncertainty. Based on a hierarchical structure of parameter, model, and scenario uncertainties and on recently developed techniques of model- and scenario-averaging, this study derives new global sensitivity indices for multiple models and multiple scenarios. To reduce computational cost of variance-based global sensitivity analysis, sparse grid collocation method is used to evaluate the mean and variance terms involved in the variance-based global sensitivity analysis. In a simple synthetic case of groundwater flow and reactive transport, it is demonstrated that the global sensitivity indices vary substantially between the four models and three scenarios. Not considering the model and scenario uncertainties, might result in biased identification of important model parameters. This problem is resolved by using the new indices defined for multiple models and/or multiple scenarios. This is particularly true when the sensitivity indices and model/scenario probabilities vary substantially. The sparse grid collocation method dramatically reduces the computational cost, in comparison with the popular quasi-random sampling method. The new framework of global sensitivity analysis is mathematically general, and can be applied to a wide range of hydrologic and environmental problems.

  8. Policy Choice for Urban Low-carbon transportation in Beijing: Scenario Analysis Based on LEAP model

    NASA Astrophysics Data System (ADS)

    Zhang, Yu

    2016-04-01

    Beijing is a fast developing megacity with serious traffic problems, such as high energy consumption, high CO2 emission and traffic congestion. The coming 13th Five-Year Plan for Beijing economic and social development will focus on the low-carbon transportation policy to achieve the urban traffic sustainable development. In order to improve the feasibility of urban low-carbon transportation policies, this paper analyzes the future trends of CO2 emissions from transportation of Beijing. Firstly, five policies scenarios are developed according to the coming Beijing 13th Five-Year Plan, including the "Business As Usual (BAU)", the "Public Transportation Priority(PTP)", the "New Energy Vehicle(NEV)", the "Active Transportation(AT)", the "Private Car Regulation(PCR)" and the "Hybrid Policy(HP)". Then the Long-range Energy Alternatives Planning System(LEAP model) framework is adopted to estimate CO2 emission under given policies scenarios up to year 2020 and analyze the implications. The results demonstrate that the low-carbon transportation policies can reduce CO2 emission effectively. Specifically, the "Hybrid Policy(HP)" has the best performance. In terms of single policy effect, the "Private Car Regulation(PCR)" comes first followed by the "Public Transportation Priority(PTP)".

  9. Dynamic optimization of ISR sensors using a risk-based reward function applied to ground and space surveillance scenarios

    NASA Astrophysics Data System (ADS)

    DeSena, J. T.; Martin, S. R.; Clarke, J. C.; Dutrow, D. A.; Newman, A. J.

    2012-06-01

    As the number and diversity of sensing assets available for intelligence, surveillance and reconnaissance (ISR) operations continues to expand, the limited ability of human operators to effectively manage, control and exploit the ISR ensemble is exceeded, leading to reduced operational effectiveness. Automated support both in the processing of voluminous sensor data and sensor asset control can relieve the burden of human operators to support operation of larger ISR ensembles. In dynamic environments it is essential to react quickly to current information to avoid stale, sub-optimal plans. Our approach is to apply the principles of feedback control to ISR operations, "closing the loop" from the sensor collections through automated processing to ISR asset control. Previous work by the authors demonstrated non-myopic multiple platform trajectory control using a receding horizon controller in a closed feedback loop with a multiple hypothesis tracker applied to multi-target search and track simulation scenarios in the ground and space domains. This paper presents extensions in both size and scope of the previous work, demonstrating closed-loop control, involving both platform routing and sensor pointing, of a multisensor, multi-platform ISR ensemble tasked with providing situational awareness and performing search, track and classification of multiple moving ground targets in irregular warfare scenarios. The closed-loop ISR system is fullyrealized using distributed, asynchronous components that communicate over a network. The closed-loop ISR system has been exercised via a networked simulation test bed against a scenario in the Afghanistan theater implemented using high-fidelity terrain and imagery data. In addition, the system has been applied to space surveillance scenarios requiring tracking of space objects where current deliberative, manually intensive processes for managing sensor assets are insufficiently responsive. Simulation experiment results are presented

  10. Application of State Analysis and Goal-Based Operations to a MER Mission Scenario

    NASA Technical Reports Server (NTRS)

    Morris, J. Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.

    2006-01-01

    State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the behavior of states and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.

  11. Assessing nitrate leaching losses with simulation scenarios and model based fertiliser recommendations

    NASA Astrophysics Data System (ADS)

    Michalczyk, A.; Kersebaum, K. C.; Hartmann, T.; Yue, S. C.; Chen, X. P.

    2012-04-01

    Excessive mineral nitrogen fertiliser application and irrigation in intensive agricultural cropping systems is seen as a major reason for low water and nitrogen use efficiencies in the North China Plain. High nitrogen fertiliser and irrigation water inputs do not only lead to higher production costs but also to decreasing ground water tables, nitrate accumulation in deeper soil layers below the root zone and water pollution. To evaluate the effects of improved management practices on environmental pollution risk, the HERMES model is used to simulate nitrate leaching losses. The HERMES model is a dynamic, process based crop model made for practical applications such as fertiliser recommendations. The model was tested and validated on two field studies in the south of the Hebei Province that lasted for about three years with a winter wheat (Triticum aestivum L.) and summer maize (Zea mays L.) double cropping system. Biomass, grain yield, plant N uptake and soil water content were better simulated than mineral nitrogen in the soil. A model based nitrogen fertiliser recommendation was applied in the field for one wheat crop. The parallel model simulation showed satisfying results. Although there was no change in the amount of irrigation, the results indicated a possibility to reduce the fertiliser rate and thus nitrogen leaching even more than in the reduced treatment without reducing crop yields. Further more a simulation scenario with a model based fertiliser recommendation and a field capacity based irrigation was compared to farmers practice and reduced nitrogen treatment. The scenario results showed that the model recommendation together with the reduced irrigation has the highest potential to reduce nitrate leaching. The results also showed that flood irrigation as practiced by the farmers and its difficult to estimate amounts of water bears a big uncertainty for modelling.

  12. Scenario analysis and path selection of low-carbon transformation in China based on a modified IPAT model.

    PubMed

    Chen, Liang; Yang, Zhifeng; Chen, Bin

    2013-01-01

    This paper presents a forecast and analysis of population, economic development, energy consumption and CO2 emissions variation in China in the short- and long-term steps before 2020 with 2007 as the base year. The widely applied IPAT model, which is the basis for calculations, projections, and scenarios of greenhouse gases (GHGs) reformulated as the Kaya equation, is extended to analyze and predict the relations between human activities and the environment. Four scenarios of CO2 emissions are used including business as usual (BAU), energy efficiency improvement scenario (EEI), low carbon scenario (LC) and enhanced low carbon scenario (ELC). The results show that carbon intensity will be reduced by 40-45% as scheduled and economic growth rate will be 6% in China under LC scenario by 2020. The LC scenario, as the most appropriate and the most feasible scheme for China's low-carbon development in the future, can maximize the harmonious development of economy, society, energy and environmental systems. Assuming China's development follows the LC scenario, the paper further gives four paths of low-carbon transformation in China: technological innovation, industrial structure optimization, energy structure optimization and policy guidance. PMID:24204922

  13. Scenario Analysis and Path Selection of Low-Carbon Transformation in China Based on a Modified IPAT Model

    PubMed Central

    Chen, Liang; Yang, Zhifeng; Chen, Bin

    2013-01-01

    This paper presents a forecast and analysis of population, economic development, energy consumption and CO2 emissions variation in China in the short- and long-term steps before 2020 with 2007 as the base year. The widely applied IPAT model, which is the basis for calculations, projections, and scenarios of greenhouse gases (GHGs) reformulated as the Kaya equation, is extended to analyze and predict the relations between human activities and the environment. Four scenarios of CO2 emissions are used including business as usual (BAU), energy efficiency improvement scenario (EEI), low carbon scenario (LC) and enhanced low carbon scenario (ELC). The results show that carbon intensity will be reduced by 40–45% as scheduled and economic growth rate will be 6% in China under LC scenario by 2020. The LC scenario, as the most appropriate and the most feasible scheme for China’s low-carbon development in the future, can maximize the harmonious development of economy, society, energy and environmental systems. Assuming China's development follows the LC scenario, the paper further gives four paths of low-carbon transformation in China: technological innovation, industrial structure optimization, energy structure optimization and policy guidance. PMID:24204922

  14. Lunar Outpost Life Support Architecture Study Based on a High-Mobility Exploration Scenario

    NASA Technical Reports Server (NTRS)

    Lange, Kevin E.; Anderson, Molly S.

    2010-01-01

    This paper presents results of a life support architecture study based on a 2009 NASA lunar surface exploration scenario known as Scenario 12. The study focuses on the assembly complete outpost configuration and includes pressurized rovers as part of a distributed outpost architecture in both stand-alone and integrated configurations. A range of life support architectures are examined reflecting different levels of closure and distributed functionality. Monte Carlo simulations are used to assess the sensitivity of results to volatile high-impact mission variables, including the quantity of residual Lander oxygen and hydrogen propellants available for scavenging, the fraction of crew time away from the outpost on excursions, total extravehicular activity hours, and habitat leakage. Surpluses or deficits of water and oxygen are reported for each architecture, along with fixed and 10-year total equivalent system mass estimates relative to a reference case. System robustness is discussed in terms of the probability of no water or oxygen resupply as determined from the Monte Carlo simulations.

  15. A Scenario-Based Process for Requirements Development: Application to Mission Operations Systems

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Boyles, Carole A.

    2008-01-01

    The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.

  16. ESPC Overview. Cash Flows, Scenarios, and Associated Diagrams for Energy Savings Performance Contracts

    SciTech Connect

    Tetreault, T.; Regenthal, S.

    2011-05-01

    This document is meant to inform state and local decision makers about the process of energy savings performance contracts, and how projected savings and allocated energy-related budgets can be impacted by changes in utility prices.

  17. ESPC Overview: Cash Flows, Scenarios, and Associated Diagrams for Energy Savings Performance Contracts

    SciTech Connect

    Tetreault, T.; Regenthal, S.

    2011-05-01

    This document is meant to inform state and local decision makers about the process of energy savings performance contracts, and how projected savings and allocated energy-related budgets can be impacted by changes in utility prices.

  18. Application of risk-based multiple criteria decision analysis for selection of the best agricultural scenario for effective watershed management.

    PubMed

    Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz

    2016-03-01

    Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health. PMID:26734840

  19. Evaluating the impact of scenario-based high-fidelity patient simulation on academic metrics of student success.

    PubMed

    Sportsman, Susan; Schumacker, Randall E; Hamilton, Patti

    2011-01-01

    Despite the ongoing nursing shortage, nurse educators are responsible for preparing students to practice in highly complex health care systems. As nurse educators explore new learning strategies to support an increase in student admissions, they must also evaluate the impact of these strategies on the quality of the educational experience. The study reported here evaluated the impact of scenario-based, high-fidelity patient simulation used to increase student admissions in an associate degree and baccalaureate nursing program in north-central Texas upon students' sense of their own clinical competence, graduating grade point average (GPA), and performance on standardized exit examinations. These are measures commonly used by nurse educators as metrics of success. PMID:21923008

  20. Current scenario of peptide-based drugs: the key roles of cationic antitumor and antiviral peptides

    PubMed Central

    Mulder, Kelly C. L.; Lima, Loiane A.; Miranda, Vivian J.; Dias, Simoni C.; Franco, Octávio L.

    2013-01-01

    Cationic antimicrobial peptides (AMPs) and host defense peptides (HDPs) show vast potential as peptide-based drugs. Great effort has been made in order to exploit their mechanisms of action, aiming to identify their targets as well as to enhance their activity and bioavailability. In this review, we will focus on both naturally occurring and designed antiviral and antitumor cationic peptides, including those here called promiscuous, in which multiple targets are associated with a single peptide structure. Emphasis will be given to their biochemical features, selectivity against extra targets, and molecular mechanisms. Peptides which possess antitumor activity against different cancer cell lines will be discussed, as well as peptides which inhibit virus replication, focusing on their applications for human health, animal health and agriculture, and their potential as new therapeutic drugs. Moreover, the current scenario for production and the use of nanotechnology as delivery tool for both classes of cationic peptides, as well as the perspectives on improving them is considered. PMID:24198814

  1. A web-based 3D visualisation and assessment system for urban precinct scenario modelling

    NASA Astrophysics Data System (ADS)

    Trubka, Roman; Glackin, Stephen; Lade, Oliver; Pettit, Chris

    2016-07-01

    Recent years have seen an increasing number of spatial tools and technologies for enabling better decision-making in the urban environment. They have largely arisen because of the need for cities to be more efficiently planned to accommodate growing populations while mitigating urban sprawl, and also because of innovations in rendering data in 3D being well suited for visualising the urban built environment. In this paper we review a number of systems that are better known and more commonly used in the field of urban planning. We then introduce Envision Scenario Planner (ESP), a web-based 3D precinct geodesign, visualisation and assessment tool, developed using Agile and Co-design methods. We provide a comprehensive account of the tool, beginning with a discussion of its design and development process and concluding with an example use case and a discussion of the lessons learned in its development.

  2. Scenario-based Water Resources Management Using the Water Value Concept

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard

    2013-04-01

    The Saskatchewan River is the key water resource for the 3 prairie provinces of Alberta, Saskatchewan and Manitoba in Western Canada, and thus it is necessary to pursue long-term regional and watershed-based planning for the river basin. The water resources system is complex because it includes multiple components, representing various demand sectors, including the environment, which impose conflicting objectives, and multiple jurisdictions. The biophysical complexity is exacerbated by the socioeconomic dimensions associated for example with impacts of land and water management, value systems including environmental flows, and policy and governance dimensions.. We focus on the South Saskatchewan River Basin (SSRB) in Alberta and Saskatchewan, which is already fully allocated in southern Alberta and is subject to increasing demand due to rapid economic development and a growing population. Multiple sectors and water uses include agricultural, municipal, industrial, mining, hydropower, and environmental flow requirements. The significant spatial variability in the level of development and future needs for water places different values on water across the basin. Water resources planning and decision making must take these complexities into consideration, yet also deal with a new dimension—climate change and its possible future impacts on water resources systems. There is a pressing need to deal with water in terms of its value, rather than a mere commodity subject to traditional quantitative optimization. In this research, a value-based water resources system (VWRS) model is proposed to couple the hydrological and the societal aspects of water resources in one integrated modeling tool for the SSRB. The objective of this work is to develop the VWRS model as a negotiation, planning, and management tool that allows for the assessment of the availability, as well as the allocation scenarios, of water resources for competing users under varying conditions. The proposed

  3. Context-based handover of persons in crowd and riot scenarios

    NASA Astrophysics Data System (ADS)

    Metzler, Jürgen

    2015-02-01

    In order to control riots in crowds, it is helpful to get ringleaders under control and pull them out of the crowd if one has become an offender. A great support to achieve these tasks is the capability of observing the crowd and ringleaders automatically by using cameras. It also allows a better conservation of evidence in riot control. A ringleader who has become an offender should be tracked across and recognized by several cameras, regardless of whether overlapping camera's fields of view exist or not. We propose a context-based approach for handover of persons between different camera fields of view. This approach can be applied for overlapping as well as for non-overlapping fields of view, so that a fast and accurate identification of individual persons in camera networks is feasible. Within the scope of this paper, the approach is applied to a handover of persons between single images without having any temporal information. It is particularly developed for semiautomatic video editing and a handover of persons between cameras in order to improve conservation of evidence. The approach has been developed on a dataset collected during a Crowd and Riot Control (CRC) training of the German armed forces. It consists of three different levels of escalation. First, the crowd started with a peaceful demonstration. Later, there were violent protests, and third, the riot escalated and offenders bumped into the chain of guards. One result of the work is a reliable context-based method for person re-identification between single images of different camera fields of view in crowd and riot scenarios. Furthermore, a qualitative assessment shows that the use of contextual information can support this task additionally. It can decrease the needed time for handover and the number of confusions which supports the conservation of evidence in crowd and riot scenarios.

  4. WSN system design by using an innovative neural network model to perform thermals forecasting in a urban canyon scenario

    NASA Astrophysics Data System (ADS)

    Giuseppina, Nicolosi; Salvatore, Tirrito

    2015-12-01

    Wireless Sensor Networks (WSNs) were studied by researchers in order to manage Heating, Ventilating and Air-Conditioning (HVAC) indoor systems. WSN can be useful specially to regulate indoor confort in a urban canyon scenario, where the thermal parameters vary rapidly, influenced by outdoor climate changing. This paper shows an innovative neural network approach, by using WSN data collected, in order to forecast the indoor temperature to varying the outdoor conditions based on climate parameters and boundary conditions typically of urban canyon. In this work more attention will be done to influence of traffic jam and number of vehicles in queue.

  5. Strong Effects of Vs30 Heterogeneity on Physics-Based Scenario Ground-Shaking Computations

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Pullammanappallil, S. K.

    2014-12-01

    Hazard mapping and building codes worldwide use the vertically time-averaged shear-wave velocity between the surface and 30 meters depth, Vs30, as one predictor of earthquake ground shaking. Intensive field campaigns a decade ago in Reno, Los Angeles, and Las Vegas measured urban Vs30 transects with 0.3-km spacing. The Clark County, Nevada, Parcel Map includes urban Las Vegas and comprises over 10,000 site measurements over 1500 km2, completed in 2010. All of these data demonstrate fractal spatial statistics, with a fractal dimension of 1.5-1.8 at scale lengths from 0.5 km to 50 km. Vs measurements in boreholes up to 400 m deep show very similar statistics at 1 m to 200 m lengths. When included in physics-based earthquake-scenario ground-shaking computations, the highly heterogeneous Vs30 maps exhibit unexpectedly strong influence. In sensitivity tests (image below), low-frequency computations at 0.1 Hz display amplifications (as well as de-amplifications) of 20% due solely to Vs30. In 0.5-1.0 Hz computations, the amplifications are a factor of two or more. At 0.5 Hz and higher frequencies the amplifications can be larger than what the 1-d Building Code equations would predict from the Vs30 variations. Vs30 heterogeneities at one location have strong influence on amplifications at other locations, stretching out in the predominant direction of wave propagation for that scenario. The sensitivity tests show that shaking and amplifications are highly scenario-dependent. Animations of computed ground motions and how they evolve with time suggest that the fractal Vs30 variance acts to trap wave energy and increases the duration of shaking. Validations of the computations against recorded ground motions, possible in Las Vegas Valley due to the measurements of the Clark County Parcel Map, show that ground motion levels and amplifications match, while recorded shaking has longer duration than computed shaking. Several mechanisms may explain the amplification and increased

  6. Assessment and comparison of total RF-EMF exposure in femtocell and macrocell base station scenarios.

    PubMed

    Aerts, Sam; Plets, David; Verloock, Leen; Martens, Luc; Joseph, Wout

    2014-12-01

    The indoor coverage of a mobile service can be drastically improved by deployment of an indoor femtocell base station (FBS). However, the impact of its proximity on the total exposure of the human body to radio-frequency (RF) electromagnetic fields (EMFs) is unknown. Using a framework designed for the combination of near-field and far-field exposure, the authors assessed and compared the RF-EMF exposure of a mobile-phone (MP) user that is either connected to an FBS or a conventional macrocell base station while in an office environment. It is found that, in average macrocell coverage and MP use-time conditions and for Universal Mobile Telecommunications System technology, the total exposure can be reduced by a factor of 20-40 by using an FBS, mostly due to the significant decrease in the output power of the MP. In general, the framework presented in this study can be used for any exposure scenario, featuring any number of technologies, base stations and/or access points, users and duration. PMID:24185915

  7. Lunar base scenario cost estimates: Lunar base systems study task 6.1

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The projected development and production costs of each of the Lunar Base's systems are described and unit costs are estimated for transporting the systems to the lunar surface and for setting up the system.

  8. Scenarios, personas and user stories from design ethnography: Evidence-based design representations of communicable disease investigations

    PubMed Central

    Turner, Anne M; Reeder, Blaine; Ramey, Judith

    2014-01-01

    Purpose Despite years of effort and millions of dollars spent to create a unified electronic communicable disease reporting systems, the goal remains elusive. A major barrier has been a lack of understanding by system designers of communicable disease (CD) work and the public health workers who perform this work. This study reports on the application of User Center Design representations, traditionally used for improving interface design, to translate the complex CD work identified through ethnographic studies to guide designers and developers of CD systems. The purpose of this work is to: (1) better understand public health practitioners and their information workflow with respect to communicable disease (CD) monitoring and control at a local health department, and (2) to develop evidence-based design representations that model this CD work to inform the design of future disease surveillance systems. Methods We performed extensive onsite semi-structured interviews, targeted work shadowing and a focus group to characterize local health department communicable disease workflow. Informed by principles of design ethnography and user-centered design (UCD) we created persona, scenarios and user stories to accurately represent the user to system designers. Results We sought to convey to designers the key findings from ethnographic studies: 1) that public health CD work is mobile and episodic, in contrast to current CD reporting systems, which are stationary and fixed 2) health department efforts are focused on CD investigation and response rather than reporting and 3) current CD information systems must conform to PH workflow to ensure their usefulness. In an effort to illustrate our findings to designers, we developed three contemporary design-support representations: persona, scenario, and user story. Conclusions Through application of user centered design principles, we were able to create design representations that illustrate complex public health communicable

  9. Thermal Performance Expectations of the Advanced Stirling Convertor Over a Range of Operating Scenarios

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Dyson, Rodger W.

    2010-01-01

    The Advanced Stirling Radioisotope Generator (ASRG) will enable various missions such as small body sample return, atmospheric missions around Venus, as well as long - duration deep space missions. Analysis of the temperature distributions are performed on an Advanced Stirling Convertor, and the results are compared with available experimental measurements. This analysis includes applied environmental conditions that are similar to those that will be experienced while the convertor is in operation. The applied conditions represent a potential mission profile including pre-takeoff sterilization, launch, transit, and return. The results focus on the anticipated peak temperatures of the magnets in the linear alternator. These results confirm that the ASC can support future missions to deep space targets, extreme environment landers, as well as more conventional goals.

  10. Application of State Analysis and Goal-based Operations to a MER Mission Scenario

    NASA Technical Reports Server (NTRS)

    Morris, John Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.

    2006-01-01

    State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the system behavior in terms of state variables and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper first describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.

  11. Projecting the environmental profile of Singapore's landfill activities: Comparisons of present and future scenarios based on LCA.

    PubMed

    Khoo, Hsien H; Tan, Lester L Z; Tan, Reginald B H

    2012-05-01

    This article aims to generate the environmental profile of Singapore's Semakau landfill by comparing three different operational options associated with the life cycle stages of landfilling activities, against a 'business as usual' scenario. Before life cycle assessment or LCA is used to quantify the potential impacts from landfilling activities, an attempt to incorporate localized and empirical information into the amounts of ash and MSW sent to the landfill was made. A linear regression representation of the relationship between the mass of waste disposed and the mass of incineration ash generated was modeled from waste statistics between years 2004 and 2009. Next, the mass of individual MSW components was projected from 2010 to 2030. The LCA results highlighted that in a 'business as usual' scenario the normalized total impacts of global warming, acidification and human toxicity increased by about 2% annually from 2011 to 2030. By replacing the 8000-tonne barge with a 10000-tonne coastal bulk carrier or freighter (in scenario 2) a grand total reduction of 48% of both global warming potential and acidification can be realized by year 2030. Scenario 3 explored the importance of having a Waste Water Treatment Plant in place to reduce human toxicity levels - however, the overall long-term benefits were not as significant as scenario 2. It is shown in scenario 4 that the option of increased recycling championed over all other three scenarios in the long run, resulting in a total 58% reduction in year 2030 for the total normalized results. A separate comparison of scenarios 1-4 is also carried out for energy utilization and land use in terms of volume of waste occupied. Along with the predicted reductions in environmental burdens, an additional bonus is found in the expanded lifespan of Semakau landfill from year 2032 (base case) to year 2039. Model limitations and suggestions for improvements were also discussed. PMID:22257698

  12. Selection of an appropriate wastewater treatment technology: a scenario-based multiple-attribute decision-making approach.

    PubMed

    Kalbar, Pradip P; Karmakar, Subhankar; Asolekar, Shyam R

    2012-12-30

    Many technological alternatives for wastewater treatment are available, ranging from advanced technologies to conventional treatment options. It is difficult to select the most appropriate technology from among a set of available alternatives to treat wastewater at a particular location. Many factors, such as capital costs, operation and maintenance costs and land requirement, are involved in the decision-making process. Sustainability criteria must also be incorporated into the decision-making process such that appropriate technologies are selected for developing economies such as that of India. A scenario-based multiple-attribute decision-making (MADM) methodology has been developed and applied to the selection of wastewater treatment alternative. The four most commonly used wastewater treatment technologies for treatment of municipal wastewater in India are ranked for various scenarios. Six scenarios are developed that capture the regional and local societal priorities of urban, suburban and rural areas and translate them into the mathematical algorithm of the MADM methodology. The articulated scenarios depict the most commonly encountered decision-making situations in addressing technology selection for wastewater treatment in India. A widely used compensatory MADM technique, TOPSIS, has been selected to rank the alternatives. Seven criteria with twelve indicators are formulated to evaluate the alternatives. Different weight matrices are used for each scenario, depending on the priorities of the scenario. This study shows that it is difficult to select the most appropriate wastewater treatment alternative under the "no scenario" condition (equal weights given to each attribute), and the decision-making methodology presented in this paper effectively identifies the most appropriate wastewater treatment alternative for each of the scenarios. PMID:23023038

  13. Scenario-based prediction of Li-ion batteries fire-induced toxicity

    NASA Astrophysics Data System (ADS)

    Lecocq, Amandine; Eshetu, Gebrekidan Gebresilassie; Grugeon, Sylvie; Martin, Nelly; Laruelle, Stephane; Marlair, Guy

    2016-06-01

    The development of high energy Li-ion batteries with improved durability and increased safety mostly relies on the use of newly developed electrolytes. A detailed appraisal of fire-induced thermal and chemical threats on LiPF6- and LiFSI-based electrolytes by means of the so-called "fire propagation apparatus" had highlighted that the salt anion was responsible for the emission of a non negligible content of irritant gas as HF (PF6-) or HF and SO2 (FSI-). A more thorough comparative investigation of the toxicity threat in the case of larger-size 0.4 kWh Li-ion modules was thus undertaken. A modeling approach that consists in extrapolating the experimental data obtained from 1.3Ah LiFePO4/graphite pouch cells under fire conditions and in using the state-of-the-art fire safety international standards for the evaluation of fire toxicity was applied under two different real-scale simulating scenarios. The obtained results reveal that critical thresholds are highly dependent on the nature of the salt, LiPF6 or LiFSI, and on the cells state of charge. Hence, this approach can help define appropriate fire safety engineering measures for a given technology (different chemistry) or application (fully charged backup batteries or batteries subjected to deep discharge).

  14. Scenario-based prediction of Li-ion batteries fire-induced toxicity

    NASA Astrophysics Data System (ADS)

    Lecocq, Amandine; Eshetu, Gebrekidan Gebresilassie; Grugeon, Sylvie; Martin, Nelly; Laruelle, Stephane; Marlair, Guy

    2016-06-01

    The development of high energy Li-ion batteries with improved durability and increased safety mostly relies on the use of newly developed electrolytes. A detailed appraisal of fire-induced thermal and chemical threats on LiPF6- and LiFSI-based electrolytes by means of the so-called "fire propagation apparatus" had highlighted that the salt anion was responsible for the emission of a non negligible content of irritant gas as HF (PF6-) or HF and SO2 (FSI-). A more thorough comparative investigation of the toxicity threat in the case of larger-size 0.4 kWh Li-ion modules was thus undertaken. A modeling approach that consists in extrapolating the experimental data obtained from 1.3Ah LiFePO4/graphite pouch cells under fire conditions and in using the state-of-the-art fire safety international standards for the evaluation of fire toxicity was applied under two different real-scale simulating scenarios. The obtained results reveal that critical thresholds are highly dependent on the nature of the salt, LiPF6 or LiFSI, and on the cells state of charge. Hence, this approach can help define appropriate fire safety engineering measures for a given technology (different chemistry) or application (fully charged backup batteries or batteries subjected to deep discharge).

  15. A scenario-based study on information flow and collaboration patterns in disaster management.

    PubMed

    Sagun, Aysu; Bouchlaghem, Dino; Anumba, Chimay J

    2009-04-01

    Disaster management (DM) is a continuous, highly collaborative process involving governments, DM organisations, responders, the construction sector, and the general public. Most research approaches to DM include the development of information and communication technologies (ICT) to support the collaboration process rather than the creation of a collaboration process to provide information flows and patterns. An Intelligent Disaster Collaboration System (IDCS) is introduced in this paper as a conceptual model to integrate ICT into DM and the mitigation process and to enhance collaboration. The framework is applicable to the collaboration process at the local, regional and national levels. Within this context, the deployment of ICT tools in DM is explored and scenario-based case studies on flooding and terrorism--examples of natural and human-induced disasters, respectively--are presented. Conclusions are drawn regarding the differences found in collaboration patterns and ICT used during natural and human-induced disasters and the differences between currently available ICT and proposed ICT. PMID:18699856

  16. Scenario-based risk analysis of winter snowstorms in the German lowlands

    NASA Astrophysics Data System (ADS)

    von Wulffen, Anja

    2014-05-01

    conditions. Based on these findings, an exemplary synoptic evolution of a snowstorm leading to representative infrastructure failure cascades is constructed. In a next step, an extrapolation of this obtained scenario to future climate and societal conditions as well as plausible more extreme but not yet observed meteorological conditions is planned in order to obtain a thorough analysis of possible threats to the German food distribution system and a strong foundation for future disaster mitigation planning efforts.

  17. Neptune Orbiter Mission Scenario Based on Nuclear Electric Propulsion and Aerocapture Orbital Insertion

    NASA Astrophysics Data System (ADS)

    Jits, R.

    2002-01-01

    insertion of spacecraft into elliptical orbit around target planet is proposed for Neptune orbiter mission. The primary goal of combining nuclear electric propulsion (NEP) and aerocapture orbital insertion is a reduction of a trip time comparing to that of similar mission, which would use nuclear electric propulsion only. One of the limitations of the all NEP orbiter is that at the planetary approach it must match its arrival velocity with Neptune's orbital speed in order to initiate slow capture into the desired orbit using low thrust electric propulsion. Use of aerocapture for insertion into closed elliptical orbit around Neptune through a single aerodynamically controlled atmospheric pass gives advantage of having higher entry velocities than it would be possible in case of all NEP scenario, thus reducing trip time required for interplanetary transfer. propulsion and thermal protection systems. Moreover, because faster interplanetary trip times for combined NEP/Aerocapture orbiter result in a higher entry velocities into the Neptune's atmosphere, they will also drive the increase in aerobrake mass fraction. In addition, aerocapture at Neptune also presents a challenge for aerobrake's guidance system which must target vehicle to the desired atmospheric exit conditions in the presence of significant uncertainties in Neptune's atmospheric density. Hence, there is a need to design a robust nominal aerocapture trajectory capable of accommodating density dispersions and also optimized for minimum thermal protection mass, thus contributing to overall reduction of aerobrake mass fraction. determine the optimal combination between reduction of the trip time and increase in aerobrake mass fraction was undertaken. The initial assumptions on aerobrake thermal protection materials and NEP system characteristics were based on near term state of the art technology, corresponding to 2007-2010 time frame, when such a mission to Neptune could be launched. interplanetary

  18. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks. PMID:8591321

  19. The FORE-SCE model: a practical approach for projecting land cover change using scenario-based modeling

    USGS Publications Warehouse

    Sohl, Terry L.; Sayler, Kristi L.; Drummond, Mark A.; Loveland, Thomas R.

    2007-01-01

    A wide variety of ecological applications require spatially explicit, historic, current, and projected land use and land cover data. The U.S. Land Cover Trends project is analyzing contemporary (1973–2000) land-cover change in the conterminous United States. The newly developed FORE-SCE model used Land Cover Trends data and theoretical, statistical, and deterministic modeling techniques to project future land cover change through 2020 for multiple plausible scenarios. Projected proportions of future land use were initially developed, and then sited on the lands with the highest potential for supporting that land use and land cover using a statistically based stochastic allocation procedure. Three scenarios of 2020 land cover were mapped for the western Great Plains in the US. The model provided realistic, high-resolution, scenario-based land-cover products suitable for multiple applications, including studies of climate and weather variability, carbon dynamics, and regional hydrology.

  20. Model-based comparisons of phylogeographic scenarios resolve the intraspecific divergence of cactophilic Drosophila mojavensis.

    PubMed

    Smith, Gilbert; Lohse, Konrad; Etges, William J; Ritchie, Michael G

    2012-07-01

    The cactophilic fly Drosophila mojavensis exhibits considerable intraspecific genetic structure across allopatric geographic regions and shows associations with different host cactus species across its range. The divergence between these populations has been studied for more than 60years, yet their exact historical relationships have not been resolved. We analysed sequence data from 15 intronic X-linked loci across populations from Baja California, mainland Sonora-Arizona and Mojave Desert regions under an isolation-with-migration model to assess multiple scenarios of divergence. We also compared the results with a pre-existing sequence data set of eight autosomal loci. We derived a population tree with Baja California placed at its base and link their isolation to Pleistocene climatic oscillations. Our estimates suggest the Baja California population diverged from an ancestral Mojave Desert/mainland Sonora-Arizona group around 230,000-270,000years ago, while the split between the Mojave Desert and mainland Sonora-Arizona populations occurred one glacial cycle later, 117,000-135,000years ago. Although we found these three populations to be effectively allopatric, model ranking could not rule out the possibility of a low level of gene flow between two of them. Finally, the Mojave Desert population showed a small effective population size, consistent with a historical population bottleneck. We show that model-based inference from multiple loci can provide accurate information on the historical relationships of closely related groups allowing us to set into historical context a classic system of incipient ecological speciation. PMID:22571504

  1. A Cloud Robotics Based Service for Managing RPAS in Emergency, Rescue and Hazardous Scenarios

    NASA Astrophysics Data System (ADS)

    Silvagni, Mario; Chiaberge, Marcello; Sanguedolce, Claudio; Dara, Gianluca

    2016-04-01

    Cloud robotics and cloud services are revolutionizing not only the ICT world but also the robotics industry, giving robots more computing capabilities, storage and connection bandwidth while opening new scenarios that blend the physical to the digital world. In this vision, new IT architectures are required to manage robots, retrieve data from them and create services to interact with users. Among all the robots this work is mainly focused on flying robots, better known as drones, UAV (Unmanned Aerial Vehicle) or RPAS (Remotely Piloted Aircraft Systems). The cloud robotics approach shifts the concept of having a single local "intelligence" for every single UAV, as a unique device that carries out onboard all the computation and storage processes, to a more powerful "centralized brain" located in the cloud. This breakthrough opens new scenarios where UAVs are agents, relying on remote servers for most of their computational load and data storage, creating a network of devices where they can share knowledge and information. Many applications, using UAVs, are growing as interesting and suitable devices for environment monitoring. Many services can be build fetching data from UAVs, such as telemetry, video streaming, pictures or sensors data; once. These services, part of the IT architecture, can be accessed via web by other devices or shared with other UAVs. As test cases of the proposed architecture, two examples are reported. In the first one a search and rescue or emergency management, where UAVs are required for monitoring intervention, is shown. In case of emergency or aggression, the user requests the emergency service from the IT architecture, providing GPS coordinates and an identification number. The IT architecture uses a UAV (choosing among the available one according to distance, service status, etc.) to reach him/her for monitoring and support operations. In the meantime, an officer will use the service to see the current position of the UAV, its

  2. Sustainable Systems Analysis of Production and Transportation Scenarios for Conventional and Bio-based Energy Commodities

    NASA Astrophysics Data System (ADS)

    Doran, E. M.; Golden, J. S.; Nowacek, D. P.

    2013-12-01

    International commerce places unique pressures on the sustainability of water resources and marine environments. System impacts include noise, emissions, and chemical and biological pollutants like introduction of invasive species into key ecosystems. At the same time, maritime trade also enables the sustainability ambition of intragenerational equity in the economy through the global circulation of commodities and manufactured goods, including agricultural, energy and mining resources (UN Trade and Development Board 2013). This paper presents a framework to guide the analysis of the multiple dimensions of the sustainable commerce-ocean nexus. As a demonstration case, we explore the social, economic and environmental aspects of the nexus framework using scenarios for the production and transportation of conventional and bio-based energy commodities. Using coupled LCA and GIS methodologies, we are able to orient the findings spatially for additional insight. Previous work on the sustainable use of marine resources has focused on distinct aspects of the maritime environment. The framework presented here, integrates the anthropogenic use, governance and impacts on the marine and coastal environments with the natural components of the system. A similar framework has been highly effective in progressing the study of land-change science (Turner et al 2007), however modification is required for the unique context of the marine environment. This framework will enable better research integration and planning for sustainability objectives including mitigation and adaptation to climate change, sea level rise, reduced dependence on fossil fuels, protection of critical marine habitat and species, and better management of the ocean as an emerging resource base for the production and transport of commodities and energy across the globe. The framework can also be adapted for vulnerability analysis, resilience studies and to evaluate the trends in production, consumption and

  3. Ethoprophos fate on soil-water interface and effects on non-target terrestrial and aquatic biota under Mediterranean crop-based scenarios.

    PubMed

    Leitão, Sara; Moreira-Santos, Matilde; Van den Brink, Paul J; Ribeiro, Rui; José Cerejeira, M; Sousa, José Paulo

    2014-05-01

    The present study aimed to assess the environmental fate of the insecticide and nematicide ethoprophos in the soil-water interface following the pesticide application in simulated maize and potato crops under Mediterranean agricultural conditions, particularly of irrigation. Focus was given to the soil-water transfer pathways (leaching and runoff), to the pesticide transport in soil between pesticide application (crop row) and non-application areas (between crop rows), as well as to toxic effects of the various matrices on terrestrial and aquatic biota. A semi-field methodology mimicking a "worst-case" ethoprophos application (twice the recommended dosage for maize and potato crops: 100% concentration v/v) in agricultural field situations was used, in order to mimic a possible misuse by the farmer under realistic conditions. A rainfall was simulated under a slope of 20° for both crop-based scenarios. Soil and water samples were collected for the analysis of pesticide residues. Ecotoxicity of soil and aquatic samples was assessed by performing lethal and sublethal bioassays with organisms from different trophic levels: the collembolan Folsomia candida, the earthworm Eisenia andrei and the cladoceran Daphnia magna. Although the majority of ethoprophos sorbed to the soil application area, pesticide concentrations were detected in all water matrices illustrating pesticide transfer pathways of water contamination between environmental compartments. Leaching to groundwater proved to be an important transfer pathway of ethoprophos under both crop-based scenarios, as it resulted in high pesticide concentration in leachates from Maize (130µgL(-1)) and Potato (630µgL(-1)) crop scenarios, respectively. Ethoprophos application at the Potato crop scenario caused more toxic effects on terrestrial and aquatic biota than at the Maize scenario at the recommended dosage and lower concentrations. In both crop-based scenarios, ethoprophos moved with the irrigation water flow to the

  4. Relation of Student Characteristics to Learning of Basic Biochemistry Concepts from a Multimedia Goal-Based Scenario.

    ERIC Educational Resources Information Center

    Schoenfeld-Tacher, Regina; Persichitte, Kay A.; Jones, Loretta L.

    This study sought to answer the question, Do all students benefit equally from the use of a hypermedia Goal-Based Scenario (GBS)? GBS is a subcategory of anchored instruction. The correlation between the demographic variables and achievement and specific cognitive variables and achievement was explored using a lesson on DNA, and was tested on…

  5. Differential Effects of a Multimedia Goal-based Scenario To Teach Introductory Biochemistry--Who Benefits Most?

    ERIC Educational Resources Information Center

    Schoenfeld-Tacher, Regina; Jones, Loretta L.; Persichitte, Kay A.

    2001-01-01

    Investigates the relationship of cognitive and demographic variables to learning outcomes from a multimedia Goal-Based Scenario (GBS) lesson on DNA. Focuses on gender, ethnicity, prior science coursework in college and high school, final score in current chemistry course as demographic variables and logical thinking ability, spatial ability, and…

  6. Blending Face-to-Face Higher Education with Web-Based Lectures: Comparing Different Didactical Application Scenarios

    ERIC Educational Resources Information Center

    Montrieux, Hannelore; Vangestel, Sandra; Raes, Annelies; Matthys, Paul; Schellens, Tammy

    2015-01-01

    Blended learning as an instructional approach is getting more attention in the educational landscape and has been researched thoroughly. Yet, this study reports the results of an innovation project aiming to gain insight into three different scenarios of applying web-based lectures: as preparation for face-to-face practical exercises, as a…

  7. Improved seismic risk estimation for Bucharest, based on multiple hazard scenarios, analytical methods and new techniques

    NASA Astrophysics Data System (ADS)

    Toma-Danila, Dragos; Florinela Manea, Elena; Ortanza Cioflan, Carmen

    2014-05-01

    a very local-dependent hazard. Also, for major earthquakes, nonlinear effects need to be considered. This problem is treated accordingly, by using recent microzonation studies, together with real data recorded at 4 events with Mw≥6. Different ground motion prediction equations are also analyzed, and improvement of them is investigated. For the buildings and population damage assessment, two open-source software are used and compared: SELENA and ELER. The damage probability for buildings is obtained through capacity-spectrum based methods. The spectral content is used for spectral acceleration at 0.2, 0.3 and 1 seconds. As the level of analysis (6 sectors for all the city) has not the best resolution with respect to the Bucharest hazard scenarios defined, we propose a procedure on how to divide the data into smaller units, taking into consideration the construction code (4 periods) and material. This approach relies on free data available from real estate agencies web-sites. The study provides an insight view on the seismic risk analysis for Bucharest and an improvement of the real-time emergency system. Most important, the system is also evaluated through real data and relevant scenarios. State-of-the art GIS maps are also presented, both for seismic hazard and risk.

  8. Photodegradation of polycyclic aromatic hydrocarbons in soils under a climate change base scenario.

    PubMed

    Marquès, Montse; Mari, Montse; Audí-Miró, Carme; Sierra, Jordi; Soler, Albert; Nadal, Martí; Domingo, José L

    2016-04-01

    The photodegradation of polycyclic aromatic hydrocarbons (PAHs) in two typical Mediterranean soils, either coarse- or fine-textured, was here investigated. Soil samples, spiked with the 16 US EPA priority PAHs, were incubated in a climate chamber at stable conditions of temperature (20 °C) and light (9.6 W m(-2)) for 28 days, simulating a climate change base scenario. PAH concentrations in soils were analyzed throughout the experiment, and correlated with data obtained by means of Microtox(®) ecotoxicity test. Photodegradation was found to be dependent on exposure time, molecular weight of each hydrocarbon, and soil texture. Fine-textured soil was able to enhance sorption, being PAHs more photodegraded than in coarse-textured soil. According to the EC50 values reported by Microtox(®), a higher detoxification was observed in fine-textured soil, being correlated with the outcomes of the analytical study. Significant photodegradation rates were detected for a number of PAHs, namely phenanthrene, anthracene, benzo(a)pyrene, and indeno(123-cd)pyrene. Benzo(a)pyrene, commonly used as an indicator for PAH pollution, was completely removed after 7 days of light exposure. In addition to the PAH chemical analysis and the ecotoxicity tests, a hydrogen isotope analysis of benzo(a)pyrene was also carried out. The degradation of this specific compound was associated to a high enrichment in (2)H, obtaining a maximum δ(2)H isotopic shift of +232‰. This strong isotopic effect observed in benzo(a)pyrene suggests that compound-specific isotope analysis (CSIA) may be a powerful tool to monitor in situ degradation of PAHs. Moreover, hydrogen isotopes of benzo(a)pyrene evidenced a degradation process of unknown origin occurring in the darkness. PMID:26841292

  9. The design of scenario-based training from the resilience engineering perspective: a study with grid electricians.

    PubMed

    Saurin, Tarcisio Abreu; Wachs, Priscila; Righi, Angela Weber; Henriqson, Eder

    2014-07-01

    Although scenario-based training (SBT) can be an effective means to help workers develop resilience skills, it has not yet been analyzed from the resilience engineering (RE) perspective. This study introduces a five-stage method for designing SBT from the RE view: (a) identification of resilience skills, work constraints and actions for re-designing the socio-technical system; (b) design of template scenarios, allowing the simulation of the work constraints and the use of resilience skills; (c) design of the simulation protocol, which includes briefing, simulation and debriefing; (d) implementation of both scenarios and simulation protocol; and (e) evaluation of the scenarios and simulation protocol. It is reported how the method was applied in an electricity distribution company, in order to train grid electricians. The study was framed as an application of design science research, and five research outputs are discussed: method, constructs, model of the relationships among constructs, instantiations of the method, and theory building. Concerning the last output, the operationalization of the RE perspective on three elements of SBT is presented: identification of training objectives; scenario design; and debriefing. PMID:23835132

  10. Scenario-Based Specification and Evaluation of Architectures for Health Monitoring of Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Sundaram, P.

    2001-01-01

    HUMS systems have been an area of increased research in the recent times due to two main reasons: (a) increase in the occurrences of accidents in the aerospace, and (b) stricter FAA regulations on aircrafts maintenance [2]. There are several problems associated with the maintenance of aircrafts that the HUMS systems can solve through the use of several monitoring technologies.This paper documents our methodology of employing scenarios in the specification and evaluation of architecture for HUMS. Section 2 investigates related works that use scenarios in software development. Section 3 describes how we use scenarios in our work, which is followed by a demonstration of our methods in the development of KUMS in section 4. Conclusion summarizes results.

  11. FPGA Based High Performance Computing

    SciTech Connect

    Bennett, Dave; Mason, Jeff; Sundararajan, Prasanna; Dellinger, Erik; Putnam, Andrew; Storaasli, Olaf O

    2008-01-01

    Current high performance computing (HPC) applications are found in many consumer, industrial and research fields. From web searches to auto crash simulations to weather predictions, these applications require large amounts of power by the compute farms and supercomputers required to run them. The demand for more and faster computation continues to increase along with an even sharper increase in the cost of the power required to operate and cool these installations. The ability of standard processor based systems to address these needs has declined in both speed of computation and in power consumption over the past few years. This paper presents a new method of computation based upon programmable logic as represented by Field Programmable Gate Arrays (FPGAs) that addresses these needs in a manner requiring only minimal changes to the current software design environment.

  12. Supporting Primary-Level Mathematics Teachers' Collaboration in Designing and Using Technology-Based Scenarios

    ERIC Educational Resources Information Center

    Misfeldt, Morten; Zacho, Lis

    2016-01-01

    In this article, we address how the design of educational scenarios can support teachers' adoption of both technology and open-ended projects indorsing creativity and innovation. We do that by describing how groups of teachers develop digital learning environments supporting using a combination of GeoGebra and Google sites. Both teachers and…

  13. Scenario-based Participatory Design of A Collaborative Clinical Trial Protocol Authoring System

    PubMed Central

    Weng, Chunhua; Gennari, John H.; McDonald, David W.

    2003-01-01

    We present our experience of using prototype scenarios to actively involve users in the design of a collaborative clinical trial protocol authoring system. This method enables us to do usability testing and elicit prompt user feedback at the early phase of design. We conclude that it is an effective approach to the design of complex medical information systems. PMID:14728554

  14. Scenario-based tsunami hazard assessment for the coast of Vietnam from the Manila Trench source

    NASA Astrophysics Data System (ADS)

    Hong Nguyen, Phuong; Cong Bui, Que; Ha Vu, Phuong; The Pham, Truyen

    2014-11-01

    This paper assesses the impact of tsunamis in the East Vietnam Sea potentially originated from a giant rupture along the Manila Trench to the Vietnamese coast. Tsunami heights and arrival times to the major forecast points along the Vietnamese coast are computed using COMCOT model. The results of the worst case scenario (Mw = 9.3) and two extreme scenarios were used to assess the tsunami hazards. The simulation results show that Vietnamese coast can be divided into three parts with different levels of tsunami hazard. The highest threat exists along the coasts of Central and North-Central Vietnam, from Quang Binh to Ba Ria - Vung Tau provinces, with maximum wave height of 18 m observed near Quang Ngai coast, and a tsunami would reach this coastline in two hours at the earliest. The northern coastal zone of Vietnam has lower tsunami hazard. In the worst case scenario, maximum amplitudes of tsunami waves at Hai Phong sea port and Nam Dinh city, North Vietnam, are 3.5 m and 3.7 m, respectively, while the travel times to these sites are much longer, over 8 h. The southern coastal zone of Vietnam has very low tsunami hazard. In the worst case scenario, the maximum amplitude at Ca Mau is 0.12 m, while the travel time is over 10 h.

  15. Hydrological Response to Climate Change over the Blue Nile Basin Distributed hydrological modeling based on surrogate climate change scenarios

    NASA Astrophysics Data System (ADS)

    Berhane, F. G.; Anyah, R. O.

    2010-12-01

    The program Soil and Water Assessment Tool (SWAT2009) model has been applied to the Blue Nile Basin to study the hydrological response to surrogate climate changes over the Blue Nile Basin (Ethiopia) by downscaling gridded weather data. The specific objectives of the study include (i) examining the performance of the SWAT model in simulating hydrology-climate interactions and feedbacks within the entire Blue Nile Basin, and (ii) investigating the response of hydrological variables to surrogate climate changes. Monthly weather data from the Climate Research Unit (CRU) are converted to daily values as input into the SWAT using Monthly to Daily Weather Converter (MODAWEC). Using the program SUFI-2 (Sequential Uncertainty Fitting Algorithm), data from 1979 to 1983 are applied for sensitivity analysis and calibration (P-factor = 90%, R-factor =0.7, R2 =0.93 and NS=0.93) and subsequently to validate hindcasts over the period 1984-1989 (R2 =0.92 and NS=0.92). The period from 1960-2000 was used as baseline and has been used to determine the changes and the effect of the surrogate climate changes over the Blue Nile Basin. Overall, our surrogate climate change based simulations indicate the hydrology of the Blue Nile catchment is very sensitive to potential climate change with 100%, 34% and 51% increase to the surface runoff, lateral flow and water yield respectively for the A2 scenario surrogate. Key Words: SWAT, MODAWEC, Blue Nile Basin, SUFI-2, climate change, hydrological modeling, CRU

  16. Future drought scenarios for the Greater Alpine Region based on dynamical downscaling experiments.

    NASA Astrophysics Data System (ADS)

    Haslinger, Klaus; Anders, Ivonne; Schöner, Wolfgang

    2014-05-01

    Large scale droughts have major ecologic, agricultural, economic as well as societal impacts by reducing crop yield, producing low flows in river systems or by limiting the public water supply. Under the perspective of rising temperatures and possibly altered precipitation regimes in the upcoming decades due to global climate change, we accomplish an assessment of future drought characteristics for the Greater Alpine Region (GAR) with regional climate model simulations. This study consists of two parts: First, the ability of the Regional Climate Model COSMO-CLM (CCLM) to simulate drought conditions in the past in space and time is evaluated. Second, an analysis of future drought scenarios for the GAR is conducted. As a drought index the Standardized Precipitation Evapotranspiration Index (SPEI) is used. For the evaluation of the Regional Climate Model in the past, simulations driven by ERA-40 are compared to observations. The gridded observational datasets of the HISTALP-database are used for evaluation in the first place. To assess the skill of CCLM, correlation coefficients between the SPEI of model simulations and gridded observations stratified by seasons and time scales are accomplished. For the analysis of future changes in the drought characteristics, four scenario runs are investigated. These are ECHAM5 and HadCM3 driven CCLM runs for the SRES scenarios A1B, A2 and B1. The SPEI is calculated spanning both the C20 and the scenario runs and are therefore regarded as transient simulations. Generally, trends to dryer annual mean conditions are apparent in each of the scenario runs, whereas the signal is rather strong in summer, contradicted by winter which shows a slight increase in precipitation north of the Alps. This in turn leads to higher variability of the SPEI in the future, as differences between winter (wetter or no change) and summer (considerably dryer) grow larger.

  17. Tsunami hazard potential for the equatorial southwestern Pacific atolls of Tokelau from scenario-based simulations

    NASA Astrophysics Data System (ADS)

    Orpin, A. R.; Rickard, G. J.; Gerring, P. K.; Lamarche, G.

    2015-07-01

    Devastating tsunami over the last decade have significantly heightened awareness of the potential consequences and vulnerability to tsunami for low-lying Pacific islands and coastal regions. Our tsunami risk assessment for the atolls of the Tokelau Islands was based on a tsunami source-propagation-inundation model using Gerris Flow Solver, adapted from the companion study by Lamarche et al. (2015) for the islands of Wallis and Futuna. We assess whether there is potential for tsunami flooding on any of the village islets from a series of fourteen earthquake-source experiments that apply a combination of well-established fault parameters to represent plausible "high-risk scenarios" for each of the tsunamigenic sources. Earthquake source location and moment magnitude were related to tsunami wave heights and tsunami flood depths simulated for each of the three atolls of Tokelau. This approach was adopted to yield indicative and instructive results for a community advisory, rather than being fully deterministic. Results from our modelling show that wave fields are channelled by the bathymetry of the Pacific basin in such a way that the swathes of the highest waves sweep immediately northeast of the Tokelau Islands. From our series of limited simulations a great earthquake from the Kuril Trench poses the most significant inundation threat to Tokelau, with maximum modelled-wave heights in excess of 1 m, which may last a few hours and include several wave trains. Other sources can impact specific sectors of the atolls, particularly from regional sources to the south, and northern and eastern distant sources that generate trans-Pacific tsunami. In many cases impacts are dependent on the wave orientation and direct exposure to the oncoming tsunami. This study shows that dry areas remain around the villages in nearly all our "worst-case" tsunami simulations of the Tokelau Islands. Consistent with the oral history of little or no perceived tsunami threat, simulations from the

  18. A Modified Wilson Cycle Scenario Based on Thermo-Mechanical Model

    NASA Astrophysics Data System (ADS)

    Baes, M.; Sobolev, S. V.

    2014-12-01

    The major problem of classical Wilson Cycle concept is the suggested conversion of the passive continental margin to the active subduction zone. Previous modeling studies assumed either unusually thick felsic continental crust at the margin (over 40 km) or unusually low lithospheric thickness (less than 70 km) to simulate this process. Here we propose a new triggering factor in subduction initiation process that is mantle suction force. Based on this proposal we suggest a modification of Wilson Cycle concept. Sometime after opening and extension of oceanic basin, continental passive margin moves over the slab remnants of the former active subduction zones in deep mantle. Such slab remnants or deep slabs of neighboring active subduction zones produce a suction mantle flow introducing additional compression at the passive margin. It results in the initiation of a new subduction zone, hence starting the closing phase of Wilson Cycle. In this scenario the weakness of continental crust near the passive margin which is inherited from the rifting phase and horizontal push force induced from far-field topographic gradient within the continent facilitate and speed up subduction initiation process. Our thermo-mechanical modeling shows that after a few tens of million years a shear zone may indeed develop along the passive margin that has typical two-layered 35 km thick continental crust and thermal lithosphere thicker than 100 km if there is a broad mantle down-welling flow below the margin. Soon after formation of this shear zone oceanic plate descends into mantle and subduction initiates. Subduction initiation occurs following over-thrusting of continental crust and retreating of future trench. In models without far-field topographic gradient within the continent subduction initiation requires weaker passive margin. Our results also indicate that subduction initiation depends on several parameters such as magnitude, domain size and location of suction mantle flow

  19. Analytical and experimental performance evaluation of an integrated Si-photonic balanced coherent receiver in a colorless scenario.

    PubMed

    Morsy-Osman, Mohamed; Chagnon, Mathieu; Xu, Xian; Zhuge, Qunbi; Poulin, Michel; Painchaud, Yves; Pelletier, Martin; Paquet, Carl; Plant, David V

    2014-03-10

    We study analytically and experimentally the performance limits of a Si-photonic (SiP) balanced coherent receiver (CRx) co-packaged with transimpedance amplifiers (TIAs) in a colorless WDM scheme. Firstly, the CRx architecture is depicted and characterization results are presented. Secondly, an analytical expression for the signal-to-noise ratio (SNR) at the CRx output is rigorously developed and various noise sources in the context of colorless reception are outlined. Thirdly, we study experimentally the system-level CRx performance in colorless reception of 16 × 112 Gbps PDM-QPSK WDM channels. Using a 15.5 dBm local oscillator (LO) power, error free transmissions over 4800 and 4160 km at received powers of -3 and -21 dBm per channel, respectively, were achieved in a fully colorless and preamplifierless reception. Next, a set of measurements on one of the center WDM channels is performed where the LO power, received signal power, distance, and number of channels presented to the CRx are swept to evaluate the performance limits of colorless reception. Results reveal that the LO beating with optical noise incoming with the signal is a dominant noise source regardless of received signal power. In the high received signal power regime (~0 dBm/channel), the self-beat noise from out-of-band (OOB) channels is an additional major noise source especially for small LO-to-signal power ratio, short reach and large number of OOB channels. For example, at a received signal power of 0 dBm/channel after 1600 km transmission, the SNR difference between the fully filtered and colorless scenarios, where 1 and 16 channels are passed to the CRx respectively, grows from 0.5 to 3.3 dB as the LO power changes from 12 to 0 dBm. For low received power (~-12 dBm/channel), the effect of OOB channels becomes minor while the receiver shot and thermal noises become more significant. We identify the common mode rejection ratio (CMRR) and sensitivity as the two important CRx specifications that

  20. Sensor-Based Human Activity Recognition in a Multi-user Scenario

    NASA Astrophysics Data System (ADS)

    Wang, Liang; Gu, Tao; Tao, Xianping; Lu, Jian

    Existing work on sensor-based activity recognition focuses mainly on single-user activities. However, in real life, activities are often performed by multiple users involving interactions between them. In this paper, we propose Coupled Hidden Markov Models (CHMMs) to recognize multi-user activities from sensor readings in a smart home environment. We develop a multimodal sensing platform and present a theoretical framework to recognize both single-user and multi-user activities. We conduct our trace collection done in a smart home, and evaluate our framework through experimental studies. Our experimental result shows that we achieve an average accuracy of 85.46% with CHMMs.

  1. Stochastic Multi-Commodity Facility Location Based on a New Scenario Generation Technique

    NASA Astrophysics Data System (ADS)

    Mahootchi, M.; Fattahi, M.; Khakbazan, E.

    2011-11-01

    This paper extends two models for stochastic multi-commodity facility location problem. The problem is formulated as two-stage stochastic programming. As a main point of this study, a new algorithm is applied to efficiently generate scenarios for uncertain correlated customers' demands. This algorithm uses Latin Hypercube Sampling (LHS) and a scenario reduction approach. The relation between customer satisfaction level and cost are considered in model I. The risk measure using Conditional Value-at-Risk (CVaR) is embedded into the optimization model II. Here, the structure of the network contains three facility layers including plants, distribution centers, and retailers. The first stage decisions are the number, locations, and the capacity of distribution centers. In the second stage, the decisions are the amount of productions, the volume of transportation between plants and customers.

  2. Application of a scenario-based modeling system to evaluate the air quality impacts of future growth

    NASA Astrophysics Data System (ADS)

    Kahyaoğlu-Koračin, Jülide; Bassett, Scott D.; Mouat, David A.; Gertler, Alan W.

    The structure and design of future urban development can have significant adverse effects on air pollutant emissions as well as other environmental factors. When considering the future impact of growth on mobile source emissions, we generally model the increase in vehicle kilometers traveled (VKT) as a function of population growth. However, diverse and poorly planned urban development (i.e., urban sprawl) can force higher rates of motor vehicle use and in return increase levels of pollutant emissions than alternative land-use scenarios. The objective of this study is to develop and implement an air quality assessment tool that takes into account the influence of alternative growth and development scenarios on air quality. The use of scenario-based techniques in land use planning has been around since the late 1940s and been tested in many different applications to aid in decision-making. In this study, we introduce the development of an advanced interactive scenario-based land use and atmospheric chemistry modeling system coupled with a GIS (Geographical Information System) framework. The modeling system is designed to be modular and includes land use/land cover information, transportation, meteorological, emissions, and photochemical modeling components. The methods and modularity of the developed system allow its application to both broad areas and applications. To investigate the impact of possible land use change and urbanization, we evaluated a set of alternative future patterns of land use developed for a study area in Southwest California. Four land use and two population variants (increases of 500k and 1M) were considered. Overall, a Regional Low-Density Future was seen to have the highest pollutant emissions, largest increase in VKT, and the greatest impact on air quality. On the other hand, a Three-Centers Future appeared to be the most beneficial alternative future land-use scenario in terms of air quality. For all cases, the increase in population was

  3. Tsunami hazard potential for the equatorial southwestern Pacific atolls of Tokelau from scenario-based simulations

    NASA Astrophysics Data System (ADS)

    Orpin, Alan R.; Rickard, Graham J.; Gerring, Peter K.; Lamarche, Geoffroy

    2016-05-01

    Devastating tsunami over the last decade have significantly heightened awareness of the potential consequences and vulnerability of low-lying Pacific islands and coastal regions. Our appraisal of the potential tsunami hazard for the atolls of the Tokelau Islands is based on a tsunami source-propagation-inundation model using Gerris Flow Solver, adapted from the companion study by Lamarche et al. (2015) for the islands of Wallis and Futuna. We assess whether there is potential for tsunami flooding on any of the village islets from a selection of 14 earthquake-source experiments. These earthquake sources are primarily based on the largest Pacific earthquakes of Mw ≥ 8.1 since 1950 and other large credible sources of tsunami that may impact Tokelau. Earthquake-source location and moment magnitude are related to tsunami-wave amplitudes and tsunami flood depths simulated for each of the three atolls of Tokelau. This approach yields instructive results for a community advisory but is not intended to be fully deterministic. Rather, the underlying aim is to identify credible sources that present the greatest potential to trigger an emergency response. Results from our modelling show that wave fields are channelled by the bathymetry of the Pacific basin in such a way that the swathes of the highest waves sweep immediately northeast of the Tokelau Islands. Our limited simulations suggest that trans-Pacific tsunami from distant earthquake sources to the north of Tokelau pose the most significant inundation threat. In particular, our assumed worst-case scenario for the Kuril Trench generated maximum modelled-wave amplitudes in excess of 1 m, which may last a few hours and include several wave trains. Other sources can impact specific sectors of the atolls, particularly distant earthquakes from Chile and Peru, and regional earthquake sources to the south. Flooding is dependent on the wave orientation and direct alignment to the incoming tsunami. Our "worst-case" tsunami

  4. Simulation-based scenario-specific channel modeling for WBAN cooperative transmission schemes.

    PubMed

    Naganawa, Jun-ichi; Wangchuk, Karma; Kim, Minseok; Aoyagi, Takahiro; Takada, Jun-ichi

    2015-03-01

    Wireless body area networks (WBANs) are an emerging technology for realizing efficient healthcare and remote medicine for the aging society of the future. In order to improve the reliability of WBAN systems and support its various applications, channel modeling and performance evaluation are important. This paper proposes a simulation-based channel modeling for evaluating the performance of WBAN cooperative transmission schemes. The time series of path losses among seven on-body nodes are generated by the finite-difference time-domain method for seven body motions. The statistical parameters of the path loss for all the motions are also obtained. The generated path loss is then applied to the evaluation of the two-hop decode-and-forward relaying scheme, yielding an improvement in transmit power. From the evaluation of body motion, useful insights are obtained such as which relay links are more robust than others. Finally, the proposed approach is validated through comparison with a measurement-based approach. PMID:24876134

  5. Spartans: Single-Sample Periocular-Based Alignment-Robust Recognition Technique Applied to Non-Frontal Scenarios.

    PubMed

    Juefei-Xu, Felix; Luu, Khoa; Savvides, Marios

    2015-12-01

    In this paper, we investigate a single-sample periocular-based alignment-robust face recognition technique that is pose-tolerant under unconstrained face matching scenarios. Our Spartans framework starts by utilizing one single sample per subject class, and generate new face images under a wide range of 3D rotations using the 3D generic elastic model which is both accurate and computationally economic. Then, we focus on the periocular region where the most stable and discriminant features on human faces are retained, and marginalize out the regions beyond the periocular region since they are more susceptible to expression variations and occlusions. A novel facial descriptor, high-dimensional Walsh local binary patterns, is uniformly sampled on facial images with robustness toward alignment. During the learning stage, subject-dependent advanced correlation filters are learned for pose-tolerant non-linear subspace modeling in kernel feature space followed by a coupled max-pooling mechanism which further improve the performance. Given any unconstrained unseen face image, the Spartans can produce a highly discriminative matching score, thus achieving high verification rate. We have evaluated our method on the challenging Labeled Faces in the Wild database and solidly outperformed the state-of-the-art algorithms under four evaluation protocols with a high accuracy of 89.69%, a top score among image-restricted and unsupervised protocols. The advancement of Spartans is also proven in the Face Recognition Grand Challenge and Multi-PIE databases. In addition, our learning method based on advanced correlation filters is much more effective, in terms of learning subject-dependent pose-tolerant subspaces, compared with many well-established subspace methods in both linear and non-linear cases. PMID:26285149

  6. A Social-Constructivist Adaptation of Case-Based Reasoning: Integrating oal-Based Scenarios with Computer-Supported Collaborative Learning.

    ERIC Educational Resources Information Center

    Hung, David; Chen, Der-Thanq; Tan, Seng Chee

    2003-01-01

    Proposes a social constructivist adaptation of case-based reasoning (CBR) by incorporating computer-supported collaborative learning tools into the thinking and reasoning process. Explains goal-based scenarios (GBS) as translations of CBR into simulated learning environments and discusses the incorporation of facilitation cues and the inclusion of…

  7. Simulation of LRT Travel Time Reduction Scenarios Based on Passenger Behavior Modeling

    NASA Astrophysics Data System (ADS)

    Hirasawa, Takayuki; Matsuoka, Shigeki; Suda, Yoshihiro

    A physical model of dwell time at transit stops for LRT is developed from observed behaviors of passengers at Kumamoto municipal transport in commercial operation and time component measurement experiments at depot for parameter identification. The developed model is able to express waiting queues of sequentially arriving and leaving passengers at the boarding and alighting doors for variety of LRV usages in detail. The model has realized precise comparison of low-floor vehicle introduction and door usage improvement scenarios in connection with fare transaction methods.

  8. DRME: Count-based differential RNA methylation analysis at small sample size scenario.

    PubMed

    Liu, Lian; Zhang, Shao-Wu; Gao, Fan; Zhang, Yixin; Huang, Yufei; Chen, Runsheng; Meng, Jia

    2016-04-15

    Differential methylation, which concerns difference in the degree of epigenetic regulation via methylation between two conditions, has been formulated as a beta or beta-binomial distribution to address the within-group biological variability in sequencing data. However, a beta or beta-binomial model is usually difficult to infer at small sample size scenario with discrete reads count in sequencing data. On the other hand, as an emerging research field, RNA methylation has drawn more and more attention recently, and the differential analysis of RNA methylation is significantly different from that of DNA methylation due to the impact of transcriptional regulation. We developed DRME to better address the differential RNA methylation problem. The proposed model can effectively describe within-group biological variability at small sample size scenario and handles the impact of transcriptional regulation on RNA methylation. We tested the newly developed DRME algorithm on simulated and 4 MeRIP-Seq case-control studies and compared it with Fisher's exact test. It is in principle widely applicable to several other RNA-related data types as well, including RNA Bisulfite sequencing and PAR-CLIP. The code together with an MeRIP-Seq dataset is available online (https://github.com/lzcyzm/DRME) for evaluation and reproduction of the figures shown in this article. PMID:26851340

  9. Increasing Plant Based Foods or Dairy Foods Differentially Affects Nutrient Intakes: Dietary Scenarios Using NHANES 2007-2010.

    PubMed

    Cifelli, Christopher J; Houchins, Jenny A; Demmer, Elieke; Fulgoni, Victor L

    2016-01-01

    Diets rich in plant foods and lower in animal-based products have garnered increased attention among researchers, dietitians and health professionals in recent years for their potential to, not only improve health, but also to lessen the environmental impact. However, the potential effects of increasing plant-based foods at the expense of animal-based foods on macro- and micronutrient nutrient adequacy in the U.S. diet is unknown. In addition, dairy foods are consistently under consumed, thus the impact of increased dairy on nutrient adequacy is important to measure. Accordingly, the objective of this study was to use national survey data to model three different dietary scenarios to assess the effects of increasing plant-based foods or dairy foods on macronutrient intake and nutrient adequacy. Data from the National Health and Nutrition Examination Survey (NHANES) 2007-2010 for persons two years and older (n = 17,387) were used in all the analyses. Comparisons were made of usual intake of macronutrients and shortfall nutrients of three dietary scenarios that increased intakes by 100%: (i) plant-based foods; (ii) protein-rich plant-based foods (i.e., legumes, nuts, seeds, soy); and (iii) milk, cheese and yogurt. Scenarios (i) and (ii) had commensurate reductions in animal product intake. In both children (2-18 years) and adults (≥19 years), the percent not meeting the Estimated Average Requirement (EAR) decreased for vitamin C, magnesium, vitamin E, folate and iron when plant-based foods were increased. However the percent not meeting the EAR increased for calcium, protein, vitamin A, and vitamin D in this scenario. Doubling protein-rich plant-based foods had no effect on nutrient intake because they were consumed in very low quantities in the baseline diet. The dairy model reduced the percent not meeting the EAR for calcium, vitamin A, vitamin D, magnesium, and protein, while sodium and saturated fat levels increased. Our modeling shows that increasing plant-based

  10. Increasing Plant Based Foods or Dairy Foods Differentially Affects Nutrient Intakes: Dietary Scenarios Using NHANES 2007–2010

    PubMed Central

    Cifelli, Christopher J.; Houchins, Jenny A.; Demmer, Elieke; Fulgoni, Victor L.

    2016-01-01

    Diets rich in plant foods and lower in animal-based products have garnered increased attention among researchers, dietitians and health professionals in recent years for their potential to, not only improve health, but also to lessen the environmental impact. However, the potential effects of increasing plant-based foods at the expense of animal-based foods on macro- and micronutrient nutrient adequacy in the U.S. diet is unknown. In addition, dairy foods are consistently under consumed, thus the impact of increased dairy on nutrient adequacy is important to measure. Accordingly, the objective of this study was to use national survey data to model three different dietary scenarios to assess the effects of increasing plant-based foods or dairy foods on macronutrient intake and nutrient adequacy. Data from the National Health and Nutrition Examination Survey (NHANES) 2007–2010 for persons two years and older (n = 17,387) were used in all the analyses. Comparisons were made of usual intake of macronutrients and shortfall nutrients of three dietary scenarios that increased intakes by 100%: (i) plant-based foods; (ii) protein-rich plant-based foods (i.e., legumes, nuts, seeds, soy); and (iii) milk, cheese and yogurt. Scenarios (i) and (ii) had commensurate reductions in animal product intake. In both children (2–18 years) and adults (≥19 years), the percent not meeting the Estimated Average Requirement (EAR) decreased for vitamin C, magnesium, vitamin E, folate and iron when plant-based foods were increased. However the percent not meeting the EAR increased for calcium, protein, vitamin A, and vitamin D in this scenario. Doubling protein-rich plant-based foods had no effect on nutrient intake because they were consumed in very low quantities in the baseline diet. The dairy model reduced the percent not meeting the EAR for calcium, vitamin A, vitamin D, magnesium, and protein, while sodium and saturated fat levels increased. Our modeling shows that increasing plant-based

  11. Managing Obstetric Emergencies and Trauma (MOET) structured skills training in Armenia, utilising models and reality based scenarios

    PubMed Central

    Johanson, Richard B; Menon, Vijay; Burns, Ethel; Kargramanya, Eduard; Osipov, Vardges; Israelyan, Musheg; Sargsyan, Karine; Dobson, Sarah; Jones, Peter

    2002-01-01

    Background Mortality rates in Western Europe have fallen significantly over the last 50 years. Maternal mortality now averages 10 maternal deaths per 100,000 live births but in some of the Newly Independent States of the former Soviet Union, the ratio is nearly 4 times higher. The availability of skilled attendants to prevent, detect and manage major obstetric complications may be the single most important factor in preventing maternal deaths. A modern, multidisciplinary, scenario and model based training programme has been established in the UK (Managing Obstetric Emergencies and Trauma (MOET)) and allows specialist obstetricians to learn or revise the undertaking of procedures using models, and to have their skills tested in scenarios. Methods Given the success of the MOET course in the UK, the organisers were keen to evaluate it in another setting (Armenia). Pre-course knowledge and practice questionnaires were administered. In an exploratory analysis, post-course results were compared to pre-course answers obtained by the same interviewer. Results All candidates showed an improvement in post-course scores. The range was far narrower afterwards (167–188) than before (85–129.5). In the individual score analysis only two scenarios showed a non-significant change (cord prolapse and breech delivery). Conclusion This paper demonstrates the reliability of the model based scenarios, with a highly significant improvement in obstetric emergency management. However, clinical audit will be required to measure the full impact of training by longer term follow up. Audit of delays, specific obstetric complications, referrals and near misses may all be amenable to review. PMID:12020355

  12. Analytic Performance Prediction of Track-to-Track Association with Biased Data in Multi-Sensor Multi-Target Tracking Scenarios

    PubMed Central

    Tian, Wei; Wang, Yue; Shan, Xiuming; Yang, Jian

    2013-01-01

    An analytic method for predicting the performance of track-to-track association (TTTA) with biased data in multi-sensor multi-target tracking scenarios is proposed in this paper. The proposed method extends the existing results of the bias-free situation by accounting for the impact of sensor biases. Since little insight of the intrinsic relationship between scenario parameters and the performance of TTTA can be obtained by numerical simulations, the proposed analytic approach is a potential substitute for the costly Monte Carlo simulation method. Analytic expressions are developed for the global nearest neighbor (GNN) association algorithm in terms of correct association probability. The translational biases of sensors are incorporated in the expressions, which provide good insight into how the TTTA performance is affected by sensor biases, as well as other scenario parameters, including the target spatial density, the extraneous track density and the average association uncertainty error. To show the validity of the analytic predictions, we compare them with the simulation results, and the analytic predictions agree reasonably well with the simulations in a large range of normally anticipated scenario parameters. PMID:24036583

  13. Analytic performance prediction of track-to-track association with biased data in multi-sensor multi-target tracking scenarios.

    PubMed

    Tian, Wei; Wang, Yue; Shan, Xiuming; Yang, Jian

    2013-01-01

    An analytic method for predicting the performance of track-to-track association (TTTA) with biased data in multi-sensor multi-target tracking scenarios is proposed in this paper. The proposed method extends the existing results of the bias-free situation by accounting for the impact of sensor biases. Since little insight of the intrinsic relationship between scenario parameters and the performance of TTTA can be obtained by numerical simulations, the proposed analytic approach is a potential substitute for the costly Monte Carlo simulation method. Analytic expressions are developed for the global nearest neighbor (GNN) association algorithm in terms of correct association probability. The translational biases of sensors are incorporated in the expressions, which provide good insight into how the TTTA performance is affected by sensor biases, as well as other scenario parameters, including the target spatial density, the extraneous track density and the average association uncertainty error. To show the validity of the analytic predictions, we compare them with the simulation results, and the analytic predictions agree reasonably well with the simulations in a large range of normally anticipated scenario parameters. PMID:24036583

  14. Alternative Geothermal Power Production Scenarios

    DOE Data Explorer

    Sullivan, John

    2014-03-14

    The information given in this file pertains to Argonne LCAs of the plant cycle stage for a set of ten new geothermal scenario pairs, each comprised of a reference and improved case. These analyses were conducted to compare environmental performances among the scenarios and cases. The types of plants evaluated are hydrothermal binary and flash and Enhanced Geothermal Systems (EGS) binary and flash plants. Each scenario pair was developed by the LCOE group using GETEM as a way to identify plant operational and resource combinations that could reduce geothermal power plant LCOE values. Based on the specified plant and well field characteristics (plant type, capacity, capacity factor and lifetime, and well numbers and depths) for each case of each pair, Argonne generated a corresponding set of material to power ratios (MPRs) and greenhouse gas and fossil energy ratios.

  15. Future impact of traffic emissions on atmospheric ozone and OH based on two scenarios

    NASA Astrophysics Data System (ADS)

    Hodnebrog, Ø.; Berntsen, T. K.; Dessens, O.; Gauss, M.; Grewe, V.; Isaksen, I. S. A.; Koffi, B.; Myhre, G.; Olivié, D.; Prather, M. J.; Stordal, F.; Szopa, S.; Tang, Q.; van Velthoven, P.; Williams, J. E.

    2012-08-01

    The future impact of traffic emissions on atmospheric ozone and OH has been investigated separately for the three sectors AIRcraft, maritime SHIPping and ROAD traffic. To reduce uncertainties we present results from an ensemble of six different atmospheric chemistry models, each simulating the atmospheric chemical composition in a possible high emission scenario (A1B), and with emissions from each transport sector reduced by 5% to estimate sensitivities. Our results are compared with optimistic future emission scenarios (B1 and B1 ACARE), presented in a companion paper, and with the recent past (year 2000). Present-day activity indicates that anthropogenic emissions so far evolve closer to A1B than the B1 scenario. As a response to expected changes in emissions, AIR and SHIP will have increased impacts on atmospheric O3 and OH in the future while the impact of ROAD traffic will decrease substantially as a result of technological improvements. In 2050, maximum aircraft-induced O3 occurs near 80° N in the UTLS region and could reach 9 ppbv in the zonal mean during summer. Emissions from ship traffic have their largest O3 impact in the maritime boundary layer with a maximum of 6 ppbv over the North Atlantic Ocean during summer in 2050. The O3 impact of road traffic emissions in the lower troposphere peaks at 3 ppbv over the Arabian Peninsula, much lower than the impact in 2000. Radiative Forcing (RF) calculations show that the net effect of AIR, SHIP and ROAD combined will change from a~marginal cooling of -0.38 ± 13 mW m-2 in 2000 to a relatively strong cooling of -32 ± 8.9 (B1) or -31 ± 20 mW m-2 (A1B) in 2050, when taking into account RF due to changes in O3, CH4 and CH4-induced O3. This is caused both by the enhanced negative net RF from SHIP, which will change from -20 ± 5.4 mW m-2 in 2000 to -31 ± 4.8 (B1) or -40 ± 11 mW m-2 (A1B) in 2050, and from reduced O3 warming from ROAD, which is likely to turn from a positive net RF of 13 ± 7.9 mW m-2 in 2000 to

  16. Future impact of traffic emissions on atmospheric ozone and OH based on two scenarios

    NASA Astrophysics Data System (ADS)

    Hodnebrog, Ø.; Berntsen, T. K.; Dessens, O.; Gauss, M.; Grewe, V.; Isaksen, I. S. A.; Koffi, B.; Myhre, G.; Olivié, D.; Prather, M. J.; Stordal, F.; Szopa, S.; Tang, Q.; van Velthoven, P.; Williams, J. E.

    2012-12-01

    The future impact of traffic emissions on atmospheric ozone and OH has been investigated separately for the three sectors AIRcraft, maritime SHIPping and ROAD traffic. To reduce uncertainties we present results from an ensemble of six different atmospheric chemistry models, each simulating the atmospheric chemical composition in a possible high emission scenario (A1B), and with emissions from each transport sector reduced by 5% to estimate sensitivities. Our results are compared with optimistic future emission scenarios (B1 and B1 ACARE), presented in a companion paper, and with the recent past (year 2000). Present-day activity indicates that anthropogenic emissions so far evolve closer to A1B than the B1 scenario. As a response to expected changes in emissions, AIR and SHIP will have increased impacts on atmospheric O3 and OH in the future while the impact of ROAD traffic will decrease substantially as a result of technological improvements. In 2050, maximum aircraft-induced O3 occurs near 80° N in the UTLS region and could reach 9 ppbv in the zonal mean during summer. Emissions from ship traffic have their largest O3 impact in the maritime boundary layer with a maximum of 6 ppbv over the North Atlantic Ocean during summer in 2050. The O3 impact of road traffic emissions in the lower troposphere peaks at 3 ppbv over the Arabian Peninsula, much lower than the impact in 2000. Radiative forcing (RF) calculations show that the net effect of AIR, SHIP and ROAD combined will change from a marginal cooling of -0.44 ± 13 mW m-2 in 2000 to a relatively strong cooling of -32 ± 9.3 (B1) or -32 ± 18 mW m-2 (A1B) in 2050, when taking into account RF due to changes in O3, CH4 and CH4-induced O3. This is caused both by the enhanced negative net RF from SHIP, which will change from -19 ± 5.3 mW m-2 in 2000 to -31 ± 4.8 (B1) or -40 ± 9 mW m-2 (A1B) in 2050, and from reduced O3 warming from ROAD, which is likely to turn from a positive net RF of 12 ± 8.5 mW m-2 in 2000 to a

  17. Science and technology based earthquake risk reduction strategies: The Indian scenario

    NASA Astrophysics Data System (ADS)

    Bansal, Brijesh; Verma, Mithila

    2013-08-01

    Science and Technology (S & T) interventions are considered to be very important in any effort related to earthquake risk reduction. Their three main components are: earthquake forecast, assessment of earthquake hazard, and education and awareness. In India, although the efforts towards earthquake forecast were initiated about two decades ago, systematic studies started recently with the launch of a National Program on Earthquake Precursors. The quantification of seismic hazard, which is imperative in the present scenario, started in India with the establishment of first seismic observatory in 1898 and since then a substantial progress has been made in this direction. A dedicated education and awareness program was initiated about 10 years ago to provide earthquake education and create awareness amongst the students and society at large. The paper highlights significant S & T efforts made in India towards reduction of risk due to future large earthquakes.

  18. A methanotroph-based biorefinery: Potential scenarios for generating multiple products from a single fermentation.

    PubMed

    Strong, P J; Kalyuzhnaya, M; Silverman, J; Clarke, W P

    2016-09-01

    Methane, a carbon source for methanotrophic bacteria, is the principal component of natural gas and is produced during anaerobic digestion of organic matter (biogas). Methanotrophs are a viable source of single cell protein (feed supplement) and can produce various products, since they accumulate osmolytes (e.g. ectoine, sucrose), phospholipids (potential biofuels) and biopolymers (polyhydroxybutyrate, glycogen), among others. Other cell components, such as surface layers, metal chelating proteins (methanobactin), enzymes (methane monooxygenase) or heterologous proteins hold promise as future products. Here, scenarios are presented where ectoine, polyhydroxybutyrate or protein G are synthesised as the primary product, in conjunction with a variety of ancillary products that could enhance process viability. Single or dual-stage processes and volumetric requirements for bioreactors are discussed, in terms of an annual biomass output of 1000 tonnesyear(-1). Product yields are discussed in relation to methane and oxygen consumption and organic waste generation. PMID:27146469

  19. The Nankai Trough earthquake tsunamis in Korea: numerical studies of the 1707 Hoei earthquake and physics-based scenarios

    NASA Astrophysics Data System (ADS)

    Kim, SatByul; Saito, Tatsuhiko; Fukuyama, Eiichi; Kang, Tae-Seob

    2016-04-01

    Historical documents in Korea and China report abnormal waves in the sea and rivers close to the date of the 1707 Hoei earthquake, which occurred in the Nankai Trough, off southwestern Japan. This indicates that the tsunami caused by the Hoei earthquake might have reached Korea and China, which suggests a potential hazard in Korea from large earthquakes in the Nankai Trough. We conducted tsunami simulations to study the details of tsunamis in Korea caused by large earthquakes. Our results showed that the Hoei earthquake (Mw 8.8) tsunami reached the Korean Peninsula about 200 min after the earthquake occurred. The maximum tsunami height was ~0.5 m along the Korean coast. The model of the Hoei earthquake predicted a long-lasting tsunami whose highest peak arrived 600 min later after the first arrival near the coastline of Jeju Island. In addition, we conducted tsunami simulations using physics-based scenarios of anticipated earthquakes in the Nankai subduction zone. The maximum tsunami height in the scenarios (Mw 8.5-8.6) was ~0.4 m along the Korean coast. As a simple evaluation of larger possible tsunamis, we increased the amount of stress released by the earthquake by a factor of two and three, resulting in scenarios for Mw 8.8 and 8.9 earthquakes, respectively. The tsunami height increased by 0.1-0.4 m compared to that estimated by the Hoei earthquake.

  20. Scenario-based impact analysis of disaster risks exploring potential implications for disaster prevention strategies in spatial and urban planning

    NASA Astrophysics Data System (ADS)

    Lüke, J.; Wenzel, F.; Vogt, J.

    2009-04-01

    The project deals with scenario techniques to assess, estimate, and communicate the potential consequences of natural disasters on risk governance arrangements. It aims to create a methodology which allows the development of disaster scenarios for different types of natural hazards. This enables relevant stakeholders to derive planning strategies to prevent harmful damage to the community through adequate adaptation. Some main questions in the project are: - How do changing boundary conditions in economic, social and ecological systems influence the significance and the benefit of existent risk analysis as a basis for spatial planning decisions? - Which factors represent or influence the forecast uncertainty of existent extrapolations within the scope of risk analysis? Which of these uncertainties have spatial relevance? (Which go beyond sectoral considerations of risk? Which refer to reservations concerning spatial development? Which influence a community as a whole?) - How can we quantify these uncertainties? Do they change according to altered hazards or vulnerabilities? - How does the explored risk vary, once quantified uncertainties are integrated into current extrapolations? What are the implications for spatial planning activities? - Which software application is suitable to visualize and communicate the scenario methodology? The work is mainly based on existing results of previous hazard analysis and vulnerability studies which have been carried out by the Center of Disaster Management and Risk Reduction Technology (CEDIM) for the federal state of Baden-Württemberg. Existing data concern the risk of damages on residential buildings, industrial and traffic infrastructure, social and economic vulnerability. We will link this data with various assumptions of potentially changing economic, social and built environments and visualize those using Geographical Information Systems (GIS). Although the scenario methodology is conceived as a multi-hazard oriented and

  1. Scenario-based assessment of buildings damage and population exposure due to tsunamis for the town of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Pagnoni, G.; Armigliato, A.; Tinti, S.

    2015-08-01

    Alexandria is the second biggest city in Egypt as regards population, is a key economic area in northern Africa and has a very important tourist activity. Historical catalogues indicate that it was severely affected by a number of tsunami events. In this work we assess the tsunami hazard by running numerical simulations of tsunami impact in Alexandria through the Worst-case Credible Tsunami Scenario Analysis (WCTSA). We identify three main seismic sources: the Western Hellenic Arc (WHA - reference event AD 365, Mw = 8.5), the Eastern Hellenic Arc (EHA - reference event 1303, Mw = 8.0) and the Cyprus Arc (CA - hypothetical scenario earthquake with Mw = 8.0), inferred from the tectonic setting and from historical tsunami catalogues. All numerical simulations are carried out by means of the code UBO-TSUFD, developed and maintained by the Tsunami Research Team of the University of Bologna. Relevant tsunami metrics are computed for each scenario and then used to build aggregated fields such as the maximum flood depth and the maximum inundation area. We find that the case that produces the most relevant flooding in Alexandria is the EHA scenario, with wave heights up to 4 m. The aggregate fields are used for a building vulnerability assessment according to a methodology developed in the frame of the EU-FP6 project SCHEMA and further refined in this study, based on the adoption of a suitable building damage matrix and on water inundation depth. It is found that in the districts of El Dekhila and Al Amriyah, to the south-west of the port of Dekhila over 12 000 buildings could be affected and hundreds of them could incur in consequences ranging from important damage to total collapse. It is also found that in the same districts tsunami inundation covers an area of about 15 km2 resulting in more than 150 000 residents being exposed.

  2. Spatial, temporal and frequency based climate change assessment in Columbia River Basin using multi downscaled-scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2016-07-01

    Uncertainties in climate modelling are well documented in literature. Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional scale. In the present work, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from statistically downscaled GCM projections in Columbia River Basin (CRB). Analysis is performed using two different statistically downscaled climate projections (with ten GCMs downscaled products each, for RCP 4.5 and RCP 8.5, from CMIP5 dataset) namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. The two datasets for BCSD and MACA are downscaled from observed data for both scenarios projections i.e. RCP4.5 and RCP8.5. Analysis is performed using spatial change (yearly scale), temporal change (monthly scale), percentile change (seasonal scale), quantile change (yearly scale), and wavelet analysis (yearly scale) in the future period from the historical period, respectively, at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice versa for temperature. Most of the models have indicated considerate positive change in quantiles and percentiles for both precipitation and temperature. Wavelet analysis provided insights into possible explanation to changes in precipitation.

  3. Spatial, temporal and frequency based climate change assessment in Columbia River Basin using multi downscaled-scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2015-10-01

    Uncertainties in climate modelling are well documented in literature. Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional scale. In the present work, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from statistically downscaled GCM projections in Columbia River Basin (CRB). Analysis is performed using two different statistically downscaled climate projections (with ten GCMs downscaled products each, for RCP 4.5 and RCP 8.5, from CMIP5 dataset) namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. The two datasets for BCSD and MACA are downscaled from observed data for both scenarios projections i.e. RCP4.5 and RCP8.5. Analysis is performed using spatial change (yearly scale), temporal change (monthly scale), percentile change (seasonal scale), quantile change (yearly scale), and wavelet analysis (yearly scale) in the future period from the historical period, respectively, at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice versa for temperature. Most of the models have indicated considerate positive change in quantiles and percentiles for both precipitation and temperature. Wavelet analysis provided insights into possible explanation to changes in precipitation.

  4. The multiscale importance of road segments in a network disruption scenario: a risk-based approach.

    PubMed

    Freiria, Susana; Tavares, Alexandre O; Pedro Julião, Rui

    2015-03-01

    This article addresses the problem of the multiscale importance of road networks, with the aim of helping to establish a more resilient network in the event of a road disruption scenario. A new model for identifying the most important roads is described and applied on a local and regional scale. The work presented here represents a step forward, since it focuses on the interaction between identifying the most important roads in a network that connect people and health services, the specificity of the natural hazards that threaten the normal functioning of the network, and an assessment of the consequences of three real-world interruptions from a multiscale perspective. The case studies concern three different past events: road interruptions due to a flood, a forest fire, and a mass movement. On the basis of the results obtained, it is possible to establish the roads for which risk management should be a priority. The multiscale perspective shows that in a road interruption the regional system may have the capacity to reorganize itself, although the interruption may have consequences for local dynamics. Coordination between local and regional scales is therefore important. The model proposed here allows for the scaling of emergency response facilities and human and physical resources. It represents an innovative approach to defining priorities, not only in the prevention phase but also in terms of the response to natural disasters, such as awareness of the consequences of road disruption for the rescue services sent out to local communities. PMID:25263956

  5. Formation Of The Haumea System: Checking Alternative Scenarios By N-body Based Numerical Simulations.

    NASA Astrophysics Data System (ADS)

    Thirouin, Audrey; Bagati, A. C.; Ortiz, J.; Duffard, R.; Benavidez, P.; Richardson, D.

    2010-10-01

    Haumea is one of the most studied and probably one of the most interesting Trans-Neptunian Objects (TNOs) and a fast spinning dwarf planet (Rabinowitz et al., 2006; Thirouin et al. 2010) that has at least two satellites and whose orbital elements are related to a group/family of bodies. A catastrophic collision at high relative velocity (around 3 km/s) between two bodies in the 1000-1500 km size range and a mass ratio 0.2 has been suggested for the formation of the 'family' and the presence of satellites (Brown et al. 2007). Leinhardt et al. (2010) proposed another mechanism in which two 1300 km size bodies undergo a grazing collision with reacculumation of part of the mass and dispersion of the rest, partly into satellites. The likelihood of both scenarios is small when tested against collisional and dynamical evolution model predictions of collisional probabilities. Furthermore, these mechanisms have trouble in explaining the velocity dispersion of the family members and the fast spin of Haumea. Schlichting and Sari (2009) proposed that a former collision at low speed formed a proto-satellite that should lately undergo a final impact finally leading to the current observed system. In this work, we propose alternative mechanisms for the formation of the Haumea system ant test them by means of N-body numerical simulations (PKDGRAV code. Richardson, 1994).

  6. Performance-Based Enrollment Management.

    ERIC Educational Resources Information Center

    McIntyre, Chuck

    Accountability in higher education most often concentrates on what and how to measure performance, but less often on how it can be used for planning, managing, and teaching. Besides serving higher education's consumers, accountability measures should also serve those who plan and manage institutions, especially those engaged in managing…

  7. Prescriptive vs. performance based cook-off fire testing.

    SciTech Connect

    Nakos, James Thomas; Tieszen, Sheldon Robert; Erikson, William Wilding; Gill, Walter; Blanchat, Thomas K.

    2010-07-01

    In the fire safety community, the trend is toward implementing performance-based standards in place of existing prescriptive ones. Prescriptive standards can be difficult to adapt to changing design methods, materials, and application situations of systems that ultimately must perform well in unwanted fire situations. In general, this trend has produced positive results and is embraced by the fire protection community. The question arises as to whether this approach could be used to advantage in cook-off testing. Prescribed fuel fire cook-off tests have been instigated because of historical incidents that led to extensive damage to structures and loss of life. They are designed to evaluate the propensity for a violent response. The prescribed protocol has several advantages: it can be defined in terms of controllable parameters (wind speed, fuel type, pool size, etc.); and it may be conservative for a particular scenario. However, fires are inherently variable and prescribed tests are not necessarily representative of a particular accident scenario. Moreover, prescribed protocols are not necessarily adaptable and may not be conservative. We also consider performance-based testing. This requires more knowledge and thought regarding not only the fire environment, but the behavior of the munitions themselves. Sandia uses a performance based approach in assuring the safe behavior of systems of interest that contain energetic materials. Sandia also conducts prescriptive fire testing for the IAEA, NRC and the DOT. Here we comment on the strengths and weakness of both approaches and suggest a path forward should it be desirable to pursue a performance based cook-off standard.

  8. Nephrologists' likelihood of referring patients for kidney transplant based on hypothetical patient scenarios

    PubMed Central

    Tandon, Ankita; Wang, Ming; Roe, Kevin C.; Patel, Surju; Ghahramani, Nasrollah

    2016-01-01

    Background There is wide variation in referral for kidney transplant and preemptive kidney transplant (PKT). Patient characteristics such as age, race, sex and geographic location have been cited as contributing factors to this disparity. We hypothesize that the characteristics of nephrologists interplay with the patients' characteristics to influence the referral decision. In this study, we used hypothetical case scenarios to assess nephrologists' decisions regarding transplant referral. Methods A total of 3180 nephrologists were invited to participate. Among those interested, 252 were randomly selected to receive a survey in which nephrologists were asked whether they would recommend transplant for the 25 hypothetical patients. Logistic regression models with single covariates and multiple covariates were used to identify patient characteristics associated with likelihood of being referred for transplant and to identify nephrologists' characteristics associated with likelihood of referring for transplant. Results Of the 252 potential participants, 216 completed the survey. A nephrologist's affiliation with an academic institution was associated with a higher likelihood of referral, and being ‘>10 years from fellowship’ was associated with lower likelihood of referring patients for transplant. Patient age <50 years was associated with higher likelihood of referral. Rural location and smoking history/chronic obstructive pulmonary disease were associated with lower likelihood of being referred for transplant. The nephrologist's affiliation with an academic institution was associated with higher likelihood of referring for preemptive transplant, and the patient having a rural residence was associated with lower likelihood of being referred for preemptive transplant. Conclusions The variability in transplant referral is related to patients' age and geographic location as well as the nephrologists' affiliation with an academic institution and time since completion

  9. DEROCS: A computer program to simulate offshore oil and natural gas development scenarios and onshore service base requirements

    USGS Publications Warehouse

    Marcus, Philip A.; Smith, E.T.; Robinson, S.R.; Wong, A.T.

    1977-01-01

    The FORTRAN IV (H) computer program, DEROCS, constructs Outer Continental Shelf (OCS) resource development scenarios and quantifies the requirements for and impacts of the operation of the onshore service bases necessary to support offshore oil and gas operations. The acronym DEROCS stands for 'Development of Energy Resources of the Outer Continental Shelf.' The user may specify the number, timing, and amounts of offshore oil and natural gas finds, onshore service base locations, and multiplier relationships between offshore development activities and onshore land, supply, labor and facility requirements. The program determines schedules of platform installation, development drilling, production from platforms, and well workover, and calculates on a yearly basis the requirements for and impacts of the operation of the onshore service bases demanded by offshore activities. We present two examples of program application.

  10. A Usability and Learnability Case Study of Glass Flight Deck Interfaces and Pilot Interactions through Scenario-based Training

    NASA Astrophysics Data System (ADS)

    De Cino, Thomas J., II

    In the aviation industry, digitally produced and presented flight, navigation, and aircraft information is commonly referred to as glass flight decks. Glass flight decks are driven by computer-based subsystems and have long been a part of military and commercial aviation sectors. Over the past 15 years, the General Aviation (GA) sector of the aviation industry has become a recent beneficiary of the rapid advancement of computer-based glass flight deck (GFD) systems. While providing the GA pilot considerable enhancements in the quality of information about the status and operations of the aircraft, training pilots on the use of glass flight decks is often delivered with traditional methods (e.g. textbooks, PowerPoint presentations, user manuals, and limited computer-based training modules). These training methods have been reported as less than desirable in learning to use the glass flight deck interface. Difficulties in achieving a complete understanding of functional and operational characteristics of the GFD systems, acquiring a full understanding of the interrelationships of the varied subsystems, and handling the wealth of flight information provided have been reported. Documented pilot concerns of poor user experience and satisfaction, and problems with the learning the complex and sophisticated interface of the GFD are additional issues with current pilot training approaches. A case study was executed to explore ways to improve training using GFD systems at a Midwestern aviation university. The researcher investigated if variations in instructional systems design and training methods for learning glass flight deck technology would affect the perceptions and attitudes of pilots of the learnability (an attribute of usability) of the glass flight deck interface. Specifically, this study investigated the effectiveness of scenario-based training (SBT) methods to potentially improve pilot knowledge and understanding of a GFD system, and overall pilot user

  11. Supporting Problem Solving with Case-Stories Learning Scenario and Video-Based Collaborative Learning Technology

    ERIC Educational Resources Information Center

    Hung, David; Tan, Seng Chee; Cheung, Wing Sum; Hu, Chun

    2004-01-01

    In this paper, we suggest that case-based resources, which are used for assisting cognition during problem solving, can be structured around the work of narratives in social cultural psychology. Theories and other research methods have proposed structures within narratives and stories which may be useful to the design of case-based resources.…

  12. Distributed ecohydrological modelling to evaluate irrigation system performance in Sirsa district, India II: Impact of viable water management scenarios

    NASA Astrophysics Data System (ADS)

    Singh, R.; Jhorar, R. K.; van Dam, J. C.; Feddes, R. A.

    2006-10-01

    SummaryThis study focuses on the identification of appropriate strategies to improve water management and productivity in an irrigated area of 4270 km 2 in India (Sirsa district). The field scale ecohydrological model SWAP in combination with field experiments, remote sensing and GIS has been applied in a distributed manner generating the required hydrological and biophysical variables to evaluate alternative water management scenarios at different spatial and temporal scales. Simulation results for the period 1991-2001 show that the water and salt limited crop production is 1.2-2.0 times higher than the actual recorded crop production. Improved crop husbandry in terms of improved crop varieties, timely sowing, better nutrient supply and more effective weed, pest and disease control, will increase crop yields and water productivity in Sirsa district. The scenario results further showed that reduction of seepage losses to 25-30% of the total canal inflow and reallocation of 15% canal water inflow from the northern to the central canal commands will improve significantly the long term water productivity, halt the rising and declining groundwater levels, and decrease the salinization in Sirsa district.

  13. Evaluation of resident evacuations in urban rainstorm waterlogging disasters based on scenario simulation: Daoli district (Harbin, China) as an example.

    PubMed

    Chen, Peng; Zhang, Jiquan; Zhang, Lifeng; Sun, Yingyue

    2014-01-01

    With the acceleration of urbanization, waterlogging has become an increasingly serious issue. Road waterlogging has a great influence on residents' travel and traffic safety. Thus, evaluation of residents' travel difficulties caused by rainstorm waterlogging disasters is of great significance for their travel safety and emergency shelter needs. This study investigated urban rainstorm waterlogging disasters, evaluating the impact of the evolution of such disasters' evolution on residents' evacuation, using Daoli District (Harbin, China) as the research demonstration area to perform empirical research using a combination of scenario simulations, questionnaires, GIS spatial technology analysis and a hydrodynamics method to establish an urban rainstorm waterlogging numerical simulation model. The results show that under the conditions of a 10-year frequency rainstorm, there are three street sections in the study area with a high difficulty index, five street sections with medium difficulty index and the index is low at other districts, while under the conditions of a 50-year frequency rainstorm, there are five street sections with a high difficulty index, nine street sections with a medium difficulty index and the other districts all have a low index. These research results can help set the foundation for further small-scale urban rainstorm waterlogging disaster scenario simulations and emergency shelter planning as well as forecasting and warning, and provide a brand-new thought and research method for research on residents' safe travel. PMID:25264676

  14. Evaluation of Resident Evacuations in Urban Rainstorm Waterlogging Disasters Based on Scenario Simulation: Daoli District (Harbin, China) as an Example

    PubMed Central

    Chen, Peng; Zhang, Jiquan; Zhang, Lifeng; Sun, Yingyue

    2014-01-01

    With the acceleration of urbanization, waterlogging has become an increasingly serious issue. Road waterlogging has a great influence on residents’ travel and traffic safety. Thus, evaluation of residents’ travel difficulties caused by rainstorm waterlogging disasters is of great significance for their travel safety and emergency shelter needs. This study investigated urban rainstorm waterlogging disasters, evaluating the impact of the evolution of such disasters’ evolution on residents’ evacuation, using Daoli District (Harbin, China) as the research demonstration area to perform empirical research using a combination of scenario simulations, questionnaires, GIS spatial technology analysis and a hydrodynamics method to establish an urban rainstorm waterlogging numerical simulation model. The results show that under the conditions of a 10-year frequency rainstorm, there are three street sections in the study area with a high difficulty index, five street sections with medium difficulty index and the index is low at other districts, while under the conditions of a 50-year frequency rainstorm, there are five street sections with a high difficulty index, nine street sections with a medium difficulty index and the other districts all have a low index. These research results can help set the foundation for further small-scale urban rainstorm waterlogging disaster scenario simulations and emergency shelter planning as well as forecasting and warning, and provide a brand-new thought and research method for research on residents’ safe travel. PMID:25264676

  15. The STAR Project: Enhancing Adolescents' Social Understanding through Video-based, Multimedia Scenarios.

    ERIC Educational Resources Information Center

    Goldsworthy, Richard C.; Barab, Sasha A.; Goldsworthy, Elizabeth L.

    2000-01-01

    This article describes a computer game that supports the development of learners' social problem-solving skills. In a controlled three-group design, the group using the prototype game performed significantly better than an attention-placebo control and comparably to a therapist-directed group on measures of problem solving and engagement. However,…

  16. Proposal of Comprehensive Model of Teaching Basic Nursing Skills Under Goal-Based Scenario Theory.

    PubMed

    Sannomiya, Yuri; Muranaka, Yoko; Teraoka, Misako; Suzuki, Sayuri; Saito, Yukie; Yamato, Hiromi; Ishii, Mariko

    2016-01-01

    The purpose of this study is to design and develop a comprehensive model of teaching basic nursing skills on GBS theory and Four-Stage Performance Cycle. We designed a basic nursing skill program that consists of three courses: basic, application and multi-tasking. The program will be offered as blended study, utilizing e-learning. PMID:27332480

  17. Exploring an Ecologically Sustainable Scheme for Landscape Restoration of Abandoned Mine Land: Scenario-Based Simulation Integrated Linear Programming and CLUE-S Model.

    PubMed

    Zhang, Liping; Zhang, Shiwen; Huang, Yajie; Cao, Meng; Huang, Yuanfang; Zhang, Hongyan

    2016-04-01

    Understanding abandoned mine land (AML) changes during land reclamation is crucial for reusing damaged land resources and formulating sound ecological restoration policies. This study combines the linear programming (LP) model and the CLUE-S model to simulate land-use dynamics in the Mentougou District (Beijing, China) from 2007 to 2020 under three reclamation scenarios, that is, the planning scenario based on the general land-use plan in study area (scenario 1), maximal comprehensive benefits (scenario 2), and maximal ecosystem service value (scenario 3). Nine landscape-scale graph metrics were then selected to describe the landscape characteristics. The results show that the coupled model presented can simulate the dynamics of AML effectively and the spatially explicit transformations of AML were different. New cultivated land dominates in scenario 1, while construction land and forest land account for major percentages in scenarios 2 and 3, respectively. Scenario 3 has an advantage in most of the selected indices as the patches combined most closely. To conclude, reclaiming AML by transformation into more forest can reduce the variability and maintain the stability of the landscape ecological system in study area. These findings contribute to better mapping AML dynamics and providing policy support for the management of AML. PMID:27023575

  18. Exploring an Ecologically Sustainable Scheme for Landscape Restoration of Abandoned Mine Land: Scenario-Based Simulation Integrated Linear Programming and CLUE-S Model

    PubMed Central

    Zhang, Liping; Zhang, Shiwen; Huang, Yajie; Cao, Meng; Huang, Yuanfang; Zhang, Hongyan

    2016-01-01

    Understanding abandoned mine land (AML) changes during land reclamation is crucial for reusing damaged land resources and formulating sound ecological restoration policies. This study combines the linear programming (LP) model and the CLUE-S model to simulate land-use dynamics in the Mentougou District (Beijing, China) from 2007 to 2020 under three reclamation scenarios, that is, the planning scenario based on the general land-use plan in study area (scenario 1), maximal comprehensive benefits (scenario 2), and maximal ecosystem service value (scenario 3). Nine landscape-scale graph metrics were then selected to describe the landscape characteristics. The results show that the coupled model presented can simulate the dynamics of AML effectively and the spatially explicit transformations of AML were different. New cultivated land dominates in scenario 1, while construction land and forest land account for major percentages in scenarios 2 and 3, respectively. Scenario 3 has an advantage in most of the selected indices as the patches combined most closely. To conclude, reclaiming AML by transformation into more forest can reduce the variability and maintain the stability of the landscape ecological system in study area. These findings contribute to better mapping AML dynamics and providing policy support for the management of AML. PMID:27023575

  19. Evolving practices in environmental scenarios: a new scenario typology

    NASA Astrophysics Data System (ADS)

    Wilkinson, Angela; Eidinow, Esther

    2008-10-01

    A new approach to scenarios focused on environmental concerns, changes and challenges, i.e. so-called 'environmental scenarios', is necessary if global environmental changes are to be more effectively appreciated and addressed through sustained and collaborative action. On the basis of a comparison of previous approaches to global environmental scenarios and a review of existing scenario typologies, we propose a new scenario typology to help guide scenario-based interventions. This typology makes explicit the types of and/or the approaches to knowledge ('the epistemologies') which underpin a scenario approach. Drawing on previous environmental scenario projects, we distinguish and describe two main types in this new typology: 'problem-focused' and 'actor-centric'. This leads in turn to our suggestion for a third type, which we call 'RIMA'—'reflexive interventionist or multi-agent based'. This approach to scenarios emphasizes the importance of the involvement of different epistemologies in a scenario-based process of action learning in the public interest. We suggest that, by combining the epistemologies apparent in the previous two types, this approach can create a more effective bridge between longer-term thinking and more immediate actions. Our description is aimed at scenario practitioners in general, as well as those who work with (environmental) scenarios that address global challenges.

  20. A triangular fuzzy TOPSIS-based approach for the application of water technologies in different emergency water supply scenarios.

    PubMed

    Qu, Jianhua; Meng, Xianlin; Yu, Huan; You, Hong

    2016-09-01

    Because of the increasing frequency and intensity of unexpected natural disasters, providing safe drinking water for the affected population following a disaster has become a global challenge of growing concern. An onsite water supply technology that is portable, mobile, or modular is a more suitable and sustainable solution for the victims than transporting bottled water. In recent years, various water techniques, such as membrane-assisted technologies, have been proposed and successfully implemented in many places. Given the diversity of techniques available, the current challenge is how to scientifically identify the optimum options for different disaster scenarios. Hence, a fuzzy triangular-based multi-criteria, group decision-making tool was developed in this research. The approach was then applied to the selection of the most appropriate water technologies corresponding to the different emergency water supply scenarios. The results show this tool capable of facilitating scientific analysis in the evaluation and selection of emergency water technologies for enduring security drinking water supply in disaster relief. PMID:27221588

  1. Climate influences on the cost-effectiveness of vector-based interventions against malaria in elimination scenarios.

    PubMed

    Parham, Paul E; Hughes, Dyfrig A

    2015-04-01

    Despite the dependence of mosquito population dynamics on environmental conditions, the associated impact of climate and climate change on present and future malaria remains an area of ongoing debate and uncertainty. Here, we develop a novel integration of mosquito, transmission and economic modelling to assess whether the cost-effectiveness of indoor residual spraying (IRS) and long-lasting insecticidal nets (LLINs) against Plasmodium falciparum transmission by Anopheles gambiae s.s. mosquitoes depends on climatic conditions in low endemicity scenarios. We find that although temperature and rainfall affect the cost-effectiveness of IRS and/or LLIN scale-up, whether this is sufficient to influence policy depends on local endemicity, existing interventions, host immune response to infection and the emergence rate of insecticide resistance. For the scenarios considered, IRS is found to be more cost-effective than LLINs for the same level of scale-up, and both are more cost-effective at lower mean precipitation and higher variability in precipitation and temperature. We also find that the dependence of peak transmission on mean temperature translates into optimal temperatures for vector-based intervention cost-effectiveness. Further cost-effectiveness analysis that accounts for country-specific epidemiological and environmental heterogeneities is required to assess optimal intervention scale-up for elimination and better understand future transmission trends under climate change. PMID:25688017

  2. Climate influences on the cost-effectiveness of vector-based interventions against malaria in elimination scenarios

    PubMed Central

    Parham, Paul E.; Hughes, Dyfrig A.

    2015-01-01

    Despite the dependence of mosquito population dynamics on environmental conditions, the associated impact of climate and climate change on present and future malaria remains an area of ongoing debate and uncertainty. Here, we develop a novel integration of mosquito, transmission and economic modelling to assess whether the cost-effectiveness of indoor residual spraying (IRS) and long-lasting insecticidal nets (LLINs) against Plasmodium falciparum transmission by Anopheles gambiae s.s. mosquitoes depends on climatic conditions in low endemicity scenarios. We find that although temperature and rainfall affect the cost-effectiveness of IRS and/or LLIN scale-up, whether this is sufficient to influence policy depends on local endemicity, existing interventions, host immune response to infection and the emergence rate of insecticide resistance. For the scenarios considered, IRS is found to be more cost-effective than LLINs for the same level of scale-up, and both are more cost-effective at lower mean precipitation and higher variability in precipitation and temperature. We also find that the dependence of peak transmission on mean temperature translates into optimal temperatures for vector-based intervention cost-effectiveness. Further cost-effectiveness analysis that accounts for country-specific epidemiological and environmental heterogeneities is required to assess optimal intervention scale-up for elimination and better understand future transmission trends under climate change. PMID:25688017

  3. Surface impedance based microwave imaging method for breast cancer screening: contrast-enhanced scenario

    NASA Astrophysics Data System (ADS)

    Güren, Onan; Çayören, Mehmet; Tükenmez Ergene, Lale; Akduman, Ibrahim

    2014-10-01

    A new microwave imaging method that uses microwave contrast agents is presented for the detection and localization of breast tumours. The method is based on the reconstruction of breast surface impedance through a measured scattered field. The surface impedance modelling allows for representing the electrical properties of the breasts in terms of impedance boundary conditions, which enable us to map the inner structure of the breasts into surface impedance functions. Later a simple quantitative method is proposed to screen breasts against malignant tumours where the detection procedure is based on weighted cross correlations among impedance functions. Numerical results demonstrate that the method is capable of detecting small malignancies and provides reasonable localization.

  4. Inquiry-Based Science Education: A Scenario on Zambia's High School Science Curriculum

    ERIC Educational Resources Information Center

    Chabalengula, Vivien M.; Mumba, Frackson

    2012-01-01

    This paper is aimed at elucidating the current state of inquiry-based science education (IBSE) in Zambia's high school science curriculum. Therefore, we investigated Zambian teachers' conceptions of inquiry; determined inquiry levels in the national high school science curriculum materials, which include syllabi, textbooks and practical exams; and…

  5. Supply Chain Simulator: A Scenario-Based Educational Tool to Enhance Student Learning

    ERIC Educational Resources Information Center

    Siddiqui, Atiq; Khan, Mehmood; Akhtar, Sohail

    2008-01-01

    Simulation-based educational products are excellent set of illustrative tools that proffer features like visualization of the dynamic behavior of a real system, etc. Such products have great efficacy in education and are known to be one of the first-rate student centered learning methodologies. These products allow students to practice skills such…

  6. Designing Collaborative E-Learning Environments Based upon Semantic Wiki: From Design Models to Application Scenarios

    ERIC Educational Resources Information Center

    Li, Yanyan; Dong, Mingkai; Huang, Ronghuai

    2011-01-01

    The knowledge society requires life-long learning and flexible learning environment that enables fast, just-in-time and relevant learning, aiding the development of communities of knowledge, linking learners and practitioners with experts. Based upon semantic wiki, a combination of wiki and Semantic Web technology, this paper designs and develops…

  7. A comparison between the example reference biosphere model ERB 2B and a process-based model: simulation of a natural release scenario.

    PubMed

    Almahayni, T

    2014-12-01

    The BIOMASS methodology was developed with the objective of constructing defensible assessment biospheres for assessing potential radiological impacts of radioactive waste repositories. To this end, a set of Example Reference Biospheres were developed to demonstrate the use of the methodology and to provide an international point of reference. In this paper, the performance of the Example Reference Biosphere model ERB 2B associated with the natural release scenario, discharge of contaminated groundwater to the surface environment, was evaluated by comparing its long-term projections of radionuclide dynamics and distribution in a soil-plant system to those of a process-based, transient advection-dispersion model (AD). The models were parametrised with data characteristic of a typical rainfed winter wheat crop grown on a sandy loam soil under temperate climate conditions. Three safety-relevant radionuclides, (99)Tc, (129)I and (237)Np with different degree of sorption were selected for the study. Although the models were driven by the same hydraulic (soil moisture content and water fluxes) and radiological (Kds) input data, their projections were remarkably different. On one hand, both models were able to capture short and long-term variation in activity concentration in the subsoil compartment. On the other hand, the Reference Biosphere model did not project any radionuclide accumulation in the topsoil and crop compartments. This behaviour would underestimate the radiological exposure under natural release scenarios. The results highlight the potential role deep roots play in soil-to-plant transfer under a natural release scenario where radionuclides are released into the subsoil. When considering the relative activity and root depth profiles within the soil column, much of the radioactivity was taken up into the crop from the subsoil compartment. Further improvements were suggested to address the limitations of the Reference Biosphere model presented in this paper

  8. Scripting Scenarios for the Human Patient Simulator

    NASA Technical Reports Server (NTRS)

    Bacal, Kira; Miller, Robert; Doerr, Harold

    2004-01-01

    The Human Patient Simulator (HPS) is particularly useful in providing scenario-based learning which can be tailored to fit specific scenarios and which can be modified in realtime to enhance the teaching environment. Scripting these scenarios so as to maximize learning requires certain skills, in order to ensure that a change in student performance, understanding, critical thinking, and/or communication skills results. Methods: A "good" scenario can be defined in terms of applicability, learning opportunities, student interest, and clearly associated metrics. Obstacles to such a scenario include a lack of understanding of the applicable environment by the scenario author(s), a desire (common among novices) to cover too many topics, failure to define learning objectives, mutually exclusive or confusing learning objectives, unskilled instructors, poor preparation , disorganized approach, or an inappropriate teaching philosophy (such as "trial by fire" or education through humiliation). Results: Descriptions of several successful teaching programs, used in the military, civilian, and NASA medical environments , will be provided, along with sample scenarios. Discussion: Simulator-based lessons have proven to be a time- and cost-efficient manner by which to educate medical personnel. Particularly when training for medical care in austere environments (pre-hospital, aeromedical transport, International Space Station, military operations), the HPS can enhance the learning experience.

  9. Perspectives on Performance-Based Incentive Plans.

    ERIC Educational Resources Information Center

    Duttweiler, Patricia Cloud; Ramos-Cancel, Maria L.

    This document is a synthesis of the current literature on performance-based incentive systems for teachers and administrators. Section one provides an introduction to the reform movement and to performance-based pay initiatives; a definition of terms; a brief discussion of funding sources; a discussion of compensation strategies; a description of…

  10. TAP 2: Performance-Based Training Manual

    SciTech Connect

    Not Available

    1993-08-01

    Cornerstone of safe operation of DOE nuclear facilities is personnel performing day-to-day functions which accomplish the facility mission. Performance-based training is fundamental to the safe operation. This manual has been developed to support the Training Accreditation Program (TAP) and assist contractors in efforts to develop performance-based training programs. It provides contractors with narrative procedures on performance-based training that can be modified and incorporated for facility-specific application. It is divided into sections dealing with analysis, design, development, implementation, and evaluation.

  11. The use of open-ended problem-based learning scenarios in an interdisciplinary biotechnology class: evaluation of a problem-based learning course across three years.

    PubMed

    Steck, Todd R; Dibiase, Warren; Wang, Chuang; Boukhtiarov, Anatoli

    2012-01-01

    Use of open-ended Problem-Based Learning (PBL) in biology classrooms has been limited by the difficulty in designing problem scenarios such that the content learned in a course can be predicted and controlled, the lack of familiarity of this method of instruction by faculty, and the difficulty in assessment. Here we present the results of a study in which we developed a team-based interdisciplinary course that combined the fields of biology and civil engineering across three years. We used PBL scenarios as the only learning tool, wrote the problem scenarios, and developed the means to assess these courses and the results of that assessment. Our data indicates that PBL changed students' perception of their learning in content knowledge and promoted a change in students' learning styles. Although no statistically significant improvement in problem-solving skills and critical thinking skills was observed, students reported substantial changes in their problem-based learning strategies and critical thinking skills. PMID:23653774

  12. Evaluating interactive computer-based scenarios designed for learning medical technology.

    PubMed

    Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Wallergård, Mattias; Johansson, Gerd

    2014-11-01

    The use of medical equipment is growing in healthcare, resulting in an increased need for resources to educate users in how to manage the various devices. Learning the practical operation of a device is one thing, but learning how to work with the device in the actual clinical context is more challenging. This paper presents a computer-based simulation prototype for learning medical technology in the context of critical care. Properties from simulation and computer games have been adopted to create a visualization-based, interactive and contextually bound tool for learning. A participatory design process, including three researchers and three practitioners from a clinic for infectious diseases, was adopted to adjust the form and content of the prototype to the needs of the clinical practice and to create a situated learning experience. An evaluation with 18 practitioners showed that practitioners were positive to this type of tool for learning and that it served as a good platform for eliciting and sharing knowledge. Our conclusion is that this type of tools can be a complement to traditional learning resources to situate the learning in a context without requiring advanced technology or being resource-demanding. PMID:24898339

  13. Modeling post-fire sediment yield based on two burn scenarios at the Sooke Lake Reservoir, BC, Canada

    NASA Astrophysics Data System (ADS)

    Dobre, Mariana; Elliot, William J.; Brooks, Erin S.; Smith, Tim

    2016-04-01

    Wildfires can have major adverse effects on municipal water sources. Local governments need methods to evaluate fire risk and to develop mitigation procedures. The Sooke Lake Reservoir is the primary source of water for the city of Victoria, BC and the concern is that sediment delivered from upland burned areas could have a detrimental impact on the reservoir and the water supply. We conducted a sediment delivery modeling pilot study on a portion of the Sooke Lake Reservoir (specifically, the Trestle Creek Management Unit (TCMU)) to evaluate the potential impacts of wildfire on sediment delivery from hillslopes and sub-catchments. We used a process-based hydrologic and soil erosion model called Water Erosion Prediction Project geospatial interface, GeoWEPP, to predict the sediment delivery from specific return period design storms for two burn severity scenarios: real (low-intensity burn severity) and worst (high-intensity burn severity) case scenarios. The GeoWEPP model allows users to simulate streamflow and erosion from hillslope polygons within a watershed. The model requires information on the topographic, soil and vegetative characteristics for each hillslope and a weather file. WEPP default values and several assumptions were necessary to apply the model where data were missing. Based on a 10-m DEM we delineated 16 watersheds within the TCMU area. A long term 100-year daily climate file was generated for this analysis using the CLIGEN model based on the historical observations recorded at Concrete, WA in United States, and adjusted for observed monthly precipitation observed in the Sooke Basin. We ran 100-year simulations and calculated yearly and event-based return periods (for 2, 5, 10, 20, 25, and 50 years) for each of the 16 watersheds. Overall, WEPP simulations indicate that the storms that are most likely to produce the greatest runoff and sediment load in these coastal, maritime climates with relatively low rainfall intensities are likely to occur in

  14. South African maize production scenarios for 2055 using a combined empirical and process-based model approach

    NASA Astrophysics Data System (ADS)

    Estes, L.; Bradley, B.; Oppenheimer, M.; Wilcove, D.; Beukes, H.; Schulze, R. E.; Tadross, M.

    2011-12-01

    In South Africa, a semi-arid country with a diverse agricultural sector, climate change is projected to negatively impact staple crop production. Our study examines future impacts to maize, South Africa's most widely grown staple crop. Working at finer spatial resolution than previous studies, we combine the process-based DSSAT4.5 and the empirical MAXENT models to study future maize suitability. Climate scenarios were based on 9 GCMs run under SRES A2 and B1 emissions scenarios down-scaled (using self-organizing maps) to 5838 locations. Soil properties were derived from textural and compositional data linked to 26422 landforms. DSSAT was run with typical dryland planting parameters and mean projected CO2 values. MAXENT was trained using aircraft-observed distributions and monthly climatologies data derived from downscaled daily records, with future rainfall increased by 10% to simulate CO2 related water-use efficiency gains. We assessed model accuracy based on correlations between model output and a satellite-derived yield proxy (integrated NDVI), and the overlap of modeled and observed maize field distributions. DSSAT yields were linearly correlated to mean integrated NDVI (R2 = 0.38), while MAXENT's relationship was logistic. Binary suitability maps based on thresholding model outputs were slightly more accurate for MAXENT (88%) than for DSSAT (87%) when compared to current maize field distribution. We created 18 suitability maps for each model (9 GCMs X 2 SRES) using projected changes relative to historical suitability thresholds. Future maps largely agreed in eastern South Africa, but disagreed strongly in the semi-arid west. Using a 95% confidence criterion (17 models agree), MAXENT showed a 241305 km2 suitability loss relative to its modeled historical suitability, while DSSAT showed a potential loss of only 112446 km2. Even the smaller potential loss highlighted by DSSAT is uncertain, given that DSSAT's mean (across all 18 climate scenarios) projected yield

  15. Performance evaluation of ground based radar systems

    NASA Astrophysics Data System (ADS)

    Grant, Stanley E.

    1994-06-01

    Ground based radar systems are a critical resource to the command, control, and communications system. This thesis provides the tools and methods to better understand the actual performance of an operational ground based radar system. This thesis defines two measurable performance standards: (1) the baseline performance, which is based on the sensor's internal characteristics, and (2) the theoretical performance, which considers not only the sensor's internal characteristics, but also the effects of the surrounding terrain and atmosphere on the sensor's performance. The baseline radar system performance, often used by operators, contractors, and radar modeling software to determine the expected system performance, is a simplistic and unrealistic means to predict actual radar system performance. The theoretical radar system performance is more complex; but, the results are much more indicative of the actual performance of an operational radar system. The AN/UPS-1 at the Naval Postgraduate School was used as the system under test to illustrate the baseline and theoretical radar system performance. The terrain effects are shown by performing a multipath study and producing coverage diagrams. The key variables used to construct the multipath study and coverage diagrams are discussed in detail. The atmospheric effects are illustrated by using the Integrated Refractive Effects Prediction System (IREPS) and the Engineer's Refractive Effects Prediction System (EREPS) software tools to produce propagations conditions summaries and coverage displays.

  16. Industrialization scenario for X-ray telescopes production based on glass slumping

    NASA Astrophysics Data System (ADS)

    Proserpio, Laura; Döhring, Thorsten; Breunig, Elias; Friedrich, Peter; Winter, Anita

    2014-07-01

    Large X-ray segmented telescopes will be a key element for future missions aiming to solve still hidden mysteries of the hot and energetic Universe, such as the role of black holes in shaping their surroundings or how and why ordinary matter assembles into galaxies and clusters as it does. The major challenge of these systems is to guarantee a large effective area in combination with large field of view and good angular resolution, while maintaining the mass of the entire system within the geometrical and mass budget posed by space launchers. The slumping technology presents all the technical potentiality to be implemented for the realization of such demanding systems: it is based on the use of thin glass foils, shaped at high temperature in an oven over a suitable mould. Thousands of slumped segments are then aligned and assembled together into the optical payload. An exercise on the mass production approach has been conducted at Max Planck Institute for Extraterrestrial Physics (MPE) to show that the slumping technology can be a valuable approach for the realization of future X-ray telescopes also from a point of view of industrialization. For the analysis, a possible design for the ATHENA mission telescope was taken as reference.

  17. Moral foundations vignettes: a standardized stimulus database of scenarios based on moral foundations theory

    PubMed Central

    Iyengar, Vijeth; Cabeza, Roberto; Sinnott-Armstrong, Walter

    2016-01-01

    Research on the emotional, cognitive, and social determinants of moral judgment has surged in recent years. The development of moral foundations theory (MFT) has played an important role, demonstrating the breadth of morality. Moral psychology has responded by investigating how different domains of moral judgment are shaped by a variety of psychological factors. Yet, the discipline lacks a validated set of moral violations that span the moral domain, creating a barrier to investigating influences on judgment and how their neural bases might vary across the moral domain. In this paper, we aim to fill this gap by developing and validating a large set of moral foundations vignettes (MFVs). Each vignette depicts a behavior violating a particular moral foundation and not others. The vignettes are controlled on many dimensions including syntactic structure and complexity making them suitable for neuroimaging research. We demonstrate the validity of our vignettes by examining respondents’ classifications of moral violations, conducting exploratory and confirmatory factor analysis, and demonstrating the correspondence between the extracted factors and existing measures of the moral foundations. We expect that the MFVs will be beneficial for a wide variety of behavioral and neuroimaging investigations of moral cognition. PMID:25582811

  18. Scenario-Based Validation of Moderate Resolution DEMs Freely Available for Complex Himalayan Terrain

    NASA Astrophysics Data System (ADS)

    Singh, Mritunjay Kumar; Gupta, R. D.; Snehmani; Bhardwaj, Anshuman; Ganju, Ashwagosha

    2016-02-01

    Accuracy of the Digital Elevation Model (DEM) affects the accuracy of various geoscience and environmental modelling results. This study evaluates accuracies of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global DEM Version-2 (GDEM V2), the Shuttle Radar Topography Mission (SRTM) X-band DEM and the NRSC Cartosat-1 DEM V1 (CartoDEM). A high resolution (1 m) photogrammetric DEM (ADS80 DEM), having a high absolute accuracy [1.60 m linear error at 90 % confidence (LE90)], resampled at 30 m cell size was used as reference. The overall root mean square error (RMSE) in vertical accuracy was 23, 73, and 166 m and the LE90 was 36, 75, and 256 m for ASTER GDEM V2, SRTM X-band DEM and CartoDEM, respectively. A detailed error analysis was performed for individual as well as combinations of different classes of aspect, slope, land-cover and elevation zones for the study area. For the ASTER GDEM V2, forest areas with North facing slopes (0°-5°) in the 4th elevation zone (3773-4369 m) showed minimum LE90 of 0.99 m, and barren with East facing slopes (>60°) falling under the 2nd elevation zone (2581-3177 m) showed maximum LE90 of 166 m. For the SRTM DEM, pixels with South-East facing slopes of 0°-5° in the 4th elevation zone covered with forest showed least LE90 of 0.33 m and maximum LE90 of 521 m was observed in the barren area with North-East facing slope (>60°) in the 4th elevation zone. In case of the CartoDEM, the snow pixels in the 2nd elevation zone with South-East facing slopes of 5°-15° showed least LE90 of 0.71 m and maximum LE90 of 1266 m was observed for the snow pixels in the 3rd elevation zone (3177-3773 m) within the South facing slope of 45°-60°. These results can be highly useful for the researchers using DEM products in various modelling exercises.

  19. Projections of high resolution climate changes for South Korea using multiple-regional climate models based on four RCP scenarios. Part 1: Surface air temperature

    NASA Astrophysics Data System (ADS)

    Suh, Myoung-Seok; Oh, Seok-Geun; Lee, Young-Suk; Ahn, Joong-Bae; Cha, Dong-Hyun; Lee, Dong-Kyou; Hong, Song-You; Min, Seung-Ki; Park, Seong-Chan; Kang, Hyun-Suk

    2016-05-01

    We projected surface air temperature changes over South Korea during the mid (2026-2050) and late (2076-2100) 21st century against the current climate (1981-2005) using the simulation results from five regional climate models (RCMs) driven by Hadley Centre Global Environmental Model, version 2, coupled with the Atmosphere- Ocean (HadGEM2-AO), and two ensemble methods (equal weighted averaging, weighted averaging based on Taylor's skill score) under four Representative Concentration Pathways (RCP) scenarios. In general, the five RCM ensembles captured the spatial and seasonal variations, and probability distribution of temperature over South Korea reasonably compared to observation. They particularly showed a good performance in simulating annual temperature range compared to HadGEM2-AO. In future simulation, the temperature over South Korea will increase significantly for all scenarios and seasons. Stronger warming trends are projected in the late 21st century than in the mid-21st century, in particular under RCP8.5. The five RCM ensembles projected that temperature changes for the mid/late 21st century relative to the current climate are +1.54oC/+1.92oC for RCP2.6, +1.68oC/+2.91oC for RCP4.5, +1.17oC/+3.11oC for RCP6.0, and +1.75oC/+4.73oC for RCP8.5. Compared to the temperature projection of HadGEM2-AO, the five RCM ensembles projected smaller increases in temperature for all RCP scenarios and seasons. The inter-RCM spread is proportional to the simulation period (i.e., larger in the late-21st than mid-21st century) and significantly greater (about four times) in winter than summer for all RCP scenarios. Therefore, the modeled predictions of temperature increases during the late 21st century, particularly for winter temperatures, should be used with caution.

  20. Projections of high resolution climate changes for South Korea using multiple-regional climate models based on four RCP scenarios. Part 1: surface air temperature

    NASA Astrophysics Data System (ADS)

    Suh, Myoung-Seok; Oh, Seok-Geun; Lee, Young-Suk; Ahn, Joong-Bae; Cha, Dong-Hyun; Lee, Dong-Kyou; Hong, Song-You; Min, Seung-Ki; Park, Seong-Chan; Kang, Hyun-Suk

    2016-05-01

    We projected surface air temperature changes over South Korea during the mid (2026-2050) and late (2076-2100) 21st century against the current climate (1981-2005) using the simulation results from five regional climate models (RCMs) driven by Hadley Centre Global Environmental Model, version 2, coupled with the Atmosphere- Ocean (HadGEM2-AO), and two ensemble methods (equal weighted averaging, weighted averaging based on Taylor's skill score) under four Representative Concentration Pathways (RCP) scenarios. In general, the five RCM ensembles captured the spatial and seasonal variations, and probability distribution of temperature over South Korea reasonably compared to observation. They particularly showed a good performance in simulating annual temperature range compared to HadGEM2-AO. In future simulation, the temperature over South Korea will increase significantly for all scenarios and seasons. Stronger warming trends are projected in the late 21st century than in the mid-21st century, in particular under RCP8.5. The five RCM ensembles projected that temperature changes for the mid/late 21st century relative to the current climate are +1.54°C/+1.92°C for RCP2.6, +1.68°C/+2.91°C for RCP4.5, +1.17°C/+3.11°C for RCP6.0, and +1.75°C/+4.73°C for RCP8.5. Compared to the temperature projection of HadGEM2-AO, the five RCM ensembles projected smaller increases in temperature for all RCP scenarios and seasons. The inter-RCM spread is proportional to the simulation period (i.e., larger in the late-21st than mid-21st century) and significantly greater (about four times) in winter than summer for all RCP scenarios. Therefore, the modeled predictions of temperature increases during the late 21st century, particularly for winter temperatures, should be used with caution.

  1. Assessment of vulnerability to future marine processes of urbanized coastal environments by a GIS-based approach: expected scenario in the metropolitan area of Bari (Italy)

    NASA Astrophysics Data System (ADS)

    Mancini, F.; Ceppi, C.; Christopulos, V.

    2013-12-01

    Literature concerning the risk assessment procedures after extreme meteorological events is generally focused on the establishing of relationship between actual severe weather conditions and impact detected over the involved zones. Such an events are classified on the basis of measurements and observation able to assess the magnitude of phenomena or on the basis of related effects on the affected area, the latter being deeply connected with the overall physical vulnerability. However such assessment almost never do consider scenario about expected extreme event and possible pattern of urbanization at the time of impact and nor the spatial and temporal uncertainty of phenomena are taken into account. The drawn of future scenario about coastal vulnerability to marine processes is therefore difficult. This work focuses the study case of the Metropoli Terra di Bari (metropolitan area of Bari, Apulia, Italy) where a coastal vulnerability analysis due to climate changes expected on the basis of expert opinions coming from the scientific community was carried out. Several possible impacts on the coastal environments were considered, in particular sea level rise inundation, flooding due to storm surge and coastal erosion. For such a purpose the methodology base on SRES (Special Report on Emission Scenario) produced by the IPCC (Intergovernmental Panel on Climate Change) was adopted after a regionalization procedure as carried out by Verburgh and others (2006) at the European scale. The open source software SLEUTH, base on the cellular automate principle, was used and the reliability of obtained scenario verified through the Monte Carlo method. Once these scenario were produced, a GIS-based multicriteria methodology was implemented to evaluate the vulnerability of the urbanized coastal area of interest. Several vulnerability maps related are therefore available for different scenario able to consider the degree of hazards and potential development of the typology and extent

  2. Risk-based decision making for staggered bioterrorist attacks : resource allocation and risk reduction in "reload" scenarios.

    SciTech Connect

    Lemaster, Michelle Nicole; Gay, David M.; Ehlen, Mark Andrew; Boggs, Paul T.; Ray, Jaideep

    2009-10-01

    Staggered bioterrorist attacks with aerosolized pathogens on population centers present a formidable challenge to resource allocation and response planning. The response and planning will commence immediately after the detection of the first attack and with no or little information of the second attack. In this report, we outline a method by which resource allocation may be performed. It involves probabilistic reconstruction of the bioterrorist attack from partial observations of the outbreak, followed by an optimization-under-uncertainty approach to perform resource allocations. We consider both single-site and time-staggered multi-site attacks (i.e., a reload scenario) under conditions when resources (personnel and equipment which are difficult to gather and transport) are insufficient. Both communicable (plague) and non-communicable diseases (anthrax) are addressed, and we also consider cases when the data, the time-series of people reporting with symptoms, are confounded with a reporting delay. We demonstrate how our approach develops allocations profiles that have the potential to reduce the probability of an extremely adverse outcome in exchange for a more certain, but less adverse outcome. We explore the effect of placing limits on daily allocations. Further, since our method is data-driven, the resource allocation progressively improves as more data becomes available.

  3. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    NASA Astrophysics Data System (ADS)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-07-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  4. Performance Based Education: A Social Alchemy.

    ERIC Educational Resources Information Center

    Clements, Millard

    1982-01-01

    An exploration of performance-based education is focused through these questions: What image of human beings does it project? What image of professionals does it project? What purpose does it serve? What image of knowledge does it project? (CT)

  5. A Semantic Web-Based Authoring Tool to Facilitate the Planning of Collaborative Learning Scenarios Compliant with Learning Theories

    ERIC Educational Resources Information Center

    Isotani, Seiji; Mizoguchi, Riichiro; Isotani, Sadao; Capeli, Olimpio M.; Isotani, Naoko; de Albuquerque, Antonio R. P. L.; Bittencourt, Ig. I.; Jaques, Patricia

    2013-01-01

    When the goal of group activities is to support long-term learning, the task of designing well-thought-out collaborative learning (CL) scenarios is an important key to success. To help students adequately acquire and develop their knowledge and skills, a teacher can plan a scenario that increases the probability for learning to occur. Such a…

  6. Off-Nominal Performance of the International Space Station Solar Array Wings Under Orbital Eclipse Lighting Scenarios

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Scheiman, David A.

    2005-01-01

    This paper documents testing and analyses to quantify International Space Station (ISS) Solar Array Wing (SAW) string electrical performance under highly off-nominal, low-temperature-low-intensity (LILT) operating conditions with nonsolar light sources. This work is relevant for assessing feasibility and risks associated with a Sequential Shunt Unit (SSU) remove and replace (R&R) Extravehicular Activity (EVA). During eclipse, SAW strings can be energized by moonlight, EVA suit helmet lights or video camera lights. To quantify SAW performance under these off-nominal conditions, solar cell performance testing was performed using full moon, solar simulator and Video Camera Luminaire (VCL) light sources. Test conditions included 25 to 110 C temperatures and 1- to 0.0001-Sun illumination intensities. Electrical performance data and calculated eclipse lighting intensities were combined to predict SAW current-voltage output for comparison with electrical hazard thresholds. Worst case predictions show there is no connector pin molten metal hazard but crew shock hazard limits are exceeded due to VCL illumination. Assessment uncertainties and limitations are discussed along with operational solutions to mitigate SAW electrical hazards from VCL illumination. Results from a preliminary assessment of SAW arcing are also discussed. The authors recommend further analyses once SSU, R&R, and EVA procedures are better defined.

  7. Material Performance of Fully-Ceramic Micro-Encapsulated Fuel under Selected LWR Design Basis Scenarios: Final Report

    SciTech Connect

    B. Boer; R. S. Sen; M. A. Pope; A. M. Ougouag

    2011-09-01

    The extension to LWRs of the use of Deep-Burn coated particle fuel envisaged for HTRs has been investigated. TRISO coated fuel particles are used in Fully-Ceramic Microencapsulated (FCM) fuel within a SiC matrix rather than the graphite of HTRs. TRISO particles are well characterized for uranium-fueled HTRs. However, operating conditions of LWRs are different from those of HTRs (temperature, neutron energy spectrum, fast fluence levels, power density). Furthermore, the time scales of transient core behavior during accidents are usually much shorter and thus more severe in LWRs. The PASTA code was updated for analysis of stresses in coated particle FCM fuel. The code extensions enable the automatic use of neutronic data (burnup, fast fluence as a function of irradiation time) obtained using the DRAGON neutronics code. An input option for automatic evaluation of temperature rise during anticipated transients was also added. A new thermal model for FCM was incorporated into the code; so-were updated correlations (for pyrocarbon coating layers) suitable to estimating dimensional changes at the high fluence levels attained in LWR DB fuel. Analyses of the FCM fuel using the updated PASTA code under nominal and accident conditions show: (1) Stress levels in SiC-coatings are low for low fission gas release (FGR) fractions of several percent, as based on data of fission gas diffusion in UO{sub 2} kernels. However, the high burnup level of LWR-DB fuel implies that the FGR fraction is more likely to be in the range of 50-100%, similar to Inert Matrix Fuels (IMFs). For this range the predicted stresses and failure fractions of the SiC coating are high for the reference particle design (500 {micro}mm kernel diameter, 100 {micro}mm buffer, 35 {micro}mm IPyC, 35 {micro}mm SiC, 40 {micro}mm OPyC). A conservative case, assuming 100% FGR, 900K fuel temperature and 705 MWd/kg (77% FIMA) fuel burnup, results in a 8.0 x 10{sup -2} failure probability. For a 'best-estimate' FGR fraction

  8. Projected climate change impacts on North Sea and Baltic Sea: CMIP3 and CMIP5 model based scenarios

    NASA Astrophysics Data System (ADS)

    Pushpadas, D.; Schrum, C.; Daewel, U.

    2015-08-01

    Climate change impacts on the marine biogeochemistry and lower trophic level dynamics in the North Sea and Baltic Sea have been assessed using regional downscaling in a number of recent studies. However, most of these where only forced by physical conditions from Global Climate Models (GCMs) and regional downscaling considering the climate change impact on oceanic nutrient conditions from Global Earth System Models (ESMs) are rare and so far solely based on CMIP3-generation climate models. The few studies published show a large range in projected future primary production and hydrodynamic condition. With the addition of CMIP5 models and scenarios, the demand to explore the uncertainty in regional climate change projections increased. Moreover, the question arises how projections based on CMIP5-generation models compare to earlier projections and multi-model ensembles comprising both AR4 and AR5 generation forcing models. Here, we investigated the potential future climate change impacts to the North Sea and the Baltic Sea ecosystem using a coherent regional downscaling strategy based on the regional coupled bio-physical model ECOSMO. ECOSMO was forced by output from different ESMs from both CMIP3 and CMIP5 models. Multi-model ensembles using CMIP3/A1B and CMIP5/RCP4.5 scenarios are examined, where the selected CMIP5 models are the successors of the chosen CMIP3 models. Comparing projected changes with the present day reference condition, all these simulations predicted an increase in Sea Surface Temperature (SST) in both North Sea and Baltic Sea, reduction in sea ice in the Baltic, decrease in primary production in the North Sea and an increase in primary production in the Baltic Sea. Despite these largely consistent results on the direction of the projected changes, our results revealed a broad range in the amplitude of projected climate change impacts. Our study strengthens the claim that the choice of the ESM is a major factor for regional climate projections

  9. Relation of student characteristics to learning of basic biochemistry concepts from a multimedia goal-based scenario

    NASA Astrophysics Data System (ADS)

    Schoenfeld-Tacher, Regina M.

    2000-10-01

    The purpose of this study was to investigate the relationship of several cognitive and demographic variables to learning outcomes from a multimedia Goal-Based Scenario lesson on DNA. The demographic variables under investigation were: gender, ethnicity, prior science coursework in college and high school, final score in current chemistry course and prior experience with computers. The cognitive variables under study were logical thinking ability, spatial ability and disembedding ability. The subjects for this study were a total of 525 college students enrolled in introductory chemistry classes for non-majors at one of four participating institutions in the U.S. and Canada. Of these participants, 488 formed the experimental group and 37 formed a control group. All subjects completed content pre- and post-tests, a demographic questionnaire and three cognitive tests: Test of Logical Thinking, Hidden Figures Test and Purdue Visualization of Rotations Test. Students in the experimental group used "Whodunnit?," a multimedia Goal-Based Scenario to teach basic biochemistry concepts pertaining to DNA, while students in the control group completed a similar paper-based activity. A combination of general linear models and linear regression analysis was used to examine the data obtained. Post-hoc analyses were conducted for categorical variables when they were found to be significant contributors to the model tested. The results showed that there was no relationship between gender or ethnicity and academic Outcomes. Prior science coursework completed in college did not show a relationship with post-test scores, although number of science courses completed in high school was a significant predictor of academic outcomes. A relationship was observed between course rank and learning outcomes, as students with final course grades in the upper quartile of the sample scored significantly higher on the post-test than all others. The amount of variance in outcomes explained by prior

  10. Evaluation of shallow landslide-triggering scenarios through a physically based approach: an example of application in the southern Messina area (northeastern Sicily, Italy)

    NASA Astrophysics Data System (ADS)

    Schilirò, L.; Esposito, C.; Scarascia Mugnozza, G.

    2015-09-01

    Rainfall-induced shallow landslides are a widespread phenomenon that frequently causes substantial damage to property, as well as numerous casualties. In recent~years a wide range of physically based models have been developed to analyze the triggering process of these events. Specifically, in this paper we propose an approach for the evaluation of different shallow landslide-triggering scenarios by means of the TRIGRS (transient rainfall infiltration and grid-based slope stability) numerical model. For the validation of the model, a back analysis of the landslide event that occurred in the study area (located SW of Messina, northeastern Sicily, Italy) on 1 October 2009 was performed, by using different methods and techniques for the definition of the input parameters. After evaluating the reliability of the model through comparison with the 2009 landslide inventory, different triggering scenarios were defined using rainfall values derived from the rainfall probability curves, reconstructed on the basis of daily and hourly historical rainfall data. The results emphasize how these phenomena are likely to occur in the area, given that even short-duration (1-3 h) rainfall events with a relatively low return period (e.g., 10-20~years) can trigger numerous slope failures. Furthermore, for the same rainfall amount, the daily simulations underestimate the instability conditions. The high susceptibility of this area to shallow landslides is testified by the high number of landslide/flood events that have occurred in the past and are summarized in this paper by means of archival research. Considering the main features of the proposed approach, the authors suggest that this methodology could be applied to different areas, even for the development of landslide early warning systems.

  11. Performance-Based Funding in Public Schools.

    ERIC Educational Resources Information Center

    Foster, Charles A.; Marquart, Deanna J.

    This report examines three performance-based funding (PBF) plans: (1) merit pay for teachers and/or administrators; (2) career ladders; and (3) formula-driven incentive payments to schools. The report contends that present-day problems in public schools result largely from the organizational structure of the educational enterprise. Being based on…

  12. Using an Animated Case Scenario Based on Constructivist 5E Model to Enhance Pre-Service Teachers' Awareness of Electrical Safety

    ERIC Educational Resources Information Center

    Hirca, Necati

    2013-01-01

    The objective of this study is to get pre-service teachers to develop an awareness of first aid knowledge and skills related to electrical shocking and safety within a scenario based animation based on a Constructivist 5E model. The sample of the study was composed of 78 (46 girls and 32 boys) pre-service classroom teachers from two faculties of…

  13. Medical Content Searching, Retrieving, and Sharing Over the Internet: Lessons Learned From the mEducator Through a Scenario-Based Evaluation

    PubMed Central

    Spachos, Dimitris; Mylläri, Jarkko; Giordano, Daniela; Dafli, Eleni; Mitsopoulou, Evangelia; Schizas, Christos N; Pattichis, Constantinos; Nikolaidou, Maria

    2015-01-01

    Background The mEducator Best Practice Network (BPN) implemented and extended standards and reference models in e-learning to develop innovative frameworks as well as solutions that enable specialized state-of-the-art medical educational content to be discovered, retrieved, shared, and re-purposed across European Institutions, targeting medical students, doctors, educators and health care professionals. Scenario-based evaluation for usability testing, complemented with data from online questionnaires and field notes of users’ performance, was designed and utilized for the evaluation of these solutions. Objective The objective of this work is twofold: (1) to describe one instantiation of the mEducator BPN solutions (mEducator3.0 - “MEdical Education LINnked Arena” MELINA+) with a focus on the metadata schema used, as well as on other aspects of the system that pertain to usability and acceptance, and (2) to present evaluation results on the suitability of the proposed metadata schema for searching, retrieving, and sharing of medical content and with respect to the overall usability and acceptance of the system from the target users. Methods A comprehensive evaluation methodology framework was developed and applied to four case studies, which were conducted in four different countries (ie, Greece, Cyprus, Bulgaria and Romania), with a total of 126 participants. In these case studies, scenarios referring to creating, sharing, and retrieving medical educational content using mEducator3.0 were used. The data were collected through two online questionnaires, consisting of 36 closed-ended questions and two open-ended questions that referred to mEducator 3.0 and through the use of field notes during scenario-based evaluations. Results The main findings of the study showed that even though the informational needs of the mEducator target groups were addressed to a satisfactory extent and the metadata schema supported content creation, sharing, and retrieval from an end

  14. Nitride fuels irradiation performance data base

    SciTech Connect

    Brozak, D.E.; Thomas, J.K.; Peddicord, K.L.

    1987-01-01

    An irradiation performance data base for nitride fuels has been developed from an extensive literature search and review that emphasized uranium nitride, but also included performance data for mixed nitrides ((U,Pu)N) and carbonitrides ((U,Pu)C,N) to increase the quantity and depth of pin data available. This work represents a very extensive effort to systematically collect and organize irradiation data for nitride-based fuels. The data base has many potential applications. First, it can facilitate parametric studies of nitride-based fuels to be performed using a wide range of pin designs and operating conditions. This should aid in the identification of important parameters and design requirements for multimegawatt and SP-100 fuel systems. Secondly, the data base can be used to evaluate fuel performance models. For detailed studies, it can serve as a guide to selecting a small group of pin specimens for extensive characterization. Finally, the data base will serve as an easily accessible and expandable source of irradiation performance information for nitride fuels.

  15. Moving from pixels to parcels: Modeling agricultural scenarios in the northern Great Plains using a hybrid raster- and vector-based approach

    NASA Astrophysics Data System (ADS)

    Sohl, T.; Wika, S.; Dornbierer, J.; Sayler, K. L.; Quenzer, R.

    2015-12-01

    Policy and economic driving forces have resulted in a higher demand for biofuel feedstocks in recent years, resulting in substantial increases in cultivated cropland in the northern Great Plains. A cellulosic-based biofuel industry could potentially further impact the region, with grassland and marginal agricultural land converted to perennial grasses or other feedstocks. Scenarios of projected land-use change are needed to enable regional stakeholders to plan for the potential consequences of expanded agricultural activity. Land-use models used to produce spatially explicit scenarios are typically raster-based and are poor at representing ownership units on which land-use change is based. This work describes a hybrid raster/vector-based modeling approach for modeling scenarios of agricultural change in the northern Great Plains. Regional scenarios of agricultural change from 2012 to 2050 were constructed, based partly on the U.S. Department of Energy's Billion Ton Update. Land-use data built from the 2012 Cropland Data Layer and the 2011 National Land Cover Database was used to establish initial conditions. Field boundaries from the U.S. Department of Agriculture's Common Land Unit dataset were used to establish ownership units. A modified version of the U.S. Geological Survey's Forecasting Scenarios of land-use (FORE-SCE) model was used to ingest vector-based field boundaries to facilitate the modeling of a farmer's choice of land use for a given year, while patch-based raster methodologies were used to represent expansion of urban/developed lands and other land use conversions. All modeled data were merged to a common raster dataset representing annual land use from 2012 to 2050. The hybrid modeling approach enabled the use of traditional, raster-based methods while integrating vector-based data to represent agricultural fields and other ownership-based units upon which land-use decisions are typically made.

  16. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project. [Assessment of post-closure performance for a proposed repository for high-level nuclear waste

    SciTech Connect

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab.

  17. Variability of tsunami inundation footprints considering stochastic scenarios based on a single rupture model: Application to the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Goda, Katsuichiro; Yasuda, Tomohiro; Mori, Nobuhito; Mai, P. Martin

    2015-06-01

    The sensitivity and variability of spatial tsunami inundation footprints in coastal cities and towns due to a megathrust subduction earthquake in the Tohoku region of Japan are investigated by considering different fault geometry and slip distributions. Stochastic tsunami scenarios are generated based on the spectral analysis and synthesis method with regards to an inverted source model. To assess spatial inundation processes accurately, tsunami modeling is conducted using bathymetry and elevation data with 50 m grid resolutions. Using the developed methodology for assessing variability of tsunami hazard estimates, stochastic inundation depth maps can be generated for local coastal communities. These maps are important for improving disaster preparedness by understanding the consequences of different situations/conditions, and by communicating uncertainty associated with hazard predictions. The analysis indicates that the sensitivity of inundation areas to the geometrical parameters (i.e., top-edge depth, strike, and dip) depends on the tsunami source characteristics and the site location, and is therefore complex and highly nonlinear. The variability assessment of inundation footprints indicates significant influence of slip distributions. In particular, topographical features of the region, such as ria coast and near-shore plain, have major influence on the tsunami inundation footprints.

  18. Combined magnetic and kinetic control of advanced tokamak steady state scenarios based on semi-empirical modelling

    NASA Astrophysics Data System (ADS)

    Moreau, D.; Artaud, J. F.; Ferron, J. R.; Holcomb, C. T.; Humphreys, D. A.; Liu, F.; Luce, T. C.; Park, J. M.; Prater, R.; Turco, F.; Walker, M. L.

    2015-06-01

    This paper shows that semi-empirical data-driven models based on a two-time-scale approximation for the magnetic and kinetic control of advanced tokamak (AT) scenarios can be advantageously identified from simulated rather than real data, and used for control design. The method is applied to the combined control of the safety factor profile, q(x), and normalized pressure parameter, βN, using DIII-D parameters and actuators (on-axis co-current neutral beam injection (NBI) power, off-axis co-current NBI power, electron cyclotron current drive power, and ohmic coil). The approximate plasma response model was identified from simulated open-loop data obtained using a rapidly converging plasma transport code, METIS, which includes an MHD equilibrium and current diffusion solver, and combines plasma transport nonlinearity with 0D scaling laws and 1.5D ordinary differential equations. The paper discusses the results of closed-loop METIS simulations, using the near-optimal ARTAEMIS control algorithm (Moreau D et al 2013 Nucl. Fusion 53 063020) for steady state AT operation. With feedforward plus feedback control, the steady state target q-profile and βN are satisfactorily tracked with a time scale of about 10 s, despite large disturbances applied to the feedforward powers and plasma parameters. The robustness of the control algorithm with respect to disturbances of the H&CD actuators and of plasma parameters such as the H-factor, plasma density and effective charge, is also shown.

  19. Preliminary Safety Analysis of the Gorleben Site: Safety Concept and Application to Scenario Development Based on a Site-Specific Features, Events and Processes (FEP) Database - 13304

    SciTech Connect

    Moenig, Joerg; Beuth, Thomas; Wolf, Jens; Lommerzheim, Andre; Mrugalla, Sabine

    2013-07-01

    Based upon the German safety criteria, released in 2010 by the Federal Ministry of the Environment (BMU), a safety concept and a safety assessment concept for the disposal of heat-generating high-level waste have both been developed in the framework of the preliminary safety case for the Gorleben site (Project VSG). The main objective of the disposal is to contain the radioactive waste inside a defined rock zone, which is called containment-providing rock zone. The radionuclides shall remain essentially at the emplacement site, and at the most, a small defined quantity of material shall be able to leave this rock zone. This shall be accomplished by the geological barrier and a technical barrier system, which is required to seal the inevitable penetration of the geological barrier by the construction of the mine. The safe containment has to be demonstrated for probable and less probable evolutions of the site, while evolutions with very low probability (less than 1 % over the demonstration period of 1 million years) need not to be considered. Owing to the uncertainty in predicting the real evolution of the site, plausible scenarios have been derived in a systematic manner. Therefore, a comprehensive site-specific features, events and processes (FEP) data base for the Gorleben site has been developed. The safety concept was directly taken into account, e.g. by identification of FEP with direct influence on the barriers that provide the containment. No effort was spared to identify the interactions of the FEP, their probabilities of occurrence, and their characteristics (values). The information stored in the data base provided the basis for the development of scenarios. The scenario development methodology is based on FEP related to an impairment of the functionality of a subset of barriers, called initial barriers. By taking these FEP into account in their probable characteristics the reference scenario is derived. Thus, the reference scenario describes a

  20. Class diagram based evaluation of software performance

    NASA Astrophysics Data System (ADS)

    Pham, Huong V.; Nguyen, Binh N.

    2013-03-01

    The evaluation of software performance in the early stages of the software life cycle is important and it has been widely studied. In the software model specification, class diagram is the important object-oriented software specification model. The measures based on a class diagram have been widely studied to evaluate quality of software such as complexity, maintainability, reuse capability, etc. However the software performance evaluation based on Class model has not been widely studied, especially for object-oriented design of embedded software. Therefore, in this paper we propose a new approach to directly evaluate the software performance based on class diagrams. From a class diagram, we determine the parameters which are used to evaluate and build formula of the measures such as Size of Class Variables, Size of Class Methods, Size of Instance Variables, Size of Instance Methods, etc. Then, we do analysis of the dependence of performance on these measures and build the performance evaluation function from class diagram. Thereby we can choose the best class diagram based on this evaluation function.

  1. High performance pitch-based carbon fiber

    SciTech Connect

    Tadokoro, Hiroyuki; Tsuji, Nobuyuki; Shibata, Hirotaka; Furuyama, Masatoshi

    1996-12-31

    The high performance pitch-based carbon fiber with smaller diameter, six micro in developed by Nippon Graphite Fiber Corporation. This fiber possesses high tensile modulus, high tensile strength, excellent yarn handle ability, low thermal expansion coefficient, and high thermal conductivity which make it an ideal material for space applications such as artificial satellites. Performance of this fiber as a reinforcement of composites was sufficient. With these characteristics, this pitch-based carbon fiber is expected to find wide variety of possible applications in space structures, industrial field, sporting goods and civil infrastructures.

  2. Implementation and Analysis of a Wireless Sensor Network-Based Pet Location Monitoring System for Domestic Scenarios.

    PubMed

    Aguirre, Erik; Lopez-Iturri, Peio; Azpilicueta, Leyre; Astrain, José Javier; Villadangos, Jesús; Santesteban, Daniel; Falcone, Francisco

    2016-01-01

    The flexibility of new age wireless networks and the variety of sensors to measure a high number of variables, lead to new scenarios where anything can be monitored by small electronic devices, thereby implementing Wireless Sensor Networks (WSN). Thanks to ZigBee, RFID or WiFi networks the precise location of humans or animals as well as some biological parameters can be known in real-time. However, since wireless sensors must be attached to biological tissues and they are highly dispersive, propagation of electromagnetic waves must be studied to deploy an efficient and well-working network. The main goal of this work is to study the influence of wireless channel limitations in the operation of a specific pet monitoring system, validated at physical channel as well as at functional level. In this sense, radio wave propagation produced by ZigBee devices operating at the ISM 2.4 GHz band is studied through an in-house developed 3D Ray Launching simulation tool, in order to analyze coverage/capacity relations for the optimal system selection as well as deployment strategy in terms of number of transceivers and location. Furthermore, a simplified dog model is developed for simulation code, considering not only its morphology but also its dielectric properties. Relevant wireless channel information such as power distribution, power delay profile and delay spread graphs are obtained providing an extensive wireless channel analysis. A functional dog monitoring system is presented, operating over the implemented ZigBee network and providing real time information to Android based devices. The proposed system can be scaled in order to consider different types of domestic pets as well as new user based functionalities. PMID:27589751

  3. Enhanced Confinement Scenarios Without Large Edge Localized Modes in Tokamaks: Control, Performance, and Extrapolability Issues for ITER

    SciTech Connect

    Maingi, R

    2014-07-01

    Large edge localized modes (ELMs) typically accompany good H-mode confinement in fusion devices, but can present problems for plasma facing components because of high transient heat loads. Here the range of techniques for ELM control deployed in fusion devices is reviewed. The two baseline strategies in the ITER baseline design are emphasized: rapid ELM triggering and peak heat flux control via pellet injection, and the use of magnetic perturbations to suppress or mitigate ELMs. While both of these techniques are moderately well developed, with reasonable physical bases for projecting to ITER, differing observations between multiple devices are also discussed to highlight the needed community R & D. In addition, recent progress in ELM-free regimes, namely Quiescent H-mode, I-mode, and Enhanced Pedestal H-mode is reviewed, and open questions for extrapolability are discussed. Finally progress and outstanding issues in alternate ELM control techniques are reviewed: supersonic molecular beam injection, edge electron cyclotron heating, lower hybrid heating and/or current drive, controlled periodic jogs of the vertical centroid position, ELM pace-making via periodic magnetic perturbations, ELM elimination with lithium wall conditioning, and naturally occurring small ELM regimes.

  4. Columnar modelling of nucleation burst evolution in the convective boundary layer - first results from a feasibility study Part III: Preliminary results on physicochemical model performance using two "clean air mass" reference scenarios

    NASA Astrophysics Data System (ADS)

    Hellmuth, O.

    2006-09-01

    In Paper I of four papers, a revised columnar high-order model to investigate gas-aerosol-turbulence interactions in the convective boundary layer (CBL) was proposed. In Paper II, the model capability to predict first-, second- and third-order moments of meteorological variables in the CBL was demonstrated using available observational data. In the present Paper III, the high-order modelling concept is extended to sulphur and ammonia chemistry as well as to aerosol dynamics. Based on the previous CBL simulation, a feasibility study is performed using two "clean air mass" scenarios with an emission source at the ground but low aerosol background concentration. Such scenarios synoptically correspond to the advection of fresh post-frontal air in an anthropogenically influenced region. The aim is to evaluate the time-height evolution of ultrafine condensation nuclei (UCNs) and to elucidate the interactions between meteorological and physicochemical variables in a CBL column. The scenarios differ in the treatment of new particle formation (NPF), whereas homogeneous nucleation according to the classical nucleation theory (CNT) is considered. The first scenario considers nucleation of a binary system consisting of water vapour and sulphuric acid (H2SO4) vapour, the second one nucleation of a ternary system additionally involving ammonia (NH3). Here, the two synthetic scenarios are discussed in detail, whereas special attention is payed to the role of turbulence in the formation of the typical UCN burst behaviour, that can often be observed in the surface layer. The intercomparison of the two scenarios reveals large differences in the evolution of the UCN number concentration in the surface layer as well as in the time-height cross-sections of first-order moments and double correlation terms. Although in both cases the occurrence of NPF bursts could be simulated, the burst characteristics and genesis of the bursts are completely different. It is demonstrated, that

  5. Climate Change Effects on Heat- and Cold-Related Mortality in the Netherlands: A Scenario-Based Integrated Environmental Health Impact Assessment

    PubMed Central

    Huynen, Maud M. T. E.; Martens, Pim

    2015-01-01

    Although people will most likely adjust to warmer temperatures, it is still difficult to assess what this adaptation will look like. This scenario-based integrated health impacts assessment explores baseline (1981–2010) and future (2050) population attributable fractions (PAF) of mortality due to heat (PAFheat) and cold (PAFcold), by combining observed temperature–mortality relationships with the Dutch KNMI’14 climate scenarios and three adaptation scenarios. The 2050 model results without adaptation reveal a decrease in PAFcold (8.90% at baseline; 6.56%–7.85% in 2050) that outweighs the increase in PAFheat (1.15% at baseline; 1.66%–2.52% in 2050). When the 2050 model runs applying the different adaptation scenarios are considered as well, however, the PAFheat ranges between 0.94% and 2.52% and the PAFcold between 6.56% and 9.85%. Hence, PAFheat and PAFcold can decrease as well as increase in view of climate change (depending on the adaptation scenario). The associated annual mortality burdens in 2050—accounting for both the increasing temperatures and mortality trend—show that heat-related deaths will range between 1879 and 5061 (1511 at baseline) and cold-related deaths between 13,149 and 19,753 (11,727 at baseline). Our results clearly illustrate that model outcomes are not only highly dependent on climate scenarios, but also on adaptation assumptions. Hence, a better understanding of (the impact of various) plausible adaptation scenarios is required to advance future integrated health impact assessments. PMID:26512680

  6. Source-Based Modeling Of Urban Stormwater Quality Response to the Selected Scenarios Combining Future Changes in Climate and Socio-Economic Factors.

    PubMed

    Borris, Matthias; Leonhardt, Günther; Marsalek, Jiri; Österlund, Heléne; Viklander, Maria

    2016-08-01

    The assessment of future trends in urban stormwater quality should be most helpful for ensuring the effectiveness of the existing stormwater quality infrastructure in the future and mitigating the associated impacts on receiving waters. Combined effects of expected changes in climate and socio-economic factors on stormwater quality were examined in two urban test catchments by applying a source-based computer model (WinSLAMM) for TSS and three heavy metals (copper, lead, and zinc) for various future scenarios. Generally, both catchments showed similar responses to the future scenarios and pollutant loads were generally more sensitive to changes in socio-economic factors (i.e., increasing traffic intensities, growth and intensification of the individual land-uses) than in the climate. Specifically, for the selected Intermediate socio-economic scenario and two climate change scenarios (RSP = 2.6 and 8.5), the TSS loads from both catchments increased by about 10 % on average, but when applying the Intermediate climate change scenario (RCP = 4.5) for two SSPs, the Sustainability and Security scenarios (SSP1 and SSP3), the TSS loads increased on average by 70 %. Furthermore, it was observed that well-designed and maintained stormwater treatment facilities targeting local pollution hotspots exhibited the potential to significantly improve stormwater quality, however, at potentially high costs. In fact, it was possible to reduce pollutant loads from both catchments under the future Sustainability scenario (on average, e.g., TSS were reduced by 20 %), compared to the current conditions. The methodology developed in this study was found useful for planning climate change adaptation strategies in the context of local conditions. PMID:27153819

  7. Source-Based Modeling Of Urban Stormwater Quality Response to the Selected Scenarios Combining Future Changes in Climate and Socio-Economic Factors

    NASA Astrophysics Data System (ADS)

    Borris, Matthias; Leonhardt, Günther; Marsalek, Jiri; Österlund, Heléne; Viklander, Maria

    2016-08-01

    The assessment of future trends in urban stormwater quality should be most helpful for ensuring the effectiveness of the existing stormwater quality infrastructure in the future and mitigating the associated impacts on receiving waters. Combined effects of expected changes in climate and socio-economic factors on stormwater quality were examined in two urban test catchments by applying a source-based computer model (WinSLAMM) for TSS and three heavy metals (copper, lead, and zinc) for various future scenarios. Generally, both catchments showed similar responses to the future scenarios and pollutant loads were generally more sensitive to changes in socio-economic factors (i.e., increasing traffic intensities, growth and intensification of the individual land-uses) than in the climate. Specifically, for the selected Intermediate socio-economic scenario and two climate change scenarios (RSP = 2.6 and 8.5), the TSS loads from both catchments increased by about 10 % on average, but when applying the Intermediate climate change scenario (RCP = 4.5) for two SSPs, the Sustainability and Security scenarios (SSP1 and SSP3), the TSS loads increased on average by 70 %. Furthermore, it was observed that well-designed and maintained stormwater treatment facilities targeting local pollution hotspots exhibited the potential to significantly improve stormwater quality, however, at potentially high costs. In fact, it was possible to reduce pollutant loads from both catchments under the future Sustainability scenario (on average, e.g., TSS were reduced by 20 %), compared to the current conditions. The methodology developed in this study was found useful for planning climate change adaptation strategies in the context of local conditions.

  8. Demystifying Results-Based Performance Measurement.

    ERIC Educational Resources Information Center

    Jorjani, Hamid

    Many evaluators are convinced that Results-based Performance Measurement (RBPM) is an effective tool to improve service delivery and cost effectiveness in both public and private sectors. Successful RBPM requires self-directed and cross-functional work teams and the supporting infrastructure to make it work. There are many misconceptions and…

  9. A Performance-Based Web Budget Tool

    ERIC Educational Resources Information Center

    Abou-Sayf, Frank K.; Lau, Wilson

    2007-01-01

    A web-based formula-driven tool has been developed for the purpose of performing two distinct academic department budgeting functions: allocation funding to the department, and budget management by the department. The tool's major features are discussed and its uses demonstrated. The tool's advantages are presented. (Contains 10 figures.)

  10. Performance-Based Evaluation and School Librarians

    ERIC Educational Resources Information Center

    Church, Audrey P.

    2015-01-01

    Evaluation of instructional personnel is standard procedure in our Pre-K-12 public schools, and its purpose is to document educator effectiveness. With Race to the Top and No Child Left Behind waivers, states are required to implement performance-based evaluations that demonstrate student academic progress. This three-year study describes the…

  11. Performance-Based Rewards and Work Stress

    ERIC Educational Resources Information Center

    Ganster, Daniel C.; Kiersch, Christa E.; Marsh, Rachel E.; Bowen, Angela

    2011-01-01

    Even though reward systems play a central role in the management of organizations, their impact on stress and the well-being of workers is not well understood. We review the literature linking performance-based reward systems to various indicators of employee stress and well-being. Well-controlled experiments in field settings suggest that certain…

  12. ICT-Supported, Scenario-Based Learning in Preclinical Veterinary Science Education: Quantifying Learning Outcomes and Facilitating the Novice-Expert Transition

    ERIC Educational Resources Information Center

    Seddon, Jennifer M.; McDonald, Brenda; Schmidt, Adele L.

    2012-01-01

    Problem and/or scenario-based learning is often deployed in preclinical education and training as a means of: (a) developing students' capacity to respond to authentic, real-world problems; (b) facilitating integration of knowledge across subject areas, and; (c) increasing motivation for learning. Six information and communication technology (ICT)…

  13. A Study of the Competency of Third Year Medical Students to Interpret Biochemically Based Clinical Scenarios Using Knowledge and Skills Gained in Year 1 and 2

    ERIC Educational Resources Information Center

    Gowda, Veena Bhaskar S.; Nagaiah, Bhaskar Hebbani; Sengodan, Bharathi

    2016-01-01

    Medical students build clinical knowledge on the grounds of previously obtained basic knowledge. The study aimed to evaluate the competency of third year medical students to interpret biochemically based clinical scenarios using knowledge and skills gained during year 1 and 2 of undergraduate medical training. Study was conducted on year 3 MBBS…

  14. Attractive scenario writing.

    PubMed

    Takahashi, Yuzo; Oku, Sachiko Alexandra

    2009-05-01

    This article describes the key steps of scenario writing to facilitate problem-based learning discussion to aid student learning of basic medical science in combination with clinical medicine. The scenario has to amplify and deepen the students' thinking so that they can correlate findings from the case and knowledge from textbooks. This can be achieved in three ways: (1) a comparison of cases; (2) demonstrating a scientific link between symptoms and basic medicine; and (3) introducing a personal and emotional aspect to the scenario. A comparison of two cases enables us to shed light on the pathological differences and think about the underlying biological mechanisms. These include: (a) a comparison of two cases with similar symptoms, but different diseases; (b) a comparison of two cases with different symptoms, but the same cause; and (c) a comparison of two cases, with an easy case, followed by a complicated case. The scenarios may be disclosed in a sequence to show a scientific link between symptoms of the patient and basic medicine, which may help to cultivate a physician with a scientific mind. Examples are given by the relationship between: (a) symptoms, pathology and morphology; and (b) symptoms, pathology and physiology. When the scenario is written in such a way that students are personally and/or emotionally involved in the case, they will be more motivated in learning as if involved in the case themselves. To facilitate this, the scenario can be written in the first-person perspective. Examples include "I had a very bad headache, and vomited several times...", and "I noticed that my father was screaming at night...". The description of the events may be in chronological order with actual time, which makes students feel as if they are really the primary responding person. PMID:19502145

  15. Performance-based inspection and maintenance strategies

    SciTech Connect

    Vesely, W.E.

    1995-04-01

    Performance-based inspection and maintenance strategies utilize measures of equipment performance to help guide inspection and maintenance activities. A relevant measure of performance for safety system components is component unavailability. The component unavailability can also be input into a plant risk model such as a Probabilistic Risk Assessment (PRA) to determine the associated plant risk performance. Based on the present and projected unavailability performance, or the present and projected risk performance, the effectiveness of current maintenance activities can be evaluated and this information can be used to plan future maintenance activities. A significant amount of information other than downtimes or failure times is collected or can be collected when an inspection or maintenance is conducted which can be used to estimate the component unavailability. This information generally involves observations on the condition or state of the component or component piecepart. The information can be detailed such as the amount of corrosion buildup or can be general such as the general state of the component described as {open_quotes}high degradation{close_quotes}, {open_quotes}moderate degradation{close_quotes}, or {open_quotes}low degradation{close_quotes}. Much of the information collected in maintenance logs is qualitative and fuzzy. As part of an NRC Research program on performance-based engineering modeling, approaches have been developed to apply Fuzzy Set Theory to information collected on the state of the component to determine the implied component or component piecepart unavailability. Demonstrations of the applications of Fuzzy Set Theory are presented utilizing information from plant maintenance logs. The demonstrations show the power of Fuzzy Set Theory in translating engineering information to reliability and risk implications.

  16. Simulation-Based Assessment to Evaluate Cognitive Performance in an Anesthesiology Residency Program

    PubMed Central

    Sidi, Avner; Baslanti, Tezcan Ozrazgat; Gravenstein, Nikolaus; Lampotang, Samsun

    2014-01-01

    Background Problem solving in a clinical context requires knowledge and experience, and most traditional examinations for learners do not capture skills that are required in some situations where there is uncertainty about the proper course of action. Objective We sought to evaluate anesthesiology residents for deficiencies in cognitive performance within and across 3 clinical domains (operating room, trauma, and cardiac resuscitation) using simulation-based assessment. Methods Individual basic knowledge and cognitive performance in each simulation-based scenario were assessed in 47 residents using a 15- to 29-item scenario-specific checklist. For every scenario and item we calculated group error scenario rate (frequency) and individual (resident) item success. For all analyses, alpha was designated as 0.05. Results Postgraduate year (PGY)-3 and PGY-4 residents' cognitive items error rates were higher and success rates lower compared to basic and technical performance in each domain tested (P < .05). In the trauma and resuscitation scenarios, the cognitive error rate by PGY-4 residents was fairly high (0.29–0.5) and their cognitive success rate was low (0.5–0.68). The most common cognitive errors were anchoring, availability bias, premature closure, and confirmation bias. Conclusions Simulation-based assessment can differentiate between higher-order (cognitive) and lower-order (basic and technical) skills expected of relatively experienced (PGY-3 and PGY-4) anesthesiology residents. Simulation-based assessments can also highlight areas of relative strength and weakness in a resident group, and this information can be used to guide curricular modifications to address deficiencies in tasks requiring higher-order processing and cognition. PMID:24701316

  17. Economic-based projections of future land use in the conterminous United States under alternative policy scenarios.

    PubMed

    Radeloff, V C; Nelson, E; Plantinga, A J; Lewis, D J; Helmers, D; Lawler, J J; Withey, J C; Beaudry, F; Martinuzzi, S; Butsic, V; Lonsdorf, E; White, D; Polasky, S

    2012-04-01

    Land-use change significantly contributes to biodiversity loss, invasive species spread, changes in biogeochemical cycles, and the loss of ecosystem services. Planning for a sustainable future requires a thorough understanding of expected land use at the fine spatial scales relevant for modeling many ecological processes and at dimensions appropriate for regional or national-level policy making. Our goal was to construct and parameterize an econometric model of land-use change to project future land use to the year 2051 at a fine spatial scale across the conterminous United States under several alternative land-use policy scenarios. We parameterized the econometric model of land-use change with the National Resource Inventory (NRI) 1992 and 1997 land-use data for 844 000 sample points. Land-use transitions were estimated for five land-use classes (cropland, pasture, range, forest, and urban). We predicted land-use change under four scenarios: business-as-usual, afforestation, removal of agricultural subsidies, and increased urban rents. Our results for the business-as-usual scenario showed widespread changes in land use, affecting 36% of the land area of the conterminous United States, with large increases in urban land (79%) and forest (7%), and declines in cropland (-16%) and pasture (-13%). Areas with particularly high rates of land-use change included the larger Chicago area, parts of the Pacific Northwest, and the Central Valley of California. However, while land-use change was substantial, differences in results among the four scenarios were relatively minor. The only scenario that was markedly different was the afforestation scenario, which resulted in an increase of forest area that was twice as high as the business-as-usual scenario. Land-use policies can affect trends, but only so much. The basic economic and demographic factors shaping land-use changes in the United States are powerful, and even fairly dramatic policy changes, showed only moderate

  18. Calculation of lifetime lung cancer risks associated with radon exposure, based on various models and exposure scenarios.

    PubMed

    Hunter, Nezahat; Muirhead, Colin R; Bochicchio, Francesco; Haylock, Richard G E

    2015-09-01

    The risk of lung cancer mortality up to 75 years of age due to radon exposure has been estimated for both male and female continuing, ex- and never-smokers, based on various radon risk models and exposure scenarios. We used risk models derived from (i) the BEIR VI analysis of cohorts of radon-exposed miners, (ii) cohort and nested case-control analyses of a European cohort of uranium miners and (iii) the joint analysis of European residential radon case-control studies. Estimates of the lifetime lung cancer risk due to radon varied between these models by just over a factor of 2 and risk estimates based on models from analyses of European uranium miners exposed at comparatively low rates and of people exposed to radon in homes were broadly compatible. For a given smoking category, there was not much difference in lifetime lung cancer risk between males and females. The estimated lifetime risk of radon-induced lung cancer for exposure to a concentration of 200 Bq m(-3) was in the range 2.98-6.55% for male continuing smokers and 0.19-0.42% for male never-smokers, depending on the model used and assuming a multiplicative relationship for the joint effect of radon and smoking. Stopping smoking at age 50 years decreases the lifetime risk due to radon by around a half relative to continuing smoking, but the risk for ex-smokers remains about a factor of 5-7 higher than that for never-smokers. Under a sub-multiplicative model for the joint effect of radon and smoking, the lifetime risk of radon-induced lung cancer was still estimated to be substantially higher for continuing smokers than for never smokers. Radon mitigation-used to reduce radon concentrations at homes-can also have a substantial impact on lung cancer risk, even for persons in their 50 s; for each of continuing smokers, ex-smokers and never-smokers, radon mitigation at age 50 would lower the lifetime risk of radon-induced lung cancer by about one-third. To maximise risk reductions, smokers in high

  19. A simplified physically-based model to calculate surface water temperature of lakes from air temperature in climate change scenarios

    NASA Astrophysics Data System (ADS)

    Piccolroaz, S.; Toffolon, M.

    2012-12-01

    Modifications of water temperature are crucial for the ecology of lakes, but long-term analyses are not usually able to provide reliable estimations. This is particularly true for climate change studies based on Global Circulation Models, whose mesh size is normally too coarse for explicitly including even some of the biggest lakes on Earth. On the other hand, modeled predictions of air temperature changes are more reliable, and long-term, high-resolution air temperature observational datasets are more available than water temperature measurements. For these reasons, air temperature series are often used to obtain some information about the surface temperature of water bodies. In order to do that, it is common to exploit regression models, but they are questionable especially when it is necessary to extrapolate current trends beyond maximum (or minimum) measured temperatures. Moreover, water temperature is influenced by a variety of processes of heat exchange across the lake surface and by the thermal inertia of the water mass, which also causes an annual hysteresis cycle between air and water temperatures that is hard to consider in regressions. In this work we propose a simplified, physically-based model for the estimation of the epilimnetic temperature in lakes. Starting from the zero-dimensional heat budget, we derive a simplified first-order differential equation for water temperature, primarily forced by a seasonally varying external term (mainly related to solar radiation) and an exchange term explicitly depending on the difference between air and water temperatures. Assuming annual sinusoidal cycles of the main heat flux components at the atmosphere-lake interface, eight parameters (some of them can be disregarded, though) are identified, which can be calibrated if two temporal series of air and water temperature are available. We note that such a calibration is supported by the physical interpretation of the parameters, which provide good initial

  20. A comprehensive multi-scenario based approach for a reliable flood-hazard assessment: a case-study application

    NASA Astrophysics Data System (ADS)

    Lanni, Cristiano; Mazzorana, Bruno; Volcan, Claudio; Bertagnolli, Rudi

    2015-04-01

    Flood hazard is generally assessed by assuming the return period of the rainfall as a proxy for the return period of the discharge and the related hydrograph. Frequently this deterministic view is extended also to the straightforward application of hydrodynamic models. However, the climate (i.e. precipitation), the catchment (i.e. geology, soil and antecedent soil-moisture condition) and the anthropogenic (i.e. drainage system and its regulation) systems interact in a complex way, and the occurrence probability of a flood inundation event can significantly differ from the occurrence probability of the triggering event (i.e. rainfall). In order to reliably determine the spatial patterns of flood intensities and probabilities, the rigorous determination of flood event scenarios is beneficial because it provides a clear, rationale method to recognize and unveil the inherent stochastic behavior of natural processes. Therefore, a multi-scenario approach for hazard assessment should be applied and should consider the possible events taking place in the area potentially subject to flooding (i.e. floodplains). Here, we apply a multi-scenario approach for the assessment of the flood hazard around the Idro lake (Italy). We consider and estimate the probability of occurrence of several scenarios related to the initial (i.e. initial water level in the lake) and boundary (i.e. shape of the hydrograph, downslope drainage, spillway opening operations) conditions characterizing the lake. Finally, we discuss the advantages and issues of the presented methodological procedure compared to traditional (and essentially deterministic) approaches.

  1. So These Numbers Really Mean Something? A Role Playing Scenario-Based Approach to the Undergraduate Instrumental Analysis Laboratory

    ERIC Educational Resources Information Center

    Grannas, Amanda M.; Lagalante, Anthony F.

    2010-01-01

    A new curricular approach in our undergraduate second-year instrumental analysis laboratory was implemented. Students work collaboratively on scenarios in diverse fields including pharmaceuticals, forensics, gemology, art conservation, and environmental chemistry. Each laboratory section (approximately 12 students) is divided into three groups…

  2. A DYNAMIC PHYSIOLOGICALLY-BASED TOXICOKINETIC (DPBTK) MODEL FOR SIMULATION OF COMPLEX TOLUENE EXPOSURE SCENARIOS IN HUMANS

    EPA Science Inventory

    A GENERAL PHYSIOLOGICAL AND TOXICOKINETIC (GPAT) MODEL FOR SIMULATION OF COMPLEX TOLUENE EXPOSURE SCENARIOS IN HUMANS. E M Kenyon1, T Colemen2, C R Eklund1 and V A Benignus3. 1U.S. EPA, ORD, NHEERL, ETD, PKB, RTP, NC, USA; 2Biological Simulators, Inc., Jackson MS, USA, 3U.S. EP...

  3. Performance- and risk-based regulation

    SciTech Connect

    Sauter, G.D.

    1994-12-31

    Risk-based regulation (RBR) and performance-based regulation (PBR) are two relatively new concepts for the regulation of nuclear reactor power plants by the U.S. Nuclear Regulatory Commission (NRC). Although RBR and PBR are often considered to be somewhat equivalent, they, in fact, address two fundamentally different regulatory questions. To fruitfully discuss these two concepts, it is important to recognize what each entails. This paper identifies those two fundamental questions and discusses how they are addressed by RBR and PBR.

  4. Performance-based asphalt mixture design methodology

    NASA Astrophysics Data System (ADS)

    Ali, Al-Hosain Mansour

    performance based design procedure. Finally, the developed guidelines with easy-to-use flow charts for the integrated mix design methodology are presented.

  5. A High Performance COTS Based Computer Architecture

    NASA Astrophysics Data System (ADS)

    Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland

    2014-08-01

    Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.

  6. Investigating the impact of land cover change on peak river flow in UK upland peat catchments, based on modelled scenarios

    NASA Astrophysics Data System (ADS)

    Gao, Jihui; Holden, Joseph; Kirkby, Mike

    2014-05-01

    Changes to land cover can influence the velocity of overland flow. In headwater peatlands, saturation means that overland flow is a dominant source of runoff, particularly during heavy rainfall events. Human modifications in headwater peatlands may include removal of vegetation (e.g. by erosion processes, fire, pollution, overgrazing) or pro-active revegetation of peat with sedges such as Eriophorum or mosses such as Sphagnum. How these modifications affect the river flow, and in particular the flood peak, in headwater peatlands is a key problem for land management. In particular, the impact of the spatial distribution of land cover change (e.g. different locations and sizes of land cover change area) on river flow is not clear. In this presentation a new fully distributed version of TOPMODEL, which represents the effects of distributed land cover change on river discharge, was employed to investigate land cover change impacts in three UK upland peat catchments (Trout Beck in the North Pennines, the Wye in mid-Wales and the East Dart in southwest England). Land cover scenarios with three typical land covers (i.e. Eriophorum, Sphagnum and bare peat) having different surface roughness in upland peatlands were designed for these catchments to investigate land cover impacts on river flow through simulation runs of the distributed model. As a result of hypothesis testing three land cover principles emerged from the work as follows: Principle (1): Well vegetated buffer strips are important for reducing flow peaks. A wider bare peat strip nearer to the river channel gives a higher flow peak and reduces the delay to peak; conversely, a wider buffer strip with higher density vegetation (e.g. Sphagnum) leads to a lower peak and postpones the peak. In both cases, a narrower buffer strip surrounding upstream and downstream channels has a greater effect than a thicker buffer strip just based around the downstream river network. Principle (2): When the area of change is equal

  7. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  8. Alternative zoning scenarios for regional sustainable land use controls in China: a knowledge-based multiobjective optimisation model.

    PubMed

    Xia, Yin; Liu, Dianfeng; Liu, Yaolin; He, Jianhua; Hong, Xiaofeng

    2014-09-01

    Alternative land use zoning scenarios provide guidance for sustainable land use controls. This study focused on an ecologically vulnerable catchment on the Loess Plateau in China, proposed a novel land use zoning model, and generated alternative zoning solutions to satisfy the various requirements of land use stakeholders and managers. This model combined multiple zoning objectives, i.e., maximum zoning suitability, maximum planning compatibility and maximum spatial compactness, with land use constraints by using goal programming technique, and employed a modified simulated annealing algorithm to search for the optimal zoning solutions. The land use zoning knowledge was incorporated into the initialisation operator and neighbourhood selection strategy of the simulated annealing algorithm to improve its efficiency. The case study indicates that the model is both effective and robust. Five optimal zoning scenarios of the study area were helpful for satisfying the requirements of land use controls in loess hilly regions, e.g., land use intensification, agricultural protection and environmental conservation. PMID:25170679

  9. Alternative Zoning Scenarios for Regional Sustainable Land Use Controls in China: A Knowledge-Based Multiobjective Optimisation Model

    PubMed Central

    Xia, Yin; Liu, Dianfeng; Liu, Yaolin; He, Jianhua; Hong, Xiaofeng

    2014-01-01

    Alternative land use zoning scenarios provide guidance for sustainable land use controls. This study focused on an ecologically vulnerable catchment on the Loess Plateau in China, proposed a novel land use zoning model, and generated alternative zoning solutions to satisfy the various requirements of land use stakeholders and managers. This model combined multiple zoning objectives, i.e., maximum zoning suitability, maximum planning compatibility and maximum spatial compactness, with land use constraints by using goal programming technique, and employed a modified simulated annealing algorithm to search for the optimal zoning solutions. The land use zoning knowledge was incorporated into the initialisation operator and neighbourhood selection strategy of the simulated annealing algorithm to improve its efficiency. The case study indicates that the model is both effective and robust. Five optimal zoning scenarios of the study area were helpful for satisfying the requirements of land use controls in loess hilly regions, e.g., land use intensification, agricultural protection and environmental conservation. PMID:25170679

  10. Risk-based damage potential and loss estimation of extreme flooding scenarios in the Austrian Federal Province of Tyrol

    NASA Astrophysics Data System (ADS)

    Huttenlau, M.; Stötter, J.; Stiefelmeyer, H.

    2010-12-01

    Within the last decades serious flooding events occurred in many parts of Europe and especially in 2005 the Austrian Federal Province of Tyrol was serious affected. These events in general and particularly the 2005 event have sensitised decision makers and the public. Beside discussions pertaining to protection goals and lessons learnt, the issue concerning potential consequences of extreme and severe flooding events has been raised. Additionally to the general interest of the public, decision makers of the insurance industry, public authorities, and responsible politicians are especially confronted with the question of possible consequences of extreme events. Answers thereof are necessary for the implementation of preventive appropriate risk management strategies. Thereby, property and liability losses reflect a large proportion of the direct tangible losses. These are of great interest for the insurance sector and can be understood as main indicators to interpret the severity of potential events. The natural scientific-technical risk analysis concept provides a predefined and structured framework to analyse the quantities of affected elements at risk, their corresponding damage potentials, and the potential losses. Generally, this risk concept framework follows the process steps hazard analysis, exposition analysis, and consequence analysis. Additionally to the conventional hazard analysis, the potential amount of endangered elements and their corresponding damage potentials were analysed and, thereupon, concrete losses were estimated. These took the specific vulnerability of the various individual elements at risk into consideration. The present flood risk analysis estimates firstly the general exposures of the risk indicators in the study area and secondly analyses the specific exposures and consequences of five extreme event scenarios. In order to precisely identify, localize, and characterize the relevant risk indicators of buildings, dwellings and inventory

  11. POSIX and Object Distributed Storage Systems Performance Comparison Studies With Real-Life Scenarios in an Experimental Data Taking Context Leveraging OpenStack Swift & Ceph

    NASA Astrophysics Data System (ADS)

    Poat, M. D.; Lauret, J.; Betts, W.

    2015-12-01

    The STAR online computing infrastructure has become an intensive dynamic system used for first-hand data collection and analysis resulting in a dense collection of data output. As we have transitioned to our current state, inefficient, limited storage systems have become an impediment to fast feedback to online shift crews. Motivation for a centrally accessible, scalable and redundant distributed storage system had become a necessity in this environment. OpenStack Swift Object Storage and Ceph Object Storage are two eye-opening technologies as community use and development have led to success elsewhere. In this contribution, OpenStack Swift and Ceph have been put to the test with single and parallel I/O tests, emulating real world scenarios for data processing and workflows. The Ceph file system storage, offering a POSIX compliant file system mounted similarly to an NFS share was of particular interest as it aligned with our requirements and was retained as our solution. I/O performance tests were run against the Ceph POSIX file system and have presented surprising results indicating true potential for fast I/O and reliability. STAR'S online compute farm historical use has been for job submission and first hand data analysis. The goal of reusing the online compute farm to maintain a storage cluster and job submission will be an efficient use of the current infrastructure.

  12. Performance-based assessment of reconstructed images

    SciTech Connect

    Hanson, Kenneth

    2009-01-01

    During the early 90s, I engaged in a productive and enjoyable collaboration with Robert Wagner and his colleague, Kyle Myers. We explored the ramifications of the principle that tbe quality of an image should be assessed on the basis of how well it facilitates the performance of appropriate visual tasks. We applied this principle to algorithms used to reconstruct scenes from incomplete and/or noisy projection data. For binary visual tasks, we used both the conventional disk detection and a new challenging task, inspired by the Rayleigh resolution criterion, of deciding whether an object was a blurred version of two dots or a bar. The results of human and machine observer tests were summarized with the detectability index based on the area under the ROC curve. We investigated a variety of reconstruction algorithms, including ART, with and without a nonnegativity constraint, and the MEMSYS3 algorithm. We concluded that the performance of the Raleigh task was optimized when the strength of the prior was near MEMSYS's default 'classic' value for both human and machine observers. A notable result was that the most-often-used metric of rms error in the reconstruction was not necessarily indicative of the value of a reconstructed image for the purpose of performing visual tasks.

  13. Using Comprehensive Science-based Disaster Scenarios to Support Seismic Safety Policy: A Case Study in Los Angeles, California

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2014-12-01

    In 2014, the USGS entered a technical assistance agreement with the City of Los Angeles to apply the results of the 2008 ShakeOut Scenario of a M7.8 earthquake on the southern San Andreas fault to develop a comprehensive plan to increase the seismic resilience of the City. The results of this project are to be submitted to the Mayor of Los Angeles at the Great ShakeOut on October 16, 2014. The ShakeOut scenario detailed how the expected cascade of failures in a big earthquake could lead to significant delays in disaster recovery that could create financial losses that greatly exceed the direct losses in the event. The goal of the seismic resilience plan is to: protect the lives of residents during earthquakes improve the capacity of the City to respond to the earthquake prepare the City to recover quickly after the earthquake so as to protect the economy of the City and all of southern California To accomplish these goals, the project addresses three areas of seismic vulnerability that were identified in the original ShakeOut Scenario: Pre-1980 buildings that present an unacceptable risk to the lives of residents, including "non-ductile reinforced concrete," and "soft-first-story" buildings Water system infrastructure (including impact on firefighting capability) Communications infrastructure The critical science needed to support policy decisions is to understand the probable consequences to the regional long-term economy caused by decisions to undertake (or not) different levels of mitigation. The arguments against mitigation are the immediate financial costs, so a better understanding of the eventual benefit is required. However, the direct savings rarely justify the mitigation costs, so the arguments in favor of mitigation are driven by the potential for cascading failures and the potential to trigger the type of long term reduction in population and economic activity that has occurred in New Orleans since Hurricane Katrina.

  14. TAP 2, Performance-Based Training Manual

    SciTech Connect

    Not Available

    1991-07-01

    Training programs at DOE nuclear facilities should provide well- trained, qualified personnel to safely and efficiently operate the facilities in accordance with DOE requirements. A need has been identified for guidance regarding analysis, design, development, implementation, and evaluation of consistent and reliable performance-based training programs. Accreditation of training programs at Category A reactors and high-hazard and selected moderate-hazard nonreactor facilities will assure consistent, appropriate, and cost-effective training of personnel responsible for the operation, maintenance, and technical support of these facilities. Training programs that are designed and based on systematically job requirements, instead of subjective estimation of trainee needs, yield training activities that are consistent and develop or improve knowledge, skills, and abilities that can be directly related to the work setting. Because the training is job-related, the content of these programs more efficiently and effectively meets the needs of the employee. Besides a better trained work force, a greater level of operational reactor safety can be realized. This manual is intended to provide an overview of the accreditation process and a brief description of the elements necessary to construct and maintain training programs that are based on the requirements of the job. Two comparison manuals provide additional information to assist contractors in their efforts to accredit training programs.

  15. Estimating future ecoregion distributions within the Okavango Delta Wetlands based on hydrological simulations and future climate and development scenarios

    NASA Astrophysics Data System (ADS)

    Milzow, C.; Burg, V.; Kinzelbach, W.

    2010-02-01

    SummaryThe terminal wetlands of the Okavango Delta in northern Botswana are driven by the balance between inflows and evapotranspiration. The present situation is threatened by climate change and possible agricultural and industrial development in Botswana and the upstream countries Angola and Namibia. A new balance will affect the spatial extent and character of the Okavango Delta Wetlands. We apply a distributed hydrological model to study the impact of those threats on the hydrology and ecology of the wetlands. The relation between the present distribution of hydrological conditions and the occurrence of vegetation classes is investigated and a good correlation is found between depth to groundwater and vegetation class. By assuming that the distribution of vegetation will in the long term adapt to hydrological conditions, the simulated hydrological conditions under climate change and water management scenarios are translated into vegetation maps for these scenarios. Drier conditions are expected for the future and aquatic vegetation zones will be reduced in size. This change will however occur non-homogeneously over the Delta.

  16. Scenario-based modelling of mass transfer mechanisms at a petroleum contaminated field site-numerical implications.

    PubMed

    Vasudevan, M; Nambi, Indumathi M; Suresh Kumar, G

    2016-06-15

    Knowledge about distribution of dissolved plumes and their influencing factors is essential for risk assessment and remediation of light non-aqueous phase liquid contamination in groundwater. Present study deals with the applicability of numerical model for simulating various hydro-geological scenarios considering non-uniform source distribution at a petroleum contaminated site in Chennai, India. The complexity associated with the hydrogeology of the site has limited scope for on-site quantification of petroleum pipeline spillage. The change in fuel composition under mass-transfer limited conditions was predicted by simultaneously comparing deviations in aqueous concentrations and activity coefficients (between Raoult's law and analytical approaches). The effects of source migration and weathering on the dissolution of major soluble fractions of petroleum fuel were also studied in relation to the apparent change in their activity coefficients and molar fractions. The model results were compared with field observations and found that field conditions were favourable for biodegradation, especially for the aromatic fraction (benzene and toluene (nearly 95% removal), polycyclic aromatic hydrocarbons (up to 65% removal) and xylene (nearly 45% removal). The results help to differentiate the effect of compositional non-ideality from rate-limited dissolution towards tailing of less soluble compounds (alkanes and trimethylbenzene). Although the effect of non-ideality decreased with distance from the source, the assumption of spatially varying residual saturation could effectively illustrate post-spill scenario by estimating the consequent decrease in mass transfer rate. PMID:27017268

  17. Decision Making Under Uncertainty and Complexity: A Model-Based Scenario Approach to Supporting Integrated Water Resources Management

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.

    2007-12-01

    Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.

  18. High Performance Oxides-Based Thermoelectric Materials

    NASA Astrophysics Data System (ADS)

    Ren, Guangkun; Lan, Jinle; Zeng, Chengcheng; Liu, Yaochun; Zhan, Bin; Butt, Sajid; Lin, Yuan-Hua; Nan, Ce-Wen

    2015-01-01

    Thermoelectric materials have attracted much attention due to their applications in waste-heat recovery, power generation, and solid state cooling. In comparison with thermoelectric alloys, oxide semiconductors, which are thermally and chemically stable in air at high temperature, are regarded as the candidates for high-temperature thermoelectric applications. However, their figure-of-merit ZT value has remained low, around 0.1-0.4 for more than 20 years. The poor performance in oxides is ascribed to the low electrical conductivity and high thermal conductivity. Since the electrical transport properties in these thermoelectric oxides are strongly correlated, it is difficult to improve both the thermoelectric power and electrical conductivity simultaneously by conventional methods. This review summarizes recent progresses on high-performance oxide-based thermoelectric bulk-materials including n-type ZnO, SrTiO3, and In2O3, and p-type Ca3Co4O9, BiCuSeO, and NiO, enhanced by heavy-element doping, band engineering and nanostructuring.

  19. Using after-action review based on automated performance assessment to enhance training effectiveness.

    SciTech Connect

    Stevens-Adams, Susan Marie; Gieseler, Charles J.; Basilico, Justin Derrick; Abbott, Robert G.; Forsythe, James Chris

    2010-09-01

    Training simulators have become increasingly popular tools for instructing humans on performance in complex environments. However, the question of how to provide individualized and scenario-specific assessment and feedback to students remains largely an open question. In this work, we follow-up on previous evaluations of the Automated Expert Modeling and Automated Student Evaluation (AEMASE) system, which automatically assesses student performance based on observed examples of good and bad performance in a given domain. The current study provides a rigorous empirical evaluation of the enhanced training effectiveness achievable with this technology. In particular, we found that students given feedback via the AEMASE-based debrief tool performed significantly better than students given only instructor feedback on two out of three domain-specific performance metrics.

  20. Seismic performance assessment of base-isolated safety-related nuclear structures

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2010-01-01

    Seismic or base isolation is a proven technology for reducing the effects of earthquake shaking on buildings, bridges and infrastructure. The benefit of base isolation has been presented in terms of reduced accelerations and drifts on superstructure components but never quantified in terms of either a percentage reduction in seismic loss (or percentage increase in safety) or the probability of an unacceptable performance. Herein, we quantify the benefits of base isolation in terms of increased safety (or smaller loss) by comparing the safety of a sample conventional and base-isolated nuclear power plant (NPP) located in the Eastern U.S. Scenario- and time-based assessments are performed using a new methodology. Three base isolation systems are considered, namely, (1) Friction Pendulum??? bearings, (2) lead-rubber bearings and (3) low-damping rubber bearings together with linear viscous dampers. Unacceptable performance is defined by the failure of key secondary systems because these systems represent much of the investment in a new build power plant and ensure the safe operation of the plant. For the scenario-based assessments, the probability of unacceptable performance is computed for an earthquake with a magnitude of 5.3 at a distance 7.5 km from the plant. For the time-based assessments, the annual frequency of unacceptable performance is computed considering all potential earthquakes that may occur. For both assessments, the implementation of base isolation reduces the probability of unacceptable performance by approximately four orders of magnitude for the same NPP superstructure and secondary systems. The increase in NPP construction cost associated with the installation of seismic isolators can be offset by substantially reducing the required seismic strength of secondary components and systems and potentially eliminating the need to seismically qualify many secondary components and systems. ?? 2010 John Wiley & Sons, Ltd.

  1. Integrating experimental and numerical methods for a scenario-based quantitative assessment of subsurface energy storage options

    NASA Astrophysics Data System (ADS)

    Kabuth, Alina; Dahmke, Andreas; Hagrey, Said Attia al; Berta, Márton; Dörr, Cordula; Koproch, Nicolas; Köber, Ralf; Köhn, Daniel; Nolde, Michael; Tilmann Pfeiffer, Wolf; Popp, Steffi; Schwanebeck, Malte; Bauer, Sebastian

    2016-04-01

    Within the framework of the transition to renewable energy sources ("Energiewende"), the German government defined the target of producing 60 % of the final energy consumption from renewable energy sources by the year 2050. However, renewable energies are subject to natural fluctuations. Energy storage can help to buffer the resulting time shifts between production and demand. Subsurface geological structures provide large potential capacities for energy stored in the form of heat or gas on daily to seasonal time scales. In order to explore this potential sustainably, the possible induced effects of energy storage operations have to be quantified for both specified normal operation and events of failure. The ANGUS+ project therefore integrates experimental laboratory studies with numerical approaches to assess subsurface energy storage scenarios and monitoring methods. Subsurface storage options for gas, i.e. hydrogen, synthetic methane and compressed air in salt caverns or porous structures, as well as subsurface heat storage are investigated with respect to site prerequisites, storage dimensions, induced effects, monitoring methods and integration into spatial planning schemes. The conceptual interdisciplinary approach of the ANGUS+ project towards the integration of subsurface energy storage into a sustainable subsurface planning scheme is presented here, and this approach is then demonstrated using the examples of two selected energy storage options: Firstly, the option of seasonal heat storage in a shallow aquifer is presented. Coupled thermal and hydraulic processes induced by periodic heat injection and extraction were simulated in the open-source numerical modelling package OpenGeoSys. Situations of specified normal operation as well as cases of failure in operational storage with leaking heat transfer fluid are considered. Bench-scale experiments provided parameterisations of temperature dependent changes in shallow groundwater hydrogeochemistry. As a

  2. Medical students’ satisfaction with the Applied Basic Clinical Seminar with Scenarios for Students, a novel simulation-based learning method in Greece

    PubMed Central

    2016-01-01

    Purpose: The integration of simulation-based learning (SBL) methods holds promise for improving the medical education system in Greece. The Applied Basic Clinical Seminar with Scenarios for Students (ABCS3) is a novel two-day SBL course that was designed by the Scientific Society of Hellenic Medical Students. The ABCS3 targeted undergraduate medical students and consisted of three core components: the case-based lectures, the ABCDE hands-on station, and the simulation-based clinical scenarios. The purpose of this study was to evaluate the general educational environment of the course, as well as the skills and knowledge acquired by the participants. Methods: Two sets of questions were distributed to the participants: the Dundee Ready Educational Environment Measure (DREEM) questionnaire and an internally designed feedback questionnaire (InEv). A multiple-choice examination was also distributed prior to the course and following its completion. A total of 176 participants answered the DREEM questionnaire, 56 the InEv, and 60 the MCQs. Results: The overall DREEM score was 144.61 (±28.05) out of 200. Delegates who participated in both the case-based lectures and the interactive scenarios core components scored higher than those who only completed the case-based lecture session (P=0.038). The mean overall feedback score was 4.12 (±0.56) out of 5. Students scored significantly higher on the post-test than on the pre-test (P<0.001). Conclusion: The ABCS3 was found to be an effective SBL program, as medical students reported positive opinions about their experiences and exhibited improvements in their clinical knowledge and skills. PMID:27012313

  3. Comparison of leaching predictions based on PRZM3.12, LEACHP, and RZWQM98 using standard scenario modeling.

    PubMed

    Jackson, Scott H; Estes, Tammara L

    2007-06-27

    Regulatory agencies in the NAFTA region use ground water leaching models to help determine risks to ground water resources. The results of three models for leaching predictions are compared using a standard soil and weather scenario currently used by the New York Department of Environmental Conservation (NYDEC) to simulate the Riverhead soil found on Long Island, New York. The three models, PRZM3.12, LEACHP, and RZWQM98, were configured to simulate the behavior of two example molecules in corn, turfgrass, and bare soil. For the bare soil simulations, LEACHP and RZWQM98 predicted similar peak concentrations and timing of peak concentration. Depending on the dissipation rate of the molecule, PRZM3.12 predicted similar to reduced peak concentrations due to the delayed timing to reach the peak concentration. For the corn and turfgrass simulations, peak concentrations and timing to reach peak concentrations varied between the models due to differences in how each simulates plant growth and evapotranspiration. PMID:17547417

  4. Nonuniversal scalar mass scenario with Higgs funnel region of supersymmetric dark matter: A signal-based analysis for the Large Hadron Collider

    SciTech Connect

    Bhattacharya, Subhaditya; Mukhopadhyaya, Biswarup; Chattopadhyay, Utpal; Das, Debottam; Choudhury, Debajyoti

    2010-04-01

    We perform a multilepton channel analysis in the context of the Large Hadron Collider (LHC) for Wilkinson Microwave Anisotropy Probe compatible points in a model with nonuniversal scalar masses, which admits a Higgs funnel region of supersymmetry dark matter even for a small tan{beta}. In addition to two- and three-lepton final states, four-lepton events, too, are shown to be useful for this purpose. We also compare the collider signatures in similar channels for Wilkinson Microwave Anisotropy Probe compatible points in the minimal supergravity (mSUGRA) framework with similar gluino masses. Some definite features of such nonuniversal scenario emerge from the analysis.

  5. Scenario-based tsunami risk assessment using a static flooding approach and high-resolution digital elevation data: An example from Muscat in Oman

    NASA Astrophysics Data System (ADS)

    Schneider, Bastian; Hoffmann, Gösta; Reicherter, Klaus

    2016-04-01

    Knowledge of tsunami risk and vulnerability is essential to establish a well-adapted Multi Hazard Early Warning System, land-use planning and emergency management. As the tsunami risk for the coastline of Oman is still under discussion and remains enigmatic, various scenarios based on historical tsunamis were created. The suggested inundation and run-up heights were projected onto the modern infrastructural setting of the Muscat Capital Area. Furthermore, possible impacts of the worst-case tsunami event for Muscat are discussed. The approved Papathoma Tsunami Vulnerability Assessment Model was used to model the structural vulnerability of the infrastructure for a 2 m tsunami scenario, depicting the 1945 tsunami and a 5 m tsunami in Muscat. Considering structural vulnerability, the results suggest a minor tsunami risk for the 2 m tsunami scenario as the flooding is mainly confined to beaches and wadis. Especially traditional brick buildings, still predominant in numerous rural suburbs, and a prevalently coast-parallel road network lead to an increased tsunami risk. In contrast, the 5 m tsunami scenario reveals extensively inundated areas and with up to 48% of the buildings flooded, and therefore consequently a significantly higher tsunami risk. We expect up to 60000 damaged buildings and up to 380000 residents directly affected in the Muscat Capital Area, accompanied with a significant loss of life and damage to vital infrastructure. The rapid urbanization processes in the Muscat Capital Area, predominantly in areas along the coast, in combination with infrastructural, demographic and economic growth will additionally increase the tsunami risk and therefore emphasizes the importance of tsunami risk assessment in Oman.

  6. The effects of performance-based assessment criteria on student performance and self-assessment skills.

    PubMed

    Fastré, Greet Mia Jos; van der Klink, Marcel R; van Merriënboer, Jeroen J G

    2010-10-01

    This study investigated the effect of performance-based versus competence-based assessment criteria on task performance and self-assessment skills among 39 novice secondary vocational education students in the domain of nursing and care. In a performance-based assessment group students are provided with a preset list of performance-based assessment criteria, describing what students should do, for the task at hand. The performance-based group is compared to a competence-based assessment group in which students receive a preset list of competence-based assessment criteria, describing what students should be able to do. The test phase revealed that the performance-based group outperformed the competence-based group on test task performance. In addition, higher performance of the performance-based group was reached with lower reported mental effort during training, indicating a higher instructional efficiency for novice students. PMID:20054648

  7. CSI-Chocolate Science Investigation and the Case of the Recipe Rip-Off: Using an Extended Problem-Based Scenario to Enhance High School Students' Science Engagement

    ERIC Educational Resources Information Center

    Marle, Peter D.; Decker, Lisa; Taylor, Victoria; Fitzpatrick, Kathleen; Khaliqi, David; Owens, Janel E.; Henry, Renee M.

    2014-01-01

    This paper discusses a K-12/university collaboration in which students participated in a four-day scenario-based summer STEM (science, technology, engineering, and mathematics) camp aimed at making difficult scientific concepts salient. This scenario, Jumpstart STEM-CSI: Chocolate Science Investigation (JSCSI), used open- and guided-inquiry…

  8. Groundwater from Clouds - Coupling a Regional Groundwater Model with Recharge Scenarios Based on Cloud Forest Distribution in Oman

    NASA Astrophysics Data System (ADS)

    Mueller, T. H.; Bawain, A. M., Sr.; Friesen, J.

    2014-12-01

    The Dhofar mountain range in southern Oman divides the semi-arid coastal plain to the South from the arid desert region to the North of the mountain range. The demand for fresh water in the region is almost exclusively covered by groundwater. Possible sources for groundwater recharge in Dhofar are the monsoon season - which is unique for the Arabian Peninsula - and rare storm events reoccuring infrequently every 3 to 7 years. The present study focuses on the recharge potential of the monsoon and the role of the Dhofar cloud forest. The monsoon with its orographic rainfall distribution is the most reliable source of precipitation for the Dhofar area, but is limited to the south side of the mountains. Every year between June and September light rain and fog is brought to the area, yielding mean rainfall values of 104 mm per year in the plain and 299 mm per year at high elevations. This results in higher spring discharge at the foot of the mountains and subsurface recharge into the coastal plain. Results from previous ecohydrological studies on the Dhofar cloud forest hydrology suggest that the recharge potential in forested regions is substantially higher than from other land cover in the region. Reasons for this lie in the ability of forested regions to capture cloud water or fog and the interception processes, specifically stemflow, that channel substantial amounts of water directly into deeper soil layers. In short, the cloud forest land cover receives fog water in addition to rainfall and the interception processes result in preferential pathways below the forest. The different recharge potential of cloud forest areas versus a classic distributed infiltration pattern caused by regular rainfall as well as the spatial distribution of the cloud forest has been used to formulate different recharge scenarios. A groundwater flow model was developed reproducing the north-to south gradient and the observed heads and outflows at the foot of the mountain. Different

  9. The Effects of Performance-Based Assessment Criteria on Student Performance and Self-Assessment Skills

    ERIC Educational Resources Information Center

    Fastre, Greet Mia Jos; van der Klink, Marcel R.; van Merrienboer, Jeroen J. G.

    2010-01-01

    This study investigated the effect of performance-based versus competence-based assessment criteria on task performance and self-assessment skills among 39 novice secondary vocational education students in the domain of nursing and care. In a performance-based assessment group students are provided with a preset list of performance-based…

  10. Physics-model-based nonlinear actuator trajectory optimization and safety factor profile feedback control for advanced scenario development in DIII-D

    NASA Astrophysics Data System (ADS)

    Barton, J. E.; Boyer, M. D.; Shi, W.; Wehner, W. P.; Schuster, E.; Ferron, J. R.; Walker, M. L.; Humphreys, D. A.; Luce, T. C.; Turco, F.; Penaflor, B. G.; Johnson, R. D.

    2015-09-01

    DIII-D experimental results are reported to demonstrate the potential of physics-model-based safety factor profile control for robust and reproducible sustainment of advanced scenarios. In the absence of feedback control, variability in wall conditions and plasma impurities, as well as drifts due to external disturbances, can limit the reproducibility of discharges with simple pre-programmed scenario trajectories. The control architecture utilized is a feedforward + feedback scheme where the feedforward commands are computed off-line and the feedback commands are computed on-line. In this work, a first-principles-driven (FPD), physics-based model of the q profile and normalized beta ({β\\text{N}} ) dynamics is first embedded into a numerical optimization algorithm to design feedforward actuator trajectories that steer the plasma through the tokamak operating space to reach a desired stationary target state that is characterized by the achieved q profile and {β\\text{N}} . Good agreement between experimental results and simulations demonstrates the accuracy of the models employed for physics-model-based control design. Second, a feedback algorithm for q profile control is designed following an FPD approach, and the ability of the controller to achieve and maintain a target q profile evolution is tested in DIII-D high confinement (H-mode) experiments. The controller is shown to be able to effectively control the q profile when {β\\text{N}} is relatively close to the target, indicating the need for integrated q profile and {β\\text{N}} control to further enhance the ability to achieve robust scenario execution. The ability of an integrated q profile + {β\\text{N}} feedback controller to track a desired target is demonstrated through simulation.

  11. Soil retention of hexavalent chromium released from construction and demolition waste in a road-base-application scenario.

    PubMed

    Butera, Stefania; Trapp, Stefan; Astrup, Thomas F; Christensen, Thomas H

    2015-11-15

    We investigated the retention of Cr(VI) in three subsoils with low organic matter content in laboratory experiments at concentration levels relevant to represent leachates from construction and demolition waste (C&DW) reused as unbound material in road construction. The retention mechanism appeared to be reduction and subsequent precipitation as Cr(III) on the soil. The reduction process was slow and in several experiments it was still proceeding at the end of the six-month experimental period. The overall retention reaction fit well with a second-order reaction governed by actual Cr(VI) concentration and reduction capacity of the soil. The experimentally determined reduction capacities and second-order kinetic parameters were used to model, for a 100-year period, the one-dimensional migration of Cr(VI) in the subsoil under a layer of C&DW. The resulting Cr(VI) concentration would be negligible below 7-70 cm depth. However, in rigid climates and with high water infiltration through the road pavement, the reduction reaction could be so slow that Cr(VI) might migrate as deep as 200 cm under the road. The reaction parameters and the model can form the basis for systematically assessing under which scenarios Cr(VI) from C&DW could lead to an environmental issue for ground- and receiving surface waters. PMID:26148961

  12. Projections of Water Stress Based on an Ensemble of Socioeconomic Growth and Climate Change Scenarios: A Case Study in Asia.

    PubMed

    Fant, Charles; Schlosser, C Adam; Gao, Xiang; Strzepek, Kenneth; Reilly, John

    2016-01-01

    The sustainability of future water resources is of paramount importance and is affected by many factors, including population, wealth and climate. Inherent in current methods to estimate these factors in the future is the uncertainty of their prediction. In this study, we integrate a large ensemble of scenarios--internally consistent across economics, emissions, climate, and population--to develop a risk portfolio of water stress over a large portion of Asia that includes China, India, and Mainland Southeast Asia in a future with unconstrained emissions. We isolate the effects of socioeconomic growth from the effects of climate change in order to identify the primary drivers of stress on water resources. We find that water needs related to socioeconomic changes, which are currently small, are likely to increase considerably in the future, often overshadowing the effect of climate change on levels of water stress. As a result, there is a high risk of severe water stress in densely populated watersheds by 2050, compared to recent history. There is strong evidence to suggest that, in the absence of autonomous adaptation or societal response, a much larger portion of the region's population will live in water-stressed regions in the near future. Tools and studies such as these can effectively investigate large-scale system sensitivities and can be useful in engaging and informing decision makers. PMID:27028871

  13. Scenario analysis for integrated water resources planning and management under uncertainty in the Zayandehrud river basin

    NASA Astrophysics Data System (ADS)

    Safavi, Hamid R.; Golmohammadi, Mohammad H.; Sandoval-Solis, Samuel

    2016-08-01

    The goal of this study is to develop and analyze three scenarios in the Zayandehrud river basin in Iran using a model already built and calibrated by Safavi et al. (2015) that has results for the baseline scenario. Results from the baseline scenario show that water demands will be supplied at the cost of depletion of surface and ground water resources, making this scenario undesirable and unsustainable. Supply Management, Demand Management, and Meta (supply and demand management) scenarios are the selected scenarios in this study. They are to be developed and declared into the Zayandehrud model to assess and evaluate the imminent status of the basin. Certain strategies will be employed for this purpose to improve and rectify the current management policies. The five performance criteria of time-based and volumetric reliability, resilience, vulnerability, and maximum deficit will be employed in the process of scenario analysis and evaluation. The results obtained from the performance criteria will be summed up into a so-called 'Water Resources Sustainability Index' to facilitate comparison among the likely trade-offs. Uncertainties arising from historical data, management policies, rainfall-runoff model, demand priorities, and performance criteria are considered in the proposed conceptual framework and modeled by appropriate approaches. Results show that the Supply Management scenario can be used to improve upon the demand supply but that it has no tangible effects on the improvement of the resources in the study region. In this regard, the Demand Management scenario is found to be more effective than the water supply one although it still remains unacceptable. Results of the Meta scenario indicate that both the supply and demand management scenarios must be applied if the water resources are to be safeguarded against degradation and depletion. In other words, the supply management scenario is necessary but not adequate; rather, it must be coupled to the demand

  14. Fuzzy logic based sensor performance evaluation of vehicle mounted metal detector systems

    NASA Astrophysics Data System (ADS)

    Abeynayake, Canicious; Tran, Minh D.

    2015-05-01

    Vehicle Mounted Metal Detector (VMMD) systems are widely used for detection of threat objects in humanitarian demining and military route clearance scenarios. Due to the diverse nature of such operational conditions, operational use of VMMD without a proper understanding of its capability boundaries may lead to heavy causalities. Multi-criteria fitness evaluations are crucial for determining capability boundaries of any sensor-based demining equipment. Evaluation of sensor based military equipment is a multi-disciplinary topic combining the efforts of researchers, operators, managers and commanders having different professional backgrounds and knowledge profiles. Information acquired through field tests usually involves uncertainty, vagueness and imprecision due to variations in test and evaluation conditions during a single test or series of tests. This report presents a fuzzy logic based methodology for experimental data analysis and performance evaluation of VMMD. This data evaluation methodology has been developed to evaluate sensor performance by consolidating expert knowledge with experimental data. A case study is presented by implementing the proposed data analysis framework in a VMMD evaluation scenario. The results of this analysis confirm accuracy, practicability and reliability of the fuzzy logic based sensor performance evaluation framework.

  15. Evaluation of Precipitation from CMIP5 Models for Western Colorado and Development of a Scenario based Method for Regional Climate Change Planning

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Arnott, J. C.; Rood, R. B.

    2015-12-01

    As the latest generation of climate and Earth system models become more complex , the use of model output to project certain impact relevant indicators for regional climate change adaptation planning is increasingly sought. However, barriers due to skill remain when utilizing this model data to project changes in precipitation over mountainous areas at regional and subregional scales. Complex topography, localized meteorological phenomenon, and other factors are still not well represented by global scale models, which can limit the representation of key impact criteria needed for planning.We explore limitations and opportunities of utilizing precipitation data from Coupled Model Intercomparison Project 5(CMIP5) models to provide use relevant projections of future precipitation conditions in Western Colorado, with a focus on applications relevant to climate information needs in the resort community of Aspen. First, a model skill evaluation is conducted by comparing precipitation and temperature values of selected model ensemble from CMIP5 to observations during a historical period. The comparison is conducted for both temporal and spatial scales, on both yearly and seasonal increments. Results indicate that the models are more skillful at representing temperature than precipitation and that the apparent lack of skill for precipitation warrants caution in the use of such data in climate impacts assessment that serve to inform adaptation planning and preparedness decision making. In light of model evaluation, the authors introduce a scenario based method in which individual models within the CMIP5 ensemble are organized into plausible qualitative futures and individual model runs are selected as representative scenarios by which detailed analysis can then be applied. The results from scenario based method are viewed as useful for exploring regional climate futures in instances when it is not appropriate to utilize data directly from global-scale climate models.

  16. Scenario-based numerical modelling and the palaeo-historic record of tsunamis in Wallis and Futuna, Southwest Pacific

    NASA Astrophysics Data System (ADS)

    Lamarche, G.; Popinet, S.; Pelletier, B.; Mountjoy, J.; Goff, J.; Delaux, S.; Bind, J.

    2015-08-01

    We investigated the tsunami hazard in the remote French territory of Wallis and Futuna, Southwest Pacific, using the Gerris flow solver to produce numerical models of tsunami generation, propagation and inundation. Wallis consists of the inhabited volcanic island of Uvéa that is surrounded by a lagoon delimited by a barrier reef. Futuna and the island of Alofi form the Horn Archipelago located ca. 240 km east of Wallis. They are surrounded by a narrow fringing reef. Futuna and Alofi emerge from the North Fiji Transform Fault that marks the seismically active Pacific-Australia plate boundary. We generated 15 tsunami scenarios. For each, we calculated maximum wave elevation (MWE), inundation distance and expected time of arrival (ETA). The tsunami sources were local, regional and distant earthquake faults located along the Pacific Rim. In Wallis, the outer reef may experience 6.8 m-high MWE. Uvéa is protected by the barrier reef and the lagoon, but inundation depths of 2-3 m occur in several coastal areas. In Futuna, flow depths exceeding 2 m are modelled in several populated areas, and have been confirmed by a post-September 2009 South Pacific tsunami survey. The channel between the islands of Futuna and Alofi amplified the 2009 tsunami, which resulted in inundation distance of almost 100 m and MWE of 4.4 m. This first ever tsunami hazard modelling study of Wallis and Futuna compares well with palaeotsunamis recognised on both islands and observation of the impact of the 2009 South Pacific tsunami. The study provides evidence for the mitigating effect of barrier and fringing reefs from tsunamis.

  17. Scenario-based numerical modelling and the palaeo-historic record of tsunamis in Wallis and Futuna, Southwest Pacific

    NASA Astrophysics Data System (ADS)

    Lamarche, G.; Popinet, S.; Pelletier, B.; Mountjoy, J.; Goff, J.; Delaux, S.; Bind, J.

    2015-04-01

    We investigated the tsunami hazard in the remote French territory of Wallis and Futuna, Southwest Pacific, using the Gerris flow solver to produce numerical models of tsunami generation, propagation and inundation. Wallis consists of the inhabited volcanic island of Uvéa that is surrounded by a lagoon delimited by a barrier reef. Futuna and the island of Alofi forms the Horn Archipelago located ca. 240 km east of Wallis. They are surrounded by a narrow fringing reef. Futuna and Alofi emerge from the North Fiji Transform Fault that marks the seismically active Pacific-Australia plate boundary. We generated fifteen tsunami scenarios. For each, we calculated maximum wave elevation (MWE), inundation distance, and Expected Time of Arrival (ETA). The tsunami sources were local, regional and distant earthquake faults located along the Pacific Rim. In Wallis, the outer reef may experience 6.8 m-high MWE. Uvéa is protected by the barrier reef and the lagoon, but inundation depths of 2-3 m occur in several coastal areas. In Futuna, flow depths exceeding 2 m are modelled in several populated areas, and have been confirmed by a post-September 2009 South Pacific tsunami survey. The channel between the islands of Futuna and Alofi amplified the 2009 tsunami, which resulted in inundation distance of almost 100 m and MWE of 4.4 m. This first-ever tsunami hazard modelling study of Wallis and Futuna compares well with palaeotsunamis recognised on both islands and observation of the impact of the 2009 South Pacific tsunami. The study provides evidence for the mitigating effect of barrier and fringing reefs from tsunamis.

  18. Limits on the significant mass-loss scenario based on the globular clusters of the Fornax dwarf spheroidal galaxy

    NASA Astrophysics Data System (ADS)

    Khalaj, P.; Baumgardt, H.

    2016-03-01

    Many of the scenarios proposed to explain the origin of chemically peculiar stars in globular clusters (GCs) require significant mass loss (≥95 per cent) to explain the observed fraction of such stars. In the GCs of the Fornax dwarf galaxy, significant mass loss could be a problem. Larsen et al. showed that there is a large ratio of GCs to metal-poor field stars in Fornax and about 20-25 per cent of all the stars with [Fe/H] < -2 belong to the four metal-poor GCs. This imposes an upper limit of ˜80 per cent mass loss that could have happened in Fornax GCs. In this paper, we propose a solution to this problem by suggesting that stars can leave the Fornax galaxy. We use a series of N-body simulations to determine the limit of mass loss from Fornax as a function of the initial orbital radii of GCs and the speed with which stars leave Fornax GCs. We consider a set of cored and cuspy density profiles for Fornax. Our results show that with a cuspy model for Fornax, the fraction of stars that leave the galaxy can be as high as ˜90 per cent, when the initial orbital radii of GCs are R = 2-3 kpc and the initial speed of stars is v > 20 km s-1. We show that such large velocities can be achieved by mass loss induced by gas expulsion but not mass loss induced by stellar evolution. Our results imply that one cannot interpret the metallicity distribution of Fornax field stars as evidence against significant mass loss in Fornax GCs, if mass loss is due to gas expulsion.

  19. Policy and Validity Prospects for Performance-Based Assessment.

    ERIC Educational Resources Information Center

    Baker, Eva L.; And Others

    1994-01-01

    This article describes performance-based assessment as expounded by its proponents, comments on these conceptions, reviews evidence regarding the technical quality of performance-based assessment, and considers its validity under various policy options. (JDD)

  20. A Behavior-Based Employee Performance System.

    ERIC Educational Resources Information Center

    Abernathy, William B.

    2003-01-01

    Discusses human performance technology models for describing and understanding factors involved in day-to-day functioning of employees and then to develop specific remedial interventions as needed, and contrasts it to an organizational performance system perspective used to design an organization before employees are even hired to prevent bad…

  1. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  2. The use of a decision tree based on the rabies diagnosis scenario, to assist the implementation of alternatives to laboratory animals.

    PubMed

    Bones, Vanessa C; Molento, Carla Forte Maiolino

    2016-05-01

    Brazilian federal legislation makes the use of alternatives mandatory, when there are validated methods to replace the use of laboratory animals. The objective of this paper is to introduce a novel decision tree (DT)-based approach, which can be used to assist the replacement of laboratory animal procedures in Brazil. This project is based on a previous analysis of the rabies diagnosis scenario, in which we identified certain barriers that hinder replacement, such as: a) the perceived higher costs of alternative methods; b) the availability of staff qualified in these methods; c) resistance to change by laboratory staff; d) regulatory obstacles, including incompatibilities between the Federal Environmental Crimes Act and specific norms and working practices relating to the use of laboratory animals; and e) the lack of government incentives. The DT represents a highly promising means to overcome these reported barriers to the replacement of laboratory animal use in Brazil. It provides guidance to address the main obstacles, and, followed step-by-step, would lead to the implementation of validated alternative methods (VAMs), or their development when such alternatives do not exist. The DT appears suitable for application to laboratory animal use scenarios where alternative methods already exist, such as in the case of rabies diagnosis, and could contribute to increase compliance with the Three Rs principles in science and with the current legal requirements in Brazil. PMID:27256454

  3. The National Commission on Performance-Based Education.

    ERIC Educational Resources Information Center

    McDonald, Frederick J.

    The National Commission on Performance-Based Education was formed to coordinate and integrate nationally plans of performance-based teacher education. The goals of the commission are defining competence, evaluating it, training for it, and managing programs of performance-based education and certification. The initial programs of the commission…

  4. A Global Scale Scenario for Prebiotic Chemistry: Silica-Based Self-Assembled Mineral Structures and Formamide

    PubMed Central

    2016-01-01

    membrane are clearly specific, demonstrating that the mineral self-assembled membranes at the same time create space compartmentalization and selective catalysis of the synthesis of relevant compounds. Rather than requiring odd local conditions, the prebiotic organic chemistry scenario for the origin of life appears to be common at a universal scale and, most probably, earlier than ever thought for our planet. PMID:27115539

  5. A Global Scale Scenario for Prebiotic Chemistry: Silica-Based Self-Assembled Mineral Structures and Formamide.

    PubMed

    Saladino, Raffaele; Botta, Giorgia; Bizzarri, Bruno Mattia; Di Mauro, Ernesto; Garcia Ruiz, Juan Manuel

    2016-05-17

    clearly specific, demonstrating that the mineral self-assembled membranes at the same time create space compartmentalization and selective catalysis of the synthesis of relevant compounds. Rather than requiring odd local conditions, the prebiotic organic chemistry scenario for the origin of life appears to be common at a universal scale and, most probably, earlier than ever thought for our planet. PMID:27115539

  6. The need for and use of socio-economic scenarios for climate change analysis: A new approach based on shared socio-economic pathways

    SciTech Connect

    Kriegler, Elmar; O'Neill, Brian; Hallegatte, Stephane; Kram, Tom; Lempert, Rob; Moss, Richard H.; Wilbanks, Thomas

    2012-10-01

    A new set of socioeconomic scenarios (Shared Socioeconomic Pathways) are described that provide a set of global narratives and socio-economic pathways to pair with climate model scenarios developed using the new Representative Concentration Pathways.

  7. Impacts of Performance-Based Accountability on Institutional Performance in the U.S.

    ERIC Educational Resources Information Center

    Shin, Jung Cheol

    2010-01-01

    In the 1990s, most US states adopted new forms of performance-based accountability, e.g., performance-based budgeting, funding, or reporting. This study analyzed changes in institutional performance following the adoption of these new accountability standards. We measured institutional performance by representative education and research…

  8. Scenario Development for the Southwestern United States

    NASA Astrophysics Data System (ADS)

    Mahmoud, M.; Gupta, H.; Stewart, S.; Liu, Y.; Hartmann, H.; Wagener, T.

    2006-12-01

    The primary goal of employing a scenario development approach for the U.S. southwest is to inform regional policy by examining future possibilities related to regional vegetation change, water-leasing, and riparian restoration. This approach is necessary due to a lack of existing explicit water resources application of scenarios to the entire southwest region. A formal approach for scenario development is adopted and applied towards water resources issues within the arid and semi-arid regions of the U.S. southwest following five progressive and reiterative phases: scenario definition, scenario construction, scenario analysis, scenario assessment, and risk management. In the scenario definition phase, the inputs of scientists, modelers, and stakeholders were collected in order to define and construct relevant scenarios to the southwest and its water sustainability needs. From stakeholder-driven scenario workshops and breakout sessions, the three main axes of principal change were identified to be climate change, population development patterns, and quality of information monitoring technology. Based on the extreme and varying conditions of these three main axes, eight scenario narratives were drafted to describe the state of each scenario's respective future and the events which led to it. Events and situations are described within each scenario narrative with respect to key variables; variables that are both important to regional water resources (as distinguished by scientists and modelers), and are good tracking and monitoring indicators of change. The current phase consists of scenario construction, where the drafted scenarios are re-presented to regional scientists and modelers to verify that proper key variables are included (or excluded) from the eight narratives. The next step is to construct the data sets necessary to implement the eight scenarios on the respective computational models of modelers investigating vegetation change, water-leasing, and riparian

  9. Mission Scenario Development Workbench

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David; Mandutianu, Dan; Hooper, David

    2006-01-01

    The Mission Scenario Development Workbench (MSDW) is a multidisciplinary performance analysis software tool for planning and optimizing space missions. It provides a number of new capabilities that are particularly useful for planning the surface activities on other planets. MSDW enables rapid planning of a space mission and supports flight system and scientific-instrumentation trades. It also provides an estimate of the ability of flight, ground, and science systems to meet high-level mission goals and provides means of evaluating expected mission performance at an early stage of planning in the project life cycle. In MSDW, activity plans and equipment-list spreadsheets are integrated with validated parameterized simulation models of spacecraft systems. In contrast to traditional approaches involving worst-case estimates with large margins, the approach embodied in MSDW affords more flexibility and more credible results early in the lifecycle through the use of validated, variable- fidelity models of spacecraft systems. MSDW is expected to help maximize the scientific return on investment for space missions by understanding early the performance required to have a successful mission while reducing the risk of costly design changes made at late stages in the project life cycle.

  10. Scenario-based assessment of buildings' damage and population exposure due to earthquake-induced tsunamis for the town of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Pagnoni, G.; Armigliato, A.; Tinti, S.

    2015-12-01

    Alexandria is the second biggest city in Egypt with regards to population, is a key economic area in northern Africa and has very important tourist activity. Historical records indicate that it was severely affected by a number of tsunami events. In this work we assess the tsunami hazard by running numerical simulations of tsunami impact in Alexandria through the worst-case credible tsunami scenario analysis (WCTSA). We identify three main seismic sources: the western Hellenic Arc (WHA - reference event AD 365, Mw = 8.5), the eastern Hellenic Arc (EHA - reference event 1303, Mw = 8.0) and the Cyprus Arc (CA - hypothetical scenario earthquake with Mw = 8.0), inferred from the tectonic setting and from historical tsunami catalogues. All numerical simulations are carried out in two sea level conditions (mean sea level and maximum high-tide sea level) by means of the code UBO-TSUFD, developed and maintained by the Tsunami Research Team of the University of Bologna. Relevant tsunami metrics are computed for each scenario and then used to build aggregated fields such as the maximum flood depth and the maximum inundation area. We find that the case that produces the most relevant flooding in Alexandria is the EHA scenario, with wave heights up to 4 m. The aggregate fields are used for a building vulnerability assessment according to a methodology developed in the framework of the EU-FP6 project SCHEMA and further refined in this study, based on the adoption of a suitable building damage matrix and on water inundation depth. It is found that in the districts of El Dekhila and Al Amriyah, to the south-west of the port of Dekhila, over 12 000 (13 400 in the case of maximum high tide) buildings could be affected and hundreds of them could sustain damaging consequences, ranging from critical damage to total collapse. It is also found that in the same districts tsunami inundation covers an area of about 15 km2, resulting in more than 150 000 (165 000 in the case of maximum high

  11. Medical Scenarios Relevant to Spaceflight

    NASA Technical Reports Server (NTRS)

    Bacal, Kira; Hurs, Victor; Doerr, Harold

    2004-01-01

    The Medical Operational Support Team (MOST) was tasked by the JSC Space Medicine and Life Sciences Directorate (SLSD) to incorporate medical simulation into 1) medical training for astronaut-crew medical officers (CMO) and medical flight control teams and 2) evaluations of procedures and resources required for medical care aboard the International Space Station (ISS). Development of evidence-based medical scenarios that mimic the physiology observed during spaceflight will be needed for the MOST to complete these two tasks. The MOST used a human patient simulator, the ISS-like resources in the Medical Simulation Laboratory (MSL), and evidence from space operations, military operations and medical literature to develop space relevant medical scenarios. These scenarios include conditions concerning airway management, Advanced Cardiac Life Support (ACLS) and mitigating anaphylactic symptoms. The MOST has used these space relevant medical scenarios to develop a preliminary space medical training regimen for NASA flight surgeons, Biomedical Flight Controllers (Biomedical Engineers; BME) and CMO-analogs. This regimen is conducted by the MOST in the MSL. The MOST has the capability to develop evidence-based space-relevant medical scenarios that can help SLSD I) demonstrate the proficiency of medical flight control teams to mitigate space-relevant medical events and 2) validate nextgeneration medical equipment and procedures for space medicine applications.

  12. An integrated exposure assessment of phthalates for the general population in China based on both exposure scenario and biomonitoring estimation approaches.

    PubMed

    Cao, Yan; Liu, Jianguo; Liu, Yang; Wang, Jie; Hao, Xuewen

    2016-02-01

    The representativeness of available studies on integrated exposure assessment of phthalates for the general population in China is lacking. Based on an exhaustive review of the extensive monitoring data available for China, this study presents a large-scale estimation of exposure levels to three typical phthalates, di(2-ethylhexyl) phthalate (DEHP), di-n-butyl phthalate (DBP) and diisobutyl phthalate (DiBP), by applying both exposure scenario and biomonitoring estimation approaches. The respective median exposure levels from the exposure scenario and biomonitoring estimation approaches were 3.80, 3.02 and 1.00 μg/kg bw/day and 3.38, 3.21 and 3.32 μg/kg bw/day for DEHP, DBP and DiBP, which are acceptable levels of exposure with respect to current international guidelines. Evaluation results from the two approaches showed both similarities and differences among the different phthalates, making the exposure assessment comparable and more comprehensive. In terms of sources of exposure, food intake was the largest contributor, while indoor air exposure had greater contribution to the estimated daily intakes (EDIs) of DiBP than that of the other phthalates. Moreover, more attention should be paid to the higher exposure levels of phthalates in several intensively industrialized and urbanized areas, and the causes of the different exposure levels in the different regions need to be further explored. PMID:26654930

  13. 3-D or median map? Earthquake scenario ground-motion maps from physics-based models versus maps from ground-motion prediction equations

    NASA Astrophysics Data System (ADS)

    Porter, K.

    2015-12-01

    There are two common ways to create a ground-motion map for a hypothetical earthquake: using ground motion prediction equations (by far the more common of the two) and using 3-D physics-based modeling. The former is very familiar to engineers, the latter much less so, and the difference can present a problem because engineers tend to trust the familiar and distrust novelty. Maps for essentially the same hypothetical earthquake using the two different methods can look very different, while appearing to present the same information. Using one or the other can lead an engineer or disaster planner to very different estimates of damage and risk. The reasons have to do with depiction of variability, spatial correlation of shaking, the skewed distribution of real-world shaking, and the upward-curving relationship between shaking and damage. The scientists who develop the two kinds of map tend to specialize in one or the other and seem to defend their turf, which can aggravate the problem of clearly communicating with engineers.The USGS Science Application for Risk Reduction's (SAFRR) HayWired scenario has addressed the challenge of explaining to engineers the differences between the two maps, and why, in a disaster planning scenario, one might want to use the less-familiar 3-D map.

  14. SPREADSHEET BASED SCALING CALCULATIONS AND MEMBRANE PERFORMANCE

    EPA Science Inventory

    Many membrane element manufacturers provide a computer program to aid buyers in the use of their elements. However, to date there are few examples of fully integrated public domain software available for calculating reverse osmosis and nanofiltration system performance. The Total...

  15. Performance Based Education. Technology Activity Modules.

    ERIC Educational Resources Information Center

    Custer, Rodney L., Ed.

    These Technology Activity Modules are designed to serve as an implementation resource for technology education teachers as they integrate technology education with Missouri's Academic Performance Standards and provide a source of activities and activity ideas that can be used to integrate and reinforce learning across the curriculum. The modules…

  16. SEU Performance of TAG Based Flip Flops

    NASA Technical Reports Server (NTRS)

    Shuler, Robert L.; Kouba, Coy; O'Neill, Patrick M.

    2005-01-01

    We describe heavy ion test results for two new SEU tolerant latches based on transition nand gates, one for single rail asynchronous and the other for dual rail synchronous designs, implemented in AMI 0.5microprocess.

  17. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    ERIC Educational Resources Information Center

    Misfeldt, Morten

    2015-01-01

    In this paper I describe how students use a project management simulation game based on an attack-defense mechanism where two teams of players compete by challenging each other's projects. The project management simulation game is intended to be played by pre-service construction workers and engineers. The gameplay has two parts: a planning part,…

  18. The Experimental Effects of the Strategic Adolescent Reading Intervention (STARI) on a Scenarios-Based Reading Comprehension Assessment

    ERIC Educational Resources Information Center

    Kim, James; Hemphill, Lowry; Troyer, Margaret; Jones, Stephanie; LaRusso, Maria; Kim, Ha-Yeon; Donovan, Suzanne; Snow, Catherine

    2016-01-01

    Nearly one-quarter of U.S. eighth graders score below basic on national assessments of reading (NCES, 2013) and are poorly equipped for the reading demands of secondary school. Struggling adolescent readers cannot summarize a simple passage, use context to determine word meanings, and have difficulties making text-based inferences. In addition,…

  19. Pre-Service Teachers' Perceptions on Game Based Learning Scenarios in Primary Reading and Writing Instruction Courses

    ERIC Educational Resources Information Center

    Karadag, Ruhan

    2015-01-01

    The aim of this study was to explore pre-service teachers' perceptions on the use of game-based learning in a Primary Reading and Writing Instruction Course. A mixed method research was used in the study. Participants were composed of a total of 189 pre-service teachers taking the Primary Reading and Writing Instruction course during the fall term…

  20. Context Impact of Clinical Scenario on Knowledge Transfer and Reasoning Capacity in a Medical Problem-Based Learning Curriculum

    ERIC Educational Resources Information Center

    Collard, A.; Brédart, S.; Bourguignon, J.-P.

    2016-01-01

    Since 2000, the faculty of Medicine at the University of Liège has integrated problem-based learning (PBL) seminars from year two to seven in its seven-year curriculum. The PBL approach has been developed to facilitate students' acquisition of reasoning capacity. This contextualized learning raises the question of the de- and re-contextualization…

  1. Human health screening level risk assessments of tertiary-butyl acetate (TBAC): calculated acute and chronic reference concentration (RfC) and Hazard Quotient (HQ) values based on toxicity and exposure scenario evaluations.

    PubMed

    Bus, James S; Banton, Marcy I; Faber, Willem D; Kirman, Christopher R; McGregor, Douglas B; Pourreau, Daniel B

    2015-02-01

    A screening level risk assessment has been performed for tertiary-butyl acetate (TBAC) examining its primary uses as a solvent in industrial and consumer products. Hazard quotients (HQ) were developed by merging TBAC animal toxicity and dose-response data with population-level, occupational and consumer exposure scenarios. TBAC has a low order of toxicity following subchronic inhalation exposure, and neurobehavioral changes (hyperactivity) in mice observed immediately after termination of exposure were used as conservative endpoints for derivation of acute and chronic reference concentration (RfC) values. TBAC is not genotoxic but has not been tested for carcinogenicity. However, TBAC is unlikely to be a human carcinogen in that its non-genotoxic metabolic surrogates tertiary-butanol (TBA) and methyl tertiary butyl ether (MTBE) produce only male rat α-2u-globulin-mediated kidney cancer and high-dose specific mouse thyroid tumors, both of which have little qualitative or quantitative relevance to humans. Benchmark dose (BMD)-modeling of the neurobehavioral responses yielded acute and chronic RfC values of 1.5 ppm and 0.3 ppm, respectively. After conservative modeling of general population and near-source occupational and consumer product exposure scenarios, almost all HQs were substantially less than 1. HQs exceeding 1 were limited to consumer use of automotive products and paints in a poorly ventilated garage-sized room (HQ = 313) and occupational exposures in small and large brake shops using no personal protective equipment or ventilation controls (HQs = 3.4-126.6). The screening level risk assessments confirm low human health concerns with most uses of TBAC and indicate that further data-informed refinements can address problematic health/exposure scenarios. The assessments also illustrate how tier-based risk assessments using read-across toxicity information to metabolic surrogates reduce the need for comprehensive animal testing. PMID:25629921

  2. Scenario based tsunami wave height estimation towards hazard evaluation for the Hellenic coastline and examples of extreme inundation zones in South Aegean

    NASA Astrophysics Data System (ADS)

    Melis, Nikolaos S.; Barberopoulou, Aggeliki; Frentzos, Elias; Krassanakis, Vassilios

    2016-04-01

    A scenario based methodology for tsunami hazard assessment is used, by incorporating earthquake sources with the potential to produce extreme tsunamis (measured through their capacity to cause maximum wave height and inundation extent). In the present study we follow a two phase approach. In the first phase, existing earthquake hazard zoning in the greater Aegean region is used to derive representative maximum expected earthquake magnitude events, with realistic seismotectonic source characteristics, and of greatest tsunamigenic potential within each zone. By stacking the scenario produced maximum wave heights a global maximum map is constructed for the entire Hellenic coastline, corresponding to all expected extreme offshore earthquake sources. Further evaluation of the produced coastline categories based on the maximum expected wave heights emphasizes the tsunami hazard in selected coastal zones with important functions (i.e. touristic crowded zones, industrial zones, airports, power plants etc). Owing to its proximity to the Hellenic Arc, many urban centres and being a popular tourist destination, Crete Island and the South Aegean region are given a top priority to define extreme inundation zoning. In the second phase, a set of four large coastal cities (Kalamata, Chania, Heraklion and Rethymno), important for tsunami hazard, due i.e. to the crowded beaches during the summer season or industrial facilities, are explored towards preparedness and resilience for tsunami hazard in Greece. To simulate tsunamis in the Aegean region (generation, propagation and runup) the MOST - ComMIT NOAA code was used. High resolution DEMs for bathymetry and topography were joined via an interface, specifically developed for the inundation maps in this study and with similar products in mind. For the examples explored in the present study, we used 5m resolution for the topography and 30m resolution for the bathymetry, respectively. Although this study can be considered as

  3. Reliable Multihop Broadcast Protocol with a Low-Overhead Link Quality Assessment for ITS Based on VANETs in Highway Scenarios

    PubMed Central

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H.

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224

  4. Reliable multihop broadcast protocol with a low-overhead link quality assessment for ITS based on VANETs in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224

  5. Environmental Health Practice: Statistically Based Performance Measurement

    PubMed Central

    Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.

    2007-01-01

    Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709

  6. SDG&E`s performance-based ratemaking

    SciTech Connect

    Scadding, J.

    1995-11-01

    Performance-based ratemaking (PBR) in the electric utility industry is outlined. The following topics are discussed: average cents/RWh for residential customers; PBR throws its shadow before it; two phases of SDG and E`s PBR; elements of base-rates PBR; price performance benchmark; `non-price` performance indicators; two-way conditionality; and sharing and off-vamps.

  7. Competency/Performance-Based Student Teaching Program.

    ERIC Educational Resources Information Center

    Simms, Earline M.

    The competency-based, student teaching program of the Southern University (Baton Rouge, Louisiana) College of Education is described. The program basis is a listing of fourteen competencies (teaching skills) which provides a guide for structured and meaningful activities during the observation period, consistency in directing those experiences,…

  8. Network based high performance concurrent computing

    SciTech Connect

    Sunderam, V.S.

    1991-01-01

    The overall objectives of this project are to investigate research issues pertaining to programming tools and efficiency issues in network based concurrent computing systems. The basis for these efforts is the PVM project that evolved during my visits to Oak Ridge Laboratories under the DOE Faculty Research Participation program; I continue to collaborate with researchers at Oak Ridge on some portions of the project.

  9. Performance optimization for space-based sensors: simulation and modelling at Fraunhofer IOSB

    NASA Astrophysics Data System (ADS)

    Schweitzer, Caroline; Stein, Karin

    2014-10-01

    The prediction of the effectiveness of a space-based sensor for its designated application in space (e.g. special earth surface observations or missile detection) can help to reduce the expenses, especially during the phases of mission planning and instrumentation. In order to optimize the performance of such systems we simulate and analyse the entire operational scenario, including: - optional waveband - various orbit heights and viewing angles - system design characteristics, e. g. pixel size and filter transmission - atmospheric effects, e. g. different cloud types, climate zones and seasons In the following, an evaluation of the appropriate infrared (IR) waveband for the designated sensor application is given. The simulation environment is also capable of simulating moving objects like aircraft or missiles. Therefore, the spectral signature of the object/missile as well as its track along a flight path is implemented. The resulting video sequence is then analysed by a tracking algorithm and an estimation of the effectiveness of the sensor system can be simulated. This paper summarizes the work carried out at Fraunhofer IOSB in the field of simulation and modelling for the performance optimization of space based sensors. The paper is structured as follows: First, an overview of the applied simulation and modelling software is given. Then, the capability of those tools is illustrated by means of a hypothetical threat scenario for space-based early warning (launch of a long-range ballistic missile (BM)).

  10. Assessment of future scenarios for wind erosion sensitivity changes based on ALADIN and REMO regional climate model simulation data

    NASA Astrophysics Data System (ADS)

    Mezősi, Gábor; Blanka, Viktória; Bata, Teodóra; Ladányi, Zsuzsanna; Kemény, Gábor; Meyer, Burghard C.

    2016-07-01

    The changes in rate and pattern of wind erosion sensitivity due to climate change were investigated for 2021-2050 and 2071-2100 compared to the reference period (1961-1990) in Hungary. The sensitivities of the main influencing factors (soil texture, vegetation cover and climate factor) were evaluated by fuzzy method and a combined wind erosion sensitivity map was compiled. The climate factor, as the driving factor of the changes, was assessed based on observed data for the reference period, while REMO and ALADIN regional climate model simulation data for the future periods. The changes in wind erosion sensitivity were evaluated on potentially affected agricultural land use types, and hot spot areas were allocated. Based on the results, 5-6% of the total agricultural areas were high sensitive areas in the reference period. In the 21st century slight or moderate changes of wind erosion sensitivity can be expected, and mostly `pastures', `complex cultivation patterns', and `land principally occupied by agriculture with significant areas of natural vegetation' are affected. The applied combination of multi-indicator approach and fuzzy analysis provides novelty in the field of land sensitivity assessment. The method is suitable for regional scale analysis of wind erosion sensitivity changes and supports regional planning by allocating priority areas where changes in agro-technics or land use have to be considered.

  11. How do we know about Earth's history? Constructing the story of Earth's geologic history by collecting and interpreting evidence based scenarios.

    NASA Astrophysics Data System (ADS)

    Ruthford, Steven; DeBari, Susan; Linneman, Scott; Boriss, Miguel; Chesbrough, John; Holmes, Randall; Thibault, Allison

    2013-04-01

    Beginning in 2003, faculty from Western Washington University, Skagit Valley Community College, local public school teachers, and area tribal college members created an innovative, inquiry based undergraduate geology curriculum. The curriculum, titled "Energy and Matter in Earth's Systems," was supported through various grants and partnerships, including Math and Science Partnership and Noyce Teacher Scholarship grants from the National Science Foundation. During 2011, the authors wrote a geologic time unit for the curriculum. The unit is titled, "How Do We Know About Earth's History?" and has students actively investigate the concepts related to geologic time and methods for determining age. Starting with reflection and assessment of personal misconceptions called "Initial Ideas," students organize a series of events into a timeline. The unit then focuses on the concepts of relative dating, biostratigraphy, and historical attempts at absolute dating, including uniformitarianism, catastrophism, Halley and Joly's Salinity hypothesis, and Kelvin's Heat Loss model. With limited lecture and text, students then dive into current understandings of the age of the Earth, which include radioactive decay rates and radiometric dating. Finally, using their newfound understanding, students investigate a number of real world scenarios and create a timeline of events related to the geologic history of the Earth. The unit concludes with activities that reinforce the Earth's absolute age and direct students to summarize what they have learned by reorganizing the timeline from the "Initial Ideas" and sharing with the class. This presentation will include the lesson materials and findings from one activity titled, "The Earth's Story." The activity is located midway through the unit and begins with reflection on the question, "What are the major events in the Earth's history and when did they happen?" Students are directed to revisit the timeline of events from the "Initial Ideas

  12. Age-specific mechanisms in an SSVEP-based BCI scenario: evidences from spontaneous rhythms and neuronal oscillators.

    PubMed

    Ehlers, Jan; Valbuena, Diana; Stiller, Anja; Gräser, Axel

    2012-01-01

    Utilizing changes in steady-state visual evoked potentials (SSVEPs) is an established approach to operate a brain-computer interface (BCI). The present study elucidates to what extent development-specific changes in the background EEG influence the ability to proper handle a stimulus-driven BCI. Therefore we investigated the effects of a wide range of photic driving on children between six and ten years in comparison to an adult control group. The results show differences in the driving profiles apparently in close communication with the specific type of intermittent stimulation. The factor age gains influence with decreasing stimulation frequency, whereby the superior performance of the adults seems to be determined to a great extent by elaborated driving responses at 10 and 11 Hz, matching the dominant resonance frequency of the respective background EEG. This functional interplay was only partially obtained in higher frequency ranges and absent in the induced driving between 30 and 40 Hz, indicating distinctions in the operating principles and developmental changes of the underlying neuronal oscillators. PMID:23365562

  13. Novel free-boundary equilibrium and transport solver with theory-based models and its validation against ASDEX Upgrade current ramp scenarios

    NASA Astrophysics Data System (ADS)

    Fable, E.; Angioni, C.; Casson, F. J.; Told, D.; Ivanov, A. A.; Jenko, F.; McDermott, R. M.; Medvedev, S. Yu; Pereverzev, G. V.; Ryter, F.; Treutterer, W.; Viezzer, E.; the ASDEX Upgrade Team

    2013-12-01

    Tokamak scenario development requires an understanding of the properties that determine the kinetic profiles in non-steady plasma phases and of the self-consistent evolution of the magnetic equilibrium. Current ramps are of particular interest since many transport-relevant parameters explore a large range of values and their impact on transport mechanisms has to be assessed. To this purpose, a novel full-discharge modelling tool has been developed, which couples the transport code ASTRA (Pereverzev et al 1991 IPP Report 5/42) and the free boundary equilibrium code SPIDER (Ivanov et al 2005 32nd EPS Conf. on Plasma Physics vol 29C (ECA) P-5.063 and http://epsppd.epfl.ch/Tarragona/pdf/P5_063.pdf), utilizing a specifically designed coupling scheme. The current ramp-up phase can be accurately and reliably simulated using this scheme, where a plasma shape, position and current controller is applied, which mimics the one of ASDEX Upgrade. Transport of energy is provided by theory-based models (e.g. TGLF (Staebler et al 2007 Phys. Plasmas 14 055909)). A recipe based on edge-relevant parameters (Scott 2000 Phys. Plasmas 7 1845) is proposed to resolve the low current phase of the current ramps, where the impact of the safety factor on micro-instabilities could make quasi-linear approaches questionable in the plasma outer region. Current ramp scenarios, selected from ASDEX Upgrade discharges, are then simulated to validate both the coupling with the free-boundary evolution and the prediction of profiles. Analysis of the underlying transport mechanisms is presented, to clarify the possible physics origin of the observed L-mode empirical energy confinement scaling. The role of toroidal micro-instabilities (ITG, TEM) and of non-linear effects is discussed.

  14. Habitat availability and gene flow influence diverging local population trajectories under scenarios of climate change: a place-based approach.

    PubMed

    Schwalm, Donelle; Epps, Clinton W; Rodhouse, Thomas J; Monahan, William B; Castillo, Jessica A; Ray, Chris; Jeffress, Mackenzie R

    2016-04-01

    Ecological niche theory holds that species distributions are shaped by a large and complex suite of interacting factors. Species distribution models (SDMs) are increasingly used to describe species' niches and predict the effects of future environmental change, including climate change. Currently, SDMs often fail to capture the complexity of species' niches, resulting in predictions that are generally limited to climate-occupancy interactions. Here, we explore the potential impact of climate change on the American pika using a replicated place-based approach that incorporates climate, gene flow, habitat configuration, and microhabitat complexity into SDMs. Using contemporary presence-absence data from occupancy surveys, genetic data to infer connectivity between habitat patches, and 21 environmental niche variables, we built separate SDMs for pika populations inhabiting eight US National Park Service units representing the habitat and climatic breadth of the species across the western United States. We then predicted occurrence probability under current (1981-2010) and three future time periods (out to 2100). Occurrence probabilities and the relative importance of predictor variables varied widely among study areas, revealing important local-scale differences in the realized niche of the American pika. This variation resulted in diverse and - in some cases - highly divergent future potential occupancy patterns for pikas, ranging from complete extirpation in some study areas to stable occupancy patterns in others. Habitat composition and connectivity, which are rarely incorporated in SDM projections, were influential in predicting pika occupancy in all study areas and frequently outranked climate variables. Our findings illustrate the importance of a place-based approach to species distribution modeling that includes fine-scale factors when assessing current and future climate impacts on species' distributions, especially when predictions are intended to manage and

  15. Spreadsheet Based Scaling Calculations and Membrane Performance

    SciTech Connect

    Wolfe, T D; Bourcier, W L; Speth, T F

    2000-12-28

    Many membrane element manufacturers provide a computer program to aid buyers in the use of their elements. However, to date there are few examples of fully integrated public domain software available for calculating reverse osmosis and nanofiltration system performance. The Total Flux and Scaling Program (TFSP), written for Excel 97 and above, provides designers and operators new tools to predict membrane system performance, including scaling and fouling parameters, for a wide variety of membrane system configurations and feedwaters. The TFSP development was funded under EPA contract 9C-R193-NTSX. It is freely downloadable at www.reverseosmosis.com/download/TFSP.zip. TFSP includes detailed calculations of reverse osmosis and nanofiltration system performance. Of special significance, the program provides scaling calculations for mineral species not normally addressed in commercial programs, including aluminum, iron, and phosphate species. In addition, ASTM calculations for common species such as calcium sulfate (CaSO{sub 4}{times}2H{sub 2}O), BaSO{sub 4}, SrSO{sub 4}, SiO{sub 2}, and LSI are also provided. Scaling calculations in commercial membrane design programs are normally limited to the common minerals and typically follow basic ASTM methods, which are for the most part graphical approaches adapted to curves. In TFSP, the scaling calculations for the less common minerals use subsets of the USGS PHREEQE and WATEQ4F databases and use the same general calculational approach as PHREEQE and WATEQ4F. The activities of ion complexes are calculated iteratively. Complexes that are unlikely to form in significant concentration were eliminated to simplify the calculations. The calculation provides the distribution of ions and ion complexes that is used to calculate an effective ion product ''Q.'' The effective ion product is then compared to temperature adjusted solubility products (Ksp's) of solids in order to calculate a Saturation Index (SI) for each solid of

  16. NTD-GE Based Microcalorimeter Performance

    NASA Technical Reports Server (NTRS)

    Bandler, Simon; Silver, Eric; Schnopper, Herbert; Murray, Stephen; Barbera, Marco; Madden, Norm; Landis, Don; Beeman, Jeff; Haller, Eugene; Tucker, Greg

    2000-01-01

    Our group has been developing x-ray microcalorimeters consisting of neutron transmutation doped (NTD) germanium thermistors attached to superconducting tin absorbers. We discuss the performance of single pixel x-ray detectors, and describe an array technology. In this paper we describe the read-out circuit that allows us to measure fast signals in our detectors as this will be important in understanding the primary cause of resolution broadening. We describe briefly a multiplexing scheme that allows a number of different calorimeters to be read out using a single JFET. We list the possible causes of broadening and give a description of the experiment which best demonstrates the cause of the primary broadening source. We mention our strategy for finding a suitable solution to this problem and describe briefly a technology for building arrays of these calorimeters.

  17. Performance-Based Thinking and Training for Competence.

    ERIC Educational Resources Information Center

    Rakow, Joel

    1982-01-01

    Discusses five job behavior functions viewed as necessary for practicing performance-based thinking in instructional development activities. Functions examined include the abilities to plan to perform a job, execute a task, monitor or control execution, troubleshoot, and evaluate. (MER)

  18. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Bases for performance-based payments. 32.1002 Section 32.1002 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments....

  19. Assessment of Folsom Lake response to historical and potential future climate scenarios

    USGS Publications Warehouse

    Yao, Huaming; Georgakakos, Aris P.

    2000-01-01

    An integrated forecast-decision system for Folsom Lake (California) is developed and used to assess the sensitivity of reservoir performance to various forecast-management schemes under historical and future climate scenarios. The assessments are based on various combinations of inflow forecasting models, decision rules, and climate scenarios and demonstrate that (1) reliable inflow forecasts and adaptive decision systems can substantially benefit reservoir performance and (2) dynamic operational procedures represent effective climate change coping strategies.

  20. Comparison of Sigma-Point and Extended Kalman Filters on a Realistic Orbit Determination Scenario

    NASA Technical Reports Server (NTRS)

    Gaebler, John; Hur-Diaz. Sun; Carpenter, Russell

    2010-01-01

    Sigma-point filters have received a lot of attention in recent years as a better alternative to extended Kalman filters for highly nonlinear problems. In this paper, we compare the performance of the additive divided difference sigma-point filter to the extended Kalman filter when applied to orbit determination of a realistic operational scenario based on the Interstellar Boundary Explorer mission. For the scenario studied, both filters provided equivalent results. The performance of each is discussed in detail.

  1. High Performance Graphene Oxide Based Rubber Composites

    NASA Astrophysics Data System (ADS)

    Mao, Yingyan; Wen, Shipeng; Chen, Yulong; Zhang, Fazhong; Panine, Pierre; Chan, Tung W.; Zhang, Liqun; Liang, Yongri; Liu, Li

    2013-08-01

    In this paper, graphene oxide/styrene-butadiene rubber (GO/SBR) composites with complete exfoliation of GO sheets were prepared by aqueous-phase mixing of GO colloid with SBR latex and a small loading of butadiene-styrene-vinyl-pyridine rubber (VPR) latex, followed by their co-coagulation. During co-coagulation, VPR not only plays a key role in the prevention of aggregation of GO sheets but also acts as an interface-bridge between GO and SBR. The results demonstrated that the mechanical properties of the GO/SBR composite with 2.0 vol.% GO is comparable with those of the SBR composite reinforced with 13.1 vol.% of carbon black (CB), with a low mass density and a good gas barrier ability to boot. The present work also showed that GO-silica/SBR composite exhibited outstanding wear resistance and low-rolling resistance which make GO-silica/SBR very competitive for the green tire application, opening up enormous opportunities to prepare high performance rubber composites for future engineering applications.

  2. High Performance Graphene Oxide Based Rubber Composites

    PubMed Central

    Mao, Yingyan; Wen, Shipeng; Chen, Yulong; Zhang, Fazhong; Panine, Pierre; Chan, Tung W.; Zhang, Liqun; Liang, Yongri; Liu, Li

    2013-01-01

    In this paper, graphene oxide/styrene-butadiene rubber (GO/SBR) composites with complete exfoliation of GO sheets were prepared by aqueous-phase mixing of GO colloid with SBR latex and a small loading of butadiene-styrene-vinyl-pyridine rubber (VPR) latex, followed by their co-coagulation. During co-coagulation, VPR not only plays a key role in the prevention of aggregation of GO sheets but also acts as an interface-bridge between GO and SBR. The results demonstrated that the mechanical properties of the GO/SBR composite with 2.0 vol.% GO is comparable with those of the SBR composite reinforced with 13.1 vol.% of carbon black (CB), with a low mass density and a good gas barrier ability to boot. The present work also showed that GO-silica/SBR composite exhibited outstanding wear resistance and low-rolling resistance which make GO-silica/SBR very competitive for the green tire application, opening up enormous opportunities to prepare high performance rubber composites for future engineering applications. PMID:23974435

  3. Transportation accident scenarios for commercial spent fuel

    SciTech Connect

    Wilmot, E L

    1981-02-01

    A spectrum of high severity, low probability, transportation accident scenarios involving commercial spent fuel is presented together with mechanisms, pathways and quantities of material that might be released from spent fuel to the environment. These scenarios are based on conclusions from a workshop, conducted in May 1980 to discuss transportation accident scenarios, in which a group of experts reviewed and critiqued available literature relating to spent fuel behavior and cask response in accidents.

  4. Projections of high resolution climate changes for South Korea using multiple-regional climate models based on four RCP scenarios. Part 2: Precipitation

    NASA Astrophysics Data System (ADS)

    Oh, Seok-Geun; Suh, Myoung-Seok; Lee, Young-Suk; Ahn, Joong-Bae; Cha, Dong-Hyun; Lee, Dong-Kyou; Hong, Song-You; Min, Seung-Ki; Park, Seong-Chan; Kang, Hyun-Suk

    2016-05-01

    Precipitation changes over South Korea were projected using five regional climate models (RCMs) with a horizontal resolution of 12.5 km for the mid and late 21st century (2026-2050, 2076-2100) under four Representative Concentration Pathways (RCP) scenarios against present precipitation (1981-2005). The simulation data of the Hadley Centre Global Environmental Model version 2 coupled with the Atmosphere-Ocean (HadGEM2-AO) was used as boundary data of RCMs. In general, the RCMs well simulated the spatial and seasonal variations of present precipitation compared with observation and HadGEM2-AO. Equal Weighted Averaging without Bias Correction (EWA_NBC) significantly reduced the model biases to some extent, but systematic biases in results still remained. However, the Weighted Averaging based on Taylor's skill score (WEA_Tay) showed a good statistical correction in terms of the spatial and seasonal variations, the magnitude of precipitation amount, and the probability density. In the mid-21st century, the spatial and interannual variabilities of precipitation over South Korea are projected to increase regardless of the RCP scenarios and seasons. However, the changes in area-averaged seasonal precipitation are not significant due to mixed changing patterns depending on locations. Whereas, in the late 21st century, the precipitation is projected to increase proportionally to the changes of net radiative forcing. Under RCP8.5, WEA_Tay projects the precipitation to be increased by about +19.1, +20.5, +33.3% for annual, summer and winter precipitation at 1-5% significance levels, respectively. In addition, the probability of strong precipitation (≥ 15 mm d-1) is also projected to increase significantly, particularly in WEA_Tay under RCP8.5.

  5. Projections of high resolution climate changes for South Korea using multiple-regional climate models based on four RCP scenarios. Part 2: precipitation

    NASA Astrophysics Data System (ADS)

    Oh, Seok-Geun; Suh, Myoung-Seok; Lee, Young-Suk; Ahn, Joong-Bae; Cha, Dong-Hyun; Lee, Dong-Kyou; Hong, Song-You; Min, Seung-Ki; Park, Seong-Chan; Kang, Hyun-Suk

    2016-05-01

    Precipitation changes over South Korea were projected using five regional climate models (RCMs) with a horizontal resolution of 12.5 km for the mid and late 21st century (2026-2050, 2076- 2100) under four Representative Concentration Pathways (RCP) scenarios against present precipitation (1981-2005). The simulation data of the Hadley Centre Global Environmental Model version 2 coupled with the Atmosphere-Ocean (HadGEM2-AO) was used as boundary data of RCMs. In general, the RCMs well simulated the spatial and seasonal variations of present precipitation compared with observation and HadGEM2-AO. Equal Weighted Averaging without Bias Correction (EWA_NBC) significantly reduced the model biases to some extent, but systematic biases in results still remained. However, the Weighted Averaging based on Taylor's skill score (WEA_Tay) showed a good statistical correction in terms of the spatial and seasonal variations, the magnitude of precipitation amount, and the probability density. In the mid-21st century, the spatial and interannual variabilities of precipitation over South Korea are projected to increase regardless of the RCP scenarios and seasons. However, the changes in area-averaged seasonal precipitation are not significant due to mixed changing patterns depending on locations. Whereas, in the late 21st century, the precipitation is projected to increase proportionally to the changes of net radiative forcing. Under RCP8.5, WEA_Tay projects the precipitation to be increased by about +19.1, +20.5, +33.3% for annual, summer and winter precipitation at 1-5% significance levels, respectively. In addition, the probability of strong precipitation (≥ 15 mm d-1) is also projected to increase significantly, particularly in WEA_Tay under RCP8.5.

  6. Characterizing the emission implications of future natural gas production and use in the U.S. and Rocky Mountain region: A scenario-based energy system modeling approach

    NASA Astrophysics Data System (ADS)

    McLeod, Jeffrey

    The recent increase in U.S. natural gas production made possible through advancements in extraction techniques including hydraulic fracturing has transformed the U.S. energy supply landscape while raising questions regarding the balance of environmental impacts associated with natural gas production and use. Impact areas at issue include emissions of methane and criteria pollutants from natural gas production, alongside changes in emissions from increased use of natural gas in place of coal for electricity generation. In the Rocky Mountain region, these impact areas have been subject to additional scrutiny due to the high level of regional oil and gas production activity and concerns over its links to air quality. Here, the MARKAL (MArket ALlocation) least-cost energy system optimization model in conjunction with the EPA-MARKAL nine-region database has been used to characterize future regional and national emissions of CO 2, CH4, VOC, and NOx attributed to natural gas production and use in several sectors of the economy. The analysis is informed by comparing and contrasting a base case, business-as-usual scenario with scenarios featuring variations in future natural gas supply characteristics, constraints affecting the electricity generation mix, carbon emission reduction strategies and increased demand for natural gas in the transportation sector. Emission trends and their associated sensitivities are identified and contrasted between the Rocky Mountain region and the U.S. as a whole. The modeling results of this study illustrate the resilience of the short term greenhouse gas emission benefits associated with fuel switching from coal to gas in the electric sector, but also call attention to the long term implications of increasing natural gas production and use for emissions of methane and VOCs, especially in the Rocky Mountain region. This analysis can help to inform the broader discussion of the potential environmental impacts of future natural gas production

  7. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  8. The USGS Earthquake Scenario Project

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Petersen, M. D.; Wald, L. A.; Frankel, A. D.; Quitoriano, V. R.; Lin, K.; Luco, N.; Mathias, S.; Bausch, D.

    2009-12-01

    The U.S. Geological Survey’s (USGS) Earthquake Hazards Program (EHP) is producing a comprehensive suite of earthquake scenarios for planning, mitigation, loss estimation, and scientific investigations. The Earthquake Scenario Project (ESP), though lacking clairvoyance, is a forward-looking project, estimating earthquake hazard and loss outcomes as they may occur one day. For each scenario event, fundamental input includes i) the magnitude and specified fault mechanism and dimensions, ii) regional Vs30 shear velocity values for site amplification, and iii) event metadata. A grid of standard ShakeMap ground motion parameters (PGA, PGV, and three spectral response periods) is then produced using the well-defined, regionally-specific approach developed by the USGS National Seismic Hazard Mapping Project (NHSMP), including recent advances in empirical ground motion predictions (e.g., the NGA relations). The framework also allows for numerical (3D) ground motion computations for specific, detailed scenario analyses. Unlike NSHMP ground motions, for ESP scenarios, local rock and soil site conditions and commensurate shaking amplifications are applied based on detailed Vs30 maps where available or based on topographic slope as a proxy. The scenario event set is comprised primarily by selection from the NSHMP events, though custom events are also allowed based on coordination of the ESP team with regional coordinators, seismic hazard experts, seismic network operators, and response coordinators. The event set will be harmonized with existing and future scenario earthquake events produced regionally or by other researchers. The event list includes approximate 200 earthquakes in CA, 100 in NV, dozens in each of NM, UT, WY, and a smaller number in other regions. Systematic output will include all standard ShakeMap products, including HAZUS input, GIS, KML, and XML files used for visualization, loss estimation, ShakeCast, PAGER, and for other systems. All products will be

  9. GLOBAL ALTERNATIVE FUTURE SCENARIOS

    EPA Science Inventory

    One way to examine possible future outcomes for environmental protection is through the development and analysis of alternative future scenarios. This type of assessment postulates two or more different paths that social and environmental development might take, using correspond...

  10. 48 CFR 970.3706 - Performance-based acquisition.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Performance-based acquisition. 970.3706 Section 970.3706 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Facilities Management Contracting 970.3706 Performance-based...

  11. 48 CFR 52.232-32 - Performance-Based Payments.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 2 2014-10-01 2014-10-01 false Performance-Based Payments. 52.232-32 Section 52.232-32 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION (CONTINUED) CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 52.232-32 Performance-Based Payments....

  12. Performance-Based Staff Development: The Cost-Effective Alternative.

    ERIC Educational Resources Information Center

    Boyer, Catherine M.

    1981-01-01

    Describes how to use the performance-based concept in developing staff. Discusses the identification of objectives based on performance expectations and the development of learning experiences that (1) emphasize application of knowledge; (2) integrate adult learning principles; and (3) make use of learning contracts, self-learning packages,…

  13. Team Primacy Concept (TPC) Based Employee Evaluation and Job Performance

    ERIC Educational Resources Information Center

    Muniute, Eivina I.; Alfred, Mary V.

    2007-01-01

    This qualitative study explored how employees learn from Team Primacy Concept (TPC) based employee evaluation and how they use the feedback in performing their jobs. TPC based evaluation is a form of multirater evaluation, during which the employee's performance is discussed by one's peers in a face-to-face team setting. The study used Kolb's…

  14. The Pros and Cons of Performance-Based Compensation.

    ERIC Educational Resources Information Center

    Solmon, Lewis C.; Podgursky, Michael

    This paper analyzes the current and historical criticism of performance-based compensation in K-12 education. It claims that new compensation methods are feasible and are necessary in order to attract and retain the best and the brightest into the teaching profession. The document outlines the objections to performance-based compensation, which in…

  15. Guidelines for Performance Based Evaluation: Teachers, Counselors, Librarians. [New Edition.

    ERIC Educational Resources Information Center

    Missouri State Dept. of Elementary and Secondary Education, Jefferson City.

    Guidelines for the performance-based evaluation of teachers, counselors, and librarians in the Missouri public schools are provided in this manual. Performance-based evaluation of school staff, mandated by state law, is described in terms of its philosophy and procedures, suggested evaluation criteria, and descriptors for each of the three job…

  16. Application of the Water Evaluation and Planning (WEAP) System for Integrated Hydrologic and Scenario-based Water Resources Systems Modeling in the Western Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Mehta, V. K.; Purkey, D. R.; Young, C.; Joyce, B.; Yates, D.

    2008-12-01

    Rivers draining western slopes of the Sierra Nevada provide critical water supply, hydropower, fisheries and recreation services to California. Coordinated efforts are under way to better characterize and model the possible impacts of climate change on Sierra Nevada hydrology. Research suggests substantial end-of- century reductions in Sierra Nevada snowpack and a shift in the center of mass of the snowmelt hydrograph. Management decisions, land use change and population growth add further complexity, necessitating the use of scenario-based modeling tools. The Water Evaluation and Planning (WEAP) system is one of the suite of tools being employed in this effort. Unlike several models that rely on perturbation of historical runoff data to simulate future climate conditions, WEAP includes a dynamically integrated watershed hydrology module that is forced by input climate time series. This allows direct simulation of water management response to climate and land use change. This paper presents ABY2008, a WEAP application for the Yuba, Bear and American River (ABY) watersheds of the Sierra Nevada. These rivers are managed by water agencies and hydropower utilities through a complex network of reservoirs, dams, hydropower plants and water conveyances. Historical watershed hydrology in ABY2008 is driven by a 10 year weekly climate time series from 1991-2000. Land use and soils data were combined into 12 landclasses representing each of 324 hydrological response units. Hydrologic parameters were incorporated from a calibration against observed streamflow developed for the entire western Sierra. Physical reservoir data, operating rules, and water deliveries to water agencies were obtained from public documents of water agencies and power utilities that manage facilities in the watersheds. ABY2008 includes 25 major reservoirs, 39 conveyances, 33 hydropower plants and 14 transmission links to 13 major water demand points. In WEAP, decisions for transferring water at

  17. Competency-Based Performance Appraisals: Improving Performance Evaluations of School Nutrition Managers and Assistants/Technicians

    ERIC Educational Resources Information Center

    Cross, Evelina W.; Asperin, Amelia Estepa; Nettles, Mary Frances

    2009-01-01

    Purpose: The purpose of the research was to develop a competency-based performance appraisal resource for evaluating school nutrition (SN) managers and assistants/technicians. Methods: A two-phased process was used to develop the competency-based performance appraisal resource for SN managers and assistants/technicians. In Phase I, draft…

  18. Standards of Performance for Community Based Educational Institutions: Quick Check of Institutional Performance.

    ERIC Educational Resources Information Center

    Association of Community Based Education, Washington, DC.

    Designed for use with "Standards of Performance for Community Based Educational Institutions" and a "Self-Assessment Workbook," this checklist helps community based educational institutions in identifying areas of performance which need improvement or further study and in assessing the overall effectiveness of the institution in carrying out its…

  19. MIOSAT Mission Scenario and Design

    NASA Astrophysics Data System (ADS)

    Agostara, C.; Dionisio, C.; Sgroi, G.; di Salvo, A.

    2008-08-01

    MIOSAT ("Mssione Ottica su microSATellite") is a low-cost technological / scientific microsatellite mission for Earth Observation, funded by Italian Space Agency (ASI) and managed by a Group Agreement between Rheinmetall Italia - B.U. Spazio - Contraves as leader and Carlo Gavazzi Space as satellite manufacturer. Several others Italians Companies, SME and Universities are involved in the development team with crucial roles. MIOSAT is a microsatellite weighting around 120 kg and placed in a 525 km altitude sun-synchronuos circular LEO orbit. The microsatellite embarks three innovative optical payloads: Sagnac multi spectral radiometer (IFAC-CNR), Mach Zehender spectrometer (IMM-CNR), high resolution pancromatic camera (Selex Galileo). In addition three technological experiments will be tested in-flight. The first one is an heat pipe based on Marangoni effect with high efficiency. The second is a high accuracy Sun Sensor using COTS components and the last is a GNSS SW receiver that utilizes a Leon2 processor. Finally a new generation of 28% efficiency solar cells will be adopted for the power generation. The platform is highly agile and can tilt along and cross flight direction. The pointing accuracy is in the order of 0,1° for each axe. The pointing determination during images acquisition is <0,02° for the axis normal to the boresight and 0,04° for the boresight. This paper deals with MIOSAT mission scenario and definition, highlighting trade-offs for mission implementation. MIOSAT mission design has been constrained from challenging requirements in terms of satellite mass, mission lifetime, instrument performance, that have implied the utilization of satellite agility capability to improve instruments performance in terms of S/N and resolution. The instruments provide complementary measurements that can be combined in effective ways to exploit new applications in the fields of atmosphere composition analysis, Earth emissions, antropic phenomena, etc. The Mission

  20. Enhanced Engine Performance During Emergency Operation Using a Model-Based Engine Control Architecture

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2015-01-01

    This paper discusses the design and application of model-based engine control (MBEC) for use during emergency operation of the aircraft. The MBEC methodology is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40,000) and features an optimal tuner Kalman Filter (OTKF) to estimate unmeasured engine parameters, which can then be used for control. During an emergency scenario, normally-conservative engine operating limits may be relaxed to increase the performance of the engine and overall survivability of the aircraft; this comes at the cost of additional risk of an engine failure. The MBEC architecture offers the advantage of estimating key engine parameters that are not directly measureable. Estimating the unknown parameters allows for tighter control over these parameters, and on the level of risk the engine will operate at. This will allow the engine to achieve better performance than possible when operating to more conservative limits on a related, measurable parameter.

  1. Enhanced Engine Performance During Emergency Operation Using a Model-Based Engine Control Architecture

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and application of model-based engine control (MBEC) for use during emergency operation of the aircraft. The MBEC methodology is applied to the Commercial Modular Aero-Propulsion System Simulation 40k (CMAPSS40k) and features an optimal tuner Kalman Filter (OTKF) to estimate unmeasured engine parameters, which can then be used for control. During an emergency scenario, normally-conservative engine operating limits may be relaxed to increase the performance of the engine and overall survivability of the aircraft; this comes at the cost of additional risk of an engine failure. The MBEC architecture offers the advantage of estimating key engine parameters that are not directly measureable. Estimating the unknown parameters allows for tighter control over these parameters, and on the level of risk the engine will operate at. This will allow the engine to achieve better performance than possible when operating to more conservative limits on a related, measurable parameter.

  2. Management strategies in hospitals: scenario planning

    PubMed Central

    Ghanem, Mohamed; Schnoor, Jörg; Heyde, Christoph-Eckhard; Kuwatsch, Sandra; Bohn, Marco; Josten, Christoph

    2015-01-01

    Background: Instead of waiting for challenges to confront hospital management, doctors and managers should act in advance to optimize and sustain value-based health. This work highlights the importance of scenario planning in hospitals, proposes an elaborated definition of the stakeholders of a hospital and defines the influence factors to which hospitals are exposed to. Methodology: Based on literature analysis as well as on personal interviews with stakeholders we propose an elaborated definition of stakeholders and designed a questionnaire that integrated the following influence factors, which have relevant impact on hospital management: political/legal, economic, social, technological and environmental forces. These influence factors are examined to develop the so-called critical uncertainties. Thorough identification of uncertainties was based on a “Stakeholder Feedback”. Results: Two key uncertainties were identified and considered in this study: the development of workload for the medical staff the profit oriented performance of the medical staff. According to the developed scenarios, complementary education of the medical staff as well as of non-medical top executives and managers of hospitals was the recommended core strategy. Complementary scenario-specific strategic options should be considered whenever needed to optimize dealing with a specific future development of the health care environment. Conclusion: Strategic planning in hospitals is essential to ensure sustainable success. It considers multiple situations and integrates internal and external insights and perspectives in addition to identifying weak signals and “blind spots”. This flows into a sound planning for multiple strategic options. It is a state of the art tool that allows dealing with the increasing challenges facing hospital management. PMID:26504735

  3. The changing nutrition scenario.

    PubMed

    Gopalan, C

    2013-09-01

    The past seven decades have seen remarkable shifts in the nutritional scenario in India. Even up to the 1950s severe forms of malnutrition such as kwashiorkar and pellagra were endemic. As nutritionists were finding home-grown and common-sense solutions for these widespread problems, the population was burgeoning and food was scarce. The threat of widespread household food insecurity and chronic undernutrition was very real. Then came the Green Revolution. Shortages of food grains disappeared within less than a decade and India became self-sufficient in food grain production. But more insidious problems arising from this revolution were looming, and cropping patterns giving low priority to coarse grains and pulses, and monocropping led to depletion of soil nutrients and 'Green Revolution fatigue'. With improved household food security and better access to health care, clinical manifestations of severe malnutrition virtually disappeared. But the decline in chronic undernutrition and "hidden hunger" from micronutrient deficiencies was slow. On the cusp of the new century, an added factor appeared on the nutritional scene in India. With steady urban migration, upward mobility out of poverty, and an increasingly sedentary lifestyle because of improvements in technology and transport, obesity rates began to increase, resulting in a dual burden. Measured in terms of its performance in meeting its Millennium Development Goals, India has fallen short. Despite its continuing high levels of poverty and illiteracy, India has a huge demographic potential in the form of a young population. This advantage must be leveraged by investing in nutrition education, household access to nutritious diets, sanitary environment and a health-promoting lifestyle. This requires co-operation from all the stakeholders, including governments, non government organizations, scientists and the people at large. PMID:24135189

  4. The changing nutrition scenario

    PubMed Central

    Gopalan, C.

    2013-01-01

    The past seven decades have seen remarkable shifts in the nutritional scenario in India. Even up to the 1950s severe forms of malnutrition such as kwashiorkar and pellagra were endemic. As nutritionists were finding home-grown and common-sense solutions for these widespread problems, the population was burgeoning and food was scarce. The threat of widespread household food insecurity and chronic undernutrition was very real. Then came the Green Revolution. Shortages of food grains disappeared within less than a decade and India became self-sufficient in food grain production. But more insidious problems arising from this revolution were looming, and cropping patterns giving low priority to coarse grains and pulses, and monocropping led to depletion of soil nutrients and ‘Green Revolution fatigue’. With improved household food security and better access to health care, clinical manifestations of severe malnutrition virtually disappeared. But the decline in chronic undernutrition and “hidden hunger” from micronutrient deficiencies was slow. On the cusp of the new century, an added factor appeared on the nutritional scene in India. With steady urban migration, upward mobility out of poverty, and an increasingly sedentary lifestyle because of improvements in technology and transport, obesity rates began to increase, resulting in a dual burden. Measured in terms of its performance in meeting its Millennium Development Goals, India has fallen short. Despite its continuing high levels of poverty and illiteracy, India has a huge demographic potential in the form of a young population. This advantage must be leveraged by investing in nutrition education, household access to nutritious diets, sanitary environment and a health-promoting lifestyle. This requires co-operation from all the stakeholders, including governments, non government organizations, scientists and the people at large. PMID:24135189

  5. Comparative study of performance of neutral axis tracking based damage detection

    NASA Astrophysics Data System (ADS)

    Soman, R.; Malinowski, P.; Ostachowicz, W.

    2015-07-01

    This paper presents a comparative study of a novel SHM technique for damage isolation. The performance of the Neutral Axis (NA) tracking based damage detection strategy is compared to other popularly used vibration based damage detection methods viz. ECOMAC, Mode Shape Curvature Method and Strain Flexibility Index Method. The sensitivity of the novel method is compared under changing ambient temperature conditions and in the presence of measurement noise. Finite Element Analysis (FEA) of the DTU 10 MW Wind Turbine was conducted to compare the local damage identification capability of each method and the results are presented. Under the conditions examined, the proposed method was found to be robust to ambient condition changes and measurement noise. The damage identification in some is either at par with the methods mentioned in the literature or better under the investigated damage scenarios.

  6. Automata learning algorithms and processes for providing more complete systems requirements specification by scenario generation, CSP-based syntax-oriented model construction, and R2D2C system requirements transformation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Margaria, Tiziana (Inventor); Rash, James L. (Inventor); Rouff, Christopher A. (Inventor); Steffen, Bernard (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.

  7. [The quality control based on the predictable performance].

    PubMed

    Zheng, D X

    2016-09-01

    The clinical performance can only be evaluated when it comes to the last step in the conventional way of prosthesis. However, it often causes the failure because of the unconformity between the expectation and final performance. Resulting from this kind of situation, quality control based on the predictable results has been suggested. It is a new idea based on the way of reverse thinking, and focuses on the need of patient and puts the final performance of the prosthesis to the first place. With the prosthodontically driven prodedure, dentists can complete the unification with the expectation and the final performance. PMID:27596338

  8. Scenarios for gluino coannihilation

    NASA Astrophysics Data System (ADS)

    Ellis, John; Evans, Jason L.; Luo, Feng; Olive, Keith A.

    2016-02-01

    We study supersymmetric scenarios in which the gluino is the next-to-lightest supersymmetric particle (NLSP), with a mass sufficiently close to that of the lightest supersymmetric particle (LSP) that gluino coannihilation becomes important. One of these scenarios is the MSSM with soft supersymmetry-breaking squark and slepton masses that are universal at an input GUT renormalization scale, but with non-universal gaugino masses. The other scenario is an extension of the MSSM to include vector-like supermultiplets. In both scenarios, we identify the regions of parameter space where gluino coannihilation is important, and discuss their relations to other regions of parameter space where other mechanisms bring the dark matter density into the range allowed by cosmology. In the case of the non-universal MSSM scenario, we find that the allowed range of parameter space is constrained by the requirement of electroweak symmetry breaking, the avoidance of a charged LSP and the measured mass of the Higgs boson, in particular, as well as the appearance of other dark matter (co)annihilation processes. Nevertheless, LSP masses m χ ≲ 8 TeV with the correct dark matter density are quite possible. In the case of pure gravity mediation with additional vector-like supermultiplets, changes to the anomaly- mediated gluino mass and the threshold effects associated with these states can make the gluino almost degenerate with the LSP, and we find a similar upper bound.

  9. BCube Ocean Scenario

    NASA Astrophysics Data System (ADS)

    Santoro, Mattia; Schofield, Oscar; Pearlman, Jay; Nativi, Stefano

    2015-04-01

    To address complex Earth system issues such as climate change and water resources, geoscientists must work across disciplinary boundaries; this requires them to access data outside of their fields. Scientists are being called upon to find, access, and use diverse and voluminous data types that are described with semantics. Within the framework of the NSF EarthCube programme, the BCube project (A Broker Framework for Next Generation Geoscience) is addressing the need for effective and efficient multi-disciplinary collaboration and interoperability through the advancement of brokering technologies. BCube develops science scenarios as key elements in providing an environment for demonstrating capabilities, benefits, and challenges of the developed e-infrastructure. The initial focus is on hydrology, oceans, polar and weather, with the intent to make the technology applicable and available to all the geosciences. This presentation focuses on the BCube ocean scenario. The purpose of this scenario is to increase the understanding of the ocean dynamics through incorporation of a wide range of in-situ and satellite data into ocean models using net primary productivity as the initial variable. The science scenario aims to identify spatial and temporal domains in ocean models, and key ecological variables. Field data sets and remote observations data sets from distributed and heterogeneous systems are accessed through the broker and will be incorporated into the models. In this work we will present the achievements in the development of the BCube ocean scenario.

  10. Safety evaluation of MHTGR licensing basis accident scenarios

    SciTech Connect

    Kroeger, P.G.

    1989-04-01

    The safety potential of the Modular High-Temperature Gas Reactor (MHTGR) was evaluated, based on the Preliminary Safety Information Document (PSID), as submitted by the US Department of Energy to the US Nuclear Regulatory Commission. The relevant reactor safety codes were extended for this purpose and applied to this new reactor concept, searching primarily for potential accident scenarios that might lead to fuel failures due to excessive core temperatures and/or to vessel damage, due to excessive vessel temperatures. The design basis accident scenario leading to the highest vessel temperatures is the depressurized core heatup scenario without any forced cooling and with decay heat rejection to the passive Reactor Cavity Cooling System (RCCS). This scenario was evaluated, including numerous parametric variations of input parameters, like material properties and decay heat. It was found that significant safety margins exist, but that high confidence levels in the core effective thermal conductivity, the reactor vessel and RCCS thermal emissivities and the decay heat function are required to maintain this safety margin. Severe accident extensions of this depressurized core heatup scenario included the cases of complete RCCS failure, cases of massive air ingress, core heatup without scram and cases of degraded RCCS performance due to absorbing gases in the reactor cavity. Except for no-scram scenarios extending beyond 100 hr, the fuel never reached the limiting temperature of 1600/degree/C, below which measurable fuel failures are not expected. In some of the scenarios, excessive vessel and concrete temperatures could lead to investment losses but are not expected to lead to any source term beyond that from the circulating inventory. 19 refs., 56 figs., 11 tabs.

  11. Testing Damage Scenarios. From Historical Earthquakes To Silent Active Faults

    NASA Astrophysics Data System (ADS)

    Galli, P.; Orsini, G.; Bosi, V.; di Pasquale, G.; Galadini, F.

    Italy is rich with historical scenarios of disruption and death that arrived up to us through the insight descriptions of hundreds of manuscripts, reports, treatises, letters and epigraphs. All these historical data constitute today one of the most powerful data-base of earthquake-induced effects. Moreover, it is now possible to relate many of these earthquakes to geological structures, the seismogenetic behavior of which has been investigated by means of paleoseismological studies. On the basis of these information and of those gathered through the national census (performed on popu- lation and dwellings by ISTAT, Italian Institute of Statistics in 1991) we developed a methodology (FaCES, Fault-Controlled Earthquake Scenario) which reproduce the damage scenario caused by the rupture of a defined fault, providing an estimate of the losses in terms of damages to building and consequences to population. The reliabil- ity of scenarios has been tested by comparing the historical damage distribution of an earthquake with that obtained applying FaCES to the responsible fault. Finally, we hypothesize the scenario related to three historically-silent faults of central Apennines (Mt. Vettore, Mt. Gorzano and Gran Sasso faults), the Holocene activity of which has been recently ascertained though paleoseimological analyses.

  12. Acting performance and flow state enhanced with sensory-motor rhythm neurofeedback comparing ecologically valid immersive VR and training screen scenarios.

    PubMed

    Gruzelier, John; Inoue, Atsuko; Smart, Roger; Steed, Anthony; Steffert, Tony

    2010-08-16

    Actors were trained in sensory-motor rhythm (SMR) neurofeedback interfaced with a computer rendition of a theatre auditorium. Enhancement of SMR led to changes in the lighting while inhibition of theta and high beta led to a reduction in intrusive audience noise. Participants were randomised to a virtual reality (VR) representation in a ReaCTor, with surrounding image projection seen through glasses, or to a 2D computer screen, which is the conventional neurofeedback medium. In addition there was a no-training comparison group. Acting performance was evaluated by three experts from both filmed, studio monologues and Hamlet excerpts on the stage of Shakespeare's Globe Theatre. Neurofeedback learning reached an asymptote earlier as did identification of the required mental state following training in the ReaCTor training compared with the computer screen, though groups reached the same asymptote. These advantages were paralleled by higher ratings of acting performance overall, well-rounded performance, and especially the creativity subscale including imaginative expression, conviction and characterisation. On the Flow State scales both neurofeedback groups scored higher than the no-training controls on self-ratings of sense of control, confidence and feeling at-one. This is the first demonstration of enhancement of artistic performance with eyes-open neurofeedback training, previously demonstrated only with eyes-closed slow-wave training. Efficacy is attributed to psychological engagement through the ecologically relevant learning context of the acting-space, putatively allowing transfer to the real world otherwise achieved with slow-wave training through imaginative visualisation. The immersive VR technology was more successful than a 2D rendition. PMID:20542087

  13. Performance-Based Pay in the Federal Government. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Performance-Based Pay in the Federal Government"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Steve Nelson discusses the evolution of employee pay systems in the federal government, from the inception of the General Schedule to continuing interest in creating more…

  14. A Model of Statistics Performance Based on Achievement Goal Theory.

    ERIC Educational Resources Information Center

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  15. Assessment in Performance-Based Secondary Music Classes

    ERIC Educational Resources Information Center

    Pellegrino, Kristen; Conway, Colleen M.; Russell, Joshua A.

    2015-01-01

    After sharing research findings about grading and assessment practices in secondary music ensemble classes, we offer examples of commonly used assessment tools (ratings scale, checklist, rubric) for the performance ensemble. Then, we explore the various purposes of assessment in performance-based music courses: (1) to meet state, national, and…

  16. Guidelines for Performance Based Teacher Evaluation in Missouri.

    ERIC Educational Resources Information Center

    Missouri State Dept. of Elementary and Secondary Education, Jefferson City.

    A step-by-step outline of recommended procedures is presented for a performance based evaluation system for teachers in Missouri. Four general performance areas form the basis for the system: (1) instructional processes; (2) classroom management; (3) interpersonal relationships; and (4) professional responsibilities. Nineteen specific,…

  17. Benchmarking and parallel scalability of MANCINTAP, a Parallel High-Performance Tool For Neutron Activation Analysis in Complex 4D Scenarios

    NASA Astrophysics Data System (ADS)

    Firpo, G.; Frambati, S.; Frignani, M.; Gerra, G.

    2014-06-01

    MANCINTAP is a parallel computational tool developed by Ansaldo Nucleare to perform 4D neutron transport, activation and time-resolved dose-rate calculations in very complex geometries for CPU-intensive fission and fusion applications. MANCINTAP creates an automated link between the 3D radiation transport code MCNP5—which is used to evaluate both the neutron fluxes for activation calculations and the resulting secondary gamma dose rates—and the zero-dimensional activation code Anita2000 by handling crucial processes such as data exchange, determination of material mixtures and generation of cumulative probability distributions. A brief description of the computational tool is given here, with particular emphasis on the key technical choices underlying the project. Benchmarking of MANCINTAP has been performed in three steps: (i) against a very simplified model, where an analytical solution is available for comparison; (ii) against the well-established deterministic transport and activation code ATTILA and (iii) against experimental data obtained at the Frascati Neutron Generator (FNG) facility. An analysis of MANCINTAP scalability performances is proposed to demonstrate the robustness of its parallel structure, tailored for HPC applications, which makes it—to the best of our knowledge—a novel tool.

  18. A Native American exposure scenario.

    PubMed

    Harris, S G; Harper, B L

    1997-12-01

    EPA's Risk Assessment Guidance for Superfund (RAGS) and later documents provide guidance for estimating exposures received from suburban and agricultural activity patterns and lifestyles. However, these methods are not suitable for typical tribal communities whose members pursue, at least in part, traditional lifestyles. These lifestyles are derived from a long association with all of the resources in a particular region. We interviewed 35 members of a Columbia River Basin tribe to develop a lifestyle-based subsistence exposure scenario that represents a midrange exposure that a traditional tribal member would receive. This scenario provides a way to partially satisfy Executive Order 12,898 on environmental justice, which requires a specific evaluation of impacts from federal actions to peoples with subsistence diets. Because a subsistence diet is only a portion of what is important to a traditional lifestyle, we also used information obtained from the interviews to identify parameters for evaluating impacts to environmental and sociocultural quality of life. PMID:9463932

  19. FUTURE SCENARIOS OF CHANGE IN WILDLIFE HABITAT

    EPA Science Inventory

    Studies in Pennsylvania, Iowa, California, and Oregon show varying losses of terrestrial wildlife habitat in scenarios based on different assumptions about future human land use patterns. Retrospective estimates of losses of habitat since Euro-American settlement in several stud...

  20. From Performance Reporting to Performance-Based Funding: Florida's Experiences in Workforce Development Performance Measurement.

    ERIC Educational Resources Information Center

    Pfeiffer, Jay J.

    1998-01-01

    Discusses accountability in Florida colleges, specifically the movement toward providing state funds to public higher education institutions based on student outputs and outcome--including post-graduation earnings--instead of full-time equivalency enrollment data. Describes several related legislative policies, including the Workforce Florida Act…

  1. Estimating the economic impact of a repository from scenario-based surveys: Models of the relation of stated intent to actual behavior

    SciTech Connect

    Easterling, D.; Morwitz, V.; Kunreuther, H.

    1990-12-01

    The task of estimating the economic impact of a facility as novel and long-lived as a high-level nuclear waste (HLNW) repository is fraught with uncertainty. One approach to the forecasting problems is to survey economic agents as to how they would respond when confronted with hypothetical repository scenarios. A series of such studies conducted for the state of Nevada have examined the potential impact of a Yucca Mountain repository on behavior such as planning conventions, attending conventions, vacationing, outmigration, immigration, and business location. In each case, respondents drawn from a target population report on whether a particular repository event (either some form of an accident, or simply the presence of the facility) would cause them to act any differently than they otherwise would. The responses to such a survey provide an indication of whether or not economic behavior would be altered. However, the analysis is inevitably plagued with the question of how much credence to place in the reports of intended behavior; can we believe what people report they would do in a hypothetical situation? The present study examines a more precise version of this question regarding the validity of stated intent data. After reviewing a variety of literature in the area of intent versus actual behavior, we provide an answer to the question, ``What levels of actual behavior are consistent with the intent data that have been observed in the repository surveys?`` More formally, we assume that we are generally interested in predicting the proportion of a sample who will actually perform a target behavior. 86 refs., 6 figs., 9 tabs.

  2. Performance analysis of a potassium-base AMTEC cell

    SciTech Connect

    Huang, C.; Hendricks, T.J.; Hunt, T.K.

    1998-07-01

    Sodium-BASE Alkali-Metal-Thermal-to-Electric-Conversion (AMTEC) cells have been receiving increased attention and funding from the Department of Energy, NASA and the United States Air Force. Recently, sodium-BASE (Na-BASE) AMTEC cells were selected for the Advanced Radioisotope Power System (ARPS) program for the next generation of deep-space missions and spacecraft. Potassium-BASE (K-BASE) AMTEC cells have not received as much attention to date, even though the vapor pressure of potassium is higher than that of sodium at the same temperature. So that, K-BASE AMTEC cells with potentially higher open circuit voltage and higher power output than Na-BASE AMTEC cells are possible. Because the surface tension of potassium is about half of the surface tension of sodium at the same temperature, the artery and evaporator design in a potassium AMTEC cell has much more challenging pore size requirements than designs using sodium. This paper uses a flexible thermal/fluid/electrical model to predict the performance of a K-BASE AMTEC cell. Pore sizes in the artery of K-BASE AMTEC cells must be smaller by an order of magnitude than in Na-BASE AMTEC cells. The performance of a K-BASE AMTEC cell was higher than a Na-BASE AMTEC cell at low voltages/high currents. K-BASE AMTEC cells also have the potential of much better electrode performance, thereby creating another avenue for potentially better performance in K-BASE AMTEC cells.

  3. The SAFRR Tsunami Scenario

    USGS Publications Warehouse

    Porter, K.; Jones, Lucile M.; Ross, Stephanie L.; Borrero, J.; Bwarie, J.; Dykstra, D.; Geist, Eric L.; Johnson, L.; Kirby, Stephen H.; Long, K.; Lynett, P.; Miller, K.; Mortensen, Carl E.; Perry, S.; Plumlee, G.; Real, C.; Ritchie, L.; Scawthorn, C.; Thio, H.K.; Wein, Anne; Whitmore, P.; Wilson, R.; Wood, Nathan J.

    2013-01-01

    The U.S. Geological Survey and several partners operate a program called Science Application for Risk Reduction (SAFRR) that produces (among other things) emergency planning scenarios for natural disasters. The scenarios show how science can be used to enhance community resiliency. The SAFRR Tsunami Scenario describes potential impacts of a hypothetical, but realistic, tsunami affecting California (as well as the west coast of the United States, Alaska, and Hawaii) for the purpose of informing planning and mitigation decisions by a variety of stakeholders. The scenario begins with an Mw 9.1 earthquake off the Alaska Peninsula. With Pacific basin-wide modeling, we estimate up to 5m waves and 10 m/sec currents would strike California 5 hours later. In marinas and harbors, 13,000 small boats are damaged or sunk (1 in 3) at a cost of $350 million, causing navigation and environmental problems. Damage in the Ports of Los Angeles and Long Beach amount to $110 million, half of it water damage to vehicles and containerized cargo. Flooding of coastal communities affects 1800 city blocks, resulting in $640 million in damage. The tsunami damages 12 bridge abutments and 16 lane-miles of coastal roadway, costing $85 million to repair. Fire and business interruption losses will substantially add to direct losses. Flooding affects 170,000 residents and workers. A wide range of environmental impacts could occur. An extensive public education and outreach program is underway, as well as an evaluation of the overall effort.

  4. Biomass Scenario Model

    SciTech Connect

    2015-09-01

    The Biomass Scenario Model (BSM) is a unique, carefully validated, state-of-the-art dynamic model of the domestic biofuels supply chain which explicitly focuses on policy issues, their feasibility, and potential side effects. It integrates resource availability, physical/technological/economic constraints, behavior, and policy. The model uses a system dynamics simulation (not optimization) to model dynamic interactions across the supply chain.

  5. Writing clinical scenarios for clinical science questions.

    PubMed

    Smith, Phil Em; Mucklow, John C

    2016-04-01

    Written knowledge assessments for physicians in training typically involve multiple-choice questions that use a clinical scenario in a single-best-answer format. The Royal College of Physicians Part 1 MRCP(UK) examination includes basic sciences themes that are challenging to assess through a clinical scenario. A realistic clinical setting based on everyday clinical practice and integral to the question is the clearest demonstration that the knowledge being assessed is clinically relevant. However, without special attention to detail, the scenario in a clinical science question can appear redundant or artificial. Reading unnecessary material frustrates candidates and threatens the reputation of the assessment. In this paper we discuss why a clinical scenario is important for basic science questions and offer advice on setting realistic and plausible clinical scenarios for such questions. PMID:27037383

  6. Control of large antennas based on electromagnetic performance criteria

    NASA Technical Reports Server (NTRS)

    Lin, Y. H.; Hamidi, M.; Manshadi, M.

    1985-01-01

    The electromagnetic (EM) performance of large flexible antennas is traditionally achieved by imposing stringent geometric restrictions on the structural distortions from a nominal optimum configuration. An approach to alleviate the stringency of the geometrical criteria of satisfactory performance is presented. The approach consists of generating a linear optimal control problem with quadratic cost functional where the cost functional is obtained from the EM characteristics of the antenna and the dynamic system constraint is given by the structural model of the antenna. It is established that the EM based optimal controller is considerably more efficient than the traditional geometrical based controllers. The same EM performance can be achieved with a much reduced control effort.

  7. Effect of hand-based sensors on manipulator control performance

    NASA Technical Reports Server (NTRS)

    Bejczy, A. K.

    1977-01-01

    Manipulator task categories and motion phases require various hand-based information systems to meet the control performance requirements. The effect of proximity, tactile and force/torque sensors on the performance of remote manipulator control is discussed. An overview is presented on various experimental hand-based information systems which provide the manipulator controller some non-visual 'awareness' of the task environment. The rest of the paper describes and evaluates various control experiments performed at JPL using hand-mounted proximity sensors to guide and control hand motion near solid objects.

  8. Performance assessment of simulated 3D laser images using Geiger-mode avalanche photo-diode: tests on simple synthetic scenarios

    NASA Astrophysics Data System (ADS)

    Coyac, Antoine; Hespel, Laurent; Riviere, Nicolas; Briottet, Xavier

    2015-10-01

    In the past few decades, laser imaging has demonstrated its potential in delivering accurate range images of objects or scenes, even at long range or under bad weather conditions (rain, fog, day and night vision). We note great improvements in the conception and development of single and multi infrared sensors, concerning embedability, circuitry reading capacity, or pixel resolution and sensitivity, allowing a wide diversity of applications (i.e. enhanced vision, long distance target detection and reconnaissance, 3D DSM generation). Unfortunately, it is often difficult to dispose of all the instruments to compare their performance for a given application. Laser imaging simulation has shown to be an interesting alternative to acquire real data, offering a higher flexibility to perform this sensors comparison, plus being time and cost efficient. In this paper, we present a 3D laser imaging end-to-end simulator using a focal plane array with Geiger mode detection, named LANGDOC. This work aims to highlight the interest and capability of this new generation of photo-diodes arrays, especially for airborne mapping and surveillance of high risk areas.

  9. Environmental assessment of spatial plan policies through land use scenarios

    SciTech Connect

    Geneletti, Davide

    2012-01-15

    This paper presents a method based on scenario analysis to compare the environmental effects of different spatial plan policies in a range of possible futures. The study aimed at contributing to overcome two limitations encountered in Strategic Environmental Assessment (SEA) for spatial planning: poor exploration of how the future might unfold, and poor consideration of alternative plan policies. Scenarios were developed through what-if functions and spatial modeling in a Geographical Information System (GIS), and consisted in maps that represent future land uses under different assumptions on key driving forces. The use of land use scenarios provided a representation of how the different policies will look like on the ground. This allowed gaining a better understanding of the policies' implications on the environment, which could be measured through a set of indicators. The research undertook a case-study approach by developing and assessing land use scenarios for the future growth of Caia, a strategically-located and fast-developing town in rural Mozambique. The effects of alternative spatial plan policies were assessed against a set of environmental performance indicators, including deforestation, loss of agricultural land, encroachment of flood-prone areas and wetlands and access to water sources. In this way, critical environmental effects related to the implementation of each policy were identified and discussed, suggesting possible strategies to address them. - Graphical abstract: Display Omitted Research Highlights: Black-Right-Pointing-Pointer The method contributes to two critical issues in SEA: exploration of the future and consideration of alternatives. Black-Right-Pointing-Pointer Future scenarios are used to test the environmental performance of different spatial plan policies in uncertainty conditions. Black-Right-Pointing-Pointer Spatially-explicit land use scenarios provide a representation of how different policies will look like on the ground.

  10. Guidelines for performance-based supplier audits (NCIG-16)

    SciTech Connect

    Lauderdale, J.R.; Mattu, R.K.; Roman, W.S. )

    1990-06-01

    This document provides guidelines for planning and conducting performance-based audits of suppliers of items used in nuclear power plants. A common purpose of audits is to provide a basis for confidence in the supplier's controls to ensure that products received will perform their intended functions satisfactorily. Performance-based audits offer means of raising the level of confidence. This confidence comes from evaluation of important features of the product and the processes and activities that produce it. This document does not add requirements to those in existing codes, standards, or regulations. The guidance herein is intended to complement the information in existing industry standards and practices. Performance-based audits are one element of an effective procurement program. A companion EPRI/NCIG document, EPRI NP-6629, Guidelines for the Procurement and Receipt of Items for Nuclear Power Plants (NCIG-15), provides guidance for other elements of an effective procurement program.

  11. Mapping of multiple parameter m-health scenarios to mobile WiMAX QoS variables.

    PubMed

    Alinejad, Ali; Philip, N; Istepanian, R S H

    2011-01-01

    Multiparameter m-health scenarios with bandwidth demanding requirements will be one of key applications in future 4 G mobile communication systems. These applications will potentially require specific spectrum allocations with higher quality of service requirements. Furthermore, one of the key 4 G technologies targeting m-health will be medical applications based on WiMAX systems. Hence, it is timely to evaluate such multiple parametric m-health scenarios over mobile WiMAX networks. In this paper, we address the preliminary performance analysis of mobile WiMAX network for multiparametric telemedical scenarios. In particular, we map the medical QoS to typical WiMAX QoS parameters to optimise the performance of these parameters in typical m-health scenario. Preliminary performance analyses of the proposed multiparametric scenarios are evaluated to provide essential information for future medical QoS requirements and constraints in these telemedical network environments. PMID:22254612

  12. Combination of Face Regions in Forensic Scenarios.

    PubMed

    Tome, Pedro; Fierrez, Julian; Vera-Rodriguez, Ruben; Ortega-Garcia, Javier

    2015-07-01

    This article presents an experimental analysis of the combination of different regions of the human face on various forensic scenarios to generate scientific knowledge useful for the forensic experts. Three scenarios of interest at different distances are considered comparing mugshot and CCTV face images using MORPH and SC face databases. One of the main findings is that inner facial regions combine better in mugshot and close CCTV scenarios and outer facial regions combine better in far CCTV scenarios. This means, that depending of the acquisition distance, the discriminative power of the facial regions change, having in some cases better performance than the full face. This effect can be exploited by considering the fusion of facial regions which results in a very significant improvement of the discriminative performance compared to just using the full face. PMID:26189995

  13. Spent fuel receipt scenarios study

    SciTech Connect

    Ballou, L.B.; Montan, D.N.; Revelli, M.A.

    1990-09-01

    This study reports on the results of an assignment from the DOE Office of Civilian Radioactive Waste Management to evaluate of the effects of different scenarios for receipt of spent fuel on the potential performance of the waste packages in the proposed Yucca Mountain high-level waste repository. The initial evaluations were performed and an interim letter report was prepared during the fall of 1988. Subsequently, the scope of work was expanded and additional analyses were conducted in 1989. This report combines the results of the two phases of the activity. This study is a part of a broader effort to investigate the options available to the DOE and the nuclear utilities for selection of spent fuel for acceptance into the Federal Waste Management System for disposal. Each major element of the system has evaluated the effects of various options on its own operations, with the objective of providing the basis for performing system-wide trade-offs and determining an optimum acceptance scenario. Therefore, this study considers different scenarios for receipt of spent fuel by the repository only from the narrow perspective of their effect on the very-near-field temperatures in the repository following permanent closure. This report is organized into three main sections. The balance of this section is devoted to a statement of the study objective, a summary of the assumptions. The second section of the report contains a discussion of the major elements of the study. The third section summarizes the results of the study and draws some conclusions from them. The appendices include copies of the waste acceptance schedule and the existing and projected spent fuel inventory that were used in the study. 10 refs., 27 figs.

  14. High performance computing for three-dimensional agent-based molecular models.

    PubMed

    Pérez-Rodríguez, G; Pérez-Pérez, M; Fdez-Riverola, F; Lourenço, A

    2016-07-01

    Agent-based simulations are increasingly popular in exploring and understanding cellular systems, but the natural complexity of these systems and the desire to grasp different modelling levels demand cost-effective simulation strategies and tools. In this context, the present paper introduces novel sequential and distributed approaches for the three-dimensional agent-based simulation of individual molecules in cellular events. These approaches are able to describe the dimensions and position of the molecules with high accuracy and thus, study the critical effect of spatial distribution on cellular events. Moreover, two of the approaches allow multi-thread high performance simulations, distributing the three-dimensional model in a platform independent and computationally efficient way. Evaluation addressed the reproduction of molecular scenarios and different scalability aspects of agent creation and agent interaction. The three approaches simulate common biophysical and biochemical laws faithfully. The distributed approaches show improved performance when dealing with large agent populations while the sequential approach is better suited for small to medium size agent populations. Overall, the main new contribution of the approaches is the ability to simulate three-dimensional agent-based models at the molecular level with reduced implementation effort and moderate-level computational capacity. Since these approaches have a generic design, they have the major potential of being used in any event-driven agent-based tool. PMID:27372059

  15. Riparian vegetation structure under desertification scenarios

    NASA Astrophysics Data System (ADS)

    Rosário Fernandes, M.; Segurado, Pedro; Jauch, Eduardo; Ferreira, M. Teresa

    2015-04-01

    Riparian areas are responsible for many ecological and ecosystems services, including the filtering function, that are considered crucial to the preservation of water quality and social benefits. The main goal of this study is to quantify and understand the riparian variability under desertification scenario(s) and identify the optimal riparian indicators for water scarcity and droughts (WS&D), henceforth improving river basin management. This study was performed in the Iberian Tâmega basin, using riparian woody patches, mapped by visual interpretation on Google Earth imagery, along 130 Sampling Units of 250 m long river stretches. Eight riparian structural indicators, related with lateral dimension, weighted area and shape complexity of riparian patches were calculated using Patch Analyst extension for ArcGis 10. A set of 29 hydrological, climatic, and hydrogeomorphological variables were computed, by a water modelling system (MOHID), using monthly meteorological data between 2008 and 2014. Land-use classes were also calculated, in a 250m-buffer surrounding each sampling unit, using a classification based system on Corine Land Cover. Boosted Regression Trees identified Mean-width (MW) as the optimal riparian indicator for water scarcity and drought, followed by the Weighted Class Area (WCA) (classification accuracy =0.79 and 0.69 respectively). Average Flow and Strahler number were consistently selected, by all boosted models, as the most important explanatory variables. However, a combined effect of hidrogeomorphology and land-use can explain the high variability found in the riparian width mainly in Tâmega tributaries. Riparian patches are larger towards Tâmega river mouth although with lower shape complexity, probably related with more continuous and almost monospecific stands. Climatic, hydrological and land use scenarios, singly and combined, were used to quantify the riparian variability responding to these changes, and to assess the loss of riparian

  16. Underground infrastructure damage for a Chicago scenario

    SciTech Connect

    Dey, Thomas N; Bos, Rabdall J

    2011-01-25

    Estimating effects due to an urban IND (improvised nuclear device) on underground structures and underground utilities is a challenging task. Nuclear effects tests performed at the Nevada Test Site (NTS) during the era of nuclear weapons testing provides much information on how underground military structures respond. Transferring this knowledge to answer questions about the urban civilian environment is needed to help plan responses to IND scenarios. Explosions just above the ground surface can only couple a small fraction of the blast energy into an underground shock. The various forms of nuclear radiation have limited penetration into the ground. While the shock transmitted into the ground carries only a small fraction of the blast energy, peak stresses are generally higher and peak ground displacement is lower than in the air blast. While underground military structures are often designed to resist stresses substantially higher than due to the overlying rocks and soils (overburden), civilian structures such as subways and tunnels would generally only need to resist overburden conditions with a suitable safety factor. Just as we expect the buildings themselves to channel and shield air blast above ground, basements and other underground openings as well as changes of geology will channel and shield the underground shock wave. While a weaker shock is expected in an urban environment, small displacements on very close-by faults, and more likely, soils being displaced past building foundations where utility lines enter could readily damaged or disable these services. Immediately near an explosion, the blast can 'liquefy' a saturated soil creating a quicksand-like condition for a period of time. We extrapolate the nuclear effects experience to a Chicago-based scenario. We consider the TARP (Tunnel and Reservoir Project) and subway system and the underground lifeline (electric, gas, water, etc) system and provide guidance for planning this scenario.

  17. Understanding the relationship between safety investment and safety performance of construction projects through agent-based modeling.

    PubMed

    Lu, Miaojia; Cheung, Clara Man; Li, Heng; Hsu, Shu-Chien

    2016-09-01

    The construction industry in Hong Kong increased its safety investment by 300% in the past two decades; however, its accident rate has plateaued to around 50% for one decade. Against this backdrop, researchers have found inconclusive results on the causal relationship between safety investment and safety performance. Using agent-based modeling, this study takes an unconventional bottom-up approach to study safety performance on a construction site as an outcome of a complex system defined by interactions among a worksite, individual construction workers, and different safety investments. Instead of focusing on finding the absolute relationship between safety investment and safety performance, this study contributes to providing a practical framework to investigate how different safety investments interacting with different parameters such as human and environmental factors could affect safety performance. As a result, we could identify cost-effective safety investments under different construction scenarios for delivering optimal safety performance. PMID:27240124

  18. Evaluation of the Terminal Sequencing and Spacing System for Performance Based Navigation Arrivals

    NASA Technical Reports Server (NTRS)

    Thipphavong, Jane; Jung, Jaewoo; Swenson, Harry N.; Martin, Lynne; Lin, Melody; Nguyen, Jimmy

    2013-01-01

    NASA has developed the Terminal Sequencing and Spacing (TSS) system, a suite of advanced arrival management technologies combining timebased scheduling and controller precision spacing tools. TSS is a ground-based controller automation tool that facilitates sequencing and merging arrivals that have both current standard ATC routes and terminal Performance-Based Navigation (PBN) routes, especially during highly congested demand periods. In collaboration with the FAA and MITRE's Center for Advanced Aviation System Development (CAASD), TSS system performance was evaluated in human-in-the-loop (HITL) simulations with currently active controllers as participants. Traffic scenarios had mixed Area Navigation (RNAV) and Required Navigation Performance (RNP) equipage, where the more advanced RNP-equipped aircraft had preferential treatment with a shorter approach option. Simulation results indicate the TSS system achieved benefits by enabling PBN, while maintaining high throughput rates-10% above baseline demand levels. Flight path predictability improved, where path deviation was reduced by 2 NM on average and variance in the downwind leg length was 75% less. Arrivals flew more fuel-efficient descents for longer, spending an average of 39 seconds less in step-down level altitude segments. Self-reported controller workload was reduced, with statistically significant differences at the p less than 0.01 level. The RNP-equipped arrivals were also able to more frequently capitalize on the benefits of being "Best-Equipped, Best- Served" (BEBS), where less vectoring was needed and nearly all RNP approaches were conducted without interruption.

  19. Integrating policy-based management and SLA performance monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tzong-Jye; Lin, Chin-Yi; Chang, Shu-Hsin; Yen, Meng-Tzu

    2001-10-01

    Policy-based management system provides the configuration capability for the system administrators to focus on the requirements of customers. The service level agreement performance monitoring mechanism helps system administrators to verify the correctness of policies. However, it is difficult for a device to process the policies directly because the policies are the management concept. This paper proposes a mechanism to decompose a policy into rules that can be efficiently processed by a device. Thus, the device may process the rule and collect the performance statistics information efficiently; and the policy-based management system may collect these performance statistics information and report the service-level agreement performance monitoring information to the system administrator. The proposed policy-based management system achieves both the policy configuration and service-level agreement performance monitoring requirements. A policy consists of a condition part and an action part. The condition part is a Boolean expression of a source host IP group, a destination host IP group, etc. The action part is the parameters of services. We say that an address group is compact if it only consists of a range of IP address that can be denoted by a pair of IP address and corresponding IP mask. If the condition part of a policy only consists of the compact address group, we say that the policy is a rule. Since a device can efficiently process a compact address and a system administrator prefers to define a range of IP address, the policy-based management system has to translate policy into rules and supplements the gaps between policy and rules. The proposed policy-based management system builds the relationships between VPN and policies, policy and rules. Since the system administrator wants to monitor the system performance information of VPNs and policies, the proposed policy-based management system downloads the relationships among VPNs, policies and rules to the

  20. Performance optimization of web-based medical simulation.

    PubMed

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2013-01-01

    This paper presents a technique for performance optimization of multimodal interactive web-based medical simulation. A web-based simulation framework is promising for easy access and wide dissemination of medical simulation. However, the real-time performance of the simulation highly depends on hardware capability on the client side. Providing consistent simulation in different hardware is critical for reliable medical simulation. This paper proposes a non-linear mixed integer programming model to optimize the performance of visualization and physics computation while considering hardware capability and application specific constraints. The optimization model identifies and parameterizes the rendering and computing capabilities of the client hardware using an exploratory proxy code. The parameters are utilized to determine the optimized simulation conditions including texture sizes, mesh sizes and canvas resolution. The test results show that the optimization model not only achieves a desired frame per second but also resolves visual artifacts due to low performance hardware. PMID:23400151

  1. Parallel performance optimizations on unstructured mesh-based simulations

    SciTech Connect

    Sarje, Abhinav; Song, Sukhyun; Jacobsen, Douglas; Huck, Kevin; Hollingsworth, Jeffrey; Malony, Allen; Williams, Samuel; Oliker, Leonid

    2015-06-01

    This paper addresses two key parallelization challenges the unstructured mesh-based ocean modeling code, MPAS-Ocean, which uses a mesh based on Voronoi tessellations: (1) load imbalance across processes, and (2) unstructured data access patterns, that inhibit intra- and inter-node performance. Our work analyzes the load imbalance due to naive partitioning of the mesh, and develops methods to generate mesh partitioning with better load balance and reduced communication. Furthermore, we present methods that minimize both inter- and intranode data movement and maximize data reuse. Our techniques include predictive ordering of data elements for higher cache efficiency, as well as communication reduction approaches. We present detailed performance data when running on thousands of cores using the Cray XC30 supercomputer and show that our optimization strategies can exceed the original performance by over 2×. Additionally, many of these solutions can be broadly applied to a wide variety of unstructured grid-based computations.

  2. Human Factors Considerations for Performance-Based Navigation

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Adams, Catherine A.

    2006-01-01

    A transition toward a performance-based navigation system is currently underway in both the United States and around the world. Performance-based navigation incorporates Area Navigation (RNAV) and Required Navigation Performance (RNP) procedures that do not rely on the location of ground-based navigation aids. These procedures offer significant benefits to both operators and air traffic managers. Under sponsorship from the Federal Aviation Administration (FAA), the National Aeronautics and Space Administration (NASA) has undertaken a project to document human factors issues that have emerged during RNAV and RNP operations and propose areas for further consideration. Issues were found to include aspects of air traffic control and airline procedures, aircraft systems, and procedure design. Major findings suggest the need for human factors-specific instrument procedure design guidelines. Ongoing industry and government activities to address air-ground communication terminology, procedure design improvements, and chart-database commonality are strongly encouraged.

  3. Evaluation of a weather generator-based method for statistically downscaling non-stationary climate scenarios for impact assessment at a point scale

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The non-stationarity is a major concern for statistically downscaling climate change scenarios for impact assessment. This study is to evaluate whether a statistical downscaling method is fully applicable to generate daily precipitation under non-stationary conditions in a wide range of climatic zo...

  4. Alpha neurofeedback training improves SSVEP-based BCI performance

    NASA Astrophysics Data System (ADS)

    Wan, Feng; Nuno da Cruz, Janir; Nan, Wenya; Wong, Chi Man; Vai, Mang I.; Rosa, Agostinho

    2016-06-01

    Objective. Steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs) can provide relatively easy, reliable and high speed communication. However, the performance is still not satisfactory, especially in some users who are not able to generate strong enough SSVEP signals. This work aims to strengthen a user’s SSVEP by alpha down-regulating neurofeedback training (NFT) and consequently improve the performance of the user in using SSVEP-based BCIs. Approach. An experiment with two steps was designed and conducted. The first step was to investigate the relationship between the resting alpha activity and the SSVEP-based BCI performance, in order to determine the training parameter for the NFT. Then in the second step, half of the subjects with ‘low’ performance (i.e. BCI classification accuracy <80%) were randomly assigned to a NFT group to perform a real-time NFT, and the rest half to a non-NFT control group for comparison. Main results. The first step revealed a significant negative correlation between the BCI performance and the individual alpha band (IAB) amplitudes in the eyes-open resting condition in a total of 33 subjects. In the second step, it was found that during the IAB down-regulating NFT, on average the subjects were able to successfully decrease their IAB amplitude over training sessions. More importantly, the NFT group showed an average increase of 16.5% in the SSVEP signal SNR (signal-to-noise ratio) and an average increase of 20.3% in the BCI classification accuracy, which was significant compared to the non-NFT control group. Significance. These findings indicate that the alpha down-regulating NFT can be used to improve the SSVEP signal quality and the subjects’ performance in using SSVEP-based BCIs. It could be helpful to the SSVEP related studies and would contribute to more effective SSVEP-based BCI applications.

  5. A Likelihood-Based Approach to Identifying Contaminated Food Products Using Sales Data: Performance and Challenges

    PubMed Central

    Kaufman, James; Lessler, Justin; Harry, April; Edlund, Stefan; Hu, Kun; Douglas, Judith; Thoens, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias

    2014-01-01

    Foodborne disease outbreaks of recent years demonstrate that due to increasingly interconnected supply chains these type of crisis situations have the potential to affect thousands of people, leading to significant healthcare costs, loss of revenue for food companies, and—in the worst cases—death. When a disease outbreak is detected, identifying the contaminated food quickly is vital to minimize suffering and limit economic losses. Here we present a likelihood-based approach that has the potential to accelerate the time needed to identify possibly contaminated food products, which is based on exploitation of food products sales data and the distribution of foodborne illness case reports. Using a real world food sales data set and artificially generated outbreak scenarios, we show that this method performs very well for contamination scenarios originating from a single “guilty” food product. As it is neither always possible nor necessary to identify the single offending product, the method has been extended such that it can be used as a binary classifier. With this extension it is possible to generate a set of potentially “guilty” products that contains the real outbreak source with very high accuracy. Furthermore we explore the patterns of food distributions that lead to “hard-to-identify” foods, the possibility of identifying these food groups a priori, and the extent to which the likelihood-based method can be used to quantify uncertainty. We find that high spatial correlation of sales data between products may be a useful indicator for “hard-to-identify” products. PMID:24992565

  6. Roadmap Toward a Predictive Performance-based Commercial Energy Code

    SciTech Connect

    Rosenberg, Michael I.; Hart, Philip R.

    2014-10-01

    Energy codes have provided significant increases in building efficiency over the last 38 years, since the first national energy model code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. The current focus on prescriptive codes has limitations including significant variation in actual energy performance depending on which prescriptive options are chosen, a lack of flexibility for designers and developers, and the inability to handle control optimization that is specific to building type and use. This paper provides a high level review of different options for energy codes, including prescriptive, prescriptive packages, EUI Target, outcome-based, and predictive performance approaches. This paper also explores a next generation commercial energy code approach that places a greater emphasis on performance-based criteria. A vision is outlined to serve as a roadmap for future commercial code development. That vision is based on code development being led by a specific approach to predictive energy performance combined with building specific prescriptive packages that are designed to be both cost-effective and to achieve a desired level of performance. Compliance with this new approach can be achieved by either meeting the performance target as demonstrated by whole building energy modeling, or by choosing one of the prescriptive packages.

  7. Performance Comparison of HPF and MPI Based NAS Parallel Benchmarks

    NASA Technical Reports Server (NTRS)

    Saini, Subhash

    1997-01-01

    Compilers supporting High Performance Form (HPF) features first appeared in late 1994 and early 1995 from Applied Parallel Research (APR), Digital Equipment Corporation, and The Portland Group (PGI). IBM introduced an HPF compiler for the IBM RS/6000 SP2 in April of 1996. Over the past two years, these implementations have shown steady improvement in terms of both features and performance. The performance of various hardware/ programming model (HPF and MPI) combinations will be compared, based on latest NAS Parallel Benchmark results, thus providing a cross-machine and cross-model comparison. Specifically, HPF based NPB results will be compared with MPI based NPB results to provide perspective on performance currently obtainable using HPF versus MPI or versus hand-tuned implementations such as those supplied by the hardware vendors. In addition, we would also present NPB, (Version 1.0) performance results for the following systems: DEC Alpha Server 8400 5/440, Fujitsu CAPP Series (VX, VPP300, and VPP700), HP/Convex Exemplar SPP2000, IBM RS/6000 SP P2SC node (120 MHz), NEC SX-4/32, SGI/CRAY T3E, and SGI Origin2000. We would also present sustained performance per dollar for Class B LU, SP and BT benchmarks.

  8. Biotechnology-based odour control: design criteria and performance data.

    PubMed

    Quigley, C; Easter, C; Burrowes, P; Witherspoon, J

    2004-01-01

    As neighbouring areas continue to encroach upon wastewater treatment plants, there is an increasing need for odour control to mitigate potential negative offsite odorous impacts. One technology that is gaining widespread acceptance is biotechnology, which utilises the inherent ability of certain microorganisms to biodegrade offensive odorous compounds. Two main advantages of this form of treatment over other odour control technologies include the absence of hazardous chemicals and relatively low operation and maintenance requirements. The purpose of this paper is to provide information related to odour control design criteria used in sizing/selecting biotechnology-based odour control technologies, and to provide odour removal performance data obtained from several different biotechnology-based odour control systems. CH2M HILL has collected biotechnology-based odour control performance data over the last several years in order to track the continued performance of various biofilters and biotowers over time. Specifically, odour removal performance data have been collected from soil-, organic- and inorganic-media biofilters and inert inorganic media biotowers. Results indicate that biotechnology-based odour control is a viable and consistent technology capable of achieving high removal performance for odour and hydrogen sulphide. It is anticipated that the information presented in this paper will be of interest to anyone involved with odour control technology evaluation/selection or design review. PMID:15484776

  9. Policy design and performance of emissions trading markets: an adaptive agent-based analysis.

    PubMed

    Bing, Zhang; Qinqin, Yu; Jun, Bi

    2010-08-01

    Emissions trading is considered to be a cost-effective environmental economic instrument for pollution control. However, the pilot emissions trading programs in China have failed to bring remarkable success in the campaign for pollution control. The policy design of an emissions trading program is found to have a decisive impact on its performance. In this study, an artificial market for sulfur dioxide (SO2) emissions trading applying the agent-based model was constructed. The performance of the Jiangsu SO2 emissions trading market under different policy design scenario was also examined. Results show that the market efficiency of emissions trading is significantly affected by policy design and existing policies. China's coal-electricity price system is the principal factor influencing the performance of the SO2 emissions trading market. Transaction costs would also reduce market efficiency. In addition, current-level emissions discharge fee/tax and banking mechanisms do not distinctly affect policy performance. Thus, applying emissions trading in emission control in China should consider policy design and interaction with other existing policies. PMID:20590153

  10. Development of the Computerized Model of Performance-Based Measurement System to Measure Nurses' Clinical Competence.

    PubMed

    Liou, Shwu-Ru; Liu, Hsiu-Chen; Tsai, Shu-Ling; Cheng, Ching-Yu; Yu, Wei-Chieh; Chu, Tsui-Ping

    2016-04-01

    Critical thinking skills and clinical competence are for providing quality patient care. The purpose of this study is to develop the Computerized Model of Performance-Based Measurement system based on the Clinical Reasoning Model. The system can evaluate and identify learning needs for clinical competency and be used as a learning tool to increase clinical competency by using computers. The system includes 10 high-risk, high-volume clinical case scenarios coupled with questions testing clinical reasoning, interpersonal, and technical skills. Questions were sequenced to reflect patients' changing condition and arranged by following the process of collecting and managing information, diagnosing and differentiating urgency of problems, and solving problems. The content validity and known-groups validity was established. The Kuder-Richardson Formula 20 was 0.90 and test-retest reliability was supported (r = 0.78). Nursing educators can use the system to understand students' needs for achieving clinical competence, and therefore, educational plans can be made to better prepare students and facilitate their smooth transition to a future clinical environment. Clinical nurses can use the system to evaluate their performance-based abilities and weakness in clinical reasoning. Appropriate training programs can be designed and implemented to practically promote nurses' clinical competence and quality of patient care. PMID:26829522

  11. Scenario-Based Multi-Objective Optimum Allocation Model for Earthquake Emergency Shelters Using a Modified Particle Swarm Optimization Algorithm: A Case Study in Chaoyang District, Beijing, China

    PubMed Central

    Zhao, Xiujuan; Xu, Wei; Ma, Yunjia; Hu, Fuyu

    2015-01-01

    The correct location of earthquake emergency shelters and their allocation to residents can effectively reduce the number of casualties by providing safe havens and efficient evacuation routes during the chaotic period of the unfolding disaster. However, diverse and strict constraints and the discrete feasible domain of the required models make the problem of shelter location and allocation more difficult. A number of models have been developed to solve this problem, but there are still large differences between the models and the actual situation because the characteristics of the evacuees and the construction costs of the shelters have been excessively simplified. We report here the development of a multi-objective model for the allocation of residents to earthquake shelters by considering these factors using the Chaoyang district, Beijing, China as a case study. The two objectives of this model were to minimize the total weighted evacuation time from residential areas to a specified shelter and to minimize the total area of all the shelters. The two constraints were the shelter capacity and the service radius. Three scenarios were considered to estimate the number of people who would need to be evacuated. The particle swarm optimization algorithm was first modified by applying the von Neumann structure in former loops and global structure in later loops, and then used to solve this problem. The results show that increasing the shelter area can result in a large decrease in the total weighted evacuation time from scheme 1 to scheme 9 in scenario A, from scheme 1 to scheme 9 in scenario B, from scheme 1 to scheme 19 in scenario C. If the funding were not a limitation, then the final schemes of each scenario are the best solutions, otherwise the earlier schemes are more reasonable. The modified model proved to be useful for the optimization of shelter allocation, and the result can be used as a scientific reference for planning shelters in the Chaoyang district

  12. Agent Assignment for Process Management: Pattern Based Agent Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Jablonski, Stefan; Talib, Ramzan

    In almost all workflow management system the role concept is determined once at the introduction of workflow application and is not reevaluated to observe how successfully certain processes are performed by the authorized agents. This paper describes an approach which evaluates how agents are working successfully and feed this information back for future agent assignment to achieve maximum business benefit for the enterprise. The approach is called Pattern based Agent Performance Evaluation (PAPE) and is based on machine learning technique combined with post processing technique. We report on the result of our experiments and discuss issues and improvement of our approach.

  13. Performance-Based Technology Selection Filter description report

    SciTech Connect

    O'Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  14. Lunar transportation scenarios utilising the Space Elevator

    NASA Astrophysics Data System (ADS)

    Engel, Kilian A.

    2005-07-01

    The Space Elevator (SE) concept has begun to receive an increasing amount of attention within the space community over the past couple of years and is no longer widely dismissed as pure science fiction. In light of the renewed interest in a, possibly sustained, human presence on the Moon and the fact that transportation and logistics form the bottleneck of many conceivable lunar missions, it is interesting to investigate what role the SE could eventually play in implementing an efficient Earth to Moon transportation system. The elevator allows vehicles to ascend from Earth and be injected into a trans-lunar trajectory without the use of chemical thrusters, thus eliminating gravity loss, aerodynamic loss and the need of high thrust multistage launch systems. Such a system therefore promises substantial savings of propellant and structural mass and could greatly increase the efficiency of Earth to Moon transportation. This paper analyzes different elevator-based trans-lunar transportation scenarios and characterizes them in terms of a number of benchmark figures. The transportation scenarios include direct elevator-launched trans-lunar trajectories, elevator-launched trajectories via L1 and L2, as well as launch from an Earth-based elevator and subsequent rendezvous with lunar elevators placed either on the near or on the far side of the Moon. The benchmark figures by which the different transfer options are characterized and evaluated include release radius (RR), required Δv, transfer times as well as other factors such as accessibility of different lunar latitudes, frequency of launch opportunities and mission complexity. The performances of the different lunar transfer options are compared with each other as well as with the performance of conventional mission concepts, represented by Apollo.

  15. Lunar transportation scenarios utilising the Space Elevator.

    PubMed

    Engel, Kilian A

    2005-01-01

    The Space Elevator (SE) concept has begun to receive an increasing amount of attention within the space community over the past couple of years and is no longer widely dismissed as pure science fiction. In light of the renewed interest in a, possibly sustained, human presence on the Moon and the fact that transportation and logistics form the bottleneck of many conceivable lunar missions, it is interesting to investigate what role the SE could eventually play in implementing an efficient Earth to Moon transportation system. The elevator allows vehicles to ascend from Earth and be injected into a trans-lunar trajectory without the use of chemical thrusters, thus eliminating gravity loss, aerodynamic loss and the need of high thrust multistage launch systems. Such a system therefore promises substantial savings of propellant and structural mass and could greatly increase the efficiency of Earth to Moon transportation. This paper analyzes different elevator-based trans-lunar transportation scenarios and characterizes them in terms of a number of benchmark figures. The transportation scenarios include direct elevator-launched trans-lunar trajectories, elevator launched trajectories via L1 and L2, as well as launch from an Earth-based elevator and subsequent rendezvous with lunar elevators placed either on the near or on the far side of the Moon. The benchmark figures by which the different transfer options are characterized and evaluated include release radius (RR), required delta v, transfer times as well as other factors such as accessibility of different lunar latitudes, frequency of launch opportunities and mission complexity. The performances of the different lunar transfer options are compared with each other as well as with the performance of conventional mission concepts, represented by Apollo. PMID:16010760

  16. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  17. Examining the ethical and social issues of health technology design through the public appraisal of prospective scenarios: a study protocol describing a multimedia-based deliberative method

    PubMed Central

    2014-01-01

    Background The design of health technologies relies on assumptions that affect how they will be implemented, such as intended use, complexity, impact on user autonomy, and appropriateness. Those who design and implement technologies make several ethical and social assumptions on behalf of users and society more broadly, but there are very few tools to examine prospectively whether such assumptions are warranted and how the public define and appraise the desirability of health innovations. This study protocol describes a three-year study that relies on a multimedia-based prospective method to support public deliberations that will enable a critical examination of the social and ethical issues of health technology design. Methods The first two steps of our mixed-method study were completed: relying on a literature review and the support of our multidisciplinary expert committee, we developed scenarios depicting social and technical changes that could unfold in three thematic areas within a 25-year timeframe; and for each thematic area, we created video clips to illustrate prospective technologies and short stories to describe their associated dilemmas. Using this multimedia material, we will: conduct four face-to-face deliberative workshops with members of the public (n = 40) who will later join additional participants (n = 25) through an asynchronous online forum; and analyze and integrate three data sources: observation, group deliberations, and a self-administered participant survey. Discussion This study protocol will be of interest to those who design and assess public involvement initiatives and to those who examine the implementation of health innovations. Our premise is that using user-friendly tools in a deliberative context that foster participants’ creativity and reflexivity in pondering potential technoscientific futures will enable our team to analyze a range of normative claims, including some that may prove problematic and others that may

  18. Building performance-based accountability with limited empirical evidence: performance measurement for public health preparedness.

    PubMed

    Shelton, Shoshana R; Nelson, Christopher D; McLees, Anita W; Mumford, Karen; Thomas, Craig

    2013-08-01

    Efforts to respond to performance-based accountability mandates for public health emergency preparedness have been hindered by a weak evidence base linking preparedness activities with response outcomes. We describe an approach to measure development that was successfully implemented in the Centers for Disease Control and Prevention Public Health Emergency Preparedness Cooperative Agreement. The approach leverages insights from process mapping and experts to guide measure selection, and provides mechanisms for reducing performance-irrelevant variation in measurement data. Also, issues are identified that need to be addressed to advance the science of measurement in public health emergency preparedness. PMID:24229520

  19. The Motivational Impact of School-Based Performance Awards.

    ERIC Educational Resources Information Center

    Kelley, Carolyn

    1999-01-01

    Examines the ways in which school-based performance award (SBPA) programs motivate teachers to modify or improve teaching practice. Qualitative and survey data from Kentucky, North Carolina, Colorado, and Maryland suggest that SBPA programs motivate teachers largely by creating conditions that increase intrinsic rewards and focus teacher efforts.…

  20. Planning and Implementing a High Performance Knowledge Base.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1999-01-01

    Discusses the conceptual framework for developing a rapid-prototype high-performance knowledge base for the four mission agencies of the United States Department of Agriculture and their university partners. Describes the background of the project and methods used for establishing the requirements; examines issues and problems surrounding semantic…

  1. The Authentic Performance-Based Assessment of Problem Solving.

    ERIC Educational Resources Information Center

    Curtis, David; Denton, Rob

    A new authentic performance-based approach to assessing problem solving was developed for use in vocational education and other programs in Australia. The process of developing the problem-solving assessment instrument and process included the following phases: (1) exploration of the theoretical conceptions of problem solving; (2) identification…

  2. Use and Performances of Web-Based Portfolio Assessment

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Tseng, Kuo-Hung

    2009-01-01

    This research explored the influence of a Web-based portfolio assessment system on students' performances. The methodological procedure adopted was to have the experimental group use the system, with the control group using conventional assessment. The study subjects were junior high school students of two computer classes. The experimental…

  3. School-Based Management: Organizing for High Performance.

    ERIC Educational Resources Information Center

    Mohrman, Susan Albers, Ed.; And Others

    School-based management (SBM) has gained popularity as a method for local school participants to improve their schools. As yet, however, there is little empirical evidence supporting a link between SBM and improved school performance. This book examines the SBM strategies that hold the most promise for increasing organizational effectiveness…

  4. 48 CFR 52.232-32 - Performance-Based Payments.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... accomplished in accordance with the terms of the contract. The Contracting Officer may, at any time, require... percentage. (2) If at any time the amount of payments under this contract exceeds any limitation in this... entitlement to performance-based payments during any time the Contractor's records or controls are...

  5. Teacher Motivation and School-Based Performance Awards.

    ERIC Educational Resources Information Center

    Kelley, Carolyn; Heneman, Herbert, III; Milanowski, Anthony

    2002-01-01

    Summarizes findings from series of research studies on the motivational effects of school-based performance award (SBPA) programs in Kentucky and the Charlotte-Mecklenburg (North Carolina) School District. Finds that teachers associate various positive and negative outcomes with the programs. Draws several implications for design and…

  6. School-Based Performance Awards: Research Findings and Future Directions.

    ERIC Educational Resources Information Center

    Kelley, Carolyn; Heneman, Herbert, III; Milanowski, Anthony

    This paper synthesizes research on how motivation influenced teachers at two school-based performance award (SBPA) programs in Kentucky and in North Carolina. The research was conducted between 1995 and 1998 by the Consortium for Policy Research in Education. SBPA programs provide teachers and other school staff with pay bonuses for the…

  7. 48 CFR 970.1100-1 - Performance-based contracting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Performance-based contracting. 970.1100-1 Section 970.1100-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Describing Agency Needs 970.1100-1...

  8. 48 CFR 970.1100-1 - Performance-based contracting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Performance-based contracting. 970.1100-1 Section 970.1100-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Describing Agency Needs 970.1100-1...

  9. 48 CFR 970.1100-1 - Performance-based contracting.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Performance-based contracting. 970.1100-1 Section 970.1100-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Describing Agency Needs 970.1100-1...

  10. Critical Arts-Based Research in Education: Performing Undocumented Historias

    ERIC Educational Resources Information Center

    Bagley, Carl; Castro-Salazar, Ricardo

    2012-01-01

    The article seeks to elucidate and academically position the genre of critical arts-based research in education. The article fuses Critical Race Theory (CRT), life history and performance, alongside work with undocumented American students of Mexican origin, to show how a politicised qualitative paradigmatic re envisioning can occur in which…

  11. The Evolution of Performance Based Teacher Education Programs.

    ERIC Educational Resources Information Center

    Aubertine, Horace E.

    This document is a discussion of a systemized approach to education theory and practice, especially as it applies to performance-based teacher education. The author uses as the basis of his discussion the physical sciences and their use of approximation models (an illustration of this use is the historical development of the description of matter…

  12. Internal Structure of DISCOVER: A Performance-based Assessment.

    ERIC Educational Resources Information Center

    Sarouphim, Ketty M.

    2000-01-01

    A study involving 257 Navajo and Mexican-American elementary students investigated the internal structure of the DISCOVER assessment, a performance-based assessment grounded in Gardner's theory of multiple intelligence. Results showed low interrating correlations among the five assessment activities, indicating students gifted in one intelligence…

  13. Advanced Organic Permeable-Base Transistor with Superior Performance.

    PubMed

    Klinger, Markus P; Fischer, Axel; Kaschura, Felix; Scholz, Reinhard; Lüssem, Björn; Kheradmand-Boroujeni, Bahman; Ellinger, Frank; Kasemann, Daniel; Leo, Karl

    2015-12-16

    An optimized vertical organic permeable-base transistor (OPBT) competing with the best organic field-effect transistors in performance, while employing low-cost fabrication techniques, is presented. The OPBT stands out by its excellent power efficiency at the highest frequencies. PMID:26484500

  14. Teachers' Reactions towards Performance-Based Language Assessment

    ERIC Educational Resources Information Center

    Chinda, Bordin

    2014-01-01

    This research aims at examining the reactions of tertiary EFL teachers towards the use of performance-based language assessment. The study employed a mixed-method research methodology. For the quantitative method, 36 teachers responded to a questionnaire survey. In addition, four teachers participated in the in-depth interviews which were…

  15. 48 CFR 970.1100-1 - Performance-based contracting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Performance-based contracting. 970.1100-1 Section 970.1100-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Describing Agency Needs 970.1100-1...

  16. Performance-Based Measurement: Action for Organizations and HPT Accountability

    ERIC Educational Resources Information Center

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  17. Begging the Question: Performativity and Studio-Based Research

    ERIC Educational Resources Information Center

    Petelin, George

    2014-01-01

    The requirement that candidates in studio-based or practice-led higher degrees by research should formulate a research question has been found to be problematic by some writers. The present article argues that this stance, particularly as it is articulated by proponents of the influential category of "performative research" (Haseman,…

  18. Energy Conservation in the Home. Performance Based Lesson Plans.

    ERIC Educational Resources Information Center

    Alabama State Dept. of Education, Montgomery. Home Economics Service.

    These ten performance-based lesson plans concentrate on tasks related to energy conservation in the home. They are (1) caulk cracks, holes, and joints; (2) apply weatherstripping to doors and windows; (3) add plastic/solar screen window covering; (4) arrange furniture for saving energy; (5) set heating/cooling thermostat; (6) replace faucet…

  19. A Performance-Based Assessment for Limiting Reactants

    ERIC Educational Resources Information Center

    Walker, Joi Phelps; Sampson, Victor; Zimmerman, Carol O.; Grooms, Jonathon A.

    2011-01-01

    Educators are increasingly being called upon to provide evidence of student learning. Traditional assessments are not always the best venue for demonstrating conceptual understanding, particularly in science. This paper presents details on the design, use, and scoring of a performance-based assessment for measuring student understanding of…

  20. Faculty Commitment to Performance Based Funding for Academic Programs

    ERIC Educational Resources Information Center

    Sandiford, Janice R.; Montoya, Rolando, Jr.

    2005-01-01

    Higher education institutions receiving public financial support are accountable to the governmental bodies providing their funding. The current accountability movement has generated demands for greater effectiveness and efficiency from public higher education institutions. A recent manifestation of this movement is performance-based funding that…

  1. Leading Instructional Practices in a Performance-Based System

    ERIC Educational Resources Information Center

    Kauble, Anna; Wise, Donald

    2015-01-01

    Given the shift to Common Core, educational leaders are challenged to see new directions in teaching and learning. The purpose of this study was to investigate the instructional practices which may be related to the effectiveness of a performance-based system (PBS) and their impact on student achievement, as part of a thematic set of dissertations…

  2. The Hidden Curriculum of Performance-Based Teacher Education

    ERIC Educational Resources Information Center

    Rennert-Ariev, Peter

    2008-01-01

    Purpose/Objective/Research question/Focus of study: This study describes and analyzes the student and faculty experiences of a "performance-based" preservice teacher education program at a large comprehensive university in the mid-Atlantic region. The aim is to understand the "hidden" curricular messages within the program and the ways that these…

  3. Performance Evaluation in Network-Based Parallel Computing

    NASA Technical Reports Server (NTRS)

    Dezhgosha, Kamyar

    1996-01-01

    Network-based parallel computing is emerging as a cost-effective alternative for solving many problems which require use of supercomputers or massively parallel computers. The primary objective of this project has been to conduct experimental research on performance evaluation for clustered parallel computing. First, a testbed was established by augmenting our existing SUNSPARCs' network with PVM (Parallel Virtual Machine) which is a software system for linking clusters of machines. Second, a set of three basic applications were selected. The applications consist of a parallel search, a parallel sort, a parallel matrix multiplication. These application programs were implemented in C programming language under PVM. Third, we conducted performance evaluation under various configurations and problem sizes. Alternative parallel computing models and workload allocations for application programs were explored. The performance metric was limited to elapsed time or response time which in the context of parallel computing can be expressed in terms of speedup. The results reveal that the overhead of communication latency between processes in many cases is the restricting factor to performance. That is, coarse-grain parallelism which requires less frequent communication between processes will result in higher performance in network-based computing. Finally, we are in the final stages of installing an Asynchronous Transfer Mode (ATM) switch and four ATM interfaces (each 155 Mbps) which will allow us to extend our study to newer applications, performance metrics, and configurations.

  4. Design and performance comparison of fuzzy logic based tracking controllers

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Jani, Yashvant

    1992-01-01

    Several camera tracking controllers based on fuzzy logic principles have been designed and tested in software simulation in the software technology branch at the Johnson Space Center. The fuzzy logic based controllers utilize range measurement and pixel positions from the image as input parameters and provide pan and tilt gimble rate commands as output. Two designs of the rulebase and tuning process applied to the membership functions are discussed in light of optimizing performance. Seven test cases have been designed to test the performance of the controllers for proximity operations where approaches like v-bar, fly-around and station keeping are performed. The controllers are compared in terms of responsiveness, and ability to maintain the object in the field-of-view of the camera. Advantages of the fuzzy logic approach with respect to the conventional approach have been discussed in terms of simplicity and robustness.

  5. Fundamental performance improvement to dispersive spectrograph based imaging technologies

    NASA Astrophysics Data System (ADS)

    Meade, Jeff T.; Behr, Bradford B.; Cenko, Andrew T.; Christensen, Peter; Hajian, Arsen R.; Hendrikse, Jan; Sweeney, Frederic D.

    2011-03-01

    Dispersive-based spectrometers may be qualified by their spectral resolving power and their throughput efficiency. A device known as a virtual slit is able to improve the resolving power by factors of several with a minimal loss in throughput, thereby fundamentally improving the quality of the spectrometer. A virtual slit was built and incorporated into a low performing spectrometer (R ~ 300) and was shown to increase the performance without a significant loss in signal. The operation and description of virtual slits is also given. High-performance, lowlight, and high-speed imaging instruments based on a dispersive-type spectrometer see the greatest impact from a virtual slit. The impact of a virtual slit on spectral domain optical coherence tomography (SD-OCT) is shown to improve the imaging quality substantially.

  6. Implementation of evidence-based practice and organizational performance.

    PubMed

    Hovmand, Peter S; Gillespie, David F

    2010-01-01

    Administrators of mental health services may expect evidence-based practice (EBP) to offer strategic benefits. Existing theory suggests that the benefits of implementing EBP vary by organizational characteristics. This paper presents a conceptual framework for considering how implementation impacts organizational performance. The framework is developed as a system dynamics simulation model based on existing literature, organizational theory, and key informant interviews with mental health services administrators and clinical directors. Results from the simulations show how gains in performance depended on organizations' initial inertia and initial efficiency and that only the most efficient organizations may see benefits in organizational performance from implementing EBP. Implications for administrators, policy makers, and services researchers are discussed. PMID:19085109

  7. Performance.

    PubMed

    Chambers, David W

    2006-01-01

    High performance is difficult to maintain because it is dynamic and not well understood. Based on a synthesis of many sources, a model is proposed where performance is a function of the balance between capacity and challenge. Too much challenge produces coping (or a crash); excess capacity results in boredom. Over time, peak performance drifts toward boredom. Performance can be managed by adjusting our level of ability, our effort, the opportunity to perform, and the challenge we agree to take on. Coping, substandard but acceptable performance, is common among professionals and its long-term side effects can be debilitating. A crash occurs when coping mechanisms fail. PMID:17020177

  8. Europa Explorer Operational Scenarios Development

    NASA Technical Reports Server (NTRS)

    Lock, Robert E.; Pappalardo, Robert T.; Clark, Karla B.

    2008-01-01

    In 2007, NASA conducted four advanced mission concept studies for outer planets targets: Europa, Ganymede, Titan and Enceladus. The studies were conducted in close cooperation with the planetary science community. Of the four, the Europa Explorer Concept Study focused on refining mission options, science trades and implementation details for a potential flagship mission to Europa in the 2015 timeframe. A science definition team (SDT) was appointed by NASA to guide the study. A JPL-led engineering team worked closely with the science team to address 3 major focus areas: 1) credible cost estimates, 2) rationale and logical discussion of radiation risk and mitigation approaches, and 3) better definition and exploration of science operational scenario trade space. This paper will address the methods and results of the collaborative process used to develop Europa Explorer operations scenarios. Working in concert with the SDT, and in parallel with the SDT's development of a science value matrix, key mission capabilities and constraints were challenged by the science and engineering members of the team. Science goals were advanced and options were considered for observation scenarios. Data collection and return strategies were tested via simulation, and mission performance was estimated and balanced with flight and ground system resources and science priorities. The key to this successful collaboration was a concurrent development environment in which all stakeholders could rapidly assess the feasibility of strategies for their success in the full system context. Issues of science and instrument compatibility, system constraints, and mission opportunities were treated analytically and objectively leading to complementary strategies for observation and data return. Current plans are that this approach, as part of the system engineering process, will continue as the Europa Explorer Concept Study moves toward becoming a development project.

  9. Performance evaluation of wavelet-based face verification on a PDA recorded database

    NASA Astrophysics Data System (ADS)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  10. Using a Web-Based Portfolio Assessment System to Elevate Project-Based Learning Performances

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Tseng, Kuo-Hung

    2011-01-01

    This study examines the effect of a Web-based portfolio assessment system on the performances of students undertaking project-based learning (PBL). The research targets were 60 students from two grade-8 classes taking senior high school computer courses. The experimental group comprised 30 students, who used the system to perform PBL and…

  11. Performability modeling based on real data: A casestudy

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.

    1987-01-01

    Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different types of errors.

  12. High performance network and channel-based storage

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.

    1991-01-01

    In the traditional mainframe-centered view of a computer system, storage devices are coupled to the system through complex hardware subsystems called input/output (I/O) channels. With the dramatic shift towards workstation-based computing, and its associated client/server model of computation, storage facilities are now found attached to file servers and distributed throughout the network. We discuss the underlying technology trends that are leading to high performance network-based storage, namely advances in networks, storage devices, and I/O controller and server architectures. We review several commercial systems and research prototypes that are leading to a new approach to high performance computing based on network-attached storage.

  13. A measurement-based performability model for a multiprocessor system

    NASA Technical Reports Server (NTRS)

    Ilsueh, M. C.; Iyer, Ravi K.; Trivedi, K. S.

    1987-01-01

    A measurement-based performability model based on real error-data collected on a multiprocessor system is described. Model development from the raw errror-data to the estimation of cumulative reward is described. Both normal and failure behavior of the system are characterized. The measured data show that the holding times in key operational and failure states are not simple exponential and that semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different failure types and recovery procedures.

  14. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation

    PubMed Central

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-01-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

  15. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation.

    PubMed

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-01-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

  16. Integrated Ring Oscillators based on high-performance Graphene Inverters

    PubMed Central

    Schall, Daniel; Otto, Martin; Neumaier, Daniel; Kurz, Heinrich

    2013-01-01

    The road to the realization of complex integrated circuits based on graphene remains an open issue so far. Current graphene based integrated circuits are limited by low integration depth and significant doping variations, representing major road blocks for the success of graphene in future electronic devices. Here we report on the realization of graphene based integrated inverters and ring oscillators. By using an optimized process technology for high-performance graphene transistors with local back-gate electrodes we demonstrate that complex graphene based integrated circuits can be manufactured reproducibly, circumventing problems associated with doping variations. The fabrication process developed here is scalable and fully compatible with conventional silicon technology. Therefore, our results pave the way towards applications based on graphene transistors in future electronic devices. PMID:24005257

  17. Application and Evaluation of Control Modes for Risk-Based Engine Performance Enhancements

    NASA Technical Reports Server (NTRS)

    Liu, Yuan; Litt, Jonathan S.; Sowers, T. Shane; Owen, A. Karl (Compiler); Guo, Ten-Huei

    2014-01-01

    The engine control system for civil transport aircraft imposes operational limits on the propulsion system to ensure compliance with safety standards. However, during certain emergency situations, aircraft survivability may benefit from engine performance beyond its normal limits despite the increased risk of failure. Accordingly, control modes were developed to improve the maximum thrust output and responsiveness of a generic high-bypass turbofan engine. The algorithms were designed such that the enhanced performance would always constitute an elevation in failure risk to a consistent predefined likelihood. This paper presents an application of these risk-based control modes to a combined engine/aircraft model. Through computer and piloted simulation tests, the aim is to present a notional implementation of these modes, evaluate their effects on a generic airframe, and demonstrate their usefulness during emergency flight situations. Results show that minimal control effort is required to compensate for the changes in flight dynamics due to control mode activation. The benefits gained from enhanced engine performance for various runway incursion scenarios are investigated. Finally, the control modes are shown to protect against potential instabilities during propulsion-only flight where all aircraft control surfaces are inoperable.

  18. Application and Evaluation of Control Modes for Risk-Based Engine Performance Enhancements

    NASA Technical Reports Server (NTRS)

    Liu, Yuan; Litt, Jonathan S.; Sowers, T. Shane; Owen, A. Karl; Guo, Ten-Huei

    2015-01-01

    The engine control system for civil transport aircraft imposes operational limits on the propulsion system to ensure compliance with safety standards. However, during certain emergency situations, aircraft survivability may benefit from engine performance beyond its normal limits despite the increased risk of failure. Accordingly, control modes were developed to improve the maximum thrust output and responsiveness of a generic high-bypass turbofan engine. The algorithms were designed such that the enhanced performance would always constitute an elevation in failure risk to a consistent predefined likelihood. This paper presents an application of these risk-based control modes to a combined engine/aircraft model. Through computer and piloted simulation tests, the aim is to present a notional implementation of these modes, evaluate their effects on a generic airframe, and demonstrate their usefulness during emergency flight situations. Results show that minimal control effort is required to compensate for the changes in flight dynamics due to control mode activation. The benefits gained from enhanced engine performance for various runway incursion scenarios are investigated. Finally, the control modes are shown to protect against potential instabilities during propulsion-only flight where all aircraft control surfaces are inoperable.

  19. EXAMPLE EXPOSURE SCENARIOS ASSESSMENT TOOL

    EPA Science Inventory

    Exposure scenarios are a tool to help the assessor develop estimates of exposure, dose, and risk. An exposure scenario generally includes facts, data, assumptions, inferences, and sometimes professional judgment about how the exposure takes place. The human physiological and beh...

  20. Demonstration of ITER Operational Scenarios on DIII-D

    SciTech Connect

    Doyle, E J; Budny, R V; DeBoo, J C; Ferron, J R; Jackson, G L; Luce, T C; Murakami, M; Osborne, T H; Park, J; Politzer, P A; Reimerdes, H; Casper, T A; Challis, C D; Groebner, R J; Holcomb, C T; Hyatt, A W; La Haye, R J; McKee, G R; Petrie, T W; Petty, C C; Rhodes, T L; Shafer, M W; Snyder, P B; Strait, E J; Wade, M R; Wang, G; West, W P; Zeng, L

    2008-10-13

    The DIII-D program has recently initiated an effort to provide suitably scaled experimental evaluations of four primary ITER operational scenarios. New and unique features of this work are that the plasmas incorporate essential features of the ITER scenarios and anticipated operating characteristics; e.g., the plasma cross-section, aspect ratio and value of I/aB of the DIII-D discharges match the ITER design, with size reduced by a factor of 3.7. Key aspects of all four scenarios, such as target values for {beta}{sub N} and H{sub 98}, have been replicated successfully on DIII-D, providing an improved and unified physics basis for transport and stability modeling, as well as for performance extrapolation to ITER. In all four scenarios normalized performance equals or closely approaches that required to realize the physics and technology goals of ITER, and projections of the DIII-D discharges are consistent with ITER achieving its goals of {ge} 400 MW of fusion power production and Q {ge} 10. These studies also address many of the key physics issues related to the ITER design, including the L-H transition power threshold, the size of ELMs, pedestal parameter scaling, the impact of tearing modes on confinement and disruptivity, beta limits and the required capabilities of the plasma control system. An example of direct influence on the ITER design from this work is a modification of the specified operating range in internal inductance at 15 MA for the poloidal field coil set, based on observations that the measured inductance in the baseline scenario case lay outside the original ITER specification.