Science.gov

Sample records for scenario based performance

  1. Mars base buildup scenarios

    SciTech Connect

    Blacic, J.D.

    1985-01-01

    Two surface base build-up scenarios are presented in order to help visualize the mission and to serve as a basis for trade studies. In the first scenario, direct manned landings on the Martian surface occur early in the missions and scientific investigation is the main driver and rationale. In the second scenario, early development of an infrastructure to exploite the volatile resources of the Martian moons for economic purposes is emphasized. Scientific exploration of the surface is delayed at first, but once begun develops rapidly aided by the presence of a permanently manned orbital station.

  2. Mars base buildup scenarios

    NASA Technical Reports Server (NTRS)

    Blacic, J. D.

    1986-01-01

    Two Mars surface based build-up scenarios are presented in order to help visualize the mission and to serve as a basis for trade studies. In the first scenario, direct manned landings on the Martian surface occur early in the missions and scientific investigation is the main driver and rationale. In the second senario, Earth development of an infrastructure to exploit the volatile resources of the Martian moons for economic purposes is emphasized. Scientific exploration of the surface is delayed at first in this scenario relative to the first, but once begun develops rapidly, aided by the presence of a permanently manned orbital station.

  3. Base case and perturbation scenarios

    SciTech Connect

    Edmunds, T

    1998-10-01

    This report describes fourteen energy factors that could affect electricity markets in the future (demand, process, source mix, etc.). These fourteen factors are believed to have the most influence on the State's energy environment. A base case, or most probable, characterization is given for each of these fourteen factors over a twenty year time horizon. The base case characterization is derived from quantitative and qualitative information provided by State of California government agencies, where possible. Federal government databases are nsed where needed to supplement the California data. It is envisioned that a initial selection of issue areas will be based upon an evaluation of them under base case conditions. For most of the fourteen factors, the report identities possible perturbations from base case values or assumptions that may be used to construct additional scenarios. Only those perturbations that are plausible and would have a significant effect on energy markets are included in the table. The fourteen factors and potential perturbations of the factors are listed in Table 1.1. These perturbations can be combined to generate internally consist.ent. combinations of perturbations relative to the base case. For example, a low natural gas price perturbation should be combined with a high natural gas demand perturbation. The factor perturbations are based upon alternative quantitative forecasts provided by other institutions (the Department of Energy - Energy Information Administration in some cases), changes in assumptions that drive the quantitative forecasts, or changes in assumptions about the structure of the California energy markets. The perturbations are intended to be used for a qualitative reexamination of issue areas after an initial evaluation under the base case. The perturbation information would be used as a "tiebreaker;" to make decisions regarding those issue areas that were marginally accepted or rejected under the base case. Hf a

  4. Performance analysis of communication links based on VCSEL and silicon photonics technology for high-capacity data-intensive scenario.

    PubMed

    Boletti, A; Boffi, P; Martelli, P; Ferrario, M; Martinelli, M

    2015-01-26

    To face the increased demand for bandwidth, cost-effectiveness and simplicity of future Ethernet data communications, a comparison between two different solutions based on directly-modulated VCSEL sources and Silicon Photonics technologies is carried out. Also by exploiting 4-PAM modulation, the transmission of 50-Gb/s and beyond capacity per channel is analyzed by means of BER performance. Applications for optical backplane, very short reach and in case of client-optics networks and intra and inter massive data centers communications (up to 10 km) are taken into account. A comparative analysis based on the power consumption is also proposed.

  5. Web Based Tool for Mission Operations Scenarios

    NASA Technical Reports Server (NTRS)

    Boyles, Carole A.; Bindschadler, Duane L.

    2008-01-01

    A conventional practice for spaceflight projects is to document scenarios in a monolithic Operations Concept document. Such documents can be hundreds of pages long and may require laborious updates. Software development practice utilizes scenarios in the form of smaller, individual use cases, which are often structured and managed using UML. We have developed a process and a web-based scenario tool that utilizes a similar philosophy of smaller, more compact scenarios (but avoids the formality of UML). The need for a scenario process and tool became apparent during the authors' work on a large astrophysics mission. It was noted that every phase of the Mission (e.g., formulation, design, verification and validation, and operations) looked back to scenarios to assess completeness of requirements and design. It was also noted that terminology needed to be clarified and structured to assure communication across all levels of the project. Attempts to manage, communicate, and evolve scenarios at all levels of a project using conventional tools (e.g., Excel) and methods (Scenario Working Group meetings) were not effective given limitations on budget and staffing. The objective of this paper is to document the scenario process and tool created to offer projects a low-cost capability to create, communicate, manage, and evolve scenarios throughout project development. The process and tool have the further benefit of allowing the association of requirements with particular scenarios, establishing and viewing relationships between higher- and lower-level scenarios, and the ability to place all scenarios in a shared context. The resulting structured set of scenarios is widely visible (using a web browser), easily updated, and can be searched according to various criteria including the level (e.g., Project, System, and Team) and Mission Phase. Scenarios are maintained in a web-accessible environment that provides a structured set of scenario fields and allows for maximum

  6. Comparison of traditional advanced cardiac life support (ACLS) course instruction vs. a scenario-based, performance oriented team instruction (SPOTI) method for Korean paramedic students.

    PubMed

    Lee, Christopher C; Im, Mark; Kim, Tae Min; Stapleton, Edward R; Kim, Kyuseok; Suh, Gil Joon; Singer, Adam J; Henry, Mark C

    2010-01-01

    Current Advanced Cardiac Life Support (ACLS) course instruction involves a 2-day course with traditional lectures and limited team interaction. We wish to explore the advantages of a scenario-based performance-oriented team instruction (SPOTI) method to implement core ACLS skills for non-English-speaking international paramedic students. The objective of this study was to determine if scenario-based, performance-oriented team instruction (SPOTI) improves educational outcomes for the ACLS instruction of Korean paramedic students. Thirty Korean paramedic students were randomly selected into two groups. One group of 15 students was taught the traditional ACLS course. The other 15 students were instructed using a SPOTI method. Each group was tested using ACLS megacode examinations endorsed by the American Heart Association. All 30 students passed the ACLS megacode examination. In the traditional ACLS study group an average of 85% of the core skills were met. In the SPOTI study group an average of 93% of the core skills were met. In particular, the SPOTI study group excelled at physical examination skills such as airway opening, assessment of breathing, signs of circulation, and compression rates. In addition, the SPOTI group performed with higher marks on rhythm recognition compared to the traditional group. The traditional group performed with higher marks at providing proper drug dosages compared to the SPOTI students. However, the students enrolled in the SPOTI method resulted in higher megacode core compliance scores compared to students trained in traditional ACLS course instruction. These differences did not achieve statistical significance due to the small sample size.

  7. The Impact of Collegiate Aviation Student Learning Styles on Flight Performance: A Scenario-Based Training Approach

    ERIC Educational Resources Information Center

    Harriman, Stanley L.

    2011-01-01

    The introduction of the glass cockpit, as well as a whole new generation of high performance general aviation aircraft, highlights the need for a comprehensive overhaul of the traditional approach to training pilots. Collegiate aviation institutions that are interested in upgrading their training aircraft fleets will need to design new curricula…

  8. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    SciTech Connect

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  9. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  10. Promoting Discourse with Task-Based Scenario Interaction.

    ERIC Educational Resources Information Center

    Dinapoli, Russell

    Tasks have become an essential feature of second language (L2) learning in recent years. Tasks range from getting learners to repeat linguistic elements satisfactorily to having them perform in "free" production. Along this task-based continuum, task-based scenario interaction lies at the point midway between controlled and semi-controlled…

  11. Mars Scenario-Based Visioning: Logistical Optimization of Transportation Architectures

    NASA Astrophysics Data System (ADS)

    1999-01-01

    The purpose of this conceptual design investigation is to examine transportation forecasts for future human Wu missions to Mars. - Scenario-Based Visioning is used to generate possible future demand projections. These scenarios are then coupled with availability, cost, and capacity parameters for indigenously designed Mars Transfer Vehicles (solar electric, nuclear thermal, and chemical propulsion types) and Earth-to-Orbit launch vehicles (current, future, and indigenous) to provide a cost-conscious dual-phase launch manifest to meet such future demand. A simulator named M-SAT (Mars Scenario Analysis Tool) is developed using this method. This simulation is used to examine three specific transportation scenarios to Mars: a limited "flaus and footprints" mission, a More ambitious scientific expedition similar to an expanded version of the Design Reference Mission from NASA, and a long-term colonization scenario. Initial results from the simulation indicate that chemical propulsion systems might be the architecture of choice for all three scenarios. With this mind, "what if' analyses were performed which indicated that if nuclear production costs were reduced by 30% for the colonization scenario, then the nuclear architecture would have a lower life cycle cost than the chemical. Results indicate that the most cost-effective solution to the Mars transportation problem is to plan for segmented development, this involves development of one vehicle at one opportunity and derivatives of that vehicle at subsequent opportunities.

  12. Reality based scenarios facilitate knowledge network development.

    PubMed

    Manning, J; Broughton, V; McConnell, E A

    1995-03-01

    The challenge in nursing education is to create a learning environment that enables students to learn new knowledge, access previously acquired information from a variety of disciplines, and apply this newly constructed knowledge to the complex and constantly changing world of practice. Faculty at the University of South Australia, School of Nursing, City Campus describe the use of reality based scenarios to acquire domain-specific knowledge and develop well connected associative knowledge networks, both of which facilitate theory based practice and the student's transition to the role of registered nurse.

  13. Scenarios for the Hanford Immobilized Low-Activity Waste (ILAW) performance assessment

    SciTech Connect

    MANN, F.M.

    1999-03-17

    Scenarios describing representative exposure cases associated with the disposal of low activity waste from the Hanford Waste Tanks have been defined. These scenarios are based on guidance from the Department of Energy, the U.S. Nuclear Regulatory Commission, and previous Hanford waste disposal performance assessments.

  14. Reliable freestanding position-based routing in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Gabriel A; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-01-01

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159

  15. Future Scenarios for Fission Based Reactors

    NASA Astrophysics Data System (ADS)

    David, S.

    2005-04-01

    The coming century will see the exhaustion of standard fossil fuels, coal, gas and oil, which today represent 75% of the world energy production. Moreover, their use will have caused large-scale emission of greenhouse gases (GEG), and induced global climate change. This problem is exacerbated by a growing world energy demand. In this context, nuclear power is the only GEG-free energy source available today capable of responding significantly to this demand. Some scenarios consider a nuclear energy production of around 5 Gtoe in 2050, wich would represent a 20% share of the world energy supply. Present reactors generate energy from the fission of U-235 and require around 200 tons of natural Uranium to produce 1GWe.y of energy, equivalent to the fission of one ton of fissile material. In a scenario of a significant increase in nuclear energy generation, these standard reactors will consume the whole of the world's estimated Uranium reserves in a few decades. However, natural Uranium or Thorium ore, wich are not themselves fissile, can produce a fissile material after a neutron capture ( 239Pu and 233U respectively). In a breeder reactor, the mass of fissile material remains constant, and the fertile ore is the only material to be consumed. In this case, only 1 ton of natural ore is needed to produce 1GWe.y. Thus, the breeding concept allows optimal use of fertile ore and development of sustainable nuclear energy production for several thousand years into the future. Different sustainable nuclear reactor concepts are studied in the international forum "generation IV". Different types of coolant (Na, Pb and He) are studied for fast breeder reactors based on the Uranium cycle. The thermal Thorium cycle requires the use of a liquid fuel, which can be reprocessed online in order to extract the neutron poisons. This paper presents these different sustainable reactors, based on the Uranium or Thorium fuel cycles and will compare the different options in term of fissile

  16. Wiki Based Collaborative Learning in Interuniversity Scenarios

    ERIC Educational Resources Information Center

    Katzlinger, Elisabeth; Herzog, Michael A.

    2014-01-01

    In business education advanced collaboration skills and media literacy are important for surviving in a globalized business where virtual communication between enterprises is part of the day-by-day business. To transform these global working situations into higher education, a learning scenario between two universities in Germany and Austria was…

  17. Flooding Capability for River-based Scenarios

    SciTech Connect

    Smith, Curtis L.; Prescott, Steven; Ryan, Emerald; Calhoun, Donna; Sampath, Ramprasad; Anderson, S. Danielle; Casteneda, Cody

    2015-10-01

    This report describes the initial investigation into modeling and simulation tools for application of riverine flooding representation as part of the Risk-Informed Safety Margin Characterization (RISMC) Pathway external hazards evaluations. The report provides examples of different flooding conditions and scenarios that could impact river and watershed systems. Both 2D and 3D modeling approaches are described.

  18. An Example Implementation of Schank's Goal-Based Scenarios

    ERIC Educational Resources Information Center

    Hsu, Chung-Yuan; Moore, David Richard

    2010-01-01

    The Goal-based Scenario method is a design model for applying simulations to instruction. This portfolio item describes an implementation of Goal-based Scenarios for the teaching of statistics. The application demonstrates how simulations can be contextualized and how they can allow learners to engage in legitimate inquiry in the pursuit of their…

  19. "The Strawberry Caper": Using Scenario-Based Problem Solving to Integrate Middle School Science Topics

    ERIC Educational Resources Information Center

    Gonda, Rebecca L.; DeHart, Kyle; Ashman, Tia-Lynn; Legg, Alison Slinskey

    2015-01-01

    Achieving a deep understanding of the many topics covered in middle school biology classes is difficult for many students. One way to help students learn these topics is through scenario-based learning, which enhances students' performance. The scenario-based problem-solving module presented here, "The Strawberry Caper," not only…

  20. Aerosol cloud interaction: a multiplatform-scenario-based methodology

    NASA Astrophysics Data System (ADS)

    Landulfo, Eduardo; Lopes, Fabío. J. S.; Guerrero-Rascado, Juan Luis; Alados-Arboledas, Lucas

    2015-10-01

    Suspended atmospheric particles i.e. aerosol particles go through many chemical and physical processes and those interactions and transformations may cause particle change in size, structure and composition regulated by mechanisms, which are also present in clouds. These interactions play a great role in the radiation transfer in the atmosphere and are not completely understood as competing effects might occur which are known as indirect aerosol effects. Performing measurements and experiments in remote sensing to improve the knowledge of these processes are also a challenge. In face of that we propose a multi-platform approach based lidar, sun photometry and satellite observations which should be characterized under a scenario perspective in which given the cloud height, geometric and optical geometries in a diurnal/nocturnal basis will make possible to apply different analytical tools in each a set of product that specify the aerosol present in the vicinity of clouds, their optical and physical properties. These scenarios are meant to aid in tagging the expected products and help in creating a robust database to systematically study the aerosol-cloud interaction.In total we will present 6 scenarios: 3 under daylight conditions, 3 under at nighttime. Each scenario and their counterpart should be able to provide the cloud base/top height, aerosol backscattering profile and cloud optical/geometric thickness. In each instance we should count on a 5 wavelength Raman lidar system measurement, a collocated sun photometer and CALIPSO/MODIS observation from AQUA/TERRA platforms. To further improve the aerosol cloud interaction the Raman lidar system should have a water vapor channel or moreover a liquid water channel. In our study we will present a two-day case study to show the methodology feasibility and its potential application.

  1. E-maintenance Scenarios Based on Augmented Reality Software Architecture

    NASA Astrophysics Data System (ADS)

    Benbelkacem, S.; Zenati-Henda, N.; Belhocine, M.

    2008-06-01

    This paper presents architecture of augmented reality for e-maintenance application. In our case, the aim is not to develop a vision system based on augmented reality concept, but to show the relationship between the different actors in the proposed architecture and to facilitate maintenance of the machine. This architecture allows implementing different scenarios which give to the technician possibilities to intervene on a breakdown device with a distant expert help. Each scenario is established according to machine parameters and technician competences. In our case, a hardware platform is designed to carry out e-maintenance scenarios. An example of e-maintenance scenario is then presented.

  2. Modeling and Composing Scenario-Based Requirements with Aspects

    NASA Technical Reports Server (NTRS)

    Araujo, Joao; Whittle, Jon; Ki, Dae-Kyoo

    2004-01-01

    There has been significant recent interest, within the Aspect-Oriented Software Development (AOSD) community, in representing crosscutting concerns at various stages of the software lifecycle. However, most of these efforts have concentrated on the design and implementation phases. We focus in this paper on representing aspects during use case modeling. In particular, we focus on scenario-based requirements and show how to compose aspectual and non-aspectual scenarios so that they can be simulated as a whole. Non-aspectual scenarios are modeled as UML sequence diagram. Aspectual scenarios are modeled as Interaction Pattern Specifications (IPS). In order to simulate them, the scenarios are transformed into a set of executable state machines using an existing state machine synthesis algorithm. Previous work composed aspectual and non-aspectual scenarios at the sequence diagram level. In this paper, the composition is done at the state machine level.

  3. ITER Scenario Performance Simulations Assessing Control and Vertical Stability

    SciTech Connect

    Casper, T; Ferron, J; Humphreys, D; Jackson, G; Leuer, J; LoDestro, L; Luce, T; Meyer, W; Pearlstein, L; Welander, A

    2008-05-20

    In simulating reference scenarios proposed for ITER operation, we also explore performance of the poloidal field (PF) and central solenoid (CS) coil systems using a controller to maintain plasma shape and vertical stability during the discharge evolution. We employ a combination of techniques to evaluate system constraints and stability using time-dependent transport simulations of ITER discharges. We have begun the process of benchmarking these simulations with experiments on the DIII-D tokamak. Simulations include startup on the outside limiter, X-point formation and current ramp up to full power, plasma burn conditions at 15MA and 17MA, and ramp down at the end of the pulse. We also simulate perturbative events such as H-to-L back transitions. Our results indicate the viability of proposed ITER operating modes.

  4. Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation

    SciTech Connect

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    2014-01-01

    Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From these five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.

  5. OBEST: The Object-Based Event Scenario Tree Methodology

    SciTech Connect

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-03-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies.

  6. Dual Mission Scenarios for the Human Lunar Campaign - Performance, Cost and Risk Benefits

    NASA Technical Reports Server (NTRS)

    Saucillo, Rudolph J.; Reeves, David M.; Chrone, Jonathan D.; Stromgren, Chel; Reeves, John D.; North, David D.

    2008-01-01

    Scenarios for human lunar operations with capabilities significantly beyond Constellation Program baseline missions are potentially feasible based on the concept of dual, sequential missions utilizing a common crew and a single Ares I/CEV (Crew Exploration Vehicle). For example, scenarios possible within the scope of baseline technology planning include outpost-based sortie missions and dual sortie missions. Top level cost benefits of these dual sortie scenarios may be estimated by comparison to the Constellation Program reference two-mission-per-year lunar campaign. The primary cost benefit is the accomplishment of Mission B with a "single launch solution" since no Ares I launch is required. Cumulative risk to the crew is lowered since crew exposure to launch risks and Earth return risks are reduced versus comparable Constellation Program reference two-mission-per-year scenarios. Payload-to-the-lunar-surface capability is substantially increased in the Mission B sortie as a result of additional propellant available for Lunar Lander #2 descent. This additional propellant is a result of EDS #2 transferring a smaller stack through trans-lunar injection and using remaining propellant to perform a portion of the lunar orbit insertion (LOI) maneuver. This paper describes these dual mission concepts, including cost, risk and performance benefits per lunar sortie site, and provides an initial feasibility assessment.

  7. Scenario-Based Training at the F.B.I.

    ERIC Educational Resources Information Center

    Whitcomb, Chris

    1999-01-01

    The 16-week training program offered by the FBI Academy for all new agents is a scenario-based curriculum that includes a range of subjects from the rules of evidence to defensive tactics and provides agents with a clear understanding of how to conduct a full investigation from start to finish. (JOW)

  8. Improving learning performance with happiness by interactive scenarios.

    PubMed

    Chuang, Chi-Hung; Chen, Ying-Nong; Tsai, Luo-Wei; Lee, Chun-Chieh; Tsai, Hsin-Chun

    2014-01-01

    Recently, digital learning has attracted a lot of researchers to improve the problems of learning carelessness, low learning ability, lack of concentration, and difficulties in comprehending the logic of math. In this study, a digital learning system based on Kinect somatosensory system is proposed to make children and teenagers happily learn in the course of the games and improve the learning performance. We propose two interactive geometry and puzzle games. The proposed somatosensory games can make learners feel curious and raise their motivation to find solutions for boring problems via abundant physical expressions and interactive operations. The players are asked to select particular operation by gestures and physical expressions within a certain time. By doing so, the learners can feel the fun of game playing and train their logic ability before they are aware. Experimental results demonstrate that the proposed somatosensory system can effectively improve the students' learning performance. PMID:24558331

  9. Improving Learning Performance with Happiness by Interactive Scenarios

    PubMed Central

    Chuang, Chi-Hung; Chen, Ying-Nong; Tsai, Luo-Wei; Lee, Chun-Chieh; Tsai, Hsin-Chun

    2014-01-01

    Recently, digital learning has attracted a lot of researchers to improve the problems of learning carelessness, low learning ability, lack of concentration, and difficulties in comprehending the logic of math. In this study, a digital learning system based on Kinect somatosensory system is proposed to make children and teenagers happily learn in the course of the games and improve the learning performance. We propose two interactive geometry and puzzle games. The proposed somatosensory games can make learners feel curious and raise their motivation to find solutions for boring problems via abundant physical expressions and interactive operations. The players are asked to select particular operation by gestures and physical expressions within a certain time. By doing so, the learners can feel the fun of game playing and train their logic ability before they are aware. Experimental results demonstrate that the proposed somatosensory system can effectively improve the students' learning performance. PMID:24558331

  10. Scenario-based Storm Surge Vulnerability Assessment of Catanduanes

    NASA Astrophysics Data System (ADS)

    Suarez, J. K. B.

    2015-12-01

    After the devastating storm surge effect of Typhoon Haiyan, the public recognized an improved communication about risks, vulnerabilities and what is threatened by storm surge. This can be provided by vulnerability maps which allow better visual presentations and understanding of the risks and vulnerabilities. Local implementers can direct the resources needed for protection of these areas. Moreover, vulnerability and hazard maps are relevant in all phases of disaster management designed by the National Disaster Risk Reduction Council (NDRRMC) - disaster preparedness, prevention and mitigation and response and recovery and rehabilitation. This paper aims to analyze the vulnerability of Catanduanes, a coastal province in the Philippines, to storm surges in terms of four parameters: population, built environment, natural environment and agricultural production. The vulnerability study relies on the storm surge inundation maps based on the Department of Science and Technology Nationwide Operational Assessment of Hazards' (DOST-Project NOAH) proposed four Storm Surge Advisory (SSA) scenarios (1-2, 3, 4, and 5 meters) for predicting storm surge heights. To determine total percent affected for each parameter elements, an overlay analysis was performed in ArcGIS Desktop. Moreover, vulnerability and hazard maps are generated as a final output and a tool for visualizing the impacts of storm surge event at different surge heights. The result of this study would help the selected province to know their present condition and adapt strategies to strengthen areas where they are found to be most vulnerable in order to prepare better for the future.

  11. Performance analysis of seismocardiography for heart sound signal recording in noisy scenarios.

    PubMed

    Jain, Puneet Kumar; Tiwari, Anil Kumar; Chourasia, Vijay S

    2016-01-01

    This paper presents a system based on Seismocardiography (SCG) to monitor the heart sound signal for the long-term. It uses an accelerometer, which is of small size and low weight and, thus, convenient to wear. Such a system should also be robust to various noises which occur in real life scenarios. Therefore, a detailed analysis is provided of the proposed system and its performance is compared to the performance of the Phoncardiography (PCG) system. For this purpose, both signals of five subjects were simultaneously recorded in clinical and different real life noisy scenarios. For the quantitative analysis, the detection rate of fundamental heart sound components, S1 and S2, is obtained. Furthermore, a quality index based on the energy of fundamental components is also proposed and obtained for the same. Results show that both the techniques are able to acquire the S1 and S2, in clinical set-up. However, in real life scenarios, we observed many favourable features in the proposed system as compared to PCG, for its use for long-term monitoring. PMID:26860039

  12. Assessing magnitude probability distribution through physics-based rupture scenarios

    NASA Astrophysics Data System (ADS)

    Hok, Sébastien; Durand, Virginie; Bernard, Pascal; Scotti, Oona

    2016-04-01

    When faced with complex network of faults in a seismic hazard assessment study, the first question raised is to what extent the fault network is connected and what is the probability that an earthquake ruptures simultaneously a series of neighboring segments. Physics-based dynamic rupture models can provide useful insight as to which rupture scenario is most probable, provided that an exhaustive exploration of the variability of the input parameters necessary for the dynamic rupture modeling is accounted for. Given the random nature of some parameters (e.g. hypocenter location) and the limitation of our knowledge, we used a logic-tree approach in order to build the different scenarios and to be able to associate them with a probability. The methodology is applied to the three main faults located along the southern coast of the West Corinth rift. Our logic tree takes into account different hypothesis for: fault geometry, location of hypocenter, seismic cycle position, and fracture energy on the fault plane. The variability of these parameters is discussed, and the different values tested are weighted accordingly. 64 scenarios resulting from 64 parameter combinations were included. Sensitivity studies were done to illustrate which parameters control the variability of the results. Given the weight of the input parameters, we evaluated the probability to obtain a full network break to be 15 %, while single segment rupture represents 50 % of the scenarios. These rupture scenario probability distribution along the three faults of the West Corinth rift fault network can then be used as input to a seismic hazard calculation.

  13. The real world and lunar base activation scenarios

    NASA Technical Reports Server (NTRS)

    Schmitt, Harrison H.

    1992-01-01

    A lunar base or a network of lunar bases may have highly desirable support functions in a national or international program to explore and settle Mars. In addition, He-3 exported from the Moon could be the basis for providing much of the energy needs of humankind in the twenty-first century. Both technical and managerial issues must be addressed when considering the establishment of a lunar base that can serve the needs of human civilization in space. Many of the technical issues become evident in the consideration of hypothetical scenarios for the activation of a network of lunar bases. Specific and realistic assumptions must be made about the conduct of various types of activities in addition to the general assumptions given above. These activities include landings, crew consumables, power production, crew selection, risk management, habitation, science station placement, base planning, science, agriculture, resource evaluation, readaptation, plant activation and test, storage module landings, resource transport module landings, integrated operations, maintenance, Base 2 activation, and management. The development of scenarios for the activation of a lunar base or network of bases will require close attention to the 'real world' of space operations. That world is defined by the natural environment, available technology, realistic objectives, and common sense.

  14. Feature, Event, and Process Screening and Scenario Development for the Yucca Mountain Total System Performance Assessment

    SciTech Connect

    Barnard, R.; Barr, G.; Burch, P.; Freeze, G.; Rechard, R.; Schenker, A.; Swift, P.

    1999-04-05

    Scenario development has two primary purposes in the design and documentation of post-closure performance assessments in a regulatory setting. First, scenario development ensures a sufficiently comprehensive consideration of the possible future states of the system. Second, scenario development identifies the important scenarios that must be considered in quantitative analyses of the total system performance assessment (TSPA). Section 2.0 of this report describes the scenario development process. Steps in the process are described in Section 2.1, and terms introduced in this section are defined in Section 2.2. The electronic database used to document the process is described in Section 3, and Section 4 provides a summary of the current status of the YMP scenario development work. Section 5 contains acknowledgments, and Section 6 contains a list of the references cited.

  15. Problem-based learning: description, advantages, disadvantages, scenarios and facilitation.

    PubMed

    Jones, R W

    2006-08-01

    Problem-based learning arose out of educational initiatives in the 1960s and is often one of the most contentious issues within medical education. McMaster University in Canada was the first to implement problem-based learning on a large scale within medicine and this was soon followed by universities in Europe and Australia. In modern times, few western medical schools do not include at least some aspect of problem-based learning within their instructional itinerary, and many build their entire curriculum and instructional procedures around problem-based learning. This article provides an overview of problem-based learning within medical education, pertinent background, describes the characteristics of problem-based learning, its advantages and disadvantages, problem-based learning scenarios and facilitation.

  16. Mortality estimation based on Business as Usual Scenario

    NASA Astrophysics Data System (ADS)

    Pozzer, Andrea; Lelieveld, Jos; Barlas, Ceren

    2013-04-01

    Air pollution by fine particulate matter (PM2.5) and ozone (O3) has increased strongly with industrialization and urbanization. Epidemiological studies have shown that these pollutants increase lung cancer, cardiopulmonary and respiratory mortality. The atmospheric chemistry general circulation model EMAC has been used to estimate the concentration of such pollutants in recent and future years (2005, 2010, 2025 and 2050), based on a Business as Usual scenario. The emission scenario assumes that population and economic growth largely determine energy consumption and consequent pollution sources ("business as usual"). Based on the modeled pollutants concentrations and the UN estimates of population growth in the future, we assessed the premature mortality and the years of human life lost (YLL) caused by anthropogenic PM2.5 and O3 for epidemiological regions defined by the World Health Organization. The premature mortality for people of 30 years and older were estimated using a health impact function using parameters derived from epidemiological studies. Our results suggest that with a Business as Usual scenario, the ratio between mortality and population would increase of ~ 50% by 2050. This ratio, together with the increase of world population, would lead by the year 2050 to 8.9 millions premature deaths, equivalent to 79 millions of YYL.

  17. Improvement of nursing students' learning outcomes through scenario-based skills training

    PubMed Central

    Uysal, Nurcan

    2016-01-01

    Abstract Objective: this study analyzed the influence of scenario-based skills training on students' learning skills. Method: the author evaluated the nursing skills laboratory exam papers of 605 sophomores in nursing programs for seven years. The study determined the common mistakes of students and the laboratory work was designed in a scenario-based format. The effectiveness of this method was evaluated by assessing the number of errors the students committed and their achievement scores in laboratory examinations. This study presents the students' common mistakes in intramuscular and subcutaneous injection and their development of intravenous access skills, included in the nursing skills laboratory examination. Results: an analysis of the students' most common mistakes revealed that the most common was not following the principles of asepsis for all three skills (intramuscular, subcutaneous injection, intravenous access) in the first year of the scenario-based training. The students' exam achievement scores increased gradually, except in the fall semester of the academic year 2009-2010. The study found that the scenario-based skills training reduced students' common mistakes in examinations and enhanced their performance on exams. Conclusion: this method received a positive response from both students and instructors. The scenario-based training is available for use in addition to other skills training methods. PMID:27508922

  18. Late Pleistocene ice age scenarios based on observational evidence

    NASA Astrophysics Data System (ADS)

    Deblonde, G.; Peltier, W. R.

    1993-04-01

    Ice age scenarios for the last glacial-interglacial cycle, based on observations of Boyle and Keigwin (1982) concerning the North Atlantic thermohaline circulation and of Barnola et al. (1987) concerning atmospheric CO2 variations derived from the Vostok ice cores, are analyzed. Northern Hemisphere continental ice sheets are simulated with an energy balance model (EBM) that is asynchronously coupled to vertically integrated ice sheet models based on the Glen flow law. The EBM includes both a realistic land-sea distribution and temperature-albedo feedback and is driven with orbital variations of effective solar insolation. With the addition of atmospheric CO2 and ocean heat flux variations, but not in their absence, a complete collapse is obtained for the Eurasian ice sheet but not for the North American ice sheet. Further feedback mechanisms, perhaps involving more accurate modeling of the dynamics of the mostly marine-based Laurentide complex, appear necessary to explain termination I.

  19. Scenario-based approach to risk analysis in support of cyber security

    SciTech Connect

    Gertman, D. I.; Folkers, R.; Roberts, J.

    2006-07-01

    control systems, perpetrators will attempt to control and defeat automation systems, engineering access, control systems and protective systems implemented in today's critical infrastructures. Major systems such as supervisory control and data acquisition (SCADA) systems are likely targets for attack. Not all attack scenarios have the same expected frequency or consequence. The attacks will be directed and structured and thus, are not be characterized as random events when one considers failure probabilities. Attack types differ in their consequence as a function of the probability associated with various sub events in the presence of specific system configurations. Ideally, a series of generic scenarios can be identified for each of the major critical infrastructure (CI) sectors. A scenario-based approach to risk assessment allows decision makers to place financial and personnel resources in-place for attacks that truly matter: e.g. attacks that generate physical and economic damage. The use of scenario-based analysis allows risk reduction goals to be informed by more than consequence analysis alone. The key CI targets used in the present study were identified previously as part of a mid-level consequence analysis performed at INL by the Control System Security Program (CSSP) for the National Cyber Security Div. (NCSD) of the Dept. of Homeland Security (DHS). This paper discusses the process for and results associated with the development of scenario-based cyber attacks upon control systems including the information and personnel requirements for scenario development. Challenges to scenario development including completeness and uncertainty characterization are discussed as well. Further, the scenario discussed herein, is one of a number of scenarios for infrastructures currently under review. (authors)

  20. Raman resonance in iron-based superconductors: The magnetic scenario

    NASA Astrophysics Data System (ADS)

    Hinojosa, Alberto; Cai, Jiashen; Chubukov, Andrey V.

    2016-02-01

    We perform theoretical analysis of polarization-sensitive Raman spectroscopy on NaFe1 -xCoxAs , EuFe 2 As2 , SrFe2As2 , and Ba (Fe1 -xCox )2As2 , focusing on two features seen in the B1 g symmetry channel (in one Fe unit cell notation): the strong temperature dependence of the static, uniform Raman response in the normal state and the existence of a collective mode in the superconducting state. We show that both features can be explained by the coupling of fermions to pairs of magnetic fluctuations via the Aslamazov-Larkin process. We first analyze magnetically mediated Raman intensity at the leading two-loop order and then include interactions between pairs of magnetic fluctuations. We show that the full Raman intensity in the B1 g channel can be viewed as the result of the coupling of light to Ising-nematic susceptibility via Aslamazov-Larkin process. We argue that the singular temperature dependence in the normal state is the combination of the temperature dependencies of the Aslamazov-Larkin vertex and of Ising-nematic susceptibility. We discuss two scenario for the resonance below Tc. One is the resonance due to development of a pole in the fully renormalized Ising-nematic susceptibility. Another is orbital excitonic scenario, in which spin fluctuations generate attractive interaction between low-energy fermions.

  1. Construct Validity and Generalizability of Simulation-Based Objective Structured Clinical Examination Scenarios

    PubMed Central

    Sidi, Avner; Gravenstein, Nikolaus; Lampotang, Samsun

    2014-01-01

    Background It is not known if construct-related validity (progression of scores with different levels of training) and generalizability of Objective Structured Clinical Examination (OSCE) scenarios previously used with non-US graduating anesthesiology residents translate to a US training program. Objective We assessed for progression of scores with training for a validated high-stakes simulation-based anesthesiology examination. Methods Fifty US anesthesiology residents in postgraduate years (PGYs) 2 to 4 were evaluated in operating room, trauma, and resuscitation scenarios developed for and used in a high-stakes Israeli Anesthesiology Board examination, requiring a score of 70% on the checklist for passing (including all critical items). Results The OSCE error rate was lower for PGY-4 than PGY-2 residents in each field, and for most scenarios within each field. The critical item error rate was significantly lower for PGY-4 than PGY-3 residents in operating room scenarios, and for PGY-4 than PGY-2 residents in resuscitation scenarios. The final pass rate was significantly higher for PGY-3 and PGY-4 than PGY-2 residents in operating room scenarios, and also was significantly higher for PGY-4 than PGY-2 residents overall. PGY-4 residents had a better error rate, total scenarios score, general evaluation score, critical items error rate, and final pass rate than PGY-2 residents. Conclusions The comparable error rates, performance grades, and pass rates for US PGY-4 and non-US (Israeli) graduating (PGY-4 equivalent) residents, and the progression of scores among US residents with training level, demonstrate the construct-related validity and generalizability of these high-stakes OSCE scenarios. PMID:26279774

  2. The use of scenario-based-learning interactive software to create custom virtual laboratory scenarios for teaching genetics.

    PubMed

    Breakey, Kate M; Levin, Daniel; Miller, Ian; Hentges, Kathryn E

    2008-07-01

    Mutagenesis screens and analysis of mutant phenotypes are one of the most powerful approaches for the study of genetics. Yet genetics students often have difficulty understanding the experimental procedures and breeding crosses required in mutagenesis screens and linking mutant phenotypes to molecular defects. Performing these experiments themselves often aids students in understanding the methodology. However, there are limitations to performing genetics experiments in a student laboratory. For example, the generation time of laboratory model organisms is considerable, and a laboratory exercise that involves many rounds of breeding or analysis of many mutants is not often feasible. Additionally, the cost of running a laboratory practical, along with safety considerations for particular reagents or protocols, often dictates the experiments that students can perform. To provide an alternative to a traditional laboratory module, we have used Scenario-Based-Learning Interactive (SBLi) software to develop a virtual laboratory to support a second year undergraduate course entitled "Genetic Analysis." This resource allows students to proceed through the steps of a genetics experiment, without the time, cost, or safety constraints of a traditional laboratory exercise.

  3. Using Crash Data to Develop Simulator Scenarios for Assessing Novice Driver Performance.

    PubMed

    McDonald, Catherine C; Tanenbaum, Jason B; Lee, Yi-Ching; Fisher, Donald L; Mayhew, Daniel R; Winston, Flaura K

    2012-01-01

    Teenage drivers are at their highest crash risk in their first 6 months or first 1,000 mi of driving. Driver training, adult-supervised practice driving, and other interventions are aimed at improving driving performance in novice drivers. Previous driver training programs have enumerated thousands of scenarios, with each scenario requiring one or more skills. Although there is general agreement about the broad set of skills needed to become a competent driver, there is no consensus set of scenarios and skills to assess whether novice drivers are likely to crash or to assess the effects of novice driver training programs on the likelihood of a crash. The authors propose that a much narrower, common set of scenarios can be used to focus on the high-risk crashes of young drivers. Until recently, it was not possible to identify the detailed set of scenarios that were specific to high-risk crashes. However, an integration of police crash reports from previous research, a number of critical simulator studies, and a nationally representative database of serious teen crashes (the National Motor Vehicle Crash Causation Survey) now make identification of these scenarios possible. In this paper, the authors propose this novel approach and discuss how to create a common set of simulated scenarios and skills to assess novice driver performance and the effects of training and interventions as they relate to high-risk crashes. PMID:23543947

  4. Supply Chain Vulnerability Analysis Using Scenario-Based Input-Output Modeling: Application to Port Operations.

    PubMed

    Thekdi, Shital A; Santos, Joost R

    2016-05-01

    Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario-based methods to measure economic sensitivity to sudden-onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management. PMID:26271771

  5. Supply Chain Vulnerability Analysis Using Scenario-Based Input-Output Modeling: Application to Port Operations.

    PubMed

    Thekdi, Shital A; Santos, Joost R

    2016-05-01

    Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario-based methods to measure economic sensitivity to sudden-onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management.

  6. Late pleistocene ice age scenarios based on observational evidence

    SciTech Connect

    DeBlonde, G. ); Peltier, W.R. )

    1993-04-01

    Ice age scenarios for the last glacial interglacial cycle, based on observations of Boyle and Keigwin concerning the North Atlantic thermohaline circulation and of Barnola et al. concerning atmospheric CO[sub 2] variations derived from the Vostok ice cores, are herein analyzed. Northern Hemisphere continental ice sheets are simulated with an energy balance model (EBM) that is asynchronously coupled to vertically integrated ice sheets models based on the Glen flow law. The EBM includes both a realistic land-sea distribution and temperature-albedo feedback and is driven with orbital variations of effective solar insolation. With the addition of atmospheric CO[sub 2] and ocean heat flux variations, but not in their absence, a complete collapse is obtained for the Eurasian ice sheet but not for the North American ice sheet. We therefore suggest that further feedback mechanisms, perhaps involving more accurate modeling of the dynamics of the mostly marine-based Laurentide complex appears necessary to explain termination I. 96 refs., 12 figs., 2 tabs.

  7. Treatment of hypogonadotropic male hypogonadism: Case-based scenarios

    PubMed Central

    Crosnoe-Shipley, Lindsey E; Elkelany, Osama O; Rahnema, Cyrus D; Kim, Edward D

    2015-01-01

    The aim of this study is to review four case-based scenarios regarding the treatment of symptomatic hypogonadism in men. The article is designed as a review of published literature. We conducted a PubMed literature search for the time period of 1989-2014, concentrating on 26 studies investigating the efficacy of various therapeutic options on semen analysis, pregnancy outcomes, time to recovery of spermatogenesis, as well as serum and intratesticular testosterone levels. Our results demonstrated that exogenous testosterone suppresses intratesticular testosterone production, which is an absolute prerequisite for normal spermatogenesis. Cessation of exogenous testosterone should be recommended for men desiring to maintain their fertility. Therapies that protect the testis involve human chorionic gonadotropin (hCG) therapy or selective estrogen receptor modulators (SERMs), but may also include low dose hCG with exogenous testosterone. Off-label use of SERMs, such as clomiphene citrate, are effective for maintaining testosterone production long-term and offer the convenience of representing a safe, oral therapy. At present, routine use of aromatase inhibitors is not recommended based on a lack of long-term data. We concluded that exogenous testosterone supplementation decreases sperm production. It was determined that clomiphene citrate is a safe and effective therapy for men who desire to maintain fertility. Although less frequently used in the general population, hCG therapy with or without testosterone supplementation represents an alternative treatment. PMID:25949938

  8. Diminished Wastewater Treatment: Evaluation of Septic System Performance Under a Climate Change Scenario

    NASA Astrophysics Data System (ADS)

    Cooper, J.; Loomis, G.; Kalen, D.; Boving, T. B.; Morales, I.; Amador, J.

    2015-12-01

    The effects of climate change are expected to reduce the ability of soil-based onsite wastewater treatment systems (OWTS), to treat domestic wastewater. In the northeastern U.S., the projected increase in atmospheric temperature, elevation of water tables from rising sea levels, and heightened precipitation will reduce the volume of unsaturated soil and oxygen available for treatment. Incomplete removal of contaminants may lead to transport of pathogens, nutrients, and biochemical oxygen demand (BOD) to groundwater, increasing the risk to public health and likelihood of eutrophying aquatic ecosystems. Advanced OWTS, which include pre-treatment steps and provide unsaturated drainfields of greater volume relative to conventional OWTS, are expected to be more resilient to climate change. We used intact soil mesocosms to quantify water quality functions for two advanced shallow narrow drainfield types and a conventional drainfield under a current climate scenario and a moderate climate change scenario of 30 cm rise in water table and 5°C increase in soil temperature. While no fecal coliform bacteria (FCB) was released under the current climate scenario, up to 109 CFU FCB/mL (conventional) and up to 20 CFU FCB/mL (shallow narrow) were released under the climate change scenario. Total P removal rates dropped from 100% to 54% (conventional) and 71% (shallow narrow) under the climate change scenario. Total N removal averaged 17% under both climate scenarios in the conventional, but dropped from 5.4% to 0% in the shallow narrow under the climate change scenario, with additional leaching of N in excess of inputs indicating release of previously held N. No significant difference was observed between scenarios for BOD removal. The initial data indicate that while advanced OWTS retain more function under the climate change scenario, all three drainfield types experience some diminished treatment capacity.

  9. Scenario details of NPE 2012 - Independent performance assessment by simulated CTBT violation

    NASA Astrophysics Data System (ADS)

    Gestermann, N.; Bönnemann, C.; Ceranna, L.; Ross, O.; Schlosser, C.

    2012-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) was opened for signature on 24 September 1996. The technical preparations for monitoring CTBT compliance are moving ahead rapidly and many efforts have been made since then to establish the verification system. In that regard the two underground nuclear explosions conducted by the Democratic People's Republic of Korea 2006 and 2009 were the first real performance tests of the system. In the light of these events National Data Centres (NDCs) realized the need of getting more familiar with the verification regime details. The idea of an independent annual exercise to evaluate the processing and analysis procedures applied at the National Data Centres of the CTBT was born at the NDC Evaluation Workshop in Kiev, Ukraine, 2006. The exercises should simulate a fictitious violation of the CTBT and all NDCs are invited to clarify the nature of the selected event. This exercise should help to evaluate the effectiveness of procedures applied at NDCs, as well as the quality, completeness, and usefulness of IDC products. Moreover, the National Data Centres Preparedness Exercise (NPE) is a measure for the readiness of the NDCs to fulfill their duties in regard of the CTBT verification, the treaty compliance based judgments about the nature of events as natural or artificial and chemical or nuclear, respectively. NPEs proved to be an efficient indicative tool for testing the performance of the verification system and its elements. In 2007 and 2008 the exercise were focused on seismic waveform data analysis. Since 2009 the analysis of infrasound data was included and additional attention was attached to the radionuclide component. In 2010 a realistic noble gas release scenario was selected as the trigger event which could be expected after an underground nuclear test. The epicenter location of an event from the Reviewed Event Bulletin (REB), unknown for participants of the exercise, was selected as the source of the noble gas

  10. Tracking Systems for Virtual Rehabilitation: Objective Performance vs. Subjective Experience. A Practical Scenario

    PubMed Central

    Lloréns, Roberto; Noé, Enrique; Naranjo, Valery; Borrego, Adrián; Latorre, Jorge; Alcañiz, Mariano

    2015-01-01

    Motion tracking systems are commonly used in virtual reality-based interventions to detect movements in the real world and transfer them to the virtual environment. There are different tracking solutions based on different physical principles, which mainly define their performance parameters. However, special requirements have to be considered for rehabilitation purposes. This paper studies and compares the accuracy and jitter of three tracking solutions (optical, electromagnetic, and skeleton tracking) in a practical scenario and analyzes the subjective perceptions of 19 healthy subjects, 22 stroke survivors, and 14 physical therapists. The optical tracking system provided the best accuracy (1.074 ± 0.417 cm) while the electromagnetic device provided the most inaccurate results (11.027 ± 2.364 cm). However, this tracking solution provided the best jitter values (0.324 ± 0.093 cm), in contrast to the skeleton tracking, which had the worst results (1.522 ± 0.858 cm). Healthy individuals and professionals preferred the skeleton tracking solution rather than the optical and electromagnetic solution (in that order). Individuals with stroke chose the optical solution over the other options. Our results show that subjective perceptions and preferences are far from being constant among different populations, thus suggesting that these considerations, together with the performance parameters, should be also taken into account when designing a rehabilitation system. PMID:25808765

  11. Tracking systems for virtual rehabilitation: objective performance vs. subjective experience. A practical scenario.

    PubMed

    Lloréns, Roberto; Noé, Enrique; Naranjo, Valery; Borrego, Adrián; Latorre, Jorge; Alcañiz, Mariano

    2015-03-19

    Motion tracking systems are commonly used in virtual reality-based interventions to detect movements in the real world and transfer them to the virtual environment. There are different tracking solutions based on different physical principles, which mainly define their performance parameters. However, special requirements have to be considered for rehabilitation purposes. This paper studies and compares the accuracy and jitter of three tracking solutions (optical, electromagnetic, and skeleton tracking) in a practical scenario and analyzes the subjective perceptions of 19 healthy subjects, 22 stroke survivors, and 14 physical therapists. The optical tracking system provided the best accuracy (1.074 ± 0.417 cm) while the electromagnetic device provided the most inaccurate results (11.027 ± 2.364 cm). However, this tracking solution provided the best jitter values (0.324 ± 0.093 cm), in contrast to the skeleton tracking, which had the worst results (1.522 ± 0.858 cm). Healthy individuals and professionals preferred the skeleton tracking solution rather than the optical and electromagnetic solution (in that order). Individuals with stroke chose the optical solution over the other options. Our results show that subjective perceptions and preferences are far from being constant among different populations, thus suggesting that these considerations, together with the performance parameters, should be also taken into account when designing a rehabilitation system.

  12. Long Pulse High Performance Plasma Scenario Development for the National Spherical Torus Experiment

    SciTech Connect

    Kessel, C.E.; Bell, R.E.; Bell, M.G.; Gates, D.A.; Harvey, R.W.

    2006-01-01

    The National Spherical Torus Experiment [Ono et al., Nucl. Fusion, 44, 452 (2004)] is targeting long pulse high performance, noninductive sustained operations at low aspect ratio, and the demonstration of nonsolenoidal startup and current rampup. The modeling of these plasmas provides a framework for experimental planning and identifies the tools to access these regimes. Simulations based on neutral beam injection (NBI)-heated plasmas are made to understand the impact of various modifications and identify the requirements for (1) high elongation and triangularity, (2) density control to optimize the current drive, (3) plasma rotation and/or feedback stabilization to operate above the no-wall limit, and (4) electron Bernstein waves (EBW) for off-axis heating/current drive (H/CD). Integrated scenarios are constructed to provide the transport evolution and H/CD source modeling, supported by rf and stability analyses. Important factors include the energy confinement, Zeff, early heating/H mode, broadening of the NBI-driven current profile, and maintaining q(0) and qmin>1.0. Simulations show that noninductive sustained plasmas can be reached at IP=800 kA, BT=0.5 T, 2.5, N5, 15%, fNI=92%, and q(0)>1.0 with NBI H/CD, density control, and similar global energy confinement to experiments. The noninductive sustained high plasmas can be reached at IP=1.0 MA, BT=0.35 T, 2.5, N9, 43%, fNI=100%, and q(0)>1.5 with NBI H/CD and 3.0 MW of EBW H/CD, density control, and 25% higher global energy confinement than experiments. A scenario for nonsolenoidal plasma current rampup is developed using high harmonic fast wave H/CD in the early low IP and low Te phase, followed by NBI H/CD to continue the current ramp, reaching a maximum of 480 kA after 3.4 s.

  13. Logistics of a Lunar Based Solar Power Satellite Scenario

    NASA Technical Reports Server (NTRS)

    Melissopoulos, Stefanos

    1995-01-01

    A logistics system comprised of two orbital stations for the support of a 500 GW space power satellite scenario in a geostationary orbit was investigated in this study. A subsystem mass model, a mass flow model and a life cycle cost model were developed. The results regarding logistics cost and burden rates show that the transportation cost contributed the most (96%) to the overall cost of the scenario. The orbital stations at a geostationary and at a lunar orbit contributed 4 % to that cost.

  14. Virtual screening applications: a study of ligand-based methods and different structure representations in four different scenarios.

    PubMed

    Hristozov, Dimitar P; Oprea, Tudor I; Gasteiger, Johann

    2007-01-01

    Four different ligand-based virtual screening scenarios are studied: (1) prioritizing compounds for subsequent high-throughput screening (HTS); (2) selecting a predefined (small) number of potentially active compounds from a large chemical database; (3) assessing the probability that a given structure will exhibit a given activity; (4) selecting the most active structure(s) for a biological assay. Each of the four scenarios is exemplified by performing retrospective ligand-based virtual screening for eight different biological targets using two large databases--MDDR and WOMBAT. A comparison between the chemical spaces covered by these two databases is presented. The performance of two techniques for ligand--based virtual screening--similarity search with subsequent data fusion (SSDF) and novelty detection with Self-Organizing Maps (ndSOM) is investigated. Three different structure representations--2,048-dimensional Daylight fingerprints, topological autocorrelation weighted by atomic physicochemical properties (sigma electronegativity, polarizability, partial charge, and identity) and radial distribution functions weighted by the same atomic physicochemical properties--are compared. Both methods were found applicable in scenario one. The similarity search was found to perform slightly better in scenario two while the SOM novelty detection is preferred in scenario three. No method/descriptor combination achieved significant success in scenario four.

  15. Scenario-Based Spoken Interaction with Virtual Agents

    ERIC Educational Resources Information Center

    Morton, Hazel; Jack, Mervyn A.

    2005-01-01

    This paper describes a CALL approach which integrates software for speaker independent continuous speech recognition with embodied virtual agents and virtual worlds to create an immersive environment in which learners can converse in the target language in contextualised scenarios. The result is a self-access learning package: SPELL (Spoken…

  16. Designing, Developing and Implementing a Software Tool for Scenario Based Learning

    ERIC Educational Resources Information Center

    Norton, Geoff; Taylor, Mathew; Stewart, Terry; Blackburn, Greg; Jinks, Audrey; Razdar, Bahareh; Holmes, Paul; Marastoni, Enrique

    2012-01-01

    The pedagogical value of problem-based and inquiry-based learning activities has led to increased use of this approach in many courses. While scenarios or case studies were initially presented to learners as text-based material, the development of modern software technology provides the opportunity to deliver scenarios as e-learning modules,…

  17. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  18. Earthquake scenarios based on lessons from the past

    NASA Astrophysics Data System (ADS)

    Solakov, Dimcho; Simeonova, Stella; Aleksandrova, Irena; Popova, Iliana

    2010-05-01

    Earthquakes are the most deadly of the natural disasters affecting the human environment; indeed catastrophic earthquakes have marked the whole human history. Global seismic hazard and vulnerability to earthquakes are increasing steadily as urbanization and development occupy more areas that are prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The implementation of the earthquake scenarios into the policies for seismic risk reduction will allow focusing on the prevention of earthquake effects rather than on intervention following the disasters. The territory of Bulgaria (situated in the eastern part of the Balkan Peninsula) represents a typical example of high seismic risk area. Over the centuries, Bulgaria has experienced strong earthquakes. At the beginning of the 20-the century (from 1901 to 1928) five earthquakes with magnitude larger than or equal to MS=7.0 occurred in Bulgaria. However, no such large earthquakes occurred in Bulgaria since 1928, which may induce non-professionals to underestimate the earthquake risk. The 1986 earthquake of magnitude MS=5.7 occurred in the central northern Bulgaria (near the town of Strazhitsa) is the strongest quake after 1928. Moreover, the seismicity of the neighboring countries, like Greece, Turkey, former Yugoslavia and Romania (especially Vrancea-Romania intermediate earthquakes), influences the seismic hazard in Bulgaria. In the present study deterministic scenarios (expressed in seismic intensity) for two Bulgarian cities (Rouse and Plovdiv) are presented. The work on

  19. Robust Performance of Marginal Pacific Coral Reef Habitats in Future Climate Scenarios.

    PubMed

    Freeman, Lauren A

    2015-01-01

    Coral reef ecosystems are under dual threat from climate change. Increasing sea surface temperatures and thermal stress create environmental limits at low latitudes, and decreasing aragonite saturation state creates environmental limits at high latitudes. This study examines the response of unique coral reef habitats to climate change in the remote Pacific, using the National Center for Atmospheric Research Community Earth System Model version 1 alongside the species distribution algorithm Maxent. Narrow ranges of physico-chemical variables are used to define unique coral habitats and their performance is tested in future climate scenarios. General loss of coral reef habitat is expected in future climate scenarios and has been shown in previous studies. This study found exactly that for most of the predominant physico-chemical environments. However, certain coral reef habitats considered marginal today at high latitude, along the equator and in the eastern tropical Pacific were found to be quite robust in climate change scenarios. Furthermore, an environmental coral reef refuge previously identified in the central south Pacific near French Polynesia was further reinforced. Studying the response of specific habitats showed that the prevailing conditions of this refuge during the 20th century shift to a new set of conditions, more characteristic of higher latitude coral reefs in the 20th century, in future climate scenarios projected to 2100.

  20. Robust Performance of Marginal Pacific Coral Reef Habitats in Future Climate Scenarios

    PubMed Central

    Freeman, Lauren A.

    2015-01-01

    Coral reef ecosystems are under dual threat from climate change. Increasing sea surface temperatures and thermal stress create environmental limits at low latitudes, and decreasing aragonite saturation state creates environmental limits at high latitudes. This study examines the response of unique coral reef habitats to climate change in the remote Pacific, using the National Center for Atmospheric Research Community Earth System Model version 1 alongside the species distribution algorithm Maxent. Narrow ranges of physico-chemical variables are used to define unique coral habitats and their performance is tested in future climate scenarios. General loss of coral reef habitat is expected in future climate scenarios and has been shown in previous studies. This study found exactly that for most of the predominant physico-chemical environments. However, certain coral reef habitats considered marginal today at high latitude, along the equator and in the eastern tropical Pacific were found to be quite robust in climate change scenarios. Furthermore, an environmental coral reef refuge previously identified in the central south Pacific near French Polynesia was further reinforced. Studying the response of specific habitats showed that the prevailing conditions of this refuge during the 20th century shift to a new set of conditions, more characteristic of higher latitude coral reefs in the 20th century, in future climate scenarios projected to 2100. PMID:26053439

  1. Scenarios and performance measures for advanced ISDN satellite design and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1991-01-01

    Described here are the contemplated input and expected output for the Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) and Full Service ISDN Satellite (FSIS) Models. The discrete event simulations of these models are presented with specific scenarios that stress ISDN satellite parameters. Performance measure criteria are presented for evaluating the advanced ISDN communication satellite designs of the NASA Satellite Communications Research (SCAR) Program.

  2. Graph-Based Comparisons of Scenarios in Intelligence Analysis

    SciTech Connect

    Chin, George; Kuchar, Olga A.; Whitney, Paul D.; Powers, Mary E.; Johnson, Katherine E.

    2004-10-13

    In the role of detecting and preventing strategic surprise, intelligence analysts rely heavily on history as they refer to past cases and events to find similarities and distinctions to a current situation or threat. Analysts’ ability to identify similar cases and events, however, is very limited because the intelligence data available for searching are largely disconnected and limited in context. As a result, analysts searching intelligence data often resort to searching by keyword or timestamp. To make better use of intelligence data, analysts should not only be able to construct hypotheses of a current situation or threat but also reconstruct the full context of past events such that those events may then be effectively compared to the current situation and hypotheses. In constructing the context of a situation, case, or hypothesis, analysts decipher patterns and relationships among many different kinds and sources of information. Analysts identify these patterns and relationships through different kinds of analyses. For instance, analysts may conduct link analysis to examine different kinds of relationships, geographical analysis to look at spatial positioning and grouping, timeline analysis to lay out chronological events, and visual analysis to identify clusters of similar information. What these different kinds of analyses provide are analytical frameworks or models for organizing intelligence data along key properties. Unfortunately, these higher-level analytical models are transient: they disappear when their associated analysis tools stop executing. Consequently, analysts cannot easily associate their analyses with their intelligence data in a way that facilitates the integration of different analyses into a larger picture or the comparison of analytical models to other cases. On the Scenario and Knowledge Framework for Analytical Modelling (SKFAM) project at the Pacific Northwest National Laboratory, we are researching and prototyping a framework that

  3. On the Performance of Video Quality Assessment Metrics under Different Compression and Packet Loss Scenarios

    PubMed Central

    Martínez-Rach, Miguel O.; Piñol, Pablo; López, Otoniel M.; Perez Malumbres, Manuel; Oliver, José; Calafate, Carlos Tavares

    2014-01-01

    When comparing the performance of video coding approaches, evaluating different commercial video encoders, or measuring the perceived video quality in a wireless environment, Rate/distortion analysis is commonly used, where distortion is usually measured in terms of PSNR values. However, PSNR does not always capture the distortion perceived by a human being. As a consequence, significant efforts have focused on defining an objective video quality metric that is able to assess quality in the same way as a human does. We perform a study of some available objective quality assessment metrics in order to evaluate their behavior in two different scenarios. First, we deal with video sequences compressed by different encoders at different bitrates in order to properly measure the video quality degradation associated with the encoding system. In addition, we evaluate the behavior of the quality metrics when measuring video distortions produced by packet losses in mobile ad hoc network scenarios with variable degrees of network congestion and node mobility. Our purpose is to determine if the analyzed metrics can replace the PSNR while comparing, designing, and evaluating video codec proposals, and, in particular, under video delivery scenarios characterized by bursty and frequent packet losses, such as wireless multihop environments. PMID:24982988

  4. On the performance of video quality assessment metrics under different compression and packet loss scenarios.

    PubMed

    Martínez-Rach, Miguel O; Piñol, Pablo; López, Otoniel M; Perez Malumbres, Manuel; Oliver, José; Calafate, Carlos Tavares

    2014-01-01

    When comparing the performance of video coding approaches, evaluating different commercial video encoders, or measuring the perceived video quality in a wireless environment, Rate/distortion analysis is commonly used, where distortion is usually measured in terms of PSNR values. However, PSNR does not always capture the distortion perceived by a human being. As a consequence, significant efforts have focused on defining an objective video quality metric that is able to assess quality in the same way as a human does. We perform a study of some available objective quality assessment metrics in order to evaluate their behavior in two different scenarios. First, we deal with video sequences compressed by different encoders at different bitrates in order to properly measure the video quality degradation associated with the encoding system. In addition, we evaluate the behavior of the quality metrics when measuring video distortions produced by packet losses in mobile ad hoc network scenarios with variable degrees of network congestion and node mobility. Our purpose is to determine if the analyzed metrics can replace the PSNR while comparing, designing, and evaluating video codec proposals, and, in particular, under video delivery scenarios characterized by bursty and frequent packet losses, such as wireless multihop environments.

  5. Performance Analysis of Ad Hoc Routing Protocols in City Scenario for VANET

    NASA Astrophysics Data System (ADS)

    Das, Sanjoy; Raw, Ram Shringar; Das, Indrani

    2011-12-01

    In this paper, performance analysis of Location Aided Routing (LAR), AODV and DSR protocol in city scenarios has been done. The mobility model considered is Manhattan model. This mobility model used to emulate the movement pattern of nodes i.e., vehicles on streets defined by maps. Our objective is to provide a comparative analysis among LAR, AODV and DSR protocol in city scenarios in Vehicular Ad hoc Networks. The simulation work has been conducted using the Glomosim 2.03 simulator. The results show that LAR1 protocol achieves maximum packet delivery ratio is 100% in the sparsely populated network. The delay is maximum in AODV 121.88 ms when the number of node is 10 in the network. The results show that LAR1 outperform DSR and AODV in term of packet delivery ratio and end to end delay.

  6. A Problem-Based Learning Scenario That Can Be Used in Science Teacher Education

    ERIC Educational Resources Information Center

    Sezgin Selçuk, Gamze

    2015-01-01

    The purpose of this study is to introduce a problem-based learning (PBL) scenario that elementary school science teachers in middle school (5th-8th grades) can use in their in-service training. The scenario treats the subjects of heat, temperature and thermal expansion within the scope of the 5th and 6th grade science course syllabi and has been…

  7. Work-Based Learning in the UK: Scenarios for the Future

    ERIC Educational Resources Information Center

    Mohamud, Mohamed; Jennings, Chris; Rix, Mike; Gold, Jeff

    2006-01-01

    Purpose: Aims to consider scenarios created by work-based learning (WBL) providers in the Tees Valley in the UK. Design/methodology/approach: The context of WBL is examined in relation to the notion of the skills gap. The method of scenario development is described. Findings: A key task of WBL is to raise the skills levels of young people. WBL…

  8. A Scenario-Based Dieting Self-Efficacy Scale: The DIET-SE

    ERIC Educational Resources Information Center

    Stich, Christine; Knauper, Barbel; Tint, Ami

    2009-01-01

    The article discusses a scenario-based dieting self-efficacy scale, the DIET-SE, developed from dieter's inventory of eating temptations (DIET). The DIET-SE consists of items that describe scenarios of eating temptations for a range of dieting situations, including high-caloric food temptations. Four studies assessed the psychometric properties of…

  9. Scenario-based water resources planning for utilities in the Lake Victoria region

    NASA Astrophysics Data System (ADS)

    Mehta, V. K.; Aslam, O.; Dale, L.; Miller, N.; Purkey, D.

    2010-12-01

    Cities in the Lake Victoria (LV) region are experiencing the highest growth rates in Africa, at the same time that their water resource is threatened by domestic waste and industrial pollution. Urban centers use local springs, wetlands and Lake Victoria as source waters. As efforts to meet increasing demand accelerate, integrated water resources management (IWRM) tools provide opportunities for utilities and other stakeholders to develop a planning framework comprehensive enough to include short term (e.g. landuse change), as well as longer term (e.g. climate change) scenarios. This paper presents IWRM models built using the Water Evaluation And Planning (WEAP) decision support system, for three pilot towns in the LV region - Bukoba (Tanzania), Masaka (Uganda), and Kisii (Kenya). Their current populations are 100,000, 70,000 and 200,000 respectively. Demand coverage is ~70% in Masaka and Bukoba, and less than 50% in Kisii. IWRM models for each town were calibrated under current system performance based on site visits, utility reporting and interviews. Projected water supply, demand, revenues and costs were then evaluated against a combination of climate, demographic and infrastructure scenarios upto 2050. In Masaka, flow and climate data were available to calibrate a runoff model to simulate streamflow at water intake. In Masaka, without considering climate change, the system is infrastructure-limited and not water availability (hydrology) limited until 2035, under projected population growth of 2.17%. Under a wet climate scenario as projected by GCM’s for the LV region, the current wetland source could supply all expected demands until 2050. Even under a drought scenario, the wetland could supply all demand until 2032, if the supply infrastructure is updated at an estimated cost of USD 10.8 million. However, demand targets can only be met at the expense of almost no water returning to the wetland downstream of the intake by 2035, unless substantial investments

  10. Scenario-based water resources planning for utilities in the Lake Victoria region

    NASA Astrophysics Data System (ADS)

    Mehta, Vishal K.; Aslam, Omar; Dale, Larry; Miller, Norman; Purkey, David R.

    Urban areas in the Lake Victoria (LV) region are experiencing the highest growth rates in Africa. As efforts to meet increasing demand accelerate, integrated water resources management (IWRM) tools provide opportunities for utilities and other stakeholders to develop a planning framework comprehensive enough to include short term (e.g. landuse change), as well as longer term (e.g. climate change) scenarios. This paper presents IWRM models built using the Water Evaluation And Planning (WEAP) decision support system, for three towns in the LV region - Bukoba (Tanzania), Masaka (Uganda), and Kisii (Kenya). Each model was calibrated under current system performance based on site visits, utility reporting and interviews. Projected water supply, demand, revenues and costs were then evaluated against a combination of climate, demographic and infrastructure scenarios up to 2050. Our results show that water supply in all three towns is currently infrastructure limited; achieving existing design capacity could meet most projected demand until 2020s in Masaka beyond which new supply and conservation strategies would be needed. In Bukoba, reducing leakages would provide little performance improvement in the short-term, but doubling capacity would meet all demands until 2050. In Kisii, major infrastructure investment is urgently needed. In Masaka, streamflow simulations show that wetland sources could satisfy all demand until 2050, but at the cost of almost no water downstream of the intake. These models demonstrate the value of IWRM tools for developing water management plans that integrate hydroclimatology-driven supply to demand projections on a single platform.

  11. Optimizing performance of hybrid FSO/RF networks in realistic dynamic scenarios

    NASA Astrophysics Data System (ADS)

    Llorca, Jaime; Desai, Aniket; Baskaran, Eswaran; Milner, Stuart; Davis, Christopher

    2005-08-01

    Hybrid Free Space Optical (FSO) and Radio Frequency (RF) networks promise highly available wireless broadband connectivity and quality of service (QoS), particularly suitable for emerging network applications involving extremely high data rate transmissions such as high quality video-on-demand and real-time surveillance. FSO links are prone to atmospheric obscuration (fog, clouds, snow, etc) and are difficult to align over long distances due the use of narrow laser beams and the effect of atmospheric turbulence. These problems can be mitigated by using adjunct directional RF links, which provide backup connectivity. In this paper, methodologies for modeling and simulation of hybrid FSO/RF networks are described. Individual link propagation models are derived using scattering theory, as well as experimental measurements. MATLAB is used to generate realistic atmospheric obscuration scenarios, including moving cloud layers at different altitudes. These scenarios are then imported into a network simulator (OPNET) to emulate mobile hybrid FSO/RF networks. This framework allows accurate analysis of the effects of node mobility, atmospheric obscuration and traffic demands on network performance, and precise evaluation of topology reconfiguration algorithms as they react to dynamic changes in the network. Results show how topology reconfiguration algorithms, together with enhancements to TCP/IP protocols which reduce the network response time, enable the network to rapidly detect and act upon link state changes in highly dynamic environments, ensuring optimized network performance and availability.

  12. Accessing technical data bases using STDS: A collection of scenarios

    NASA Technical Reports Server (NTRS)

    Hardgrave, W. T.

    1975-01-01

    A line by line description is given of sessions using the set-theoretic data system (STDS) to interact with technical data bases. The data bases contain data from actual applications at NASA Langley Research Center. The report is meant to be a tutorial document that accompanies set processing in a network environment.

  13. Review of scenario selection approaches for performance assessment of high-level waste repositories and related issues.

    SciTech Connect

    Banano, E.J.; Baca, R.G.

    1995-08-01

    The selection of scenarios representing plausible realizations of the future conditions-with associated probabilities of occurrence-that can affect the long-term performance of a high-level radioactive waste (HLW) repository is the commonly used method for treating the uncertainty in the prediction of the future states of the system. This method, conventionally referred to as the ``scenario approach,`` while common is not the only method to deal with this uncertainty; other method ``ch as the environmental simulation approach (ESA), have also been proposed. Two of the difficulties with the scenario approach are the lack of uniqueness in the definition of the term ``scenario`` and the lack of uniqueness in the approach to formulate scenarios, which relies considerably on subjective judgments. Consequently, it is difficult to assure that a complete and unique set of scenarios can be defined for use in a performance assessment. Because scenarios are key to the determination of the long-term performance of the repository system, this lack of uniqueness can present a considerable challenge when attempting to reconcile the set of scenarios, and their level of detail, obtained using different approaches, particularly among proponents and regulators of a HLW repository.

  14. Mannich Bases: An Important Pharmacophore in Present Scenario

    PubMed Central

    Sharma, Neha; Kajal, Anu; Saini, Vipin

    2014-01-01

    Mannich bases are the end products of Mannich reaction and are known as beta-amino ketone carrying compounds. Mannich reaction is a carbon-carbon bond forming nucleophilic addition reaction and is a key step in synthesis of a wide variety of natural products, pharmaceuticals, and so forth. Mannich reaction is important for the construction of nitrogen containing compounds. There is a number of aminoalkyl chain bearing Mannich bases like fluoxetine, atropine, ethacrynic acid, trihexyphenidyl, and so forth with high curative value. The literature studies enlighten the fact that Mannich bases are very reactive and recognized to possess potent diverse activities like anti-inflammatory, anticancer, antifilarial, antibacterial, antifungal, anticonvulsant, anthelmintic, antitubercular, analgesic, anti-HIV, antimalarial, antipsychotic, antiviral activities and so forth. The biological activity of Mannich bases is mainly attributed to α, β-unsaturated ketone which can be generated by deamination of hydrogen atom of the amine group. PMID:25478226

  15. Scenario based approach for multiple source Tsunami Hazard assessment for Sines, Portugal

    NASA Astrophysics Data System (ADS)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-08-01

    In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines - Portugal, one of the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING a Non-linear Shallow Water Model With Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water), MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawback, runup and inundation distance. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at Sines test site considering the single scenarios at mean sea level, the aggregate scenario and the influence of the tide on the aggregate scenario. The results confirm the composite of Horseshoe and Marques Pombal fault as the worst case scenario. It governs the aggregate scenario with about 60 % and inundates an area of 3.5 km2.

  16. Performance-Based Assessment

    ERIC Educational Resources Information Center

    ERIC Review, 1994

    1994-01-01

    "The ERIC Review" is published three times a year and announces research results, publications, and new programs relevant to each issue's theme topic. This issue explores performance-based assessment via two principal articles: "Performance Assessment" (Lawrence M. Rudner and Carol Boston); and "Alternative Assessment: Implications for Social…

  17. River discharge and flood inundation over the Amazon based on IPCC AR5 scenarios

    NASA Astrophysics Data System (ADS)

    Paiva, Rodrigo; Sorribas, Mino; Jones, Charles; Carvalho, Leila; Melack, John; Bravo, Juan Martin; Beighley, Edward

    2015-04-01

    Climate change and related effects over the hydrologic regime of the Amazon River basin could have major impacts over human and ecological communities, including issues with transportation, flood vulnerability, fisheries and hydropower generation. We examined future changes in discharge and floodplain inundation within the Amazon River basin. We used the hydrological model MGB-IPH (Modelo de Grandes Bacias - Instituto de Pesquisas Hidráulicas) coupled with a 1D river hydrodynamic model simulating water storage over the floodplains. The model was forced using satellite based precipitation from the TRMM 3B42 dataset, and it had a good performance when validated against discharge and stage measurements as well as remotely sensed data, including radar altimetry-based water levels, gravity anomaly-based terrestrial water storage and flood inundation extent. Future scenarios of precipitation and other relevant climatic variables for the 2070 to 2100 time period were taken from five coupled atmosphere-ocean general circulation models (AOGCMs) from IPCC's Fifth Assessment Report (AR5) Coupled Model Intercomparison Project Phase 5 (CMIP5). The climate models were chosen based on their ability to represent the main aspects of recent (1970 to 2000) Amazon climate. A quantile-quantile bias removal procedure was applied to climate model precipitation to mitigate unreliable predictions. The hydrologic model was then forced using past observed climate data altered by delta change factors based on the past and future climate models aiming to estimate projected discharge and floodplain inundation in climate change scenario at several control points in the basin. The climate projections present large uncertainty, especially the precipitation rate, and predictions using different climate models do not agree on the sign of changes on total Amazon flood extent or discharge along the main stem of the Amazon River. However, analyses of results at different regions indicate an increase

  18. Scenario-Based Programming, Usability-Oriented Perception

    ERIC Educational Resources Information Center

    Alexandron, Giora; Armoni, Michal; Gordon, Michal; Harel, David

    2014-01-01

    In this article, we discuss the possible connection between the programming language and the paradigm behind it, and programmers' tendency to adopt an external or internal perspective of the system they develop. Based on a qualitative analysis, we found that when working with the visual, interobject language of live sequence charts (LSC),…

  19. 77 FR 48107 - Workshop on Performance Assessments of Near-Surface Disposal Facilities: FEPs Analysis, Scenario...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ...-Surface Disposal Facilities: FEPs Analysis, Scenario and Conceptual Model Development, and Code Selection...-level radioactive waste (LLW) disposal facilities. The workshop has been developed to facilitate...) the development of scenarios and conceptual models, and (3) the selection of computer...

  20. ABM and GIS-based multi-scenarios volcanic evacuation modelling of Merapi

    NASA Astrophysics Data System (ADS)

    Jumadi, Carver, Steve; Quincey, Duncan

    2016-05-01

    Conducting effective evacuation is one of the successful keys to deal with such crisis. Therefore, a plan that considers the probability of the spatial extent of the hazard occurrences is needed. Likewise, the evacuation plan in Merapi is already prepared before the eruption on 2010. However, the plan could not be performed because the eruption magnitude was bigger than it was predicted. In this condition, the extent of the hazardous area was increased larger than the prepared hazard model. Managing such unpredicted situation need adequate information that flexible and adaptable to the current situation. Therefore, we applied an Agent-based Model (ABM) and Geographic Information System (GIS) using multi-scenarios hazard model to support the evacuation management. The methodology and the case study in Merapi is provided.

  1. Ontology-based Software for Generating Scenarios for Characterizing Searches for Nuclear Materials

    SciTech Connect

    Ward, Richard C; Sorokine, Alexandre; Schlicher, Bob G; Wright, Michael C; Kruse, Kara L

    2011-01-01

    A software environment was created in which ontologies are used to significantly expand the number and variety of scenarios for special nuclear materials (SNM) detection based on a set of simple generalized initial descriptions. A framework was built that combined advanced reasoning from ontologies with geographical and other data sources to generate a much larger list of specific detailed descriptions from a simple initial set of user-input variables. This presentation shows how basing the scenario generation on a process of inferencing from multiple ontologies, including a new SNM Detection Ontology (DO) combined with data extraction from geodatabases, provided the desired significant variability of scenarios for testing search algorithms, including unique combinations of variables not previously expected. The various components of the software environment and the resulting scenarios generated will be discussed.

  2. Situation-Based Access Control: privacy management via modeling of patient data access scenarios.

    PubMed

    Peleg, Mor; Beimel, Dizza; Dori, Dov; Denekamp, Yaron

    2008-12-01

    Access control is a central problem in privacy management. A common practice in controlling access to sensitive data, such as electronic health records (EHRs), is Role-Based Access Control (RBAC). RBAC is limited as it does not account for the circumstances under which access to sensitive data is requested. Following a qualitative study that elicited access scenarios, we used Object-Process Methodology to structure the scenarios and conceive a Situation-Based Access Control (SitBAC) model. SitBAC is a conceptual model, which defines scenarios where patient's data access is permitted or denied. The main concept underlying this model is the Situation Schema, which is a pattern consisting of the entities Data-Requestor, Patient, EHR, Access Task, Legal-Authorization, and Response, along with their properties and relations. The various data access scenarios are expressed via Situation Instances. While we focus on the medical domain, the model is generic and can be adapted to other domains.

  3. Design Scenarios for Web-Based Management of Online Information

    NASA Astrophysics Data System (ADS)

    Hepting, Daryl H.; Maciag, Timothy

    The Internet enables access to more information, from a greater variety of perspectives and with greater immediacy, than ever before. A person may be interested in information to become more informed or to coordinate his or her local activities and place them into a larger, more global context. The challenge, as has been noted by many, is to sift through all the information to find what is relevant without becoming overwhelmed. Furthermore, the selected information must be put into an actionable form. The diversity of the Web has important consequences for the variety of ideas that are now available. While people once relied on newspaper editors to shape their view of the world, today's technology creates room for a more democratic approach. Today it is easy to pull news feeds from a variety of sources and aggregate them. It is less easy to push that information to a variety of channels. At a higher level, we might have the goal of collecting all the available information about a certain topic, on a daily basis. There are many new technologies available under the umbrella of Web 2.0, but it can be difficult to use them together for the management of online information. Web-based support for online communication management is the most appropriate choice to address the deficiencies apparent with current technologies. We consider the requirements and potential designs for such information management support, by following an example related to local food.

  4. Nanocarriers Based Anticancer Drugs: Current Scenario and Future Perceptions.

    PubMed

    Raj, Rakesh; Mongia, Pooja; Kumar Sahu, Suresh; Ram, Alpana

    2016-01-01

    Anticancer therapies mostly depend on the ability of the bioactives to reach their designated cellular and subcellular target sites, while minimizing accumulation and side effects at non specific sites. The development of nanotechnology based drug delivery systems that are able to modify the biodistribution, tissue uptake and pharmacokinetics of therapeutic agents is considered of great importance in biomedical research and treatment therapy. Controlled releases from nanocarriers can significantly enhance the therapeutic effect of a drug. Nanotechnology has the potential to revolutionize in cancer diagnosis and therapy. Targeted nano medicines either marketed or under development, are designed for the treatment of various types of cancer. Nanocarriers are able to reduce cytotoxic effect of the active anticancer drugs by increasing cancer cell targeting in comparison to conventional formulations. The newly developed nano devices such as quantum dots, liposomes, nanotubes, nanoparticles, micelles, gold nanoparticles, carbon nanotubes and solid lipid nanoparticles are the most promising applications for various cancer treatments. This review is focused on currently available information regarding pharmaceutical nanocarriers for cancer therapy and imaging.

  5. NPE 2010 results - Independent performance assessment by simulated CTBT violation scenarios

    NASA Astrophysics Data System (ADS)

    Ross, O.; Bönnemann, C.; Ceranna, L.; Gestermann, N.; Hartmann, G.; Plenefisch, T.

    2012-04-01

    For verification of compliance to the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the global International Monitoring System (IMS) is currently being built up. The IMS is designed to detect nuclear explosions through their seismic, hydroacoustic, infrasound, and radionuclide signature. The IMS data are collected, processed to analysis products, and distributed to the state signatories by the International Data Centre (IDC) in Vienna. The state signatories themselves may operate National Data Centers (NDC) giving technical advice concerning CTBT verification to the government. NDC Preparedness Exercises (NPE) are regularly performed to practice the verification procedures for the detection of nuclear explosions in the framework of CTBT monitoring. The initial focus of the NPE 2010 was on the component of radionuclide detections and the application of Atmospheric Transport Modeling (ATM) for defining the source region of a radionuclide event. The exercise was triggered by fictitious radioactive noble gas detections which were calculated beforehand secretly by forward ATM for a hypothetical xenon release scenario starting at location and time of a real seismic event. The task for the exercise participants was to find potential source events by atmospheric backtracking and to analyze in the following promising candidate events concerning their waveform signals. The study shows one possible way of solution for NPE 2010 as it was performed at German NDC by a team without precedent knowledge of the selected event and release scenario. The ATM Source Receptor Sensitivity (SRS) fields as provided by the IDC were evaluated in a logical approach in order to define probable source regions for several days before the first reported fictitious radioactive xenon finding. Additional information on likely event times was derived from xenon isotopic ratios where applicable. Of the considered seismic events in the potential source region all except one could be identified as

  6. Inertial sensing-based pre-impact detection of falls involving near-fall scenarios.

    PubMed

    Lee, Jung Keun; Robinovitch, Stephen N; Park, Edward J

    2015-03-01

    Although near-falls (or recoverable imbalances) are common episodes for many older adults, they have received a little attention and were not considered in the previous laboratory-based fall assessments. Hence, this paper addresses near-fall scenarios in addition to the typical falls and activities of daily living (ADLs). First, a novel vertical velocity-based pre-impact fall detection method using a wearable inertial sensor is proposed. Second, to investigate the effect of near-fall conditions on the detection performance and feasibility of the vertical velocity as a fall detection parameter, the detection performance of the proposed method (Method 1) is evaluated by comparing it to that of an acceleration-based method (Method 2) for the following two different discrimination cases: falls versus ADLs (i.e., excluding near-falls) and falls versus non-falls (i.e., including near-falls). Our experiment results show that both methods produce similar accuracies for the fall versus ADL detection case; however, Method 1 exhibits a much higher accuracy than Method 2 for the fall versus non-fall detection case. This result demonstrates the superiority of the vertical velocity over the peak acceleration as a fall detection parameter when the near-fall conditions are included in the non-fall category, in addition to its capability of detecting pre-impact falls.

  7. Scenario Generation Using Differential Scenario Information

    NASA Astrophysics Data System (ADS)

    Makino, Masayuki; Ohnishi, Atsushi

    A method of generating scenarios using differential scenaro information is presented. Behaviors of normal scenarios of similar purpose are quite similar each other, while actors and data in scenarios are different among these scenarios. We derive the differential information between them and apply the differential information to generate new alternative/exceptional scenarios. Our method will be illustrated with examples. This paper describes (1) a language for describing scenarios based on a simple case grammar of actions, (2) introduction of the differential scenario, and (3) method and examples of scenario generation using the differential scenario.

  8. Scenario-Based Training on Human Errors Contributing to Security Incidents

    SciTech Connect

    Greitzer, Frank L.; Pond, Daniel J.; Jannotta, Marjorie

    2004-12-06

    Error assessment studies reveal that ''human errors'' are often the consequence of unsuitable environmental factors, ineffective systems, inappropriate task conditions, and individual actions or failures to act. The US Department of Energy (DOE) initiated a program to determine if system-induced human errors could also be contributing factors to security incidents. As the seminal basis for this work, the Enhanced Security Through Human Error Reduction (ESTHER) program at Los Alamos National Laboratory (LANL) produced a contributing factors data set and systems categorization for security related incidents attributed to human error. This material supports the development and delivery of training for security incident inquiry officials. While LANL's initial work focused on classroom training, a collaborative effort between LANL and Pacific Northwest National Laboratory (PNNL) has focused on delivering interactive e-Learning training applications based on ESTHER principles. Through training, inquiry officials will understand and be capable of applying the underlying human error control concepts to new or novel situations. Their performance requires a high degree of analysis and judgment to accomplish the associated cognitive and procedural tasks. To meet this requirement, we employed cognitive principles of instructional design to engage the learner in interactive, realistic, problem-centered activity; we constructed scenarios within a guided-discovery framework; and we utilized learner-centered developmental sequences leading to field application. To enhance the relevance and realism of the training experience, we employed 3-D modeling technologies in constructing interactive scenarios. This paper describes the application of cognitive learning principles, use of varied media, and the implementation challenges in developing a technology-rich, interactive security incident training program that includes Web-based training.

  9. Real-time determination of the worst tsunami scenario based on Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Furuya, Takashi; Koshimura, Shunichi; Hino, Ryota; Ohta, Yusaku; Inoue, Takuya

    2016-04-01

    In recent years, real-time tsunami inundation forecasting has been developed with the advances of dense seismic monitoring, GPS Earth observation, offshore tsunami observation networks, and high-performance computing infrastructure (Koshimura et al., 2014). Several uncertainties are involved in tsunami inundation modeling and it is believed that tsunami generation model is one of the great uncertain sources. Uncertain tsunami source model has risk to underestimate tsunami height, extent of inundation zone, and damage. Tsunami source inversion using observed seismic, geodetic and tsunami data is the most effective to avoid underestimation of tsunami, but needs to expect more time to acquire the observed data and this limitation makes difficult to terminate real-time tsunami inundation forecasting within sufficient time. Not waiting for the precise tsunami observation information, but from disaster management point of view, we aim to determine the worst tsunami source scenario, for the use of real-time tsunami inundation forecasting and mapping, using the seismic information of Earthquake Early Warning (EEW) that can be obtained immediately after the event triggered. After an earthquake occurs, JMA's EEW estimates magnitude and hypocenter. With the constraints of earthquake magnitude, hypocenter and scaling law, we determine possible multi tsunami source scenarios and start searching the worst one by the superposition of pre-computed tsunami Green's functions, i.e. time series of tsunami height at offshore points corresponding to 2-dimensional Gaussian unit source, e.g. Tsushima et al., 2014. Scenario analysis of our method consists of following 2 steps. (1) Searching the worst scenario range by calculating 90 scenarios with various strike and fault-position. From maximum tsunami height of 90 scenarios, we determine a narrower strike range which causes high tsunami height in the area of concern. (2) Calculating 900 scenarios that have different strike, dip, length

  10. The centricity of presence in scenario-based high fidelity human patient simulation: a model.

    PubMed

    Dunnington, Renee M

    2015-01-01

    Enhancing immersive presence has been shown to have influence on learning outcomes in virtual types of simulation. Scenario-based human patient simulation, a mixed reality form, may pose unique challenges for inducing the centricity of presence among participants in simulation. A model for enhancing the centricity of presence in scenario-based human patient simulation is presented here. The model represents a theoretical linkage among the interaction of pedagogical, individual, and group factors that influence the centricity of presence among participants in simulation. Presence may have an important influence on the learning experiences and learning outcomes in scenario-based high fidelity human patient simulation. This report is a follow-up to an article published in 2014 by the author where connections were made to the theoretical basis of presence as articulated by nurse scholars. PMID:25520467

  11. SAFRR AND Physics-Based Scenarios: The Power of Scientifically Credible Stories

    NASA Astrophysics Data System (ADS)

    Cox, D. A.; Jones, L.

    2015-12-01

    USGS's SAFRR (Science Application for Risk Reduction) Project and its predecessor, the Multi Hazards Demonstration Project, uses the latest earth science to develop scenarios so that communities can improve disaster resilience. SAFRR has created detailed physics-based natural-disaster scenarios of a M7.8 San Andreas earthquake in southern California (ShakeOut), atmospheric-river storms rivaling the Great California Flood of 1862 (ARkStorm), a Tohoku-sized earthquake and tsunami in the eastern Aleutians (SAFRR Tsunami), and now a M7.05 quake on the Hayward Fault in the San Francisco Bay area (HayWired), as novel ways of providing science for decision making. Each scenario is scientifically plausible, deterministic, and large enough to demand attention but not too large to be believable. The scenarios address interacting hazards, requiring involvement of multiple science disciplines and user communities. The scenarios routinely expose hitherto unknown or ignored vulnerabilities, most often in cascading effects missed when impacts are considered in isolation. They take advantage of story telling to provide decision makers with clear explanations and justifications for mitigation and preparedness actions, and have been used for national-to-local disaster response exercises and planning. Effectiveness is further leveraged by downscaling the scenarios to local levels. For example, although the ARkStorm scenario describes state-scale events and has been used that way by NASA and the Navy, SAFRR also partnered with FEMA to focus on two local areas, Ventura County in the coastal plain and the mountain setting of Lake Tahoe with downstream impacts in Reno, Sparks and Carson City. Downscaling and focused analyses increased usefulness to user communities, drawing new participants into the study. SAFRR scenarios have also motivated new research to answer questions uncovered by stakeholders, closing the circle of co-evolving disaster-science and disaster-response improvements.

  12. A multivariate copula-based framework for dealing with hazard scenarios and failure probabilities

    NASA Astrophysics Data System (ADS)

    Salvadori, G.; Durante, F.; De Michele, C.; Bernardi, M.; Petrella, L.

    2016-05-01

    This paper is of methodological nature, and deals with the foundations of Risk Assessment. Several international guidelines have recently recommended to select appropriate/relevant Hazard Scenarios in order to tame the consequences of (extreme) natural phenomena. In particular, the scenarios should be multivariate, i.e., they should take into account the fact that several variables, generally not independent, may be of interest. In this work, it is shown how a Hazard Scenario can be identified in terms of (i) a specific geometry and (ii) a suitable probability level. Several scenarios, as well as a Structural approach, are presented, and due comparisons are carried out. In addition, it is shown how the Hazard Scenario approach illustrated here is well suited to cope with the notion of Failure Probability, a tool traditionally used for design and risk assessment in engineering practice. All the results outlined throughout the work are based on the Copula Theory, which turns out to be a fundamental theoretical apparatus for doing multivariate risk assessment: formulas for the calculation of the probability of Hazard Scenarios in the general multidimensional case (d≥2) are derived, and worthy analytical relationships among the probabilities of occurrence of Hazard Scenarios are presented. In addition, the Extreme Value and Archimedean special cases are dealt with, relationships between dependence ordering and scenario levels are studied, and a counter-example concerning Tail Dependence is shown. Suitable indications for the practical application of the techniques outlined in the work are given, and two case studies illustrate the procedures discussed in the paper.

  13. Performance-based ratemaking

    SciTech Connect

    Cross, P.S.

    1995-07-15

    Performance-based ratemaking (PBR) departs from the cost-of-service standard in setting just and reasonable utility rates, but that departure isn`t as easy as it looks. Up until now, cost-of-service ratemaking has provided relatively stable rates, while enabling utilities to attract enormous amounts of capital. Of late, however, regulators appear to be heeding the argument that changing markets warrant a second look. Throughout the country and across the utility industry, some regulators appear willing to abandon cost of service as a proxy for competition, instead favoring performance-based methods that would rely on competitive forces. These performance-based schemes vary in their details but generally afford utilities the opportunity to increase profits by exceeding targets for efficiency and cost savings. Moreover, these plans purport to streamline the regulatory process. Annual, accounting-type reviews replace rate hearings. Cost-of-service studies might not be required at all once initial rates are fixed. Nevertheless, these PBR plans rely on cost-based rates as a starting point and still contain safeguards to protect ratepayers. PBR falls short of true deregulation. As the Massachusetts Department of Public Utilities noted recently in an order approving a PBR variant known as price-cap regulation for New England Telephone and Telegraph Co., `price-cap regulation is not deregulation; it is merely another way for regulators to control the rates charged by a firm.`

  14. Hybrid Modeling for Scenario-Based Evaluation of Failure Effects in Advanced Hardware-Software Designs

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David

    2001-01-01

    This paper describes an incremental scenario-based simulation approach to evaluation of intelligent software for control and management of hardware systems. A hybrid continuous/discrete event simulation of the hardware dynamically interacts with the intelligent software in operations scenarios. Embedded anomalous conditions and failures in simulated hardware can lead to emergent software behavior and identification of missing or faulty software or hardware requirements. An approach is described for extending simulation-based automated incremental failure modes and effects analysis, to support concurrent evaluation of intelligent software and the hardware controlled by the software

  15. The hydroclimatological response to global warming based on the dynamically downscaled climate change scenario

    NASA Astrophysics Data System (ADS)

    Im, Eun-Soon; Coppola, Erika; Giorgi, Felippo

    2010-05-01

    Given the discernable evidences of climate changes due to human activity, there is a growing demand for the reliable climate change scenario in response to future emission forcing. One of the most significant impacts of climate changes can be that on the hydrological process. Changes in the seasonality and increase in the low and high rainfall extremes can severely influence the water balance of river basin, with serious consequences for societies and ecosystems. In fact, recent studies have reported that East Asia including the Korean peninsula is regarded to be a highly vulnerability region under global warming, in particular for water resources. As an attempt accurately assess the impact of climate change over Korea, we performed a downscaling of the ECAHM5-MPI/OM global projection under the A1B emission scenario for the period 1971-2100 using the RegCM3 one-way double-nested system. Physically based long-term (130 years) fine-scale (20 km) climate information is appropriate for analyzing the detailed structure of the hydroclimatological response to climate change. Changes in temperature and precipitation are translated to the hydrological condition in a direct or indirect way. The change in precipitation shows a distinct seasonal variations and a complicated spatial pattern. While changes in total precipitation do not show any relevant trend, the change patterns in daily precipitation clearly show an enhancement of high intensity precipitation and a reduction of weak intensity precipitation. The increase of temperature enhances the evapotranspiration, and hence the actual water stress becomes more pronounced in the future climate. Precipitation, snow, and runoff changes show the relevant topographical modulation under global warming. This study clearly demonstrates the importance of a refined topography for improving the accuracy of the local climatology. Improved accuracy of regional climate projection could lead to an enhanced reliability of the

  16. Lung cancer screening: review and performance comparison under different risk scenarios.

    PubMed

    Tota, Joseph E; Ramanakumar, Agnihotram V; Franco, Eduardo L

    2014-02-01

    Lung cancer is currently one of the most common malignant diseases and is responsible for substantial mortality worldwide. Compared with never smokers, former smokers remain at relatively high risk for lung cancer, accounting for approximately half of all newly diagnosed cases in the US. Screening offers former smokers the best opportunity to reduce their risk of advanced stage lung cancer and there is now evidence that annual screening using low-dose computed tomography (LDCT) is effective in preventing mortality. Studies are being conducted to evaluate whether the benefits of LDCT screening outweigh its costs and potential harms and to determine the most appropriate workup for patients with screen-detected lung nodules. Program efficiency would be optimized by targeting high-risk current smokers, but low uptake among this group is a concern. Former smokers may be invited for screening; however, if fewer long-term current smokers and more former smokers with long quit duration elect to attend, this could have very adverse effects on cost and screening test parameters. To illustrate this point, we present three possible screening scenarios with lung cancer prevalence ranging from between 0.62 and 5.0 %. In summary, cost-effectiveness of lung cancer screening may be improved if linked to successful smoking cessation programs and if better approaches are developed to reach very high-risk patients, e.g., long-term current smokers or others based on more accurate risk prediction models.

  17. Exposure Scenarios and Unit Dose Factors for the Hanford Immobilized Low Activity Tank Waste Performance Assessment

    SciTech Connect

    RITTMANN, P.D.

    1999-12-29

    Exposure scenarios are defined to identify potential pathways and combinations of pathways that could lead to radiation exposure from immobilized tank waste. Appropriate data and models are selected to permit calculation of dose factors for each exposure

  18. Pre-Service Teachers' Perspectives on Using Scenario-Based Virtual Worlds in Science Education

    ERIC Educational Resources Information Center

    Kennedy-Clark, Shannon

    2011-01-01

    This paper presents the findings of a study on the current knowledge and attitudes of pre-service teachers on the use of scenario-based multi-user virtual environments in science education. The 28 participants involved in the study were introduced to "Virtual Singapura," a multi-user virtual environment, and completed an open-ended questionnaire.…

  19. Design Process of a Goal-Based Scenario on Computing Fundamentals

    ERIC Educational Resources Information Center

    Beriswill, Joanne Elizabeth

    2014-01-01

    In this design case, an instructor developed a goal-based scenario (GBS) for undergraduate computer fundamentals students to apply their knowledge of computer equipment and software. The GBS, entitled the MegaTech Project, presented the students with descriptions of the everyday activities of four persons needing to purchase a computer system. The…

  20. THE SCENARIOS APPROACH TO ATTENUATION-BASED REMEDIES FOR INORGANIC AND RADIONUCLIDE CONTAMINANTS

    SciTech Connect

    Vangelas, K.; Rysz, M.; Truex, M.; Brady, P.; Newell, C.; Denham, M.

    2011-08-04

    Guidance materials based on use of conceptual model scenarios were developed to assist evaluation and implementation of attenuation-based remedies for groundwater and vadose zones contaminated with inorganic and radionuclide contaminants. The Scenarios approach is intended to complement the comprehensive information provided in the US EPA's Technical Protocol for Monitored Natural Attenuation (MNA) of Inorganic Contaminants by providing additional information on site conceptual models and extending the evaluation to consideration of Enhanced Attenuation approaches. The conceptual models incorporate the notion of reactive facies, defined as units with hydrogeochemical properties that are different from surrounding units and that react with contaminants in distinct ways. The conceptual models also incorporate consideration of biogeochemical gradients, defined as boundaries between different geochemical conditions that have been induced by waste disposal or other natural phenomena. Gradients can change over time when geochemical conditions from one area migrate into another, potentially affecting contaminant mobility. A recognition of gradients allows the attenuation-affecting conditions of a site to be projected into the future. The Scenarios approach provides a stepwise process to identify an appropriate category of conceptual model and refine it for a specific site. Scenario materials provide links to pertinent sections in the EPA technical protocol and present information about contaminant mobility and important controlling mechanism for attenuation-based remedies based on the categories of conceptual models.

  1. The Scenarios Approach to Attenuation-Based Remedies for Inorganic and Radionuclide Contaminants (Invited)

    NASA Astrophysics Data System (ADS)

    Truex, M.; Brady, P.; Newell, C. J.; Denham, M.; Vangelas, K.

    2010-12-01

    Guidance materials based on use of conceptual model scenarios were developed to assist evaluation and implementation of attenuation-based remedies for groundwater and vadose zones contaminated with inorganic and radionuclide contaminants. The Scenarios approach is intended to complement the comprehensive information provided in the US EPA’s Technical Protocol for Monitored Natural Attenuation (MNA) of Inorganic Contaminants by providing additional information on site conceptual models and extending the evaluation to consideration of Enhanced Attenuation approaches. The conceptual models incorporate the notion of reactive facies, defined as units with hydrogeochemical properties that are different from surrounding units and that react with contaminants in distinct ways. The conceptual models also incorporate consideration of biogeochemical gradients, defined as boundaries between different geochemical conditions that have been induced by waste disposal or other natural phenomena. Gradients can change over time when geochemical conditions from one area migrate into another, potentially affecting contaminant mobility. A recognition of gradients allows the attenuation-affecting conditions of a site to be projected into the future. The Scenarios approach provides a stepwise process to identify an appropriate category of conceptual model and refine it for a specific site. Scenario materials provide links to pertinent sections in the EPA technical protocol and present information about contaminant mobility and important controlling mechanism for attenuation-based remedies based on the categories of conceptual models.

  2. The Effects of Task, Database, and Guidance on Interaction in a Goal-Based Scenario.

    ERIC Educational Resources Information Center

    Bell, Benjamin

    This paper describes the "Sickle Cell Counselor" (SCC), a goal based scenario on permanent display at the Museum of Science and Industry in Chicago. SCC is an exploratory hypermedia simulation program which provides users with a basic understanding of Sickle Cell Anemia. The user of the program plays the role of a genetic counselor, and, while…

  3. Adapting Scenario-Based Curriculum Materials to Community College Technical Courses

    ERIC Educational Resources Information Center

    Yarnall, Louise; Toyama, Yukie; Gong, Bowyee; Ayers, Catherine; Ostrander, Jane

    2007-01-01

    Community college educators seek to infuse their workforce courses with more "real world" activities. This 3-year case study examined how 7 instructors and 78 students in California and Texas responded to the changes involved in implementing one type of reform program--the scenario-based curriculum (Schank, 1997). The study shows that the…

  4. Multimedia Scenario Based Learning Programme for Enhancing the English Language Efficiency among Primary School Students

    ERIC Educational Resources Information Center

    Tupe, Navnath

    2015-01-01

    This research was undertaken with a view to assess the deficiencies in English language among Primary School Children and to develop Multimedia Scenario Based Learning Programme (MSBLP) for mastery of English language which required special attention and effective treatment. The experimental study with pre-test, post-test control group design was…

  5. A Scenario-Based Protocol Checker for Public-Key Authentication Scheme

    NASA Astrophysics Data System (ADS)

    Saito, Takamichi

    Security protocol provides communication security for the internet. One of the important features of it is authentication with key exchange. Its correctness is a requirement of the whole of the communication security. In this paper, we introduce three attack models realized as their attack scenarios, and provide an authentication-protocol checker for applying three attack-scenarios based on the models. We also utilize it to check two popular security protocols: Secure SHell (SSH) and Secure Socket Layer/Transport Layer Security (SSL/TLS).

  6. An Ontology-Based Scenario for Teaching the Management of Health Information Systems.

    PubMed

    Jahn, Franziska; Schaaf, Michael; Kahmann, Christian; Tahar, Kais; Kücherer, Christian; Paech, Barbara; Winter, Alfred

    2016-01-01

    The terminology for the management of health information systems is characterized by complexity and polysemy which is both challenging for medical informatics students and practitioners. SNIK, an ontology of information management (IMI) in hospitals, brings together IM concepts from different literature sources. Based on SNIK, we developed a blended learning scenario to teach medical informatics students IM concepts and their relationships. In proof-of-concept teaching units, students found the use of SNIK in teaching and learning motivating and useful. In the next step, the blended learning scenario will be rolled out to an international course for medical informatics students.

  7. An Ontology-Based Scenario for Teaching the Management of Health Information Systems.

    PubMed

    Jahn, Franziska; Schaaf, Michael; Kahmann, Christian; Tahar, Kais; Kücherer, Christian; Paech, Barbara; Winter, Alfred

    2016-01-01

    The terminology for the management of health information systems is characterized by complexity and polysemy which is both challenging for medical informatics students and practitioners. SNIK, an ontology of information management (IMI) in hospitals, brings together IM concepts from different literature sources. Based on SNIK, we developed a blended learning scenario to teach medical informatics students IM concepts and their relationships. In proof-of-concept teaching units, students found the use of SNIK in teaching and learning motivating and useful. In the next step, the blended learning scenario will be rolled out to an international course for medical informatics students. PMID:27577404

  8. Scenario planning based on geomatics: a case study in Zijin mountain national forest park

    NASA Astrophysics Data System (ADS)

    Li, Mingyang; He, Yanjie; Xu, Guangcai; Wu, Wenhao; Wang, Baozhong

    2007-06-01

    With the rapid development of forest tourism, it is crucial to coordinate the conflicting goals of a forest park by making a scientific plan. It is difficult to determine the complex relationship by means of traditional laboratory and field experiments on the scale of landscape. Zijin Mountain national forest park is taken as a case study area, while RS and GIS software ERDAS 8.7, ArcGis 9.0 are chosen as the spatial platforms of doing scenario planning. Three different periods remote sensing data in the years of 2000 (IKNOS), 2002(SPOT5), 2004 ( QuickBird ) are gathered, then supervised classification, neighborhood analysis are being done before three scenarios of national park in ten years are built based on Cellular Automation Model (CAM). Three spatial pattern index of mean patch area, shape index, patch density of each scenario are calculated by using the spatial pattern analysis program of Fragstats 3.3. After comparison of the three scenarios from two aspects of landscape spatial pattern and protection goals, an optimized planning is made and compared with the land classes in 2002. In the end of the paper, some problems concerned with the scenario making are discussed.

  9. Performance of a Frequency-Hopped Real-Time Remote Control System in a Multiple Access Scenario

    NASA Astrophysics Data System (ADS)

    Cervantes, Frank

    A recent trend is observed in the context of the radio-controlled aircrafts and automobiles within the hobby grade category and Unmanned Aerial Vehicles (UAV) applications moving to the well-known Industrial, Scientific and Medical (ISM) band. Based on this technological fact, the present thesis evaluates an individual user performance by featuring a multiple-user scenario where several point-to-point co-located real-time Remote Control (RC) applications operate using Frequency Hopping Spread Spectrum (FHSS) as a medium access technique in order to handle interference efficiently. Commercial-off-the-shelf wireless transceivers ready to operate in the ISM band are considered as the operational platform supporting the above-mentioned applications. The impact of channel impairments and of different critical system engineering issues, such as working with real clock oscillators and variable packet duty cycle, are considered. Based on the previous, simulation results allowed us to evaluate the range of variation for those parameters for an acceptable system performance under Multiple Access (MA) environments.

  10. Scenario Based Approach for Multiple Source Tsunami Hazard Assessment for Sines, Portugal

    NASA Astrophysics Data System (ADS)

    Wronna, Martin; Omira, Rachid; Baptista, Maria Ana

    2015-04-01

    In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines, Portugal one the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean to the southwest facing the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, a total of five scenarios were selected to assess tsunami impact at the test site. These scenarios correspond to the worst-case credible scenario approach based upon the largest events of the historical and paleo tsunami catalogues. The tsunami simulations from the source area towards the coast is carried out using NSWING a Non-linear Shallow Water Model With Nested Grids. The code solves the non-linear shallow water equations using the discretization and explicit leap-frog finite difference scheme, in a Cartesian or Spherical frame. The initial sea surface displacement is assumed to be equal to the sea bottom deformation that is computed by Okada equations. Both uniform and non-uniform slip conditions are used. The presented results correspond to the models using non-uniform slip conditions. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water) MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawdown, run-up and inundation distance. Synthetic waveforms are computed at virtual tide gages at specific locations outside and inside the harbour. The final results consist of Aggregate Scenario Maps presented for the different inundation parameters. This work is funded by ASTARTE - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839

  11. Organizational Learning and Performance: Understanding Indian Scenario in Present Global Context

    ERIC Educational Resources Information Center

    Khandekar, Aradhana; Sharma, Anuradha

    2006-01-01

    Purpose: The purpose of this paper is to show that the role of organizational learning is increasingly becoming crucial for organizational performance. Based on the study of three Indian global firms operating in National Capital Region of Delhi, India, this study explores the correlation of organizational learning with organizational performance…

  12. Analysis of cloud-based solutions on EHRs systems in different scenarios.

    PubMed

    Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud. PMID:22492177

  13. Analysis of cloud-based solutions on EHRs systems in different scenarios.

    PubMed

    Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud.

  14. Probabilistic scenario-based water resource planning and management:A case study in the Yellow River Basin, China

    NASA Astrophysics Data System (ADS)

    Dong, C.; Schoups, G.; van de Giesen, N.

    2012-04-01

    Water resource planning and management is subject to large uncertainties with respect to the impact of climate change and socio-economic development on water systems. In order to deal with these uncertainties, probabilistic climate and socio-economic scenarios were developed based on the Principle of Maximum Entropy, as defined within information theory, and as inputs to hydrological models to construct probabilistic water scenarios using Monte Carlo simulation. Probabilistic scenarios provide more explicit information than equally-likely scenarios for decision-making in water resource management. A case was developed for the Yellow River Basin, China, where future water availability and water demand are affected by both climate change and socio-economic development. Climate scenarios of future precipitation and temperature were developed based on the results of multiple Global climate models; and socio-economic scenarios were downscaled from existing large-scale scenarios. Probability distributions were assigned to these scenarios to explicitly represent a full set of future possibilities. Probabilistic climate scenarios were used as input to a rainfall-runoff model to simulate future river discharge and socio-economic scenarios for calculating water demand. A full set of possible future water supply-demand scenarios and their associated probability distributions were generated. This set can feed the further analysis of the future water balance, which can be used as a basis to plan and manage water resources in the Yellow River Basin. Key words: Probabilistic scenarios, climate change, socio-economic development, water management

  15. Scenarios of Future Water use on Mediterranean Islands based on an Integrated Assessment of Water Management

    NASA Astrophysics Data System (ADS)

    Lange, M. A.

    2006-12-01

    The availability of water in sufficient quantities and adequate quality presents considerable problems on Mediterranean islands. Because of their isolation and thus the impossibility to draw on more distant or more divers aquifers, they rely entirely on precipitation as natural replenishing mechanism. Recent observations indicate decreasing precipitation, increasing evaporation and steadily growing demand for water on the islands. Future climate change will exacerbate this problem, thus increasing the already pertinent vulnerability to droughts. Responsible planning of water management strategies requires scenarios of future supply and demand through an integrated assessment including climate scenarios based on regional climate modeling as well as scenarios on changes in societal and economical determinants of water demand. Constructing such strategies necessitates a thorough understanding about the interdependencies and feedbacks between physical/hydrological and socio-economic determinants of water balances on an island. This has to be based on a solid understanding of past and present developments of these drivers. In the framework of the EU-funded MEDIS project (Towards sustainable water use on Mediterranean Islands: addressing conflicting demands and varying hydrological, social and economic conditions, EVK1-CT-2001-00092), detailed investigations on present vulnerabilities and adaptation strategies to droughts have been carried out on Mallorca, Corsica, Sicily, Crete and Cyprus. This was based on an interdisciplinary study design including hydrological, geophysical, agricultural-, social and political sciences investigations. A central element of the study has been the close interaction with stakeholders on the islands and their contribution to strategy formulation. An important result has been a specification of vulnerability components including: a physical/environmental-, an economical/regulatory- and a social/institutional/political component. Their

  16. An exploration of scenario discussion in a Web-based nursing course.

    PubMed

    Hsu, Li-Ling; Hsieh, Suh-Ing

    2006-06-01

    Complexity in nursing education has increased as it is challenged to meet the needs of diverse populations in rapidly evolving and highly technical health care settings. To accomplish or meet these societal wants, needs, and demands, nursing educators must prepare students successfully to become active, independent learners and problem solvers. The purpose of this study was to design a nursing course on the basis of scenario discussion, Web-based instruction (WBI), and the assessment of learning outcomes. The design of the study involved two stages. The first, beginning in 2001, developed the scenario discussion with the WBI system. The second evaluated learning outcomes within the context of a scenario discussion. Two instruments were examined in this study: a nursing assessment score and learning effectiveness survey. The target population in this study consisted of students enrolled in a two-year nursing program and registered for the course, Nursing I, during the fall semester of 2002. Using simple random sampling, 43 students were recruited and agreed to participate in the study. Most of the students chose "good" for learning effectiveness. Overall, the students gave higher learning effectiveness survey scores and nursing assessment scores. Due to their lack of previous exposure to scenario discussion, the students here felt frustration and anxiety while taking this course. Faculty should devote more time explaining the advantages of scenario discussion. In addition, in comparison with traditional teaching, Web-based instruction (WBI) imposes a heavier burden on the instructors and institutions involved. Nurse educators must continue to use innovative strategies to enhance student learning. Students registered both positive and negative feedback in open-ended questions on Web- based instruction. However in the future, special attention should be given to the learning software, Internet access speed, synchronous and asynchronous meetings, and the interaction

  17. Scenarios which may lead to the rise of an asteroid-based technical civilisation.

    PubMed

    Kecskes, Csaba

    2002-05-01

    In a previous paper, the author described a hypothetical development path of technical civilisations which has the following stages: planet dwellers, asteroid dwellers, interstellar travellers, interstellar space dwellers. In this paper, several scenarios are described which may cause the rise of an asteroid-based technical civilisation. Before such a transition may take place, certain space technologies must be developed fully (now these exist only in very preliminary forms): closed-cycle biological life support systems, space manufacturing systems, electrical propulsion systems. After mastering these technologies, certain events may provide the necessary financial means and social impetus for the foundation of the first asteroid-based colonies. In the first scenario, a rich minority group becomes persecuted and they decide to leave the Earth. In the second scenario, a "cold war"-like situation exists and the leaders of the superpowers order the creation of asteroid-based colonies to show off their empires' technological (and financial) grandiosity. In the third scenario, the basic situation is similar to the second one, but in this case the asteroids are not just occupied by the colonists. With several decades of hard work, an asteroid can be turned into a kinetic energy weapon which can provide the same (or greater) threat as the nuclear arsenal of a present superpower. In the fourth scenario, some military asteroids are moved to Earth-centred orbits and utilised as "solar power satellites" (SPS). This would be a quite economical solution because a "military asteroid" already contains most of the important components of an SPS (large solar collector arrays, power distribution devices, orbit modifying rocket engine), one should add only a large microwave transmitter.

  18. Lunar Outpost Life Support Architecture Study Based on a High Mobility Exploration Scenario

    NASA Technical Reports Server (NTRS)

    Lange, Kevin E.; Anderson, Molly S.

    2009-01-01

    As scenarios for lunar surface exploration and habitation continue to evolve within NASA s Constellation program, so must studies of optimal life support system architectures and technologies. This paper presents results of a life support architecture study based on a 2009 NASA scenario known as Scenario 12. Scenario 12 represents a consolidation of ideas from earlier NASA scenarios and includes an outpost near the Lunar South Pole comprised of three larger fixed surface elements and four attached pressurized rovers. The scenario places a high emphasis on surface mobility, with planning assuming that all four crewmembers spend roughly 50% of the time away from the outpost on 3-14 day excursions in two of the pressurized rovers. Some of the larger elements can also be mobilized for longer duration excursions. This emphasis on mobility poses a significant challenge for a regenerative life support system in terms of cost-effective waste collection and resource recovery across multiple elements, including rovers with very constrained infrastructure resources. The current study considers pressurized rovers as part of a distributed outpost life support architecture in both stand-alone and integrated configurations. A range of architectures are examined reflecting different levels of closure and distributed functionality. Different lander propellant scavenging options are also considered involving either initial conversion of residual oxygen and hydrogen propellants to water or initial direct oxygen scavenging. Monte Carlo simulations are used to assess the sensitivity of results to volatile high-impact mission variables, including the quantity of residual lander propellants available for scavenging, the fraction of crew time away from the outpost on excursions, total extravehicular activity hours, and habitat leakage. Architectures are evaluated by estimating surpluses or deficits of water and oxygen per 180-day mission and differences in fixed and 10-year

  19. The Workplace of the Future: Insights from Futures Scenarios and Today's High Performance Workplaces.

    ERIC Educational Resources Information Center

    Curtain, Richard

    1998-01-01

    Studies of the workplace of the future that used scenario-planning methodology and survey data suggest that nonmarket organizations will provide stability for temporary workers and result in the emergence of networks. Survey data suggest that future workplaces will foster intellectual capital through research and development. (JOW)

  20. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    NASA Astrophysics Data System (ADS)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging

  1. Representing Instructional Material for Scenario-Based Guided-Discovery Courseware

    SciTech Connect

    Greitzer, Frank L.; Merrill, M. DAVID.; Rice, Douglas M.; Curtis, Darren S.

    2004-12-06

    The focus of this paper is to discuss paradigms for learning that are based on sound principles of human learning and cognition, and to discuss technical challenges that must be overcome in achieving this research goal through instructional system design (ISD) approaches that are cost-effective as well as conformant with today's interactive multimedia instruction standards. Fundamental concepts are to: engage learners to solve real-world problems (progress from simple to complex); relate material to previous experience; demonstrate what is to be learned using interactive, problem-centered activities rather than passive exposure to material; require learners to use their new knowledge to solve problems that demonstrate their knowledge in a relevant applied setting; and guide the learner with feedback and coaching early, then gradually withdraw this support as learning progresses. Many of these principles have been put into practice by employing interactive learning objects as re-usable components of larger, more integrated exercises. A challenge is to make even more extensive use of interactive, scenario-based activities within a guided-discovery framework. Because the design and construction of interactive, scenario-based learning objects and more complex integrated exercises is labor-intensive, this paper explores the use of interactive learning objects and associated representation schema for instructional content to facilitate development of tools for creating scenario-based, guided-discovery courseware.

  2. Fuzzy Cognitive Map scenario-based medical decision support systems for education.

    PubMed

    Georgopoulos, Voula C; Chouliara, Spyridoula; Stylios, Chrysostomos D

    2014-01-01

    Soft Computing (SC) techniques are based on exploiting human knowledge and experience and they are extremely useful to model any complex decision making procedure. Thus, they have a key role in the development of Medical Decision Support Systems (MDSS). The soft computing methodology of Fuzzy Cognitive Maps has successfully been used to represent human reasoning and to infer conclusions and decisions in a human-like way and thus, FCM-MDSSs have been developed. Such systems are able to assist in critical decision-making, support diagnosis procedures and consult medical professionals. Here a new methodology is introduced to expand the utilization of FCM-MDSS for learning and educational purposes using a scenario-based learning (SBL) approach. This is particularly important in medical education since it allows future medical professionals to safely explore extensive "what-if" scenarios in case studies and prepare for dealing with critical adverse events.

  3. Making pharmacogenomic-based prescribing alerts more effective: A scenario-based pilot study with physicians.

    PubMed

    Overby, Casey Lynnette; Devine, Emily Beth; Abernethy, Neil; McCune, Jeannine S; Tarczy-Hornoch, Peter

    2015-06-01

    To facilitate personalized drug dosing (PDD), this pilot study explored the communication effectiveness and clinical impact of using a prototype clinical decision support (CDS) system embedded in an electronic health record (EHR) to deliver pharmacogenomic (PGx) information to physicians. We employed a conceptual framework and measurement model to access the impact of physician characteristics (previous experience, awareness, relative advantage, perceived usefulness), technology characteristics (methods of implementation-semi-active/active, actionability-low/high) and a task characteristic (drug prescribed) on communication effectiveness (usefulness, confidence in prescribing decision), and clinical impact (uptake, prescribing intent, change in drug dosing). Physicians performed prescribing tasks using five simulated clinical case scenarios, presented in random order within the prototype PGx-CDS system. Twenty-two physicians completed the study. The proportion of physicians that saw a relative advantage to using PGx-CDS was 83% at the start and 94% at the conclusion of our study. Physicians used semi-active alerts 74-88% of the time. There was no association between previous experience with, awareness of, and belief in a relative advantage of using PGx-CDS and improved uptake. The proportion of physicians reporting confidence in their prescribing decisions decreased significantly after using the prototype PGx-CDS system (p=0.02). Despite decreases in confidence, physicians perceived a relative advantage to using PGx-CDS, viewed semi-active alerts on most occasions, and more frequently changed doses toward doses supported by published evidence. Specifically, sixty-five percent of physicians reduced their dosing, significantly for capecitabine (p=0.002) and mercaptopurine/thioguanine (p=0.03). These findings suggest a need to improve our prototype such that PGx CDS content is more useful and delivered in a way that improves physician's confidence in their prescribing

  4. Analyzing Process Data from Game/Scenario-Based Tasks: An Edit Distance Approach

    ERIC Educational Resources Information Center

    Hao, Jiangang; Shu, Zhan; von Davier, Alina

    2015-01-01

    Students' activities in game/scenario-based tasks (G/SBTs) can be characterized by a sequence of time-stamped actions of different types with different attributes. For a subset of G/SBTs in which only the order of the actions is of great interest, the process data can be well characterized as a string of characters (i.e., action string) if we…

  5. Scenario analysis of energy-based low-carbon development in China.

    PubMed

    Zhou, Yun; Hao, Fanghua; Meng, Wei; Fu, Jiafeng

    2014-08-01

    China's increasing energy consumption and coal-dominant energy structure have contributed not only to severe environmental pollution, but also to global climate change. This article begins with a brief review of China's primary energy use and associated environmental problems and health risks. To analyze the potential of China's transition to low-carbon development, three scenarios are constructed to simulate energy demand and CO₂ emission trends in China up to 2050 by using the Long-range Energy Alternatives Planning System (LEAP) model. Simulation results show that with the assumption of an average annual Gross Domestic Product (GDP) growth rate of 6.45%, total primary energy demand is expected to increase by 63.4%, 48.8% and 12.2% under the Business as Usual (BaU), Carbon Reduction (CR) and Integrated Low Carbon Economy (ILCE) scenarios in 2050 from the 2009 levels. Total energy-related CO₂ emissions will increase from 6.7 billiontons in 2009 to 9.5, 11, 11.6 and 11.2 billiontons; 8.2, 9.2, 9.6 and 9 billiontons; 7.1, 7.4, 7.2 and 6.4 billiontons in 2020, 2030, 2040 and 2050 under the BaU, CR and ILCE scenarios, respectively. Total CO₂ emission will drop by 19.6% and 42.9% under the CR and ILCE scenarios in 2050, compared with the BaU scenario. To realize a substantial cut in energy consumption and carbon emissions, China needs to make a long-term low-carbon development strategy targeting further improvement of energy efficiency, optimization of energy structure, deployment of clean coal technology and use of market-based economic instruments like energy/carbon taxation. PMID:25108719

  6. Scenario analysis of energy-based low-carbon development in China.

    PubMed

    Zhou, Yun; Hao, Fanghua; Meng, Wei; Fu, Jiafeng

    2014-08-01

    China's increasing energy consumption and coal-dominant energy structure have contributed not only to severe environmental pollution, but also to global climate change. This article begins with a brief review of China's primary energy use and associated environmental problems and health risks. To analyze the potential of China's transition to low-carbon development, three scenarios are constructed to simulate energy demand and CO₂ emission trends in China up to 2050 by using the Long-range Energy Alternatives Planning System (LEAP) model. Simulation results show that with the assumption of an average annual Gross Domestic Product (GDP) growth rate of 6.45%, total primary energy demand is expected to increase by 63.4%, 48.8% and 12.2% under the Business as Usual (BaU), Carbon Reduction (CR) and Integrated Low Carbon Economy (ILCE) scenarios in 2050 from the 2009 levels. Total energy-related CO₂ emissions will increase from 6.7 billiontons in 2009 to 9.5, 11, 11.6 and 11.2 billiontons; 8.2, 9.2, 9.6 and 9 billiontons; 7.1, 7.4, 7.2 and 6.4 billiontons in 2020, 2030, 2040 and 2050 under the BaU, CR and ILCE scenarios, respectively. Total CO₂ emission will drop by 19.6% and 42.9% under the CR and ILCE scenarios in 2050, compared with the BaU scenario. To realize a substantial cut in energy consumption and carbon emissions, China needs to make a long-term low-carbon development strategy targeting further improvement of energy efficiency, optimization of energy structure, deployment of clean coal technology and use of market-based economic instruments like energy/carbon taxation.

  7. Measuring Engagement in Later Life Activities: Rasch-Based Scenario Scales for Work, Caregiving, Informal Helping, and Volunteering

    ERIC Educational Resources Information Center

    Ludlow, Larry H.; Matz-Costa, Christina; Johnson, Clair; Brown, Melissa; Besen, Elyssa; James, Jacquelyn B.

    2014-01-01

    The development of Rasch-based "comparative engagement scenarios" based on Guttman's facet theory and sentence mapping procedures is described. The scenario scales measuring engagement in work, caregiving, informal helping, and volunteering illuminate the lived experiences of role involvement among older adults and offer multiple…

  8. Thermal Performance Expectations of the Advanced Stirling Convertor Over a Range of Operating Scenarios

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Dyson, Rodger W.

    2010-01-01

    Objectives of this work are: (1) Assist the Science Mission Directorate in developing technologies for space missions. (2) Explore the capability of computational modeling to assist in the development of the Advanced Stirling Convertor. (3) Baseline computational simulation with available experimental data of the ASC. (4) Calculate peak external pressure vessel wall temperatures and compare them with anticipated values. (5) Calculated peak magnet temperature inside the ASC over a range of operational scenarios.

  9. Performance-Based Funding Brief

    ERIC Educational Resources Information Center

    Washington Higher Education Coordinating Board, 2011

    2011-01-01

    A number of states have made progress in implementing performance-based funding (PFB) and accountability. This policy brief summarizes main features of performance-based funding systems in three states: Tennessee, Ohio, and Indiana. The brief also identifies key issues that states considering performance-based funding must address, as well as…

  10. [Scenario analysis on sustainable development of Sino-Singapore Tianjin Eco-city based on emergy and system dynamics].

    PubMed

    Li, Chun-fa; Cao, Ying-ying; Yang, Jian-cho; Yang, Qi-qi

    2015-08-01

    Dynamic evaluation of sustainable development is one of the key fundamental parts of the success of Sino-Singapore Tianjin Eco-city, which is the first eco-city in China constructed by international cooperation. Based on the analysis of nature and economy, function and structure, planning control indices and so on, we constructed a sustainable development evaluation index system and a system dynamics model of Sino-Singapore Tianjin Eco-city to explore dynamic trends of its population, material and currency by comprehensive utilization of emergy analysis and system dynamics method. Five scenarios were set up and simulated, including inertial scenario, scientific and technological scenario, economic scenario, environmental scenario and harmonious development scenario. Then, the sustainability of the 5 scenarios was evaluated and compared. The results showed that in the economy and environment sustainable development scenario, there was a steady growth trend of GDP, accumulation of both emergy and currency, and relatively lower values in emergy waste ratio, emergy ratio of waste, and emergy loading ratio. Although both sustainable evaluation indices, such as ESI and UEI, were relatively low, the economy and environment sustainable development scenario was still the best development scenario which was more active than others. PMID:26685610

  11. [Scenario analysis on sustainable development of Sino-Singapore Tianjin Eco-city based on emergy and system dynamics].

    PubMed

    Li, Chun-fa; Cao, Ying-ying; Yang, Jian-cho; Yang, Qi-qi

    2015-08-01

    Dynamic evaluation of sustainable development is one of the key fundamental parts of the success of Sino-Singapore Tianjin Eco-city, which is the first eco-city in China constructed by international cooperation. Based on the analysis of nature and economy, function and structure, planning control indices and so on, we constructed a sustainable development evaluation index system and a system dynamics model of Sino-Singapore Tianjin Eco-city to explore dynamic trends of its population, material and currency by comprehensive utilization of emergy analysis and system dynamics method. Five scenarios were set up and simulated, including inertial scenario, scientific and technological scenario, economic scenario, environmental scenario and harmonious development scenario. Then, the sustainability of the 5 scenarios was evaluated and compared. The results showed that in the economy and environment sustainable development scenario, there was a steady growth trend of GDP, accumulation of both emergy and currency, and relatively lower values in emergy waste ratio, emergy ratio of waste, and emergy loading ratio. Although both sustainable evaluation indices, such as ESI and UEI, were relatively low, the economy and environment sustainable development scenario was still the best development scenario which was more active than others.

  12. Exposure to sulfosulfuron in agricultural drainage ditches: field monitoring and scenario-based modelling.

    PubMed

    Brown, Colin D; Dubus, Igor G; Fogg, Paul; Spirlet, Marie; Gustin, Christophe

    2004-08-01

    Field monitoring and scenario-based modelling were used to assess exposure of small ditches in the UK to the herbicide sulfosulfuron following transport via field drains. A site in central England on a high pH, clay soil was treated with sulfosulfuron, and concentrations were monitored in the single drain outfall and in the receiving ditch 1 km downstream. Drainflow in the nine months following application totalled 283 mm. Pesticide lost in the first 12.5 mm of flow was 99% of the total loading to drains (0.5% of applied). Significant dilution was observed in the receiving ditch and quantifiable residues were only detected in one sample (0.06 microg litre(-1)). The MACRO model was evaluated against the field data with minimal calibration. The parameterisation over-estimated the importance of macropore flow at the site. As a consequence, the maximum concentration in drainflow (2.3 microg litre(-1)) and the total loading to drains (0.76 g) were over-estimated by factors of 2.4 and 5, respectively. MACRO was then used to simulate long-term fate of the herbicide for each of 20 environmental scenarios. Resulting estimates for concentrations of sulfosulfuron in a receiving ditch were weighted according to the prevalence of each scenario to produce a probability distribution of daily exposure. PMID:15307668

  13. Subjective-probability-based scenarios for uncertain input parameters: Stratospheric ozone depletion

    SciTech Connect

    Hammitt, J.K.

    1990-04-01

    Risk analysis often depends on complex, computer-based models to describe links between policies (e.g., required emission-control equipment) and consequences (e.g., probabilities of adverse health effects). Appropriate specification of many model aspects is uncertain, including details of the model structure; transport, reaction-rate, and other parameters; and application-specific inputs such as pollutant-release rates. Because these uncertainties preclude calculation of the precise consequences of a policy, it is important to characterize the plausible range of effects. In principle, a probability distribution function for the effects can be constructed using Monte Carlo analysis, but the combinatorics of multiple uncertainties and the often high cost of model runs quickly exhaust available resources. A method to choose sets of input conditions (scenarios) that efficiently represent knowledge about the joint probability distribution of inputs is presented and applied. A simple score function approximately relating inputs to a policy-relevant output, in this case, globally averaged stratospheric ozone depletion, is developed. The probability density function for the score-function value is analytically derived from a subjective joint probability density for the inputs. Scenarios are defined by selected quantiles of the score function. Using this method, scenarios can be systematically selected in terms of the approximate probability distribution function for the output of concern, and probability intervals for the joint effect of the inputs can be readily constructed.

  14. CMIP5 Global Climate Model Performance Evaluation and Climate Scenario Development over the South-Central United States

    NASA Astrophysics Data System (ADS)

    Rosendahl, D. H.; Rupp, D. E.; Mcpherson, R. A.; Moore, B., III

    2015-12-01

    Future climate change projections from Global Climate Models (GCMs) are the primary drivers of regional downscaling and impacts research - from which relevant information for stakeholders is generated at the regional and local levels. Therefore understanding uncertainties in GCMs is a fundamental necessity if the scientific community is to provide useful and reliable future climate change information that can be utilized by end users and decision makers. Two different assessments of the Coupled Model Intercomparison Project Phase 5 (CMIP5) GCM ensemble were conducted for the south-central United States. The first was a performance evaluation over the historical period for metrics of near surface meteorological variables (e.g., temperature, precipitation) and system-based phenomena, which include large-scale processes that can influence the region (e.g., low-level jet, ENSO). These metrics were used to identify a subset of models of higher performance across the region which were then used to constrain future climate change projections. A second assessment explored climate scenario development where all model climate change projections were assumed equally likely and future projections with the highest impact were identified (e.g., temperature and precipitation combinations of hottest/driest, hottest/wettest, and highest variability). Each of these assessments identify a subset of models that may prove useful to regional downscaling and impacts researchers who may be restricted by the total number of GCMs they can utilize. Results from these assessments will be provided as well as a discussion on when each would be useful and appropriate to use.

  15. Environmental performance of construction waste: Comparing three scenarios from a case study in Catalonia, Spain.

    PubMed

    Ortiz, O; Pasqualino, J C; Castells, F

    2010-04-01

    The main objective of this paper is to evaluate environmental impacts of construction wastes in terms of the LIFE 98 ENV/E/351 project. Construction wastes are classified in accordance with the Life Program Environment Directive of the European Commission. Three different scenarios to current waste management from a case study in Catalonia (Spain) have been compared: landfilling, recycling and incineration, and these scenarios were evaluated by means of Life Cycle Assessment. The recommendations of the Catalan Waste Catalogue and the European Waste Catalogue have been taken into account. Also, the influence of transport has been evaluated. Results show that in terms of the Global Warming Potential, the most environmentally friendly treatment was recycling, followed by incineration and lastly landfilling. According to the influence of treatment plants location on the GWP indicator, we observe that incineration and recycling of construction wastes are better than landfilling, even for long distances from the building site to the plants. This is true for most wastes except for the stony types, than should be recycled close to the building site. In summary, data from construction waste of a Catalan case study was evaluated using the well established method of LCA to determine the environmental impacts. PMID:20005694

  16. Environmental performance of construction waste: Comparing three scenarios from a case study in Catalonia, Spain

    SciTech Connect

    Ortiz, O.; Pasqualino, J.C.; Castells, F.

    2010-04-15

    The main objective of this paper is to evaluate environmental impacts of construction wastes in terms of the LIFE 98 ENV/E/351 project. Construction wastes are classified in accordance with the Life Program Environment Directive of the European Commission. Three different scenarios to current waste management from a case study in Catalonia (Spain) have been compared: landfilling, recycling and incineration, and these scenarios were evaluated by means of Life Cycle Assessment. The recommendations of the Catalan Waste Catalogue and the European Waste Catalogue have been taken into account. Also, the influence of transport has been evaluated. Results show that in terms of the Global Warming Potential, the most environmentally friendly treatment was recycling, followed by incineration and lastly landfilling. According to the influence of treatment plants location on the GWP indicator, we observe that incineration and recycling of construction wastes are better than landfilling, even for long distances from the building site to the plants. This is true for most wastes except for the stony types, than should be recycled close to the building site. In summary, data from construction waste of a Catalan case study was evaluated using the well established method of LCA to determine the environmental impacts.

  17. Model based systems engineering (MBSE) applied to Radio Aurora Explorer (RAX) CubeSat mission operational scenarios

    NASA Astrophysics Data System (ADS)

    Spangelo, S. C.; Cutler, J.; Anderson, L.; Fosse, E.; Cheng, L.; Yntema, R.; Bajaj, M.; Delp, C.; Cole, B.; Soremekum, G.; Kaslow, D.

    Small satellites are more highly resource-constrained by mass, power, volume, delivery timelines, and financial cost relative to their larger counterparts. Small satellites are operationally challenging because subsystem functions are coupled and constrained by the limited available commodities (e.g. data, energy, and access times to ground resources). Furthermore, additional operational complexities arise because small satellite components are physically integrated, which may yield thermal or radio frequency interference. In this paper, we extend our initial Model Based Systems Engineering (MBSE) framework developed for a small satellite mission by demonstrating the ability to model different behaviors and scenarios. We integrate several simulation tools to execute SysML-based behavior models, including subsystem functions and internal states of the spacecraft. We demonstrate utility of this approach to drive the system analysis and design process. We demonstrate applicability of the simulation environment to capture realistic satellite operational scenarios, which include energy collection, the data acquisition, and downloading to ground stations. The integrated modeling environment enables users to extract feasibility, performance, and robustness metrics. This enables visualization of both the physical states (e.g. position, attitude) and functional states (e.g. operating points of various subsystems) of the satellite for representative mission scenarios. The modeling approach presented in this paper offers satellite designers and operators the opportunity to assess the feasibility of vehicle and network parameters, as well as the feasibility of operational schedules. This will enable future missions to benefit from using these models throughout the full design, test, and fly cycle. In particular, vehicle and network parameters and schedules can be verified prior to being implemented, during mission operations, and can also be updated in near real-time with oper

  18. [Study on strategies of pollution prevention in coastal city of Zhejiang Province based on scenario analysis].

    PubMed

    Tian, Jin-Ping; Chen, Lü-Jun; Du, Peng-Fei; Qian, Yi

    2013-01-01

    Scenario analysis was used to study the environmental burden in a coastal city of Zhejiang province under different patterns of economic development. The aim of this research is to propose advices on decision making by illustrating how to make emissions reduced by transforming the pattern of economic development in a developed coastal area, which had acquired the level of 70 000 yuan GDP per cap. At first, 18 heavy pollution industries were screened out, by referencing total emissions of chemical oxygen demand, ammonia-nitrogen, sulfur dioxide, and nitrogen oxide. Then, a model of scenario analysis and the back-up calculation program were designed to study the sustainable development of the heavy pollution industries. With 2008 and 2015 as the reference year and the target year respectively, emissions of four pollutants mentioned above in the 18 heavy pollution industries in the city were analyzed under six scenarios. The total emissions of 4 pollutants should be reduced to an expectant degree, which is set as the constraint prerequisite of the scenario analysis. At last, some suggestions for decision-making are put forward, which include maintaining a moderate increase rate of GDP around 7%, strengthening the adjustment of economic structure, controlling the increasing rate of industrial added value of the industries with heavy pollution, optimizing the structure of industries with heavy pollution, decreasing the intensity of waste emission by implementing cleaner production to reduce emission produce at the source, and strengthening regulations on the operation of waste treatment plants to further promote the efficiency of waste treatment. Only by implementing such measures mentioned above, can the total emissions of chemical oxygen demand, ammonia-nitrogen, sulfur dioxide, and nitrogen oxide of the 18 industries with heavy pollution in the city be reduced by a 10%, 10%, 5%, and 15% respectively based on the reference year.

  19. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    NASA Astrophysics Data System (ADS)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  20. [Study on strategies of pollution prevention in coastal city of Zhejiang Province based on scenario analysis].

    PubMed

    Tian, Jin-Ping; Chen, Lü-Jun; Du, Peng-Fei; Qian, Yi

    2013-01-01

    Scenario analysis was used to study the environmental burden in a coastal city of Zhejiang province under different patterns of economic development. The aim of this research is to propose advices on decision making by illustrating how to make emissions reduced by transforming the pattern of economic development in a developed coastal area, which had acquired the level of 70 000 yuan GDP per cap. At first, 18 heavy pollution industries were screened out, by referencing total emissions of chemical oxygen demand, ammonia-nitrogen, sulfur dioxide, and nitrogen oxide. Then, a model of scenario analysis and the back-up calculation program were designed to study the sustainable development of the heavy pollution industries. With 2008 and 2015 as the reference year and the target year respectively, emissions of four pollutants mentioned above in the 18 heavy pollution industries in the city were analyzed under six scenarios. The total emissions of 4 pollutants should be reduced to an expectant degree, which is set as the constraint prerequisite of the scenario analysis. At last, some suggestions for decision-making are put forward, which include maintaining a moderate increase rate of GDP around 7%, strengthening the adjustment of economic structure, controlling the increasing rate of industrial added value of the industries with heavy pollution, optimizing the structure of industries with heavy pollution, decreasing the intensity of waste emission by implementing cleaner production to reduce emission produce at the source, and strengthening regulations on the operation of waste treatment plants to further promote the efficiency of waste treatment. Only by implementing such measures mentioned above, can the total emissions of chemical oxygen demand, ammonia-nitrogen, sulfur dioxide, and nitrogen oxide of the 18 industries with heavy pollution in the city be reduced by a 10%, 10%, 5%, and 15% respectively based on the reference year. PMID:23487960

  1. What Did I Do? A Scenario-Based Program To Assist Specific Learning Disabled Adolescents in Understanding Legal Issues.

    ERIC Educational Resources Information Center

    McDougall, Donna M.

    This practicum was designed to train eight adolescents with specific learning disabilities (SLD) about their legal rights and responsibilities, through a scenario-based program presented in the classroom as part of a transition program. The practicum involved the development of 22 scenarios, a pretest and posttest, and discussions and role-playing…

  2. Developing Authentic Online Problem-Based Learning Case Scenarios for Teachers of Students with Visual Impairments in the United Kingdom

    ERIC Educational Resources Information Center

    McLinden, Mike; McCall, Steve; Hinton, Danielle; Weston, Annette

    2010-01-01

    This article reports on the development of online problem-based learning case scenarios for use in a distance education program for teachers of students with visual impairments in the United Kingdom. Following participation in two case scenarios, a cohort of teachers provided feedback. This feedback was analyzed in relation to the relevant…

  3. Incorporating scenario-based simulation into a hospital nursing education program.

    PubMed

    Nagle, Beth M; McHale, Jeanne M; Alexander, Gail A; French, Brian M

    2009-01-01

    Nurse educators are challenged to provide meaningful and effective learning opportunities for both new and experienced nurses. Simulation as a teaching and learning methodology is being embraced by nursing in academic and practice settings to provide innovative educational experiences to assess and develop clinical competency, promote teamwork, and improve care processes. This article provides an overview of the historical basis for using simulation in education, simulation methodologies, and perceived advantages and disadvantages. It also provides a description of the integration of scenario-based programs using a full-scale patient simulator into nursing education programming at a large academic medical center.

  4. A method for scenario-based risk assessment for robust aerospace systems

    NASA Astrophysics Data System (ADS)

    Thomas, Victoria Katherine

    In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps

  5. The Impact of New Estimates of Mixing Ratio and Flux-based Halogen Scenarios on Ozone Evolution

    NASA Technical Reports Server (NTRS)

    Oman, Luke D.; Douglass, Anne R.; Liang, Qing; Strahan, Susan E.

    2014-01-01

    The evolution of ozone in the 21st century has been shown to be mainly impacted by the halogen emissions scenario and predicted changes in the circulation of the stratosphere. New estimates of mixing ratio and flux-based emission scenarios have been produced from the SPARC Lifetime Assessment 2013. Simulations using the Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM) are conducted using this new A1 2014 halogen scenario and compared to ones using the A1 2010 scenario. This updated version of GEOSCCM includes a realistic representation of the Quasi-Biennial Oscillation and improvements related to the break up of the Antarctic polar vortex. We will present results of the ozone evolution over the recent past and 21st century to the A1 2010, A1 2014 mixing ratio, and an A1 2014 flux-based halogen scenario. Implications of the uncertainties in these estimates as well as those from possible circulation changes will be discussed.

  6. TRIDEC Cloud - a Web-based Platform for Tsunami Early Warning tested with NEAMWave14 Scenarios

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven; Necmioglu, Ocal; Comoglu, Mustafa; Ozer Sozdinler, Ceren; Carrilho, Fernando; Wächter, Joachim

    2015-04-01

    In times of cloud computing and ubiquitous computing the use of concepts and paradigms introduced by information and communications technology (ICT) have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in research projects new technologies are exploited to implement a cloud-based and web-based platform - the TRIDEC Cloud - to open up new prospects for EWS. The platform in its current version addresses tsunami early warning and mitigation. It merges several complementary external and in-house cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The TRIDEC Cloud can be accessed in two different modes, the monitoring mode and the exercise-and-training mode. The monitoring mode provides important functionality required to act in a real event. So far, the monitoring mode integrates historic and real-time sea level data and latest earthquake information. The integration of sources is supported by a simple and secure interface. The exercise and training mode enables training and exercises with virtual scenarios. This mode disconnects real world systems and connects with a virtual environment that receives virtual earthquake information and virtual sea level data re-played by a scenario player. Thus operators and other stakeholders are able to train skills and prepare for real events and large exercises. The GFZ German Research Centre for Geosciences (GFZ), the Kandilli Observatory and Earthquake Research Institute (KOERI), and the Portuguese Institute for the Sea and Atmosphere (IPMA) have used the opportunity provided by NEAMWave14 to test the TRIDEC Cloud as a collaborative activity based on previous partnership and commitments at

  7. Land-use threats and protected areas: a scenario-based, landscape level approach

    USGS Publications Warehouse

    Wilson, Tamara S.; Sleeter, Benjamin M.; Sleeter, Rachel R.; Soulard, Christopher E.

    2014-01-01

    Anthropogenic land use will likely present a greater challenge to biodiversity than climate change this century in the Pacific Northwest, USA. Even if species are equipped with the adaptive capacity to migrate in the face of a changing climate, they will likely encounter a human-dominated landscape as a major dispersal obstacle. Our goal was to identify, at the ecoregion-level, protected areas in close proximity to lands with a higher likelihood of future land-use conversion. Using a state-and-transition simulation model, we modeled spatially explicit (1 km2) land use from 2000 to 2100 under seven alternative land-use and emission scenarios for ecoregions in the Pacific Northwest. We analyzed scenario-based land-use conversion threats from logging, agriculture, and development near existing protected areas. A conversion threat index (CTI) was created to identify ecoregions with highest projected land-use conversion potential within closest proximity to existing protected areas. Our analysis indicated nearly 22% of land area in the Coast Range, over 16% of land area in the Puget Lowland, and nearly 11% of the Cascades had very high CTI values. Broader regional-scale land-use change is projected to impact nearly 40% of the Coast Range, 30% of the Puget Lowland, and 24% of the Cascades (i.e., two highest CTI classes). A landscape level, scenario-based approach to modeling future land use helps identify ecoregions with existing protected areas at greater risk from regional land-use threats and can help prioritize future conservation efforts.

  8. Scenario-based design: A method for connecting information system design with public health operations and emergency management

    PubMed Central

    Reeder, Blaine; Turner, Anne M

    2011-01-01

    Responding to public health emergencies requires rapid and accurate assessment of workforce availability under adverse and changing circumstances. However, public health information systems to support resource management during both routine and emergency operations are currently lacking. We applied scenario-based design as an approach to engage public health practitioners in the creation and validation of an information design to support routine and emergency public health activities. Methods: Using semi-structured interviews we identified the information needs and activities of senior public health managers of a large municipal health department during routine and emergency operations. Results: Interview analysis identified twenty-five information needs for public health operations management. The identified information needs were used in conjunction with scenario-based design to create twenty-five scenarios of use and a public health manager persona. Scenarios of use and persona were validated and modified based on follow-up surveys with study participants. Scenarios were used to test and gain feedback on a pilot information system. Conclusion: The method of scenario-based design was applied to represent the resource management needs of senior-level public health managers under routine and disaster settings. Scenario-based design can be a useful tool for engaging public health practitioners in the design process and to validate an information system design. PMID:21807120

  9. Emergence of the First Catalytic Oligonucleotides in a Formamide-Based Origin Scenario.

    PubMed

    Šponer, Judit E; Šponer, Jiří; Nováková, Olga; Brabec, Viktor; Šedo, Ondrej; Zdráhal, Zbyněk; Costanzo, Giovanna; Pino, Samanta; Saladino, Raffaele; Di Mauro, Ernesto

    2016-03-01

    50 years after the historical Miller-Urey experiment, the formamide-based scenario is perhaps the most powerful concurrent hypothesis for the origin of life on our planet besides the traditional HCN-based concept. The information accumulated during the last 15 years in this topic is astonishingly growing and nowadays the formamide-based model represents one of the most complete and coherent pathways leading from simple prebiotic precursors up to the first catalytically active RNA molecules. In this work, we overview the major events of this long pathway that have emerged from recent experimental and theoretical studies, mainly concentrating on the mechanistic, methodological, and structural aspects of this research.

  10. Multi-Purpose Avionic Architecture for Vision Based Navigation Systems for EDL and Surface Mobility Scenarios

    NASA Astrophysics Data System (ADS)

    Tramutola, A.; Paltro, D.; Cabalo Perucha, M. P.; Paar, G.; Steiner, J.; Barrio, A. M.

    2015-09-01

    Vision Based Navigation (VBNAV) has been identified as a valid technology to support space exploration because it can improve autonomy and safety of space missions. Several mission scenarios can benefit from the VBNAV: Rendezvous & Docking, Fly-Bys, Interplanetary cruise, Entry Descent and Landing (EDL) and Planetary Surface exploration. For some of them VBNAV can improve the accuracy in state estimation as additional relative navigation sensor or as absolute navigation sensor. For some others, like surface mobility and terrain exploration for path identification and planning, VBNAV is mandatory. This paper presents the general avionic architecture of a Vision Based System as defined in the frame of the ESA R&T study “Multi-purpose Vision-based Navigation System Engineering Model - part 1 (VisNav-EM-1)” with special focus on the surface mobility application.

  11. Earthquake Scenario-Based Tsunami Wave Heights in the Eastern Mediterranean and Connected Seas

    NASA Astrophysics Data System (ADS)

    Necmioglu, Ocal; Özel, Nurcan Meral

    2015-12-01

    We identified a set of tsunami scenario input parameters in a 0.5° × 0.5° uniformly gridded area in the Eastern Mediterranean, Aegean (both for shallow- and intermediate-depth earthquakes) and Black Seas (only shallow earthquakes) and calculated tsunami scenarios using the SWAN-Joint Research Centre (SWAN-JRC) code ( Mader 2004; Annunziato 2007) with 2-arcmin resolution bathymetry data for the range of 6.5—Mwmax with an Mw increment of 0.1 at each grid in order to realize a comprehensive analysis of tsunami wave heights from earthquakes originating in the region. We defined characteristic earthquake source parameters from a compiled set of sources such as existing moment tensor catalogues and various reference studies, together with the Mwmax assigned in the literature, where possible. Results from 2,415 scenarios show that in the Eastern Mediterranean and its connected seas (Aegean and Black Sea), shallow earthquakes with Mw ≥ 6.5 may result in coastal wave heights of 0.5 m, whereas the same wave height would be expected only from intermediate-depth earthquakes with Mw ≥ 7.0 . The distribution of maximum wave heights calculated indicate that tsunami wave heights up to 1 m could be expected in the northern Aegean, whereas in the Black Sea, Cyprus, Levantine coasts, northern Libya, eastern Sicily, southern Italy, and western Greece, up to 3-m wave height could be possible. Crete, the southern Aegean, and the area between northeast Libya and Alexandria (Egypt) is prone to maximum tsunami wave heights of >3 m. Considering that calculations are performed at a minimum bathymetry depth of 20 m, these wave heights may, according to Green's Law, be amplified by a factor of 2 at the coastline. The study can provide a basis for detailed tsunami hazard studies in the region.

  12. Variance-based global sensitivity analysis for multiple scenarios and models with implementation using sparse grid collocation

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Ye, Ming

    2015-09-01

    Sensitivity analysis is a vital tool in hydrological modeling to identify influential parameters for inverse modeling and uncertainty analysis, and variance-based global sensitivity analysis has gained popularity. However, the conventional global sensitivity indices are defined with consideration of only parametric uncertainty. Based on a hierarchical structure of parameter, model, and scenario uncertainties and on recently developed techniques of model- and scenario-averaging, this study derives new global sensitivity indices for multiple models and multiple scenarios. To reduce computational cost of variance-based global sensitivity analysis, sparse grid collocation method is used to evaluate the mean and variance terms involved in the variance-based global sensitivity analysis. In a simple synthetic case of groundwater flow and reactive transport, it is demonstrated that the global sensitivity indices vary substantially between the four models and three scenarios. Not considering the model and scenario uncertainties, might result in biased identification of important model parameters. This problem is resolved by using the new indices defined for multiple models and/or multiple scenarios. This is particularly true when the sensitivity indices and model/scenario probabilities vary substantially. The sparse grid collocation method dramatically reduces the computational cost, in comparison with the popular quasi-random sampling method. The new framework of global sensitivity analysis is mathematically general, and can be applied to a wide range of hydrologic and environmental problems.

  13. Policy Choice for Urban Low-carbon transportation in Beijing: Scenario Analysis Based on LEAP model

    NASA Astrophysics Data System (ADS)

    Zhang, Yu

    2016-04-01

    Beijing is a fast developing megacity with serious traffic problems, such as high energy consumption, high CO2 emission and traffic congestion. The coming 13th Five-Year Plan for Beijing economic and social development will focus on the low-carbon transportation policy to achieve the urban traffic sustainable development. In order to improve the feasibility of urban low-carbon transportation policies, this paper analyzes the future trends of CO2 emissions from transportation of Beijing. Firstly, five policies scenarios are developed according to the coming Beijing 13th Five-Year Plan, including the "Business As Usual (BAU)", the "Public Transportation Priority(PTP)", the "New Energy Vehicle(NEV)", the "Active Transportation(AT)", the "Private Car Regulation(PCR)" and the "Hybrid Policy(HP)". Then the Long-range Energy Alternatives Planning System(LEAP model) framework is adopted to estimate CO2 emission under given policies scenarios up to year 2020 and analyze the implications. The results demonstrate that the low-carbon transportation policies can reduce CO2 emission effectively. Specifically, the "Hybrid Policy(HP)" has the best performance. In terms of single policy effect, the "Private Car Regulation(PCR)" comes first followed by the "Public Transportation Priority(PTP)".

  14. Dynamic optimization of ISR sensors using a risk-based reward function applied to ground and space surveillance scenarios

    NASA Astrophysics Data System (ADS)

    DeSena, J. T.; Martin, S. R.; Clarke, J. C.; Dutrow, D. A.; Newman, A. J.

    2012-06-01

    As the number and diversity of sensing assets available for intelligence, surveillance and reconnaissance (ISR) operations continues to expand, the limited ability of human operators to effectively manage, control and exploit the ISR ensemble is exceeded, leading to reduced operational effectiveness. Automated support both in the processing of voluminous sensor data and sensor asset control can relieve the burden of human operators to support operation of larger ISR ensembles. In dynamic environments it is essential to react quickly to current information to avoid stale, sub-optimal plans. Our approach is to apply the principles of feedback control to ISR operations, "closing the loop" from the sensor collections through automated processing to ISR asset control. Previous work by the authors demonstrated non-myopic multiple platform trajectory control using a receding horizon controller in a closed feedback loop with a multiple hypothesis tracker applied to multi-target search and track simulation scenarios in the ground and space domains. This paper presents extensions in both size and scope of the previous work, demonstrating closed-loop control, involving both platform routing and sensor pointing, of a multisensor, multi-platform ISR ensemble tasked with providing situational awareness and performing search, track and classification of multiple moving ground targets in irregular warfare scenarios. The closed-loop ISR system is fullyrealized using distributed, asynchronous components that communicate over a network. The closed-loop ISR system has been exercised via a networked simulation test bed against a scenario in the Afghanistan theater implemented using high-fidelity terrain and imagery data. In addition, the system has been applied to space surveillance scenarios requiring tracking of space objects where current deliberative, manually intensive processes for managing sensor assets are insufficiently responsive. Simulation experiment results are presented

  15. Application of State Analysis and Goal-Based Operations to a MER Mission Scenario

    NASA Technical Reports Server (NTRS)

    Morris, J. Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.

    2006-01-01

    State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the behavior of states and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.

  16. Scenario analysis and path selection of low-carbon transformation in China based on a modified IPAT model.

    PubMed

    Chen, Liang; Yang, Zhifeng; Chen, Bin

    2013-01-01

    This paper presents a forecast and analysis of population, economic development, energy consumption and CO2 emissions variation in China in the short- and long-term steps before 2020 with 2007 as the base year. The widely applied IPAT model, which is the basis for calculations, projections, and scenarios of greenhouse gases (GHGs) reformulated as the Kaya equation, is extended to analyze and predict the relations between human activities and the environment. Four scenarios of CO2 emissions are used including business as usual (BAU), energy efficiency improvement scenario (EEI), low carbon scenario (LC) and enhanced low carbon scenario (ELC). The results show that carbon intensity will be reduced by 40-45% as scheduled and economic growth rate will be 6% in China under LC scenario by 2020. The LC scenario, as the most appropriate and the most feasible scheme for China's low-carbon development in the future, can maximize the harmonious development of economy, society, energy and environmental systems. Assuming China's development follows the LC scenario, the paper further gives four paths of low-carbon transformation in China: technological innovation, industrial structure optimization, energy structure optimization and policy guidance. PMID:24204922

  17. Scenario Analysis and Path Selection of Low-Carbon Transformation in China Based on a Modified IPAT Model

    PubMed Central

    Chen, Liang; Yang, Zhifeng; Chen, Bin

    2013-01-01

    This paper presents a forecast and analysis of population, economic development, energy consumption and CO2 emissions variation in China in the short- and long-term steps before 2020 with 2007 as the base year. The widely applied IPAT model, which is the basis for calculations, projections, and scenarios of greenhouse gases (GHGs) reformulated as the Kaya equation, is extended to analyze and predict the relations between human activities and the environment. Four scenarios of CO2 emissions are used including business as usual (BAU), energy efficiency improvement scenario (EEI), low carbon scenario (LC) and enhanced low carbon scenario (ELC). The results show that carbon intensity will be reduced by 40–45% as scheduled and economic growth rate will be 6% in China under LC scenario by 2020. The LC scenario, as the most appropriate and the most feasible scheme for China’s low-carbon development in the future, can maximize the harmonious development of economy, society, energy and environmental systems. Assuming China's development follows the LC scenario, the paper further gives four paths of low-carbon transformation in China: technological innovation, industrial structure optimization, energy structure optimization and policy guidance. PMID:24204922

  18. Methodology To Define Drought Management Scenarios Based On Accumulated Future Projections Of Risk

    NASA Astrophysics Data System (ADS)

    Haro-Monteagudo, David; Solera-Solera, Abel; Andreu-Álvarez, Joaquín

    2014-05-01

    Drought is a serious threat to many water resources systems in the world. Especially to those in which the equilibrium between resources availability and water uses is very fragile, making that deviation below normality compromises the capacity of the system to cope with all the demands and environmental requirements. Since droughts are not isolated events but instead they develop through time in what could be considered a creeping behavior, it is very difficult to determine when an episode starts and how long will it last. Because this is a major concern for water managers and society in general, scientific research has strived to develop indices that allow evaluating the risk of a drought event occurrence. These indices often have as basis previous and current state variables of the system that combined between them supply decision making responsible with an indication of the risk of being in a situation of drought, normally through the definition of a drought scenario situation. While this way of proceeding has found to be effective in many systems, there are cases in which indicators systems fail to define the appropriate on-going drought scenario early enough to start measures that allowed to minimize the possible impacts. This is the case, for example, of systems with high seasonal precipitation variability. The use of risk assessment models to evaluate future possible states of the system becomes handy in cases like the previous one, although they are not limited to such systems. We present a method to refine the drought scenario definition within a water resources system. To implement this methodology, we use a risk assessment model generalized to water resources systems based in the stochastic generation of multiple possible future streamflows generation and the simulation of the system from a Monte-Carlo approach. We do this assessment every month of the year up to the end of the hydrologic year that normally corresponds with the end of the irrigation

  19. Assessing nitrate leaching losses with simulation scenarios and model based fertiliser recommendations

    NASA Astrophysics Data System (ADS)

    Michalczyk, A.; Kersebaum, K. C.; Hartmann, T.; Yue, S. C.; Chen, X. P.

    2012-04-01

    Excessive mineral nitrogen fertiliser application and irrigation in intensive agricultural cropping systems is seen as a major reason for low water and nitrogen use efficiencies in the North China Plain. High nitrogen fertiliser and irrigation water inputs do not only lead to higher production costs but also to decreasing ground water tables, nitrate accumulation in deeper soil layers below the root zone and water pollution. To evaluate the effects of improved management practices on environmental pollution risk, the HERMES model is used to simulate nitrate leaching losses. The HERMES model is a dynamic, process based crop model made for practical applications such as fertiliser recommendations. The model was tested and validated on two field studies in the south of the Hebei Province that lasted for about three years with a winter wheat (Triticum aestivum L.) and summer maize (Zea mays L.) double cropping system. Biomass, grain yield, plant N uptake and soil water content were better simulated than mineral nitrogen in the soil. A model based nitrogen fertiliser recommendation was applied in the field for one wheat crop. The parallel model simulation showed satisfying results. Although there was no change in the amount of irrigation, the results indicated a possibility to reduce the fertiliser rate and thus nitrogen leaching even more than in the reduced treatment without reducing crop yields. Further more a simulation scenario with a model based fertiliser recommendation and a field capacity based irrigation was compared to farmers practice and reduced nitrogen treatment. The scenario results showed that the model recommendation together with the reduced irrigation has the highest potential to reduce nitrate leaching. The results also showed that flood irrigation as practiced by the farmers and its difficult to estimate amounts of water bears a big uncertainty for modelling.

  20. ESPC Overview: Cash Flows, Scenarios, and Associated Diagrams for Energy Savings Performance Contracts

    SciTech Connect

    Tetreault, T.; Regenthal, S.

    2011-05-01

    This document is meant to inform state and local decision makers about the process of energy savings performance contracts, and how projected savings and allocated energy-related budgets can be impacted by changes in utility prices.

  1. ESPC Overview. Cash Flows, Scenarios, and Associated Diagrams for Energy Savings Performance Contracts

    SciTech Connect

    Tetreault, T.; Regenthal, S.

    2011-05-01

    This document is meant to inform state and local decision makers about the process of energy savings performance contracts, and how projected savings and allocated energy-related budgets can be impacted by changes in utility prices.

  2. Lunar Outpost Life Support Architecture Study Based on a High-Mobility Exploration Scenario

    NASA Technical Reports Server (NTRS)

    Lange, Kevin E.; Anderson, Molly S.

    2010-01-01

    This paper presents results of a life support architecture study based on a 2009 NASA lunar surface exploration scenario known as Scenario 12. The study focuses on the assembly complete outpost configuration and includes pressurized rovers as part of a distributed outpost architecture in both stand-alone and integrated configurations. A range of life support architectures are examined reflecting different levels of closure and distributed functionality. Monte Carlo simulations are used to assess the sensitivity of results to volatile high-impact mission variables, including the quantity of residual Lander oxygen and hydrogen propellants available for scavenging, the fraction of crew time away from the outpost on excursions, total extravehicular activity hours, and habitat leakage. Surpluses or deficits of water and oxygen are reported for each architecture, along with fixed and 10-year total equivalent system mass estimates relative to a reference case. System robustness is discussed in terms of the probability of no water or oxygen resupply as determined from the Monte Carlo simulations.

  3. WSN system design by using an innovative neural network model to perform thermals forecasting in a urban canyon scenario

    NASA Astrophysics Data System (ADS)

    Giuseppina, Nicolosi; Salvatore, Tirrito

    2015-12-01

    Wireless Sensor Networks (WSNs) were studied by researchers in order to manage Heating, Ventilating and Air-Conditioning (HVAC) indoor systems. WSN can be useful specially to regulate indoor confort in a urban canyon scenario, where the thermal parameters vary rapidly, influenced by outdoor climate changing. This paper shows an innovative neural network approach, by using WSN data collected, in order to forecast the indoor temperature to varying the outdoor conditions based on climate parameters and boundary conditions typically of urban canyon. In this work more attention will be done to influence of traffic jam and number of vehicles in queue.

  4. Supporting Algebraic Reasoning through Personalized Story Scenarios: How Situational Understanding Mediates Performance

    ERIC Educational Resources Information Center

    Walkington, Candace; Petrosino, Anthony; Sherman, Milan

    2013-01-01

    Context personalization refers to matching instruction to students' out-of-school interests and experiences. Belief in the benefits of matching instruction to interests is widely held in the culture of schooling; however, little research has empirically examined how interest impacts performance and learning in secondary mathematics. Here we…

  5. Application of risk-based multiple criteria decision analysis for selection of the best agricultural scenario for effective watershed management.

    PubMed

    Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz

    2016-03-01

    Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health.

  6. A web-based 3D visualisation and assessment system for urban precinct scenario modelling

    NASA Astrophysics Data System (ADS)

    Trubka, Roman; Glackin, Stephen; Lade, Oliver; Pettit, Chris

    2016-07-01

    Recent years have seen an increasing number of spatial tools and technologies for enabling better decision-making in the urban environment. They have largely arisen because of the need for cities to be more efficiently planned to accommodate growing populations while mitigating urban sprawl, and also because of innovations in rendering data in 3D being well suited for visualising the urban built environment. In this paper we review a number of systems that are better known and more commonly used in the field of urban planning. We then introduce Envision Scenario Planner (ESP), a web-based 3D precinct geodesign, visualisation and assessment tool, developed using Agile and Co-design methods. We provide a comprehensive account of the tool, beginning with a discussion of its design and development process and concluding with an example use case and a discussion of the lessons learned in its development.

  7. Development of responses based on IPCC and "what-if?" IWRM scenarios

    NASA Astrophysics Data System (ADS)

    Giannini, V.; Ceccato, L.; Hutton, C.; Allan, A. A.; Kienberger, S.; Flügel, W.-A.; Giupponi, C.

    2011-04-01

    This work illustrates the findings of a participatory research process aimed at identifying responses for sustainable water management in a climate change perspective, in two river basins in Europe and Asia. The chapter describes the methodology implemented through local participatory workshops, aimed at eliciting and evaluating possible responses to flood risk, which were then assessed with respect to the existing governance framework. Socio-economic vulnerability was also investigated developing an indicator, whose future trend was analysed with reference to IPCC scenarios. The main outcome of such activities consists in the identification of Integrated Water Resource Management Strategies (IWRMS) based upon the issues and preferences elicited from local experts. The mDSS decision support tool was used to facilitate transparent and robust management of the information collected and communication of the outputs.

  8. Current scenario of peptide-based drugs: the key roles of cationic antitumor and antiviral peptides

    PubMed Central

    Mulder, Kelly C. L.; Lima, Loiane A.; Miranda, Vivian J.; Dias, Simoni C.; Franco, Octávio L.

    2013-01-01

    Cationic antimicrobial peptides (AMPs) and host defense peptides (HDPs) show vast potential as peptide-based drugs. Great effort has been made in order to exploit their mechanisms of action, aiming to identify their targets as well as to enhance their activity and bioavailability. In this review, we will focus on both naturally occurring and designed antiviral and antitumor cationic peptides, including those here called promiscuous, in which multiple targets are associated with a single peptide structure. Emphasis will be given to their biochemical features, selectivity against extra targets, and molecular mechanisms. Peptides which possess antitumor activity against different cancer cell lines will be discussed, as well as peptides which inhibit virus replication, focusing on their applications for human health, animal health and agriculture, and their potential as new therapeutic drugs. Moreover, the current scenario for production and the use of nanotechnology as delivery tool for both classes of cationic peptides, as well as the perspectives on improving them is considered. PMID:24198814

  9. Context-based handover of persons in crowd and riot scenarios

    NASA Astrophysics Data System (ADS)

    Metzler, Jürgen

    2015-02-01

    In order to control riots in crowds, it is helpful to get ringleaders under control and pull them out of the crowd if one has become an offender. A great support to achieve these tasks is the capability of observing the crowd and ringleaders automatically by using cameras. It also allows a better conservation of evidence in riot control. A ringleader who has become an offender should be tracked across and recognized by several cameras, regardless of whether overlapping camera's fields of view exist or not. We propose a context-based approach for handover of persons between different camera fields of view. This approach can be applied for overlapping as well as for non-overlapping fields of view, so that a fast and accurate identification of individual persons in camera networks is feasible. Within the scope of this paper, the approach is applied to a handover of persons between single images without having any temporal information. It is particularly developed for semiautomatic video editing and a handover of persons between cameras in order to improve conservation of evidence. The approach has been developed on a dataset collected during a Crowd and Riot Control (CRC) training of the German armed forces. It consists of three different levels of escalation. First, the crowd started with a peaceful demonstration. Later, there were violent protests, and third, the riot escalated and offenders bumped into the chain of guards. One result of the work is a reliable context-based method for person re-identification between single images of different camera fields of view in crowd and riot scenarios. Furthermore, a qualitative assessment shows that the use of contextual information can support this task additionally. It can decrease the needed time for handover and the number of confusions which supports the conservation of evidence in crowd and riot scenarios.

  10. Thermal Performance Expectations of the Advanced Stirling Convertor Over a Range of Operating Scenarios

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Dyson, Rodger W.

    2010-01-01

    The Advanced Stirling Radioisotope Generator (ASRG) will enable various missions such as small body sample return, atmospheric missions around Venus, as well as long - duration deep space missions. Analysis of the temperature distributions are performed on an Advanced Stirling Convertor, and the results are compared with available experimental measurements. This analysis includes applied environmental conditions that are similar to those that will be experienced while the convertor is in operation. The applied conditions represent a potential mission profile including pre-takeoff sterilization, launch, transit, and return. The results focus on the anticipated peak temperatures of the magnets in the linear alternator. These results confirm that the ASC can support future missions to deep space targets, extreme environment landers, as well as more conventional goals.

  11. Performance Monitoring Based on UML Performance Profile

    NASA Astrophysics Data System (ADS)

    Kim, Dong Kwan; Kim, Chul Jin; Cho, Eun Sook

    In this paper we propose a way of measuring software performance metrics such as response time, throughput, and resource utilization. It is obvious that performance-related Quality of Service (QoS) is one of the important factors which are satisfied for users' needs. The proposed approach uses UML performance profile for the performance specification and aspect-oriented paradigm for the performance measurement. Code instrumentation in AOP is a mechanism to insert source code for performance measurement into business logic code. We used AspectJ, an aspect-oriented extension to the Java. AspectJ code for performance measurement is separated from Java code for functional requirements. Both AspectJ and Java code can be woven together for the performance measurement. The key component of the proposed approach is an AspectJ code generator. It creates AspectJ code for the performance measurement from the UML [1] models containing performance profile.

  12. Scenarios, personas and user stories from design ethnography: Evidence-based design representations of communicable disease investigations

    PubMed Central

    Turner, Anne M; Reeder, Blaine; Ramey, Judith

    2014-01-01

    Purpose Despite years of effort and millions of dollars spent to create a unified electronic communicable disease reporting systems, the goal remains elusive. A major barrier has been a lack of understanding by system designers of communicable disease (CD) work and the public health workers who perform this work. This study reports on the application of User Center Design representations, traditionally used for improving interface design, to translate the complex CD work identified through ethnographic studies to guide designers and developers of CD systems. The purpose of this work is to: (1) better understand public health practitioners and their information workflow with respect to communicable disease (CD) monitoring and control at a local health department, and (2) to develop evidence-based design representations that model this CD work to inform the design of future disease surveillance systems. Methods We performed extensive onsite semi-structured interviews, targeted work shadowing and a focus group to characterize local health department communicable disease workflow. Informed by principles of design ethnography and user-centered design (UCD) we created persona, scenarios and user stories to accurately represent the user to system designers. Results We sought to convey to designers the key findings from ethnographic studies: 1) that public health CD work is mobile and episodic, in contrast to current CD reporting systems, which are stationary and fixed 2) health department efforts are focused on CD investigation and response rather than reporting and 3) current CD information systems must conform to PH workflow to ensure their usefulness. In an effort to illustrate our findings to designers, we developed three contemporary design-support representations: persona, scenario, and user story. Conclusions Through application of user centered design principles, we were able to create design representations that illustrate complex public health communicable

  13. Lunar base scenario cost estimates: Lunar base systems study task 6.1

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The projected development and production costs of each of the Lunar Base's systems are described and unit costs are estimated for transporting the systems to the lunar surface and for setting up the system.

  14. Application of State Analysis and Goal-based Operations to a MER Mission Scenario

    NASA Technical Reports Server (NTRS)

    Morris, John Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.

    2006-01-01

    State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the system behavior in terms of state variables and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper first describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.

  15. Selection of an appropriate wastewater treatment technology: a scenario-based multiple-attribute decision-making approach.

    PubMed

    Kalbar, Pradip P; Karmakar, Subhankar; Asolekar, Shyam R

    2012-12-30

    Many technological alternatives for wastewater treatment are available, ranging from advanced technologies to conventional treatment options. It is difficult to select the most appropriate technology from among a set of available alternatives to treat wastewater at a particular location. Many factors, such as capital costs, operation and maintenance costs and land requirement, are involved in the decision-making process. Sustainability criteria must also be incorporated into the decision-making process such that appropriate technologies are selected for developing economies such as that of India. A scenario-based multiple-attribute decision-making (MADM) methodology has been developed and applied to the selection of wastewater treatment alternative. The four most commonly used wastewater treatment technologies for treatment of municipal wastewater in India are ranked for various scenarios. Six scenarios are developed that capture the regional and local societal priorities of urban, suburban and rural areas and translate them into the mathematical algorithm of the MADM methodology. The articulated scenarios depict the most commonly encountered decision-making situations in addressing technology selection for wastewater treatment in India. A widely used compensatory MADM technique, TOPSIS, has been selected to rank the alternatives. Seven criteria with twelve indicators are formulated to evaluate the alternatives. Different weight matrices are used for each scenario, depending on the priorities of the scenario. This study shows that it is difficult to select the most appropriate wastewater treatment alternative under the "no scenario" condition (equal weights given to each attribute), and the decision-making methodology presented in this paper effectively identifies the most appropriate wastewater treatment alternative for each of the scenarios. PMID:23023038

  16. Projecting the environmental profile of Singapore's landfill activities: Comparisons of present and future scenarios based on LCA.

    PubMed

    Khoo, Hsien H; Tan, Lester L Z; Tan, Reginald B H

    2012-05-01

    This article aims to generate the environmental profile of Singapore's Semakau landfill by comparing three different operational options associated with the life cycle stages of landfilling activities, against a 'business as usual' scenario. Before life cycle assessment or LCA is used to quantify the potential impacts from landfilling activities, an attempt to incorporate localized and empirical information into the amounts of ash and MSW sent to the landfill was made. A linear regression representation of the relationship between the mass of waste disposed and the mass of incineration ash generated was modeled from waste statistics between years 2004 and 2009. Next, the mass of individual MSW components was projected from 2010 to 2030. The LCA results highlighted that in a 'business as usual' scenario the normalized total impacts of global warming, acidification and human toxicity increased by about 2% annually from 2011 to 2030. By replacing the 8000-tonne barge with a 10000-tonne coastal bulk carrier or freighter (in scenario 2) a grand total reduction of 48% of both global warming potential and acidification can be realized by year 2030. Scenario 3 explored the importance of having a Waste Water Treatment Plant in place to reduce human toxicity levels - however, the overall long-term benefits were not as significant as scenario 2. It is shown in scenario 4 that the option of increased recycling championed over all other three scenarios in the long run, resulting in a total 58% reduction in year 2030 for the total normalized results. A separate comparison of scenarios 1-4 is also carried out for energy utilization and land use in terms of volume of waste occupied. Along with the predicted reductions in environmental burdens, an additional bonus is found in the expanded lifespan of Semakau landfill from year 2032 (base case) to year 2039. Model limitations and suggestions for improvements were also discussed.

  17. Selection of an appropriate wastewater treatment technology: a scenario-based multiple-attribute decision-making approach.

    PubMed

    Kalbar, Pradip P; Karmakar, Subhankar; Asolekar, Shyam R

    2012-12-30

    Many technological alternatives for wastewater treatment are available, ranging from advanced technologies to conventional treatment options. It is difficult to select the most appropriate technology from among a set of available alternatives to treat wastewater at a particular location. Many factors, such as capital costs, operation and maintenance costs and land requirement, are involved in the decision-making process. Sustainability criteria must also be incorporated into the decision-making process such that appropriate technologies are selected for developing economies such as that of India. A scenario-based multiple-attribute decision-making (MADM) methodology has been developed and applied to the selection of wastewater treatment alternative. The four most commonly used wastewater treatment technologies for treatment of municipal wastewater in India are ranked for various scenarios. Six scenarios are developed that capture the regional and local societal priorities of urban, suburban and rural areas and translate them into the mathematical algorithm of the MADM methodology. The articulated scenarios depict the most commonly encountered decision-making situations in addressing technology selection for wastewater treatment in India. A widely used compensatory MADM technique, TOPSIS, has been selected to rank the alternatives. Seven criteria with twelve indicators are formulated to evaluate the alternatives. Different weight matrices are used for each scenario, depending on the priorities of the scenario. This study shows that it is difficult to select the most appropriate wastewater treatment alternative under the "no scenario" condition (equal weights given to each attribute), and the decision-making methodology presented in this paper effectively identifies the most appropriate wastewater treatment alternative for each of the scenarios.

  18. Projecting the environmental profile of Singapore's landfill activities: Comparisons of present and future scenarios based on LCA.

    PubMed

    Khoo, Hsien H; Tan, Lester L Z; Tan, Reginald B H

    2012-05-01

    This article aims to generate the environmental profile of Singapore's Semakau landfill by comparing three different operational options associated with the life cycle stages of landfilling activities, against a 'business as usual' scenario. Before life cycle assessment or LCA is used to quantify the potential impacts from landfilling activities, an attempt to incorporate localized and empirical information into the amounts of ash and MSW sent to the landfill was made. A linear regression representation of the relationship between the mass of waste disposed and the mass of incineration ash generated was modeled from waste statistics between years 2004 and 2009. Next, the mass of individual MSW components was projected from 2010 to 2030. The LCA results highlighted that in a 'business as usual' scenario the normalized total impacts of global warming, acidification and human toxicity increased by about 2% annually from 2011 to 2030. By replacing the 8000-tonne barge with a 10000-tonne coastal bulk carrier or freighter (in scenario 2) a grand total reduction of 48% of both global warming potential and acidification can be realized by year 2030. Scenario 3 explored the importance of having a Waste Water Treatment Plant in place to reduce human toxicity levels - however, the overall long-term benefits were not as significant as scenario 2. It is shown in scenario 4 that the option of increased recycling championed over all other three scenarios in the long run, resulting in a total 58% reduction in year 2030 for the total normalized results. A separate comparison of scenarios 1-4 is also carried out for energy utilization and land use in terms of volume of waste occupied. Along with the predicted reductions in environmental burdens, an additional bonus is found in the expanded lifespan of Semakau landfill from year 2032 (base case) to year 2039. Model limitations and suggestions for improvements were also discussed. PMID:22257698

  19. Scenario-based prediction of Li-ion batteries fire-induced toxicity

    NASA Astrophysics Data System (ADS)

    Lecocq, Amandine; Eshetu, Gebrekidan Gebresilassie; Grugeon, Sylvie; Martin, Nelly; Laruelle, Stephane; Marlair, Guy

    2016-06-01

    The development of high energy Li-ion batteries with improved durability and increased safety mostly relies on the use of newly developed electrolytes. A detailed appraisal of fire-induced thermal and chemical threats on LiPF6- and LiFSI-based electrolytes by means of the so-called "fire propagation apparatus" had highlighted that the salt anion was responsible for the emission of a non negligible content of irritant gas as HF (PF6-) or HF and SO2 (FSI-). A more thorough comparative investigation of the toxicity threat in the case of larger-size 0.4 kWh Li-ion modules was thus undertaken. A modeling approach that consists in extrapolating the experimental data obtained from 1.3Ah LiFePO4/graphite pouch cells under fire conditions and in using the state-of-the-art fire safety international standards for the evaluation of fire toxicity was applied under two different real-scale simulating scenarios. The obtained results reveal that critical thresholds are highly dependent on the nature of the salt, LiPF6 or LiFSI, and on the cells state of charge. Hence, this approach can help define appropriate fire safety engineering measures for a given technology (different chemistry) or application (fully charged backup batteries or batteries subjected to deep discharge).

  20. Life began when evolution began: a lipidic vesicle-based scenario.

    PubMed

    Tessera, Marc

    2009-12-01

    The research on the origin of life, as such, seems to have reached an impasse as a clear and universal scientific definition of life is probably impossible. On the contrary, the research on the origin of evolution may provide a clue. But it is necessary to identify the minimum requirements that allowed evolution to emerge on early Earth. The classical approach, the 'RNA world hypothesis' is one way, but an alternative based on nonlinear dynamics dealing with far-from-equilibrium self-organization and dissipative structures can also be proposed. The conditions on early Earth, near deep-sea hydrothermal sites, were favorable to the emergence of dissipative structures such as vesicles with bilayer membranes composed of a mixture of amphiphilic and hydrophobic molecules. Experimentally these vesicles are able to self-reproduce but not to evolve. A plausible scenario for the emergence of a positive feedback process giving them the capability of evolving on early Earth is suggested. The possibilities offered by such a process are described in regard to specific characteristics of extant biological organisms and leads for future research in the field are suggested.

  1. [Online bedside teaching: multimedia, interactive and case-based teaching scenarios in dermatology].

    PubMed

    Avila, Javier; Kaiser, Gerd; Nguyen-Dobinsky, Trong-Nghia; Zielke, Hendrik; Sterry, Wolfram; Rzany, Berthold

    2004-12-01

    The MeduMobile project or Mobiler Campus Charité was started in the beginning of 2003 to evaluate new, multimedia supported teaching and learning scenarios. The project focuses especially on acute and rare diseases to which students usually have are very limited access. During the teaching sessions the MeduOnCall team and the academic teacher communicate with the students who are equipped with individual notebooks through a wireless net. In dermatology the MeduMobile project focused in summer 2003 and winter 2003/2004 on bedside teaching. Despite some technical difficulties in the beginning the project was well received by teachers and students. The possibility of simultaneous online searches by Internet as well as the possibility to include photographs depicting the course of the skin disease was felt to be one of the main advantages of this project. Based on these positive experiences an attempt will be made to integrate the project further into the regular teaching of the Charite.

  2. Life Began When Evolution Began: A Lipidic Vesicle-Based Scenario

    NASA Astrophysics Data System (ADS)

    Tessera, Marc

    2009-12-01

    The research on the origin of life, as such, seems to have reached an impasse as a clear and universal scientific definition of life is probably impossible. On the contrary, the research on the origin of evolution may provide a clue. But it is necessary to identify the minimum requirements that allowed evolution to emerge on early Earth. The classical approach, the ‘RNA world hypothesis’ is one way, but an alternative based on nonlinear dynamics dealing with far-from-equilibrium self-organization and dissipative structures can also be proposed. The conditions on early Earth, near deep-sea hydrothermal sites, were favorable to the emergence of dissipative structures such as vesicles with bilayer membranes composed of a mixture of amphiphilic and hydrophobic molecules. Experimentally these vesicles are able to self-reproduce but not to evolve. A plausible scenario for the emergence of a positive feedback process giving them the capability of evolving on early Earth is suggested. The possibilities offered by such a process are described in regard to specific characteristics of extant biological organisms and leads for future research in the field are suggested.

  3. Scenario-based risk analysis of winter snowstorms in the German lowlands

    NASA Astrophysics Data System (ADS)

    von Wulffen, Anja

    2014-05-01

    conditions. Based on these findings, an exemplary synoptic evolution of a snowstorm leading to representative infrastructure failure cascades is constructed. In a next step, an extrapolation of this obtained scenario to future climate and societal conditions as well as plausible more extreme but not yet observed meteorological conditions is planned in order to obtain a thorough analysis of possible threats to the German food distribution system and a strong foundation for future disaster mitigation planning efforts.

  4. Moving beyond the EHR: a scenario-based approach to IDS design.

    PubMed

    Ellis, Todd; Moran, Roey

    2015-10-01

    When seeking to establish an integrated delivery system, provider organizations can use the following broad steps to map out IT solutions: Identify high-level focus areas for integrated care. Analyze care gaps in those areas using hypothetical scenarios Project how patients would interact with newly implemented integrated-care elements. Translate this projection-i.e., a target-state scenario-into technical design.

  5. The FORE-SCE model: a practical approach for projecting land cover change using scenario-based modeling

    USGS Publications Warehouse

    Sohl, Terry L.; Sayler, Kristi L.; Drummond, Mark A.; Loveland, Thomas R.

    2007-01-01

    A wide variety of ecological applications require spatially explicit, historic, current, and projected land use and land cover data. The U.S. Land Cover Trends project is analyzing contemporary (1973–2000) land-cover change in the conterminous United States. The newly developed FORE-SCE model used Land Cover Trends data and theoretical, statistical, and deterministic modeling techniques to project future land cover change through 2020 for multiple plausible scenarios. Projected proportions of future land use were initially developed, and then sited on the lands with the highest potential for supporting that land use and land cover using a statistically based stochastic allocation procedure. Three scenarios of 2020 land cover were mapped for the western Great Plains in the US. The model provided realistic, high-resolution, scenario-based land-cover products suitable for multiple applications, including studies of climate and weather variability, carbon dynamics, and regional hydrology.

  6. A Cloud Robotics Based Service for Managing RPAS in Emergency, Rescue and Hazardous Scenarios

    NASA Astrophysics Data System (ADS)

    Silvagni, Mario; Chiaberge, Marcello; Sanguedolce, Claudio; Dara, Gianluca

    2016-04-01

    Cloud robotics and cloud services are revolutionizing not only the ICT world but also the robotics industry, giving robots more computing capabilities, storage and connection bandwidth while opening new scenarios that blend the physical to the digital world. In this vision, new IT architectures are required to manage robots, retrieve data from them and create services to interact with users. Among all the robots this work is mainly focused on flying robots, better known as drones, UAV (Unmanned Aerial Vehicle) or RPAS (Remotely Piloted Aircraft Systems). The cloud robotics approach shifts the concept of having a single local "intelligence" for every single UAV, as a unique device that carries out onboard all the computation and storage processes, to a more powerful "centralized brain" located in the cloud. This breakthrough opens new scenarios where UAVs are agents, relying on remote servers for most of their computational load and data storage, creating a network of devices where they can share knowledge and information. Many applications, using UAVs, are growing as interesting and suitable devices for environment monitoring. Many services can be build fetching data from UAVs, such as telemetry, video streaming, pictures or sensors data; once. These services, part of the IT architecture, can be accessed via web by other devices or shared with other UAVs. As test cases of the proposed architecture, two examples are reported. In the first one a search and rescue or emergency management, where UAVs are required for monitoring intervention, is shown. In case of emergency or aggression, the user requests the emergency service from the IT architecture, providing GPS coordinates and an identification number. The IT architecture uses a UAV (choosing among the available one according to distance, service status, etc.) to reach him/her for monitoring and support operations. In the meantime, an officer will use the service to see the current position of the UAV, its

  7. On Improving the Reliability of Distribution Networks Based on Investment Scenarios Using Reference Networks

    NASA Astrophysics Data System (ADS)

    Kawahara, Koji

    Distribution systems are inherent monopolies and therefore these have generally been regulated in order to protect customers and to ensure cost-effective operation. In the UK this is one of the functions of OFGEM (Office of Gas and Electricity Markets). Initially the regulation was based on the value of assets but there is a trend nowadays towards performance-based regulation. In order to achieve this, a methodology is needed that enables the reliability performance associated with alternative investment strategies to be compared with the investment cost of these strategies. At present there is no accepted approach for such assessments. Building on the concept of reference networks proposed in Refs. (1), (2), this paper describes how these networks can be used to assess the impact that performance driven investment strategies will have on the improvement in reliability indices. The method has been tested using the underground and overhead part of a real system.

  8. Sustainable Systems Analysis of Production and Transportation Scenarios for Conventional and Bio-based Energy Commodities

    NASA Astrophysics Data System (ADS)

    Doran, E. M.; Golden, J. S.; Nowacek, D. P.

    2013-12-01

    International commerce places unique pressures on the sustainability of water resources and marine environments. System impacts include noise, emissions, and chemical and biological pollutants like introduction of invasive species into key ecosystems. At the same time, maritime trade also enables the sustainability ambition of intragenerational equity in the economy through the global circulation of commodities and manufactured goods, including agricultural, energy and mining resources (UN Trade and Development Board 2013). This paper presents a framework to guide the analysis of the multiple dimensions of the sustainable commerce-ocean nexus. As a demonstration case, we explore the social, economic and environmental aspects of the nexus framework using scenarios for the production and transportation of conventional and bio-based energy commodities. Using coupled LCA and GIS methodologies, we are able to orient the findings spatially for additional insight. Previous work on the sustainable use of marine resources has focused on distinct aspects of the maritime environment. The framework presented here, integrates the anthropogenic use, governance and impacts on the marine and coastal environments with the natural components of the system. A similar framework has been highly effective in progressing the study of land-change science (Turner et al 2007), however modification is required for the unique context of the marine environment. This framework will enable better research integration and planning for sustainability objectives including mitigation and adaptation to climate change, sea level rise, reduced dependence on fossil fuels, protection of critical marine habitat and species, and better management of the ocean as an emerging resource base for the production and transport of commodities and energy across the globe. The framework can also be adapted for vulnerability analysis, resilience studies and to evaluate the trends in production, consumption and

  9. Ethoprophos fate on soil-water interface and effects on non-target terrestrial and aquatic biota under Mediterranean crop-based scenarios.

    PubMed

    Leitão, Sara; Moreira-Santos, Matilde; Van den Brink, Paul J; Ribeiro, Rui; José Cerejeira, M; Sousa, José Paulo

    2014-05-01

    The present study aimed to assess the environmental fate of the insecticide and nematicide ethoprophos in the soil-water interface following the pesticide application in simulated maize and potato crops under Mediterranean agricultural conditions, particularly of irrigation. Focus was given to the soil-water transfer pathways (leaching and runoff), to the pesticide transport in soil between pesticide application (crop row) and non-application areas (between crop rows), as well as to toxic effects of the various matrices on terrestrial and aquatic biota. A semi-field methodology mimicking a "worst-case" ethoprophos application (twice the recommended dosage for maize and potato crops: 100% concentration v/v) in agricultural field situations was used, in order to mimic a possible misuse by the farmer under realistic conditions. A rainfall was simulated under a slope of 20° for both crop-based scenarios. Soil and water samples were collected for the analysis of pesticide residues. Ecotoxicity of soil and aquatic samples was assessed by performing lethal and sublethal bioassays with organisms from different trophic levels: the collembolan Folsomia candida, the earthworm Eisenia andrei and the cladoceran Daphnia magna. Although the majority of ethoprophos sorbed to the soil application area, pesticide concentrations were detected in all water matrices illustrating pesticide transfer pathways of water contamination between environmental compartments. Leaching to groundwater proved to be an important transfer pathway of ethoprophos under both crop-based scenarios, as it resulted in high pesticide concentration in leachates from Maize (130µgL(-1)) and Potato (630µgL(-1)) crop scenarios, respectively. Ethoprophos application at the Potato crop scenario caused more toxic effects on terrestrial and aquatic biota than at the Maize scenario at the recommended dosage and lower concentrations. In both crop-based scenarios, ethoprophos moved with the irrigation water flow to the

  10. Ethoprophos fate on soil-water interface and effects on non-target terrestrial and aquatic biota under Mediterranean crop-based scenarios.

    PubMed

    Leitão, Sara; Moreira-Santos, Matilde; Van den Brink, Paul J; Ribeiro, Rui; José Cerejeira, M; Sousa, José Paulo

    2014-05-01

    The present study aimed to assess the environmental fate of the insecticide and nematicide ethoprophos in the soil-water interface following the pesticide application in simulated maize and potato crops under Mediterranean agricultural conditions, particularly of irrigation. Focus was given to the soil-water transfer pathways (leaching and runoff), to the pesticide transport in soil between pesticide application (crop row) and non-application areas (between crop rows), as well as to toxic effects of the various matrices on terrestrial and aquatic biota. A semi-field methodology mimicking a "worst-case" ethoprophos application (twice the recommended dosage for maize and potato crops: 100% concentration v/v) in agricultural field situations was used, in order to mimic a possible misuse by the farmer under realistic conditions. A rainfall was simulated under a slope of 20° for both crop-based scenarios. Soil and water samples were collected for the analysis of pesticide residues. Ecotoxicity of soil and aquatic samples was assessed by performing lethal and sublethal bioassays with organisms from different trophic levels: the collembolan Folsomia candida, the earthworm Eisenia andrei and the cladoceran Daphnia magna. Although the majority of ethoprophos sorbed to the soil application area, pesticide concentrations were detected in all water matrices illustrating pesticide transfer pathways of water contamination between environmental compartments. Leaching to groundwater proved to be an important transfer pathway of ethoprophos under both crop-based scenarios, as it resulted in high pesticide concentration in leachates from Maize (130µgL(-1)) and Potato (630µgL(-1)) crop scenarios, respectively. Ethoprophos application at the Potato crop scenario caused more toxic effects on terrestrial and aquatic biota than at the Maize scenario at the recommended dosage and lower concentrations. In both crop-based scenarios, ethoprophos moved with the irrigation water flow to the

  11. Dark scenarios

    NASA Astrophysics Data System (ADS)

    Ahonen, Pasi; Alahuhta, Petteri; Daskala, Barbara; Delaitre, Sabine; Hert, Paul De; Lindner, Ralf; Maghiros, Ioannis; Moscibroda, Anna; Schreurs, Wim; Verlinden, Michiel

    In this chapter, we present four "dark scenarios" that highlight the key socio-economic, legal, technological and ethical risks to privacy, identity, trust, security and inclusiveness posed by new AmI technologies. We call them dark scenarios, because they show things that could go wrong in an AmI world, because they present visions of the future that we do not want to become reality. The scenarios expose threats and vulnerabilities as a way to inform policy-makers and planners about issues they need to take into account in developing new policies or updating existing legislation. Before presenting the four scenarios and our analysis of each, we describe the process of how we created the scenarios as well as the elements in our methodology for analysing the scenarios.

  12. Appendix E: Other NEMS-MP results for the base case and scenarios.

    SciTech Connect

    Plotkin, S. E.; Singh, M. K.; Energy Systems

    2009-12-03

    The NEMS-MP model generates numerous results for each run of a scenario. (This model is the integrated National Energy Modeling System [NEMS] version used for the Multi-Path Transportation Futures Study [MP].) This appendix examines additional findings beyond the primary results reported in the Multi-Path Transportation Futures Study: Vehicle Characterization and Scenario Analyses (Reference 1). These additional results are provided in order to help further illuminate some of the primary results. Specifically discussed in this appendix are: (1) Energy use results for light vehicles (LVs), including details about the underlying total vehicle miles traveled (VMT), the average vehicle fuel economy, and the volumes of the different fuels used; (2) Resource fuels and their use in the production of ethanol, hydrogen (H{sub 2}), and electricity; (3) Ethanol use in the scenarios (i.e., the ethanol consumption in E85 vs. other blends, the percent of travel by flex fuel vehicles on E85, etc.); (4) Relative availability of E85 and H2 stations; (5) Fuel prices; (6) Vehicle prices; and (7) Consumer savings. These results are discussed as follows: (1) The three scenarios (Mixed, (P)HEV & Ethanol, and H2 Success) when assuming vehicle prices developed through literature review; (2) The three scenarios with vehicle prices that incorporate the achievement of the U.S. Department of Energy (DOE) program vehicle cost goals; (3) The three scenarios with 'literature review' vehicle prices, plus vehicle subsidies; and (4) The three scenarios with 'program goals' vehicle prices, plus vehicle subsidies. The four versions or cases of each scenario are referred to as: Literature Review No Subsidies, Program Goals No Subsidies, Literature Review with Subsidies, and Program Goals with Subsidies. Two additional points must be made here. First, none of the results presented for LVs in this section include Class 2B trucks. Results for this class are included occasionally in Reference 1. They

  13. Relation of Student Characteristics to Learning of Basic Biochemistry Concepts from a Multimedia Goal-Based Scenario.

    ERIC Educational Resources Information Center

    Schoenfeld-Tacher, Regina; Persichitte, Kay A.; Jones, Loretta L.

    This study sought to answer the question, Do all students benefit equally from the use of a hypermedia Goal-Based Scenario (GBS)? GBS is a subcategory of anchored instruction. The correlation between the demographic variables and achievement and specific cognitive variables and achievement was explored using a lesson on DNA, and was tested on…

  14. Blending Face-to-Face Higher Education with Web-Based Lectures: Comparing Different Didactical Application Scenarios

    ERIC Educational Resources Information Center

    Montrieux, Hannelore; Vangestel, Sandra; Raes, Annelies; Matthys, Paul; Schellens, Tammy

    2015-01-01

    Blended learning as an instructional approach is getting more attention in the educational landscape and has been researched thoroughly. Yet, this study reports the results of an innovation project aiming to gain insight into three different scenarios of applying web-based lectures: as preparation for face-to-face practical exercises, as a…

  15. Improved seismic risk estimation for Bucharest, based on multiple hazard scenarios, analytical methods and new techniques

    NASA Astrophysics Data System (ADS)

    Toma-Danila, Dragos; Florinela Manea, Elena; Ortanza Cioflan, Carmen

    2014-05-01

    a very local-dependent hazard. Also, for major earthquakes, nonlinear effects need to be considered. This problem is treated accordingly, by using recent microzonation studies, together with real data recorded at 4 events with Mw≥6. Different ground motion prediction equations are also analyzed, and improvement of them is investigated. For the buildings and population damage assessment, two open-source software are used and compared: SELENA and ELER. The damage probability for buildings is obtained through capacity-spectrum based methods. The spectral content is used for spectral acceleration at 0.2, 0.3 and 1 seconds. As the level of analysis (6 sectors for all the city) has not the best resolution with respect to the Bucharest hazard scenarios defined, we propose a procedure on how to divide the data into smaller units, taking into consideration the construction code (4 periods) and material. This approach relies on free data available from real estate agencies web-sites. The study provides an insight view on the seismic risk analysis for Bucharest and an improvement of the real-time emergency system. Most important, the system is also evaluated through real data and relevant scenarios. State-of-the art GIS maps are also presented, both for seismic hazard and risk.

  16. The design of scenario-based training from the resilience engineering perspective: a study with grid electricians.

    PubMed

    Saurin, Tarcisio Abreu; Wachs, Priscila; Righi, Angela Weber; Henriqson, Eder

    2014-07-01

    Although scenario-based training (SBT) can be an effective means to help workers develop resilience skills, it has not yet been analyzed from the resilience engineering (RE) perspective. This study introduces a five-stage method for designing SBT from the RE view: (a) identification of resilience skills, work constraints and actions for re-designing the socio-technical system; (b) design of template scenarios, allowing the simulation of the work constraints and the use of resilience skills; (c) design of the simulation protocol, which includes briefing, simulation and debriefing; (d) implementation of both scenarios and simulation protocol; and (e) evaluation of the scenarios and simulation protocol. It is reported how the method was applied in an electricity distribution company, in order to train grid electricians. The study was framed as an application of design science research, and five research outputs are discussed: method, constructs, model of the relationships among constructs, instantiations of the method, and theory building. Concerning the last output, the operationalization of the RE perspective on three elements of SBT is presented: identification of training objectives; scenario design; and debriefing.

  17. GIS-based quantification of future nutrient loads into Lake Peipsi/Chudskoe using qualitative regional development scenarios.

    PubMed

    Mourad, D S J; Van der Perk, M; Gooch, G D; Loigu, E; Piirimäe, K; Stålnacke, P

    2005-01-01

    This study aims at the quantification of possible future nutrient loads into Lake Peipsi/Chudskoe under different economic development scenarios. This drainage basin is on the borders of Russia, Estonia and Latvia. The sudden disintegration of the Soviet Union in 1991 caused a collapse of agricultural economy, and consequently, a substantial decrease of diffuse and point-source nutrient emissions. For the future, uncertainties about economic development and the priorities that will be set for this region make it difficult to assess the consequences for river water quality and nutrient loads into the lake. We applied five integrated scenarios of future development of this transboundary region for the next twelve to fifteen years. Each scenario consists of a qualitative story line, which was translated into quantitative changes in the input variables for a geographical information system based nutrient transport model. This model calculates nutrient emissions, as well as transport and retention and the resulting nutrient loads into the lake. The model results show that the effects of the different development scenarios on nutrient loads are relatively limited over a time span of about 15 years. In general, a further reduction of nutrient loads is expected, except for a fast economic development scenario. PMID:15850209

  18. Photodegradation of polycyclic aromatic hydrocarbons in soils under a climate change base scenario.

    PubMed

    Marquès, Montse; Mari, Montse; Audí-Miró, Carme; Sierra, Jordi; Soler, Albert; Nadal, Martí; Domingo, José L

    2016-04-01

    The photodegradation of polycyclic aromatic hydrocarbons (PAHs) in two typical Mediterranean soils, either coarse- or fine-textured, was here investigated. Soil samples, spiked with the 16 US EPA priority PAHs, were incubated in a climate chamber at stable conditions of temperature (20 °C) and light (9.6 W m(-2)) for 28 days, simulating a climate change base scenario. PAH concentrations in soils were analyzed throughout the experiment, and correlated with data obtained by means of Microtox(®) ecotoxicity test. Photodegradation was found to be dependent on exposure time, molecular weight of each hydrocarbon, and soil texture. Fine-textured soil was able to enhance sorption, being PAHs more photodegraded than in coarse-textured soil. According to the EC50 values reported by Microtox(®), a higher detoxification was observed in fine-textured soil, being correlated with the outcomes of the analytical study. Significant photodegradation rates were detected for a number of PAHs, namely phenanthrene, anthracene, benzo(a)pyrene, and indeno(123-cd)pyrene. Benzo(a)pyrene, commonly used as an indicator for PAH pollution, was completely removed after 7 days of light exposure. In addition to the PAH chemical analysis and the ecotoxicity tests, a hydrogen isotope analysis of benzo(a)pyrene was also carried out. The degradation of this specific compound was associated to a high enrichment in (2)H, obtaining a maximum δ(2)H isotopic shift of +232‰. This strong isotopic effect observed in benzo(a)pyrene suggests that compound-specific isotope analysis (CSIA) may be a powerful tool to monitor in situ degradation of PAHs. Moreover, hydrogen isotopes of benzo(a)pyrene evidenced a degradation process of unknown origin occurring in the darkness.

  19. Photodegradation of polycyclic aromatic hydrocarbons in soils under a climate change base scenario.

    PubMed

    Marquès, Montse; Mari, Montse; Audí-Miró, Carme; Sierra, Jordi; Soler, Albert; Nadal, Martí; Domingo, José L

    2016-04-01

    The photodegradation of polycyclic aromatic hydrocarbons (PAHs) in two typical Mediterranean soils, either coarse- or fine-textured, was here investigated. Soil samples, spiked with the 16 US EPA priority PAHs, were incubated in a climate chamber at stable conditions of temperature (20 °C) and light (9.6 W m(-2)) for 28 days, simulating a climate change base scenario. PAH concentrations in soils were analyzed throughout the experiment, and correlated with data obtained by means of Microtox(®) ecotoxicity test. Photodegradation was found to be dependent on exposure time, molecular weight of each hydrocarbon, and soil texture. Fine-textured soil was able to enhance sorption, being PAHs more photodegraded than in coarse-textured soil. According to the EC50 values reported by Microtox(®), a higher detoxification was observed in fine-textured soil, being correlated with the outcomes of the analytical study. Significant photodegradation rates were detected for a number of PAHs, namely phenanthrene, anthracene, benzo(a)pyrene, and indeno(123-cd)pyrene. Benzo(a)pyrene, commonly used as an indicator for PAH pollution, was completely removed after 7 days of light exposure. In addition to the PAH chemical analysis and the ecotoxicity tests, a hydrogen isotope analysis of benzo(a)pyrene was also carried out. The degradation of this specific compound was associated to a high enrichment in (2)H, obtaining a maximum δ(2)H isotopic shift of +232‰. This strong isotopic effect observed in benzo(a)pyrene suggests that compound-specific isotope analysis (CSIA) may be a powerful tool to monitor in situ degradation of PAHs. Moreover, hydrogen isotopes of benzo(a)pyrene evidenced a degradation process of unknown origin occurring in the darkness. PMID:26841292

  20. Scenario-Based Specification and Evaluation of Architectures for Health Monitoring of Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Sundaram, P.

    2001-01-01

    HUMS systems have been an area of increased research in the recent times due to two main reasons: (a) increase in the occurrences of accidents in the aerospace, and (b) stricter FAA regulations on aircrafts maintenance [2]. There are several problems associated with the maintenance of aircrafts that the HUMS systems can solve through the use of several monitoring technologies.This paper documents our methodology of employing scenarios in the specification and evaluation of architecture for HUMS. Section 2 investigates related works that use scenarios in software development. Section 3 describes how we use scenarios in our work, which is followed by a demonstration of our methods in the development of KUMS in section 4. Conclusion summarizes results.

  1. Subspace based non-parametric approach for hyperspectral anomaly detection in complex scenarios

    NASA Astrophysics Data System (ADS)

    Matteoli, Stefania; Acito, Nicola; Diani, Marco; Corsini, Giovanni

    2014-10-01

    Recent studies on global anomaly detection (AD) in hyperspectral images have focused on non-parametric approaches that seem particularly suitable to detect anomalies in complex backgrounds without the need of assuming any specific model for the background distribution. Among these, AD algorithms based on the kernel density estimator (KDE) benefit from the flexibility provided by KDE, which attempts to estimate the background probability density function (PDF) regardless of its specific form. The high computational burden associated with KDE requires KDE-based AD algorithms be preceded by a suitable dimensionality reduction (DR) procedure aimed at identifying the subspace where most of the useful signal lies. In most cases, this may lead to a degradation of the detection performance due to the leakage of some anomalous target components to the subspace orthogonal to the one identified by the DR procedure. This work presents a novel subspace-based AD strategy that combines the use of KDE with a simple parametric detector performed on the orthogonal complement of the signal subspace, in order to benefit of the non-parametric nature of KDE and, at the same time, avoid the performance loss that may occur due to the DR procedure. Experimental results indicate that the proposed AD strategy is promising and deserves further investigation.

  2. Scenario-based tsunami hazard assessment for the coast of Vietnam from the Manila Trench source

    NASA Astrophysics Data System (ADS)

    Hong Nguyen, Phuong; Cong Bui, Que; Ha Vu, Phuong; The Pham, Truyen

    2014-11-01

    This paper assesses the impact of tsunamis in the East Vietnam Sea potentially originated from a giant rupture along the Manila Trench to the Vietnamese coast. Tsunami heights and arrival times to the major forecast points along the Vietnamese coast are computed using COMCOT model. The results of the worst case scenario (Mw = 9.3) and two extreme scenarios were used to assess the tsunami hazards. The simulation results show that Vietnamese coast can be divided into three parts with different levels of tsunami hazard. The highest threat exists along the coasts of Central and North-Central Vietnam, from Quang Binh to Ba Ria - Vung Tau provinces, with maximum wave height of 18 m observed near Quang Ngai coast, and a tsunami would reach this coastline in two hours at the earliest. The northern coastal zone of Vietnam has lower tsunami hazard. In the worst case scenario, maximum amplitudes of tsunami waves at Hai Phong sea port and Nam Dinh city, North Vietnam, are 3.5 m and 3.7 m, respectively, while the travel times to these sites are much longer, over 8 h. The southern coastal zone of Vietnam has very low tsunami hazard. In the worst case scenario, the maximum amplitude at Ca Mau is 0.12 m, while the travel time is over 10 h.

  3. The Education Council Report 2001: An Evaluation Based on the ATEE-RDC19 Scenarios.

    ERIC Educational Resources Information Center

    Mikl, Josef

    2003-01-01

    Examines recent developments in European Union educational policy, highlighting a 2001 document of the Education Council. Evaluates the document's intentions and directions from a pedagogical viewpoint and assesses the document using the Association for Teacher Education in Europe's scenario framework. Shows that future European Union education…

  4. Supporting Primary-Level Mathematics Teachers' Collaboration in Designing and Using Technology-Based Scenarios

    ERIC Educational Resources Information Center

    Misfeldt, Morten; Zacho, Lis

    2016-01-01

    In this article, we address how the design of educational scenarios can support teachers' adoption of both technology and open-ended projects indorsing creativity and innovation. We do that by describing how groups of teachers develop digital learning environments supporting using a combination of GeoGebra and Google sites. Both teachers and…

  5. Future drought scenarios for the Greater Alpine Region based on dynamical downscaling experiments.

    NASA Astrophysics Data System (ADS)

    Haslinger, Klaus; Anders, Ivonne; Schöner, Wolfgang

    2014-05-01

    Large scale droughts have major ecologic, agricultural, economic as well as societal impacts by reducing crop yield, producing low flows in river systems or by limiting the public water supply. Under the perspective of rising temperatures and possibly altered precipitation regimes in the upcoming decades due to global climate change, we accomplish an assessment of future drought characteristics for the Greater Alpine Region (GAR) with regional climate model simulations. This study consists of two parts: First, the ability of the Regional Climate Model COSMO-CLM (CCLM) to simulate drought conditions in the past in space and time is evaluated. Second, an analysis of future drought scenarios for the GAR is conducted. As a drought index the Standardized Precipitation Evapotranspiration Index (SPEI) is used. For the evaluation of the Regional Climate Model in the past, simulations driven by ERA-40 are compared to observations. The gridded observational datasets of the HISTALP-database are used for evaluation in the first place. To assess the skill of CCLM, correlation coefficients between the SPEI of model simulations and gridded observations stratified by seasons and time scales are accomplished. For the analysis of future changes in the drought characteristics, four scenario runs are investigated. These are ECHAM5 and HadCM3 driven CCLM runs for the SRES scenarios A1B, A2 and B1. The SPEI is calculated spanning both the C20 and the scenario runs and are therefore regarded as transient simulations. Generally, trends to dryer annual mean conditions are apparent in each of the scenario runs, whereas the signal is rather strong in summer, contradicted by winter which shows a slight increase in precipitation north of the Alps. This in turn leads to higher variability of the SPEI in the future, as differences between winter (wetter or no change) and summer (considerably dryer) grow larger.

  6. Moon manned mission scenarios

    NASA Astrophysics Data System (ADS)

    de Angelis, G.; Tripathi, R. K.; Wilson, J. W.; Clowdsley, M. S.; Nealy, J. E.; Badavi, F. F.

    An analysis is performed on the radiation environment found around and on the surface of the Moon, and applied to different possible lunar mission scenarios. An optimization technique has been used to obtain mission scenarios minimizing the astronaut radiation exposure and at the same time controlling the effect of shielding, in terms of mass addition and material choice, as a mission cost driver. The scenarios are evaluated from the point of view of radiation safety with the radiation protection quantities recommended for LEO scenarios.

  7. Tsunami hazard potential for the equatorial southwestern Pacific atolls of Tokelau from scenario-based simulations

    NASA Astrophysics Data System (ADS)

    Orpin, A. R.; Rickard, G. J.; Gerring, P. K.; Lamarche, G.

    2015-07-01

    Devastating tsunami over the last decade have significantly heightened awareness of the potential consequences and vulnerability to tsunami for low-lying Pacific islands and coastal regions. Our tsunami risk assessment for the atolls of the Tokelau Islands was based on a tsunami source-propagation-inundation model using Gerris Flow Solver, adapted from the companion study by Lamarche et al. (2015) for the islands of Wallis and Futuna. We assess whether there is potential for tsunami flooding on any of the village islets from a series of fourteen earthquake-source experiments that apply a combination of well-established fault parameters to represent plausible "high-risk scenarios" for each of the tsunamigenic sources. Earthquake source location and moment magnitude were related to tsunami wave heights and tsunami flood depths simulated for each of the three atolls of Tokelau. This approach was adopted to yield indicative and instructive results for a community advisory, rather than being fully deterministic. Results from our modelling show that wave fields are channelled by the bathymetry of the Pacific basin in such a way that the swathes of the highest waves sweep immediately northeast of the Tokelau Islands. From our series of limited simulations a great earthquake from the Kuril Trench poses the most significant inundation threat to Tokelau, with maximum modelled-wave heights in excess of 1 m, which may last a few hours and include several wave trains. Other sources can impact specific sectors of the atolls, particularly from regional sources to the south, and northern and eastern distant sources that generate trans-Pacific tsunami. In many cases impacts are dependent on the wave orientation and direct exposure to the oncoming tsunami. This study shows that dry areas remain around the villages in nearly all our "worst-case" tsunami simulations of the Tokelau Islands. Consistent with the oral history of little or no perceived tsunami threat, simulations from the

  8. A Modified Wilson Cycle Scenario Based on Thermo-Mechanical Model

    NASA Astrophysics Data System (ADS)

    Baes, M.; Sobolev, S. V.

    2014-12-01

    The major problem of classical Wilson Cycle concept is the suggested conversion of the passive continental margin to the active subduction zone. Previous modeling studies assumed either unusually thick felsic continental crust at the margin (over 40 km) or unusually low lithospheric thickness (less than 70 km) to simulate this process. Here we propose a new triggering factor in subduction initiation process that is mantle suction force. Based on this proposal we suggest a modification of Wilson Cycle concept. Sometime after opening and extension of oceanic basin, continental passive margin moves over the slab remnants of the former active subduction zones in deep mantle. Such slab remnants or deep slabs of neighboring active subduction zones produce a suction mantle flow introducing additional compression at the passive margin. It results in the initiation of a new subduction zone, hence starting the closing phase of Wilson Cycle. In this scenario the weakness of continental crust near the passive margin which is inherited from the rifting phase and horizontal push force induced from far-field topographic gradient within the continent facilitate and speed up subduction initiation process. Our thermo-mechanical modeling shows that after a few tens of million years a shear zone may indeed develop along the passive margin that has typical two-layered 35 km thick continental crust and thermal lithosphere thicker than 100 km if there is a broad mantle down-welling flow below the margin. Soon after formation of this shear zone oceanic plate descends into mantle and subduction initiates. Subduction initiation occurs following over-thrusting of continental crust and retreating of future trench. In models without far-field topographic gradient within the continent subduction initiation requires weaker passive margin. Our results also indicate that subduction initiation depends on several parameters such as magnitude, domain size and location of suction mantle flow

  9. Stochastic Multi-Commodity Facility Location Based on a New Scenario Generation Technique

    NASA Astrophysics Data System (ADS)

    Mahootchi, M.; Fattahi, M.; Khakbazan, E.

    2011-11-01

    This paper extends two models for stochastic multi-commodity facility location problem. The problem is formulated as two-stage stochastic programming. As a main point of this study, a new algorithm is applied to efficiently generate scenarios for uncertain correlated customers' demands. This algorithm uses Latin Hypercube Sampling (LHS) and a scenario reduction approach. The relation between customer satisfaction level and cost are considered in model I. The risk measure using Conditional Value-at-Risk (CVaR) is embedded into the optimization model II. Here, the structure of the network contains three facility layers including plants, distribution centers, and retailers. The first stage decisions are the number, locations, and the capacity of distribution centers. In the second stage, the decisions are the amount of productions, the volume of transportation between plants and customers.

  10. Scenario-based extreme seismic hazard and risk assessment for the Baku city (Azerbaijan)

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Babayev, G.; Le Mouel, J.

    2010-12-01

    A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations), and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan) to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA), vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide’s occurrence), and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and northeastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that for all earthquake scenarios risk depends essentially on the quality of buildings and the probability of their damage and on the distribution of urban population and exposure, rather than on the pattern of peak ground acceleration. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  11. Tsunami hazard potential for the equatorial southwestern Pacific atolls of Tokelau from scenario-based simulations

    NASA Astrophysics Data System (ADS)

    Orpin, Alan R.; Rickard, Graham J.; Gerring, Peter K.; Lamarche, Geoffroy

    2016-05-01

    Devastating tsunami over the last decade have significantly heightened awareness of the potential consequences and vulnerability of low-lying Pacific islands and coastal regions. Our appraisal of the potential tsunami hazard for the atolls of the Tokelau Islands is based on a tsunami source-propagation-inundation model using Gerris Flow Solver, adapted from the companion study by Lamarche et al. (2015) for the islands of Wallis and Futuna. We assess whether there is potential for tsunami flooding on any of the village islets from a selection of 14 earthquake-source experiments. These earthquake sources are primarily based on the largest Pacific earthquakes of Mw ≥ 8.1 since 1950 and other large credible sources of tsunami that may impact Tokelau. Earthquake-source location and moment magnitude are related to tsunami-wave amplitudes and tsunami flood depths simulated for each of the three atolls of Tokelau. This approach yields instructive results for a community advisory but is not intended to be fully deterministic. Rather, the underlying aim is to identify credible sources that present the greatest potential to trigger an emergency response. Results from our modelling show that wave fields are channelled by the bathymetry of the Pacific basin in such a way that the swathes of the highest waves sweep immediately northeast of the Tokelau Islands. Our limited simulations suggest that trans-Pacific tsunami from distant earthquake sources to the north of Tokelau pose the most significant inundation threat. In particular, our assumed worst-case scenario for the Kuril Trench generated maximum modelled-wave amplitudes in excess of 1 m, which may last a few hours and include several wave trains. Other sources can impact specific sectors of the atolls, particularly distant earthquakes from Chile and Peru, and regional earthquake sources to the south. Flooding is dependent on the wave orientation and direct alignment to the incoming tsunami. Our "worst-case" tsunami

  12. Analytic performance prediction of track-to-track association with biased data in multi-sensor multi-target tracking scenarios.

    PubMed

    Tian, Wei; Wang, Yue; Shan, Xiuming; Yang, Jian

    2013-01-01

    An analytic method for predicting the performance of track-to-track association (TTTA) with biased data in multi-sensor multi-target tracking scenarios is proposed in this paper. The proposed method extends the existing results of the bias-free situation by accounting for the impact of sensor biases. Since little insight of the intrinsic relationship between scenario parameters and the performance of TTTA can be obtained by numerical simulations, the proposed analytic approach is a potential substitute for the costly Monte Carlo simulation method. Analytic expressions are developed for the global nearest neighbor (GNN) association algorithm in terms of correct association probability. The translational biases of sensors are incorporated in the expressions, which provide good insight into how the TTTA performance is affected by sensor biases, as well as other scenario parameters, including the target spatial density, the extraneous track density and the average association uncertainty error. To show the validity of the analytic predictions, we compare them with the simulation results, and the analytic predictions agree reasonably well with the simulations in a large range of normally anticipated scenario parameters. PMID:24036583

  13. Analytic performance prediction of track-to-track association with biased data in multi-sensor multi-target tracking scenarios.

    PubMed

    Tian, Wei; Wang, Yue; Shan, Xiuming; Yang, Jian

    2013-09-12

    An analytic method for predicting the performance of track-to-track association (TTTA) with biased data in multi-sensor multi-target tracking scenarios is proposed in this paper. The proposed method extends the existing results of the bias-free situation by accounting for the impact of sensor biases. Since little insight of the intrinsic relationship between scenario parameters and the performance of TTTA can be obtained by numerical simulations, the proposed analytic approach is a potential substitute for the costly Monte Carlo simulation method. Analytic expressions are developed for the global nearest neighbor (GNN) association algorithm in terms of correct association probability. The translational biases of sensors are incorporated in the expressions, which provide good insight into how the TTTA performance is affected by sensor biases, as well as other scenario parameters, including the target spatial density, the extraneous track density and the average association uncertainty error. To show the validity of the analytic predictions, we compare them with the simulation results, and the analytic predictions agree reasonably well with the simulations in a large range of normally anticipated scenario parameters.

  14. Alternative Geothermal Power Production Scenarios

    DOE Data Explorer

    Sullivan, John

    2014-03-14

    The information given in this file pertains to Argonne LCAs of the plant cycle stage for a set of ten new geothermal scenario pairs, each comprised of a reference and improved case. These analyses were conducted to compare environmental performances among the scenarios and cases. The types of plants evaluated are hydrothermal binary and flash and Enhanced Geothermal Systems (EGS) binary and flash plants. Each scenario pair was developed by the LCOE group using GETEM as a way to identify plant operational and resource combinations that could reduce geothermal power plant LCOE values. Based on the specified plant and well field characteristics (plant type, capacity, capacity factor and lifetime, and well numbers and depths) for each case of each pair, Argonne generated a corresponding set of material to power ratios (MPRs) and greenhouse gas and fossil energy ratios.

  15. Performance Evaluation of Analog Beamforming with Hardware Impairments for mmW Massive MIMO Communication in an Urban Scenario.

    PubMed

    Gimenez, Sonia; Roger, Sandra; Baracca, Paolo; Martín-Sacristán, David; Monserrat, Jose F; Braun, Volker; Halbauer, Hardy

    2016-09-22

    The use of massive multiple-input multiple-output (MIMO) techniques for communication at millimeter-Wave (mmW) frequency bands has become a key enabler to meet the data rate demands of the upcoming fifth generation (5G) cellular systems. In particular, analog and hybrid beamforming solutions are receiving increasing attention as less expensive and more power efficient alternatives to fully digital precoding schemes. Despite their proven good performance in simple setups, their suitability for realistic cellular systems with many interfering base stations and users is still unclear. Furthermore, the performance of massive MIMO beamforming and precoding methods are in practice also affected by practical limitations and hardware constraints. In this sense, this paper assesses the performance of digital precoding and analog beamforming in an urban cellular system with an accurate mmW channel model under both ideal and realistic assumptions. The results show that analog beamforming can reach the performance of fully digital maximum ratio transmission under line of sight conditions and with a sufficient number of parallel radio-frequency (RF) chains, especially when the practical limitations of outdated channel information and per antenna power constraints are considered. This work also shows the impact of the phase shifter errors and combiner losses introduced by real phase shifter and combiner implementations over analog beamforming, where the former ones have minor impact on the performance, while the latter ones determine the optimum number of RF chains to be used in practice.

  16. Performance Evaluation of Analog Beamforming with Hardware Impairments for mmW Massive MIMO Communication in an Urban Scenario.

    PubMed

    Gimenez, Sonia; Roger, Sandra; Baracca, Paolo; Martín-Sacristán, David; Monserrat, Jose F; Braun, Volker; Halbauer, Hardy

    2016-01-01

    The use of massive multiple-input multiple-output (MIMO) techniques for communication at millimeter-Wave (mmW) frequency bands has become a key enabler to meet the data rate demands of the upcoming fifth generation (5G) cellular systems. In particular, analog and hybrid beamforming solutions are receiving increasing attention as less expensive and more power efficient alternatives to fully digital precoding schemes. Despite their proven good performance in simple setups, their suitability for realistic cellular systems with many interfering base stations and users is still unclear. Furthermore, the performance of massive MIMO beamforming and precoding methods are in practice also affected by practical limitations and hardware constraints. In this sense, this paper assesses the performance of digital precoding and analog beamforming in an urban cellular system with an accurate mmW channel model under both ideal and realistic assumptions. The results show that analog beamforming can reach the performance of fully digital maximum ratio transmission under line of sight conditions and with a sufficient number of parallel radio-frequency (RF) chains, especially when the practical limitations of outdated channel information and per antenna power constraints are considered. This work also shows the impact of the phase shifter errors and combiner losses introduced by real phase shifter and combiner implementations over analog beamforming, where the former ones have minor impact on the performance, while the latter ones determine the optimum number of RF chains to be used in practice. PMID:27669241

  17. [New paradigm for soil and water conservation: a method based on watershed process modeling and scenario analysis].

    PubMed

    Zhu, A-Xing; Chen, La-Jiao; Qin, Cheng-Zhi; Wang, Ping; Liu, Jun-Zhi; Li, Run-Kui; Cai, Qiang-Guo

    2012-07-01

    With the increase of severe soil erosion problem, soil and water conservation has become an urgent concern for sustainable development. Small watershed experimental observation is the traditional paradigm for soil and water control. However, the establishment of experimental watershed usually takes long time, and has the limitations of poor repeatability and high cost. Moreover, the popularization of the results from the experimental watershed is limited for other areas due to the differences in watershed conditions. Therefore, it is not sufficient to completely rely on this old paradigm for soil and water loss control. Recently, scenario analysis based on watershed modeling has been introduced into watershed management, which can provide information about the effectiveness of different management practices based on the quantitative simulation of watershed processes. Because of its merits such as low cost, short period, and high repeatability, scenario analysis shows great potential in aiding the development of watershed management strategy. This paper elaborated a new paradigm using watershed modeling and scenario analysis for soil and water conservation, illustrated this new paradigm through two cases for practical watershed management, and explored the future development of this new soil and water conservation paradigm.

  18. Increasing Plant Based Foods or Dairy Foods Differentially Affects Nutrient Intakes: Dietary Scenarios Using NHANES 2007-2010.

    PubMed

    Cifelli, Christopher J; Houchins, Jenny A; Demmer, Elieke; Fulgoni, Victor L

    2016-07-11

    Diets rich in plant foods and lower in animal-based products have garnered increased attention among researchers, dietitians and health professionals in recent years for their potential to, not only improve health, but also to lessen the environmental impact. However, the potential effects of increasing plant-based foods at the expense of animal-based foods on macro- and micronutrient nutrient adequacy in the U.S. diet is unknown. In addition, dairy foods are consistently under consumed, thus the impact of increased dairy on nutrient adequacy is important to measure. Accordingly, the objective of this study was to use national survey data to model three different dietary scenarios to assess the effects of increasing plant-based foods or dairy foods on macronutrient intake and nutrient adequacy. Data from the National Health and Nutrition Examination Survey (NHANES) 2007-2010 for persons two years and older (n = 17,387) were used in all the analyses. Comparisons were made of usual intake of macronutrients and shortfall nutrients of three dietary scenarios that increased intakes by 100%: (i) plant-based foods; (ii) protein-rich plant-based foods (i.e., legumes, nuts, seeds, soy); and (iii) milk, cheese and yogurt. Scenarios (i) and (ii) had commensurate reductions in animal product intake. In both children (2-18 years) and adults (≥19 years), the percent not meeting the Estimated Average Requirement (EAR) decreased for vitamin C, magnesium, vitamin E, folate and iron when plant-based foods were increased. However the percent not meeting the EAR increased for calcium, protein, vitamin A, and vitamin D in this scenario. Doubling protein-rich plant-based foods had no effect on nutrient intake because they were consumed in very low quantities in the baseline diet. The dairy model reduced the percent not meeting the EAR for calcium, vitamin A, vitamin D, magnesium, and protein, while sodium and saturated fat levels increased. Our modeling shows that increasing plant-based

  19. Increasing Plant Based Foods or Dairy Foods Differentially Affects Nutrient Intakes: Dietary Scenarios Using NHANES 2007-2010.

    PubMed

    Cifelli, Christopher J; Houchins, Jenny A; Demmer, Elieke; Fulgoni, Victor L

    2016-01-01

    Diets rich in plant foods and lower in animal-based products have garnered increased attention among researchers, dietitians and health professionals in recent years for their potential to, not only improve health, but also to lessen the environmental impact. However, the potential effects of increasing plant-based foods at the expense of animal-based foods on macro- and micronutrient nutrient adequacy in the U.S. diet is unknown. In addition, dairy foods are consistently under consumed, thus the impact of increased dairy on nutrient adequacy is important to measure. Accordingly, the objective of this study was to use national survey data to model three different dietary scenarios to assess the effects of increasing plant-based foods or dairy foods on macronutrient intake and nutrient adequacy. Data from the National Health and Nutrition Examination Survey (NHANES) 2007-2010 for persons two years and older (n = 17,387) were used in all the analyses. Comparisons were made of usual intake of macronutrients and shortfall nutrients of three dietary scenarios that increased intakes by 100%: (i) plant-based foods; (ii) protein-rich plant-based foods (i.e., legumes, nuts, seeds, soy); and (iii) milk, cheese and yogurt. Scenarios (i) and (ii) had commensurate reductions in animal product intake. In both children (2-18 years) and adults (≥19 years), the percent not meeting the Estimated Average Requirement (EAR) decreased for vitamin C, magnesium, vitamin E, folate and iron when plant-based foods were increased. However the percent not meeting the EAR increased for calcium, protein, vitamin A, and vitamin D in this scenario. Doubling protein-rich plant-based foods had no effect on nutrient intake because they were consumed in very low quantities in the baseline diet. The dairy model reduced the percent not meeting the EAR for calcium, vitamin A, vitamin D, magnesium, and protein, while sodium and saturated fat levels increased. Our modeling shows that increasing plant-based

  20. Increasing Plant Based Foods or Dairy Foods Differentially Affects Nutrient Intakes: Dietary Scenarios Using NHANES 2007–2010

    PubMed Central

    Cifelli, Christopher J.; Houchins, Jenny A.; Demmer, Elieke; Fulgoni, Victor L.

    2016-01-01

    Diets rich in plant foods and lower in animal-based products have garnered increased attention among researchers, dietitians and health professionals in recent years for their potential to, not only improve health, but also to lessen the environmental impact. However, the potential effects of increasing plant-based foods at the expense of animal-based foods on macro- and micronutrient nutrient adequacy in the U.S. diet is unknown. In addition, dairy foods are consistently under consumed, thus the impact of increased dairy on nutrient adequacy is important to measure. Accordingly, the objective of this study was to use national survey data to model three different dietary scenarios to assess the effects of increasing plant-based foods or dairy foods on macronutrient intake and nutrient adequacy. Data from the National Health and Nutrition Examination Survey (NHANES) 2007–2010 for persons two years and older (n = 17,387) were used in all the analyses. Comparisons were made of usual intake of macronutrients and shortfall nutrients of three dietary scenarios that increased intakes by 100%: (i) plant-based foods; (ii) protein-rich plant-based foods (i.e., legumes, nuts, seeds, soy); and (iii) milk, cheese and yogurt. Scenarios (i) and (ii) had commensurate reductions in animal product intake. In both children (2–18 years) and adults (≥19 years), the percent not meeting the Estimated Average Requirement (EAR) decreased for vitamin C, magnesium, vitamin E, folate and iron when plant-based foods were increased. However the percent not meeting the EAR increased for calcium, protein, vitamin A, and vitamin D in this scenario. Doubling protein-rich plant-based foods had no effect on nutrient intake because they were consumed in very low quantities in the baseline diet. The dairy model reduced the percent not meeting the EAR for calcium, vitamin A, vitamin D, magnesium, and protein, while sodium and saturated fat levels increased. Our modeling shows that increasing plant-based

  1. [Synergistic emission reduction of chief air pollutants and greenhouse gases-based on scenario simulations of energy consumptions in Beijing].

    PubMed

    Xie, Yuan-bo; Li, Wei

    2013-05-01

    It is one of the common targets and important tasks for energy management and environmental control of Beijing to improve urban air quality while reducing the emissions of greenhouse gases (GHG). Here, based on the interim and long term developmental planning and energy structure of the city, three energy consumption scenarios in low, moderate and high restrictions were designed by taking the potential energy saving policies and environmental targets into account. The long-range energy alternatives planning (LEAP) model was employed to predict and evaluate reduction effects of the chief air pollutants and GHG during 2010 to 2020 under the three given scenarios. The results showed that if urban energy consumption system was optimized or adjusted by exercising energy saving and emission reduction and pollution control measures, the predicted energy uses will be reduced by 10 to 30 million tons of coal equivalents by 2020. Under the two energy scenarios with moderate and high restrictions, the anticipated emissions of SO2, NOx, PM10, PM2.5, VOC and GHG will be respectively reduced to 71 to 100.2, 159.2 to 218.7, 89.8 to 133.8, 51.4 to 96.0, 56.4 to 74.8 and 148 200 to 164 700 thousand tons. Correspondingly, when compared with the low-restriction scenario, the reducing rate will be 53% to 67% , 50% to 64% , 33% to 55% , 25% to 60% , 41% to 55% and 26% to 34% respectively. Furthermore, based on a study of synergistic emission reduction of the air pollutants and GHG, it was proposed that the adjustment and control of energy consumptions shall be intensively developed in the three sectors of industry, transportation and services. In this way the synergistic reduction of the emissions of chief air pollutants and GHG will be achieved; meanwhile the pressures of energy demands may be deliberately relieved.

  2. A methanotroph-based biorefinery: Potential scenarios for generating multiple products from a single fermentation.

    PubMed

    Strong, P J; Kalyuzhnaya, M; Silverman, J; Clarke, W P

    2016-09-01

    Methane, a carbon source for methanotrophic bacteria, is the principal component of natural gas and is produced during anaerobic digestion of organic matter (biogas). Methanotrophs are a viable source of single cell protein (feed supplement) and can produce various products, since they accumulate osmolytes (e.g. ectoine, sucrose), phospholipids (potential biofuels) and biopolymers (polyhydroxybutyrate, glycogen), among others. Other cell components, such as surface layers, metal chelating proteins (methanobactin), enzymes (methane monooxygenase) or heterologous proteins hold promise as future products. Here, scenarios are presented where ectoine, polyhydroxybutyrate or protein G are synthesised as the primary product, in conjunction with a variety of ancillary products that could enhance process viability. Single or dual-stage processes and volumetric requirements for bioreactors are discussed, in terms of an annual biomass output of 1000 tonnesyear(-1). Product yields are discussed in relation to methane and oxygen consumption and organic waste generation.

  3. A methanotroph-based biorefinery: Potential scenarios for generating multiple products from a single fermentation.

    PubMed

    Strong, P J; Kalyuzhnaya, M; Silverman, J; Clarke, W P

    2016-09-01

    Methane, a carbon source for methanotrophic bacteria, is the principal component of natural gas and is produced during anaerobic digestion of organic matter (biogas). Methanotrophs are a viable source of single cell protein (feed supplement) and can produce various products, since they accumulate osmolytes (e.g. ectoine, sucrose), phospholipids (potential biofuels) and biopolymers (polyhydroxybutyrate, glycogen), among others. Other cell components, such as surface layers, metal chelating proteins (methanobactin), enzymes (methane monooxygenase) or heterologous proteins hold promise as future products. Here, scenarios are presented where ectoine, polyhydroxybutyrate or protein G are synthesised as the primary product, in conjunction with a variety of ancillary products that could enhance process viability. Single or dual-stage processes and volumetric requirements for bioreactors are discussed, in terms of an annual biomass output of 1000 tonnesyear(-1). Product yields are discussed in relation to methane and oxygen consumption and organic waste generation. PMID:27146469

  4. Future impact of traffic emissions on atmospheric ozone and OH based on two scenarios

    NASA Astrophysics Data System (ADS)

    Hodnebrog, Ø.; Berntsen, T. K.; Dessens, O.; Gauss, M.; Grewe, V.; Isaksen, I. S. A.; Koffi, B.; Myhre, G.; Olivié, D.; Prather, M. J.; Stordal, F.; Szopa, S.; Tang, Q.; van Velthoven, P.; Williams, J. E.

    2012-12-01

    The future impact of traffic emissions on atmospheric ozone and OH has been investigated separately for the three sectors AIRcraft, maritime SHIPping and ROAD traffic. To reduce uncertainties we present results from an ensemble of six different atmospheric chemistry models, each simulating the atmospheric chemical composition in a possible high emission scenario (A1B), and with emissions from each transport sector reduced by 5% to estimate sensitivities. Our results are compared with optimistic future emission scenarios (B1 and B1 ACARE), presented in a companion paper, and with the recent past (year 2000). Present-day activity indicates that anthropogenic emissions so far evolve closer to A1B than the B1 scenario. As a response to expected changes in emissions, AIR and SHIP will have increased impacts on atmospheric O3 and OH in the future while the impact of ROAD traffic will decrease substantially as a result of technological improvements. In 2050, maximum aircraft-induced O3 occurs near 80° N in the UTLS region and could reach 9 ppbv in the zonal mean during summer. Emissions from ship traffic have their largest O3 impact in the maritime boundary layer with a maximum of 6 ppbv over the North Atlantic Ocean during summer in 2050. The O3 impact of road traffic emissions in the lower troposphere peaks at 3 ppbv over the Arabian Peninsula, much lower than the impact in 2000. Radiative forcing (RF) calculations show that the net effect of AIR, SHIP and ROAD combined will change from a marginal cooling of -0.44 ± 13 mW m-2 in 2000 to a relatively strong cooling of -32 ± 9.3 (B1) or -32 ± 18 mW m-2 (A1B) in 2050, when taking into account RF due to changes in O3, CH4 and CH4-induced O3. This is caused both by the enhanced negative net RF from SHIP, which will change from -19 ± 5.3 mW m-2 in 2000 to -31 ± 4.8 (B1) or -40 ± 9 mW m-2 (A1B) in 2050, and from reduced O3 warming from ROAD, which is likely to turn from a positive net RF of 12 ± 8.5 mW m-2 in 2000 to a

  5. Future impact of traffic emissions on atmospheric ozone and OH based on two scenarios

    NASA Astrophysics Data System (ADS)

    Hodnebrog, Ø.; Berntsen, T. K.; Dessens, O.; Gauss, M.; Grewe, V.; Isaksen, I. S. A.; Koffi, B.; Myhre, G.; Olivié, D.; Prather, M. J.; Stordal, F.; Szopa, S.; Tang, Q.; van Velthoven, P.; Williams, J. E.

    2012-08-01

    The future impact of traffic emissions on atmospheric ozone and OH has been investigated separately for the three sectors AIRcraft, maritime SHIPping and ROAD traffic. To reduce uncertainties we present results from an ensemble of six different atmospheric chemistry models, each simulating the atmospheric chemical composition in a possible high emission scenario (A1B), and with emissions from each transport sector reduced by 5% to estimate sensitivities. Our results are compared with optimistic future emission scenarios (B1 and B1 ACARE), presented in a companion paper, and with the recent past (year 2000). Present-day activity indicates that anthropogenic emissions so far evolve closer to A1B than the B1 scenario. As a response to expected changes in emissions, AIR and SHIP will have increased impacts on atmospheric O3 and OH in the future while the impact of ROAD traffic will decrease substantially as a result of technological improvements. In 2050, maximum aircraft-induced O3 occurs near 80° N in the UTLS region and could reach 9 ppbv in the zonal mean during summer. Emissions from ship traffic have their largest O3 impact in the maritime boundary layer with a maximum of 6 ppbv over the North Atlantic Ocean during summer in 2050. The O3 impact of road traffic emissions in the lower troposphere peaks at 3 ppbv over the Arabian Peninsula, much lower than the impact in 2000. Radiative Forcing (RF) calculations show that the net effect of AIR, SHIP and ROAD combined will change from a~marginal cooling of -0.38 ± 13 mW m-2 in 2000 to a relatively strong cooling of -32 ± 8.9 (B1) or -31 ± 20 mW m-2 (A1B) in 2050, when taking into account RF due to changes in O3, CH4 and CH4-induced O3. This is caused both by the enhanced negative net RF from SHIP, which will change from -20 ± 5.4 mW m-2 in 2000 to -31 ± 4.8 (B1) or -40 ± 11 mW m-2 (A1B) in 2050, and from reduced O3 warming from ROAD, which is likely to turn from a positive net RF of 13 ± 7.9 mW m-2 in 2000 to

  6. The Nankai Trough earthquake tsunamis in Korea: numerical studies of the 1707 Hoei earthquake and physics-based scenarios

    NASA Astrophysics Data System (ADS)

    Kim, SatByul; Saito, Tatsuhiko; Fukuyama, Eiichi; Kang, Tae-Seob

    2016-04-01

    Historical documents in Korea and China report abnormal waves in the sea and rivers close to the date of the 1707 Hoei earthquake, which occurred in the Nankai Trough, off southwestern Japan. This indicates that the tsunami caused by the Hoei earthquake might have reached Korea and China, which suggests a potential hazard in Korea from large earthquakes in the Nankai Trough. We conducted tsunami simulations to study the details of tsunamis in Korea caused by large earthquakes. Our results showed that the Hoei earthquake (Mw 8.8) tsunami reached the Korean Peninsula about 200 min after the earthquake occurred. The maximum tsunami height was ~0.5 m along the Korean coast. The model of the Hoei earthquake predicted a long-lasting tsunami whose highest peak arrived 600 min later after the first arrival near the coastline of Jeju Island. In addition, we conducted tsunami simulations using physics-based scenarios of anticipated earthquakes in the Nankai subduction zone. The maximum tsunami height in the scenarios (Mw 8.5-8.6) was ~0.4 m along the Korean coast. As a simple evaluation of larger possible tsunamis, we increased the amount of stress released by the earthquake by a factor of two and three, resulting in scenarios for Mw 8.8 and 8.9 earthquakes, respectively. The tsunami height increased by 0.1-0.4 m compared to that estimated by the Hoei earthquake.

  7. Scenario-based assessment of buildings damage and population exposure due to tsunamis for the town of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Pagnoni, G.; Armigliato, A.; Tinti, S.

    2015-08-01

    Alexandria is the second biggest city in Egypt as regards population, is a key economic area in northern Africa and has a very important tourist activity. Historical catalogues indicate that it was severely affected by a number of tsunami events. In this work we assess the tsunami hazard by running numerical simulations of tsunami impact in Alexandria through the Worst-case Credible Tsunami Scenario Analysis (WCTSA). We identify three main seismic sources: the Western Hellenic Arc (WHA - reference event AD 365, Mw = 8.5), the Eastern Hellenic Arc (EHA - reference event 1303, Mw = 8.0) and the Cyprus Arc (CA - hypothetical scenario earthquake with Mw = 8.0), inferred from the tectonic setting and from historical tsunami catalogues. All numerical simulations are carried out by means of the code UBO-TSUFD, developed and maintained by the Tsunami Research Team of the University of Bologna. Relevant tsunami metrics are computed for each scenario and then used to build aggregated fields such as the maximum flood depth and the maximum inundation area. We find that the case that produces the most relevant flooding in Alexandria is the EHA scenario, with wave heights up to 4 m. The aggregate fields are used for a building vulnerability assessment according to a methodology developed in the frame of the EU-FP6 project SCHEMA and further refined in this study, based on the adoption of a suitable building damage matrix and on water inundation depth. It is found that in the districts of El Dekhila and Al Amriyah, to the south-west of the port of Dekhila over 12 000 buildings could be affected and hundreds of them could incur in consequences ranging from important damage to total collapse. It is also found that in the same districts tsunami inundation covers an area of about 15 km2 resulting in more than 150 000 residents being exposed.

  8. Spatial, temporal and frequency based climate change assessment in Columbia River Basin using multi downscaled-scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2016-07-01

    Uncertainties in climate modelling are well documented in literature. Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional scale. In the present work, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from statistically downscaled GCM projections in Columbia River Basin (CRB). Analysis is performed using two different statistically downscaled climate projections (with ten GCMs downscaled products each, for RCP 4.5 and RCP 8.5, from CMIP5 dataset) namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. The two datasets for BCSD and MACA are downscaled from observed data for both scenarios projections i.e. RCP4.5 and RCP8.5. Analysis is performed using spatial change (yearly scale), temporal change (monthly scale), percentile change (seasonal scale), quantile change (yearly scale), and wavelet analysis (yearly scale) in the future period from the historical period, respectively, at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice versa for temperature. Most of the models have indicated considerate positive change in quantiles and percentiles for both precipitation and temperature. Wavelet analysis provided insights into possible explanation to changes in precipitation.

  9. The multiscale importance of road segments in a network disruption scenario: a risk-based approach.

    PubMed

    Freiria, Susana; Tavares, Alexandre O; Pedro Julião, Rui

    2015-03-01

    This article addresses the problem of the multiscale importance of road networks, with the aim of helping to establish a more resilient network in the event of a road disruption scenario. A new model for identifying the most important roads is described and applied on a local and regional scale. The work presented here represents a step forward, since it focuses on the interaction between identifying the most important roads in a network that connect people and health services, the specificity of the natural hazards that threaten the normal functioning of the network, and an assessment of the consequences of three real-world interruptions from a multiscale perspective. The case studies concern three different past events: road interruptions due to a flood, a forest fire, and a mass movement. On the basis of the results obtained, it is possible to establish the roads for which risk management should be a priority. The multiscale perspective shows that in a road interruption the regional system may have the capacity to reorganize itself, although the interruption may have consequences for local dynamics. Coordination between local and regional scales is therefore important. The model proposed here allows for the scaling of emergency response facilities and human and physical resources. It represents an innovative approach to defining priorities, not only in the prevention phase but also in terms of the response to natural disasters, such as awareness of the consequences of road disruption for the rescue services sent out to local communities. PMID:25263956

  10. The multiscale importance of road segments in a network disruption scenario: a risk-based approach.

    PubMed

    Freiria, Susana; Tavares, Alexandre O; Pedro Julião, Rui

    2015-03-01

    This article addresses the problem of the multiscale importance of road networks, with the aim of helping to establish a more resilient network in the event of a road disruption scenario. A new model for identifying the most important roads is described and applied on a local and regional scale. The work presented here represents a step forward, since it focuses on the interaction between identifying the most important roads in a network that connect people and health services, the specificity of the natural hazards that threaten the normal functioning of the network, and an assessment of the consequences of three real-world interruptions from a multiscale perspective. The case studies concern three different past events: road interruptions due to a flood, a forest fire, and a mass movement. On the basis of the results obtained, it is possible to establish the roads for which risk management should be a priority. The multiscale perspective shows that in a road interruption the regional system may have the capacity to reorganize itself, although the interruption may have consequences for local dynamics. Coordination between local and regional scales is therefore important. The model proposed here allows for the scaling of emergency response facilities and human and physical resources. It represents an innovative approach to defining priorities, not only in the prevention phase but also in terms of the response to natural disasters, such as awareness of the consequences of road disruption for the rescue services sent out to local communities.

  11. Space-enabled information environment for crisis management. Scenario-based analysis and evaluation in an operational environment

    NASA Astrophysics Data System (ADS)

    Ryzenko, Jakub; Smolarkiewicz, Marcin

    2010-01-01

    The paper presents analysis of usefulness of space applications in crisis management activities carried out on the national level. Analytical approach has been based upon development of realistic disaster scenarios and their evaluation with assumption of existence of space-related capabilities available to rescue forces. Building upon analysis's results, the experimental information environment has been developed and it successfully supported commanding of a large-scale crisis management field training. The results prove that many crisis management needs can be served with existing, commercially available products. The key to success lays in understanding operational needs; integration into common information environment; and standardisation of information exchange.

  12. Study of the triton-burnup process in different JET scenarios using neutron monitor based on CVD diamond

    NASA Astrophysics Data System (ADS)

    Nemtsev, G.; Amosov, V.; Meshchaninov, S.; Popovichev, S.; Rodionov, R.

    2016-11-01

    We present the results of analysis of triton burn-up process using the data from diamond detector. Neutron monitor based on CVD diamond was installed in JET torus hall close to the plasma center. We measure the part of 14 MeV neutrons in scenarios where plasma current varies in a range of 1-3 MA. In this experiment diamond neutron monitor was also able to detect strong gamma bursts produced by runaway electrons arising during the disruptions. We can conclude that CVD diamond detector will contribute to the study of fast particles confinement and help predict the disruption events in future tokamaks.

  13. Nephrologists' likelihood of referring patients for kidney transplant based on hypothetical patient scenarios

    PubMed Central

    Tandon, Ankita; Wang, Ming; Roe, Kevin C.; Patel, Surju; Ghahramani, Nasrollah

    2016-01-01

    Background There is wide variation in referral for kidney transplant and preemptive kidney transplant (PKT). Patient characteristics such as age, race, sex and geographic location have been cited as contributing factors to this disparity. We hypothesize that the characteristics of nephrologists interplay with the patients' characteristics to influence the referral decision. In this study, we used hypothetical case scenarios to assess nephrologists' decisions regarding transplant referral. Methods A total of 3180 nephrologists were invited to participate. Among those interested, 252 were randomly selected to receive a survey in which nephrologists were asked whether they would recommend transplant for the 25 hypothetical patients. Logistic regression models with single covariates and multiple covariates were used to identify patient characteristics associated with likelihood of being referred for transplant and to identify nephrologists' characteristics associated with likelihood of referring for transplant. Results Of the 252 potential participants, 216 completed the survey. A nephrologist's affiliation with an academic institution was associated with a higher likelihood of referral, and being ‘>10 years from fellowship’ was associated with lower likelihood of referring patients for transplant. Patient age <50 years was associated with higher likelihood of referral. Rural location and smoking history/chronic obstructive pulmonary disease were associated with lower likelihood of being referred for transplant. The nephrologist's affiliation with an academic institution was associated with higher likelihood of referring for preemptive transplant, and the patient having a rural residence was associated with lower likelihood of being referred for preemptive transplant. Conclusions The variability in transplant referral is related to patients' age and geographic location as well as the nephrologists' affiliation with an academic institution and time since completion

  14. DEROCS: A computer program to simulate offshore oil and natural gas development scenarios and onshore service base requirements

    USGS Publications Warehouse

    Marcus, Philip A.; Smith, E.T.; Robinson, S.R.; Wong, A.T.

    1977-01-01

    The FORTRAN IV (H) computer program, DEROCS, constructs Outer Continental Shelf (OCS) resource development scenarios and quantifies the requirements for and impacts of the operation of the onshore service bases necessary to support offshore oil and gas operations. The acronym DEROCS stands for 'Development of Energy Resources of the Outer Continental Shelf.' The user may specify the number, timing, and amounts of offshore oil and natural gas finds, onshore service base locations, and multiplier relationships between offshore development activities and onshore land, supply, labor and facility requirements. The program determines schedules of platform installation, development drilling, production from platforms, and well workover, and calculates on a yearly basis the requirements for and impacts of the operation of the onshore service bases demanded by offshore activities. We present two examples of program application.

  15. A Usability and Learnability Case Study of Glass Flight Deck Interfaces and Pilot Interactions through Scenario-based Training

    NASA Astrophysics Data System (ADS)

    De Cino, Thomas J., II

    In the aviation industry, digitally produced and presented flight, navigation, and aircraft information is commonly referred to as glass flight decks. Glass flight decks are driven by computer-based subsystems and have long been a part of military and commercial aviation sectors. Over the past 15 years, the General Aviation (GA) sector of the aviation industry has become a recent beneficiary of the rapid advancement of computer-based glass flight deck (GFD) systems. While providing the GA pilot considerable enhancements in the quality of information about the status and operations of the aircraft, training pilots on the use of glass flight decks is often delivered with traditional methods (e.g. textbooks, PowerPoint presentations, user manuals, and limited computer-based training modules). These training methods have been reported as less than desirable in learning to use the glass flight deck interface. Difficulties in achieving a complete understanding of functional and operational characteristics of the GFD systems, acquiring a full understanding of the interrelationships of the varied subsystems, and handling the wealth of flight information provided have been reported. Documented pilot concerns of poor user experience and satisfaction, and problems with the learning the complex and sophisticated interface of the GFD are additional issues with current pilot training approaches. A case study was executed to explore ways to improve training using GFD systems at a Midwestern aviation university. The researcher investigated if variations in instructional systems design and training methods for learning glass flight deck technology would affect the perceptions and attitudes of pilots of the learnability (an attribute of usability) of the glass flight deck interface. Specifically, this study investigated the effectiveness of scenario-based training (SBT) methods to potentially improve pilot knowledge and understanding of a GFD system, and overall pilot user

  16. Evaluation of Resident Evacuations in Urban Rainstorm Waterlogging Disasters Based on Scenario Simulation: Daoli District (Harbin, China) as an Example

    PubMed Central

    Chen, Peng; Zhang, Jiquan; Zhang, Lifeng; Sun, Yingyue

    2014-01-01

    With the acceleration of urbanization, waterlogging has become an increasingly serious issue. Road waterlogging has a great influence on residents’ travel and traffic safety. Thus, evaluation of residents’ travel difficulties caused by rainstorm waterlogging disasters is of great significance for their travel safety and emergency shelter needs. This study investigated urban rainstorm waterlogging disasters, evaluating the impact of the evolution of such disasters’ evolution on residents’ evacuation, using Daoli District (Harbin, China) as the research demonstration area to perform empirical research using a combination of scenario simulations, questionnaires, GIS spatial technology analysis and a hydrodynamics method to establish an urban rainstorm waterlogging numerical simulation model. The results show that under the conditions of a 10-year frequency rainstorm, there are three street sections in the study area with a high difficulty index, five street sections with medium difficulty index and the index is low at other districts, while under the conditions of a 50-year frequency rainstorm, there are five street sections with a high difficulty index, nine street sections with a medium difficulty index and the other districts all have a low index. These research results can help set the foundation for further small-scale urban rainstorm waterlogging disaster scenario simulations and emergency shelter planning as well as forecasting and warning, and provide a brand-new thought and research method for research on residents’ safe travel. PMID:25264676

  17. Evaluation of resident evacuations in urban rainstorm waterlogging disasters based on scenario simulation: Daoli district (Harbin, China) as an example.

    PubMed

    Chen, Peng; Zhang, Jiquan; Zhang, Lifeng; Sun, Yingyue

    2014-01-01

    With the acceleration of urbanization, waterlogging has become an increasingly serious issue. Road waterlogging has a great influence on residents' travel and traffic safety. Thus, evaluation of residents' travel difficulties caused by rainstorm waterlogging disasters is of great significance for their travel safety and emergency shelter needs. This study investigated urban rainstorm waterlogging disasters, evaluating the impact of the evolution of such disasters' evolution on residents' evacuation, using Daoli District (Harbin, China) as the research demonstration area to perform empirical research using a combination of scenario simulations, questionnaires, GIS spatial technology analysis and a hydrodynamics method to establish an urban rainstorm waterlogging numerical simulation model. The results show that under the conditions of a 10-year frequency rainstorm, there are three street sections in the study area with a high difficulty index, five street sections with medium difficulty index and the index is low at other districts, while under the conditions of a 50-year frequency rainstorm, there are five street sections with a high difficulty index, nine street sections with a medium difficulty index and the other districts all have a low index. These research results can help set the foundation for further small-scale urban rainstorm waterlogging disaster scenario simulations and emergency shelter planning as well as forecasting and warning, and provide a brand-new thought and research method for research on residents' safe travel. PMID:25264676

  18. Proposal of Comprehensive Model of Teaching Basic Nursing Skills Under Goal-Based Scenario Theory.

    PubMed

    Sannomiya, Yuri; Muranaka, Yoko; Teraoka, Misako; Suzuki, Sayuri; Saito, Yukie; Yamato, Hiromi; Ishii, Mariko

    2016-01-01

    The purpose of this study is to design and develop a comprehensive model of teaching basic nursing skills on GBS theory and Four-Stage Performance Cycle. We designed a basic nursing skill program that consists of three courses: basic, application and multi-tasking. The program will be offered as blended study, utilizing e-learning. PMID:27332480

  19. Prescriptive vs. performance based cook-off fire testing.

    SciTech Connect

    Nakos, James Thomas; Tieszen, Sheldon Robert; Erikson, William Wilding; Gill, Walter; Blanchat, Thomas K.

    2010-07-01

    In the fire safety community, the trend is toward implementing performance-based standards in place of existing prescriptive ones. Prescriptive standards can be difficult to adapt to changing design methods, materials, and application situations of systems that ultimately must perform well in unwanted fire situations. In general, this trend has produced positive results and is embraced by the fire protection community. The question arises as to whether this approach could be used to advantage in cook-off testing. Prescribed fuel fire cook-off tests have been instigated because of historical incidents that led to extensive damage to structures and loss of life. They are designed to evaluate the propensity for a violent response. The prescribed protocol has several advantages: it can be defined in terms of controllable parameters (wind speed, fuel type, pool size, etc.); and it may be conservative for a particular scenario. However, fires are inherently variable and prescribed tests are not necessarily representative of a particular accident scenario. Moreover, prescribed protocols are not necessarily adaptable and may not be conservative. We also consider performance-based testing. This requires more knowledge and thought regarding not only the fire environment, but the behavior of the munitions themselves. Sandia uses a performance based approach in assuring the safe behavior of systems of interest that contain energetic materials. Sandia also conducts prescriptive fire testing for the IAEA, NRC and the DOT. Here we comment on the strengths and weakness of both approaches and suggest a path forward should it be desirable to pursue a performance based cook-off standard.

  20. Scripting Scenarios for the Human Patient Simulator

    NASA Technical Reports Server (NTRS)

    Bacal, Kira; Miller, Robert; Doerr, Harold

    2004-01-01

    The Human Patient Simulator (HPS) is particularly useful in providing scenario-based learning which can be tailored to fit specific scenarios and which can be modified in realtime to enhance the teaching environment. Scripting these scenarios so as to maximize learning requires certain skills, in order to ensure that a change in student performance, understanding, critical thinking, and/or communication skills results. Methods: A "good" scenario can be defined in terms of applicability, learning opportunities, student interest, and clearly associated metrics. Obstacles to such a scenario include a lack of understanding of the applicable environment by the scenario author(s), a desire (common among novices) to cover too many topics, failure to define learning objectives, mutually exclusive or confusing learning objectives, unskilled instructors, poor preparation , disorganized approach, or an inappropriate teaching philosophy (such as "trial by fire" or education through humiliation). Results: Descriptions of several successful teaching programs, used in the military, civilian, and NASA medical environments , will be provided, along with sample scenarios. Discussion: Simulator-based lessons have proven to be a time- and cost-efficient manner by which to educate medical personnel. Particularly when training for medical care in austere environments (pre-hospital, aeromedical transport, International Space Station, military operations), the HPS can enhance the learning experience.

  1. Exploring an Ecologically Sustainable Scheme for Landscape Restoration of Abandoned Mine Land: Scenario-Based Simulation Integrated Linear Programming and CLUE-S Model

    PubMed Central

    Zhang, Liping; Zhang, Shiwen; Huang, Yajie; Cao, Meng; Huang, Yuanfang; Zhang, Hongyan

    2016-01-01

    Understanding abandoned mine land (AML) changes during land reclamation is crucial for reusing damaged land resources and formulating sound ecological restoration policies. This study combines the linear programming (LP) model and the CLUE-S model to simulate land-use dynamics in the Mentougou District (Beijing, China) from 2007 to 2020 under three reclamation scenarios, that is, the planning scenario based on the general land-use plan in study area (scenario 1), maximal comprehensive benefits (scenario 2), and maximal ecosystem service value (scenario 3). Nine landscape-scale graph metrics were then selected to describe the landscape characteristics. The results show that the coupled model presented can simulate the dynamics of AML effectively and the spatially explicit transformations of AML were different. New cultivated land dominates in scenario 1, while construction land and forest land account for major percentages in scenarios 2 and 3, respectively. Scenario 3 has an advantage in most of the selected indices as the patches combined most closely. To conclude, reclaiming AML by transformation into more forest can reduce the variability and maintain the stability of the landscape ecological system in study area. These findings contribute to better mapping AML dynamics and providing policy support for the management of AML. PMID:27023575

  2. Exploring an Ecologically Sustainable Scheme for Landscape Restoration of Abandoned Mine Land: Scenario-Based Simulation Integrated Linear Programming and CLUE-S Model.

    PubMed

    Zhang, Liping; Zhang, Shiwen; Huang, Yajie; Cao, Meng; Huang, Yuanfang; Zhang, Hongyan

    2016-03-24

    Understanding abandoned mine land (AML) changes during land reclamation is crucial for reusing damaged land resources and formulating sound ecological restoration policies. This study combines the linear programming (LP) model and the CLUE-S model to simulate land-use dynamics in the Mentougou District (Beijing, China) from 2007 to 2020 under three reclamation scenarios, that is, the planning scenario based on the general land-use plan in study area (scenario 1), maximal comprehensive benefits (scenario 2), and maximal ecosystem service value (scenario 3). Nine landscape-scale graph metrics were then selected to describe the landscape characteristics. The results show that the coupled model presented can simulate the dynamics of AML effectively and the spatially explicit transformations of AML were different. New cultivated land dominates in scenario 1, while construction land and forest land account for major percentages in scenarios 2 and 3, respectively. Scenario 3 has an advantage in most of the selected indices as the patches combined most closely. To conclude, reclaiming AML by transformation into more forest can reduce the variability and maintain the stability of the landscape ecological system in study area. These findings contribute to better mapping AML dynamics and providing policy support for the management of AML.

  3. Exploring an Ecologically Sustainable Scheme for Landscape Restoration of Abandoned Mine Land: Scenario-Based Simulation Integrated Linear Programming and CLUE-S Model.

    PubMed

    Zhang, Liping; Zhang, Shiwen; Huang, Yajie; Cao, Meng; Huang, Yuanfang; Zhang, Hongyan

    2016-04-01

    Understanding abandoned mine land (AML) changes during land reclamation is crucial for reusing damaged land resources and formulating sound ecological restoration policies. This study combines the linear programming (LP) model and the CLUE-S model to simulate land-use dynamics in the Mentougou District (Beijing, China) from 2007 to 2020 under three reclamation scenarios, that is, the planning scenario based on the general land-use plan in study area (scenario 1), maximal comprehensive benefits (scenario 2), and maximal ecosystem service value (scenario 3). Nine landscape-scale graph metrics were then selected to describe the landscape characteristics. The results show that the coupled model presented can simulate the dynamics of AML effectively and the spatially explicit transformations of AML were different. New cultivated land dominates in scenario 1, while construction land and forest land account for major percentages in scenarios 2 and 3, respectively. Scenario 3 has an advantage in most of the selected indices as the patches combined most closely. To conclude, reclaiming AML by transformation into more forest can reduce the variability and maintain the stability of the landscape ecological system in study area. These findings contribute to better mapping AML dynamics and providing policy support for the management of AML. PMID:27023575

  4. Performance Based Budgeting Update. Information Capsule.

    ERIC Educational Resources Information Center

    Bashford, Joanne

    The report shows the performance of Miami-Dade Community College (M-DCC) (Florida) on the measures stipulated for the 2000-01 allocation of Performance Based Budgeting (PBB). Of the total state funds allocated for Performance Based Budgeting, a certain percentage is designated for each of the measures. Colleges earn "points" according to the…

  5. Climate influences on the cost-effectiveness of vector-based interventions against malaria in elimination scenarios.

    PubMed

    Parham, Paul E; Hughes, Dyfrig A

    2015-04-01

    Despite the dependence of mosquito population dynamics on environmental conditions, the associated impact of climate and climate change on present and future malaria remains an area of ongoing debate and uncertainty. Here, we develop a novel integration of mosquito, transmission and economic modelling to assess whether the cost-effectiveness of indoor residual spraying (IRS) and long-lasting insecticidal nets (LLINs) against Plasmodium falciparum transmission by Anopheles gambiae s.s. mosquitoes depends on climatic conditions in low endemicity scenarios. We find that although temperature and rainfall affect the cost-effectiveness of IRS and/or LLIN scale-up, whether this is sufficient to influence policy depends on local endemicity, existing interventions, host immune response to infection and the emergence rate of insecticide resistance. For the scenarios considered, IRS is found to be more cost-effective than LLINs for the same level of scale-up, and both are more cost-effective at lower mean precipitation and higher variability in precipitation and temperature. We also find that the dependence of peak transmission on mean temperature translates into optimal temperatures for vector-based intervention cost-effectiveness. Further cost-effectiveness analysis that accounts for country-specific epidemiological and environmental heterogeneities is required to assess optimal intervention scale-up for elimination and better understand future transmission trends under climate change. PMID:25688017

  6. Climate influences on the cost-effectiveness of vector-based interventions against malaria in elimination scenarios

    PubMed Central

    Parham, Paul E.; Hughes, Dyfrig A.

    2015-01-01

    Despite the dependence of mosquito population dynamics on environmental conditions, the associated impact of climate and climate change on present and future malaria remains an area of ongoing debate and uncertainty. Here, we develop a novel integration of mosquito, transmission and economic modelling to assess whether the cost-effectiveness of indoor residual spraying (IRS) and long-lasting insecticidal nets (LLINs) against Plasmodium falciparum transmission by Anopheles gambiae s.s. mosquitoes depends on climatic conditions in low endemicity scenarios. We find that although temperature and rainfall affect the cost-effectiveness of IRS and/or LLIN scale-up, whether this is sufficient to influence policy depends on local endemicity, existing interventions, host immune response to infection and the emergence rate of insecticide resistance. For the scenarios considered, IRS is found to be more cost-effective than LLINs for the same level of scale-up, and both are more cost-effective at lower mean precipitation and higher variability in precipitation and temperature. We also find that the dependence of peak transmission on mean temperature translates into optimal temperatures for vector-based intervention cost-effectiveness. Further cost-effectiveness analysis that accounts for country-specific epidemiological and environmental heterogeneities is required to assess optimal intervention scale-up for elimination and better understand future transmission trends under climate change. PMID:25688017

  7. A triangular fuzzy TOPSIS-based approach for the application of water technologies in different emergency water supply scenarios.

    PubMed

    Qu, Jianhua; Meng, Xianlin; Yu, Huan; You, Hong

    2016-09-01

    Because of the increasing frequency and intensity of unexpected natural disasters, providing safe drinking water for the affected population following a disaster has become a global challenge of growing concern. An onsite water supply technology that is portable, mobile, or modular is a more suitable and sustainable solution for the victims than transporting bottled water. In recent years, various water techniques, such as membrane-assisted technologies, have been proposed and successfully implemented in many places. Given the diversity of techniques available, the current challenge is how to scientifically identify the optimum options for different disaster scenarios. Hence, a fuzzy triangular-based multi-criteria, group decision-making tool was developed in this research. The approach was then applied to the selection of the most appropriate water technologies corresponding to the different emergency water supply scenarios. The results show this tool capable of facilitating scientific analysis in the evaluation and selection of emergency water technologies for enduring security drinking water supply in disaster relief. PMID:27221588

  8. Uncertainty in local and regional tsunami earthquake source parameters: Implications for scenario based hazard assessment and forecasting

    NASA Astrophysics Data System (ADS)

    Müller, Christof; Power, William; Burbidge, David; Wang, Xiaoming

    2016-04-01

    Over the last decade tsunami propagation models have been used extensively for both tsunami forecasting, hazard and risk assessment. However, the effect of uncertainty in the earthquake source parameters, such as location and distribution of slip in the earthquake source on the results of the tsunami model has not always been examined in great detail. We have developed a preliminary combined and continuous Hikurangi-Kermadec subduction zone interface model. The model is defined by a spline surface and is based on a previously published spline model for Hikurangi interface and a more traditional unit source model for the Kermadec interface. The model allows to freely position and vary the earthquake epicenter and to consider non-uniform slip. Using this model we have investigated the effects of variability in non-uniform slip and epicenter location on the distribution of offshore maximum wave heights for local New Zealand targets. Which scenario out of an ensemble is responsible for the maximum wave height locally is a spatially highly variable function of earthquake location and/or the distribution of slip. We use the Coefficient of Variation (CoV) to quantify the variability of offshore wave heights as a function of source location and distribution of slip. CoV increases significantly with closer proximity to the shore, in bays and in shallow water. The study has implication for tsunami hazard assessment and forecasting. As an example, our results challenge the concept of hazard assessment using a single worst case scenario in particular for local tsunami.

  9. Surface impedance based microwave imaging method for breast cancer screening: contrast-enhanced scenario.

    PubMed

    Güren, Onan; Çayören, Mehmet; Ergene, Lale Tükenmez; Akduman, Ibrahim

    2014-10-01

    A new microwave imaging method that uses microwave contrast agents is presented for the detection and localization of breast tumours. The method is based on the reconstruction of breast surface impedance through a measured scattered field. The surface impedance modelling allows for representing the electrical properties of the breasts in terms of impedance boundary conditions, which enable us to map the inner structure of the breasts into surface impedance functions. Later a simple quantitative method is proposed to screen breasts against malignant tumours where the detection procedure is based on weighted cross correlations among impedance functions. Numerical results demonstrate that the method is capable of detecting small malignancies and provides reasonable localization.

  10. A comparison between the example reference biosphere model ERB 2B and a process-based model: simulation of a natural release scenario.

    PubMed

    Almahayni, T

    2014-12-01

    The BIOMASS methodology was developed with the objective of constructing defensible assessment biospheres for assessing potential radiological impacts of radioactive waste repositories. To this end, a set of Example Reference Biospheres were developed to demonstrate the use of the methodology and to provide an international point of reference. In this paper, the performance of the Example Reference Biosphere model ERB 2B associated with the natural release scenario, discharge of contaminated groundwater to the surface environment, was evaluated by comparing its long-term projections of radionuclide dynamics and distribution in a soil-plant system to those of a process-based, transient advection-dispersion model (AD). The models were parametrised with data characteristic of a typical rainfed winter wheat crop grown on a sandy loam soil under temperate climate conditions. Three safety-relevant radionuclides, (99)Tc, (129)I and (237)Np with different degree of sorption were selected for the study. Although the models were driven by the same hydraulic (soil moisture content and water fluxes) and radiological (Kds) input data, their projections were remarkably different. On one hand, both models were able to capture short and long-term variation in activity concentration in the subsoil compartment. On the other hand, the Reference Biosphere model did not project any radionuclide accumulation in the topsoil and crop compartments. This behaviour would underestimate the radiological exposure under natural release scenarios. The results highlight the potential role deep roots play in soil-to-plant transfer under a natural release scenario where radionuclides are released into the subsoil. When considering the relative activity and root depth profiles within the soil column, much of the radioactivity was taken up into the crop from the subsoil compartment. Further improvements were suggested to address the limitations of the Reference Biosphere model presented in this paper.

  11. A comparison between the example reference biosphere model ERB 2B and a process-based model: simulation of a natural release scenario.

    PubMed

    Almahayni, T

    2014-12-01

    The BIOMASS methodology was developed with the objective of constructing defensible assessment biospheres for assessing potential radiological impacts of radioactive waste repositories. To this end, a set of Example Reference Biospheres were developed to demonstrate the use of the methodology and to provide an international point of reference. In this paper, the performance of the Example Reference Biosphere model ERB 2B associated with the natural release scenario, discharge of contaminated groundwater to the surface environment, was evaluated by comparing its long-term projections of radionuclide dynamics and distribution in a soil-plant system to those of a process-based, transient advection-dispersion model (AD). The models were parametrised with data characteristic of a typical rainfed winter wheat crop grown on a sandy loam soil under temperate climate conditions. Three safety-relevant radionuclides, (99)Tc, (129)I and (237)Np with different degree of sorption were selected for the study. Although the models were driven by the same hydraulic (soil moisture content and water fluxes) and radiological (Kds) input data, their projections were remarkably different. On one hand, both models were able to capture short and long-term variation in activity concentration in the subsoil compartment. On the other hand, the Reference Biosphere model did not project any radionuclide accumulation in the topsoil and crop compartments. This behaviour would underestimate the radiological exposure under natural release scenarios. The results highlight the potential role deep roots play in soil-to-plant transfer under a natural release scenario where radionuclides are released into the subsoil. When considering the relative activity and root depth profiles within the soil column, much of the radioactivity was taken up into the crop from the subsoil compartment. Further improvements were suggested to address the limitations of the Reference Biosphere model presented in this paper

  12. Improving the performance of a filling line based on simulation

    NASA Astrophysics Data System (ADS)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  13. Inquiry-Based Science Education: A Scenario on Zambia's High School Science Curriculum

    ERIC Educational Resources Information Center

    Chabalengula, Vivien M.; Mumba, Frackson

    2012-01-01

    This paper is aimed at elucidating the current state of inquiry-based science education (IBSE) in Zambia's high school science curriculum. Therefore, we investigated Zambian teachers' conceptions of inquiry; determined inquiry levels in the national high school science curriculum materials, which include syllabi, textbooks and practical exams; and…

  14. Designing Collaborative E-Learning Environments Based upon Semantic Wiki: From Design Models to Application Scenarios

    ERIC Educational Resources Information Center

    Li, Yanyan; Dong, Mingkai; Huang, Ronghuai

    2011-01-01

    The knowledge society requires life-long learning and flexible learning environment that enables fast, just-in-time and relevant learning, aiding the development of communities of knowledge, linking learners and practitioners with experts. Based upon semantic wiki, a combination of wiki and Semantic Web technology, this paper designs and develops…

  15. Supply Chain Simulator: A Scenario-Based Educational Tool to Enhance Student Learning

    ERIC Educational Resources Information Center

    Siddiqui, Atiq; Khan, Mehmood; Akhtar, Sohail

    2008-01-01

    Simulation-based educational products are excellent set of illustrative tools that proffer features like visualization of the dynamic behavior of a real system, etc. Such products have great efficacy in education and are known to be one of the first-rate student centered learning methodologies. These products allow students to practice skills such…

  16. Application of physiologically based toxicokinetic modelling to study the impact of the exposure scenario on the toxicokinetics and the behavioural effects of toluene in rats.

    PubMed

    van Asperen, Judith; Rijcken, W Robert Pels; Lammers, Jan H C M

    2003-02-18

    The toxicity of inhalatory exposure to organic solvents may not only be related to the total external dose, but also to the pattern of exposure. In this study physiologically based toxicokinetic (PBTK) modelling has been used to study the impact of the exposure scenario on the toxicokinetics and the behavioural effects of the model solvent toluene in rats. After construction of the model with parameters from literature, toxicokinetic data were collected from rats exposed to either a constant concentration or fluctuating concentrations at total external dose levels of 20,000 and 10,000 ppm x h for model validation. At the same exposure conditions the effects on learned performance were evaluated in separate groups of rats using a visual discrimination task. In general, the PBTK model provided reliable predictions of the toxicokinetics of toluene at different exposure scenarios, but it also tended to underestimate the blood and brain concentrations in the descending parts of the tissue concentration-time curves. At these high dose levels the differences in toxicokinetics between the constant and the fluctuating exposure groups were relatively small. The visual discrimination experiments demonstrated a slowing of response speed and disinhibition of responding in all toluene-exposed groups. The results suggest that the brain concentration of toluene is one of the major determinants of its effect on disinhibition of responding. PMID:12559692

  17. Physics and control of ELMing H-mode negative-central-shear advanced tokamak ITER scenario based on experimental profiles from DIII-D

    NASA Astrophysics Data System (ADS)

    Lao, L. L.; Chan, V. S.; Chu, M. S.; Evans, T.; Humphreys, D. A.; Leuer, J. A.; Mahdavi, M. A.; Petrie, T. W.; Snyder, P. B.; St. John, H. E.; Staebler, G. M.; Stambaugh, R. D.; Taylor, T. S.; Turnbull, A. D.; West, W. P.; Brennan, D. P.

    2003-10-01

    Key DIII-D advanced tokamak (AT) experimental and modelling results are applied to examine the physics and control issues for ITER to operate in a negative central shear (NCS) AT scenario. The effects of a finite edge pressure pedestal and current density are included based on the DIII-D experimental profiles. Ideal and resistive stability analyses demonstrate that feedback control of resistive wall modes by rotational drive or flux conserving intelligent coils is crucial for these AT configurations to operate at attractive bgrN values in the range 3.0-3.5. Vertical stability and halo current analyses show that reliable disruption mitigation is essential and mitigation control using an impurity gas can significantly reduce the local mechanical stress to an acceptable level. Core transport and turbulence analyses indicate that control of the rotational shear profile is essential to reduce the pedestal temperature required for high bgr. Consideration of edge stability and core transport suggests that a sufficiently wide pedestal is necessary for the projected fusion performance. Heat flux analyses indicate that, with core-only radiation enhancement, the outboard peak divertor heat load is near the design limit of 10 MW m-2. Detached operation may be necessary to reduce the heat flux to a more manageable level. Evaluation of the ITER pulse length using a local step response approach indicates that the 3000 s ITER long-pulse scenario is probably both necessary and sufficient for demonstration of local current profile control.

  18. Validation of a scenario-based assessment of critical thinking using an externally validated tool.

    PubMed

    Buur, Jennifer L; Schmidt, Peggy; Smylie, Dean; Irizarry, Kris; Crocker, Carlos; Tyler, John; Barr, Margaret

    2012-01-01

    With medical education transitioning from knowledge-based curricula to competency-based curricula, critical thinking skills have emerged as a major competency. While there are validated external instruments for assessing critical thinking, many educators have created their own custom assessments of critical thinking. However, the face validity of these assessments has not been challenged. The purpose of this study was to compare results from a custom assessment of critical thinking with the results from a validated external instrument of critical thinking. Students from the College of Veterinary Medicine at Western University of Health Sciences were administered a custom assessment of critical thinking (ACT) examination and the externally validated instrument, California Critical Thinking Skills Test (CCTST), in the spring of 2011. Total scores and sub-scores from each exam were analyzed for significant correlations using Pearson correlation coefficients. Significant correlations between ACT Blooms 2 and deductive reasoning and total ACT score and deductive reasoning were demonstrated with correlation coefficients of 0.24 and 0.22, respectively. No other statistically significant correlations were found. The lack of significant correlation between the two examinations illustrates the need in medical education to externally validate internal custom assessments. Ultimately, the development and validation of custom assessments of non-knowledge-based competencies will produce higher quality medical professionals.

  19. The Use of Open-Ended Problem-Based Learning Scenarios in an Interdisciplinary Biotechnology Class: Evaluation of a Problem-Based Learning Course Across Three Years†

    PubMed Central

    Steck, Todd R.; DiBiase, Warren; Wang, Chuang; Boukhtiarov, Anatoli

    2012-01-01

    Use of open-ended Problem-Based Learning (PBL) in biology classrooms has been limited by the difficulty in designing problem scenarios such that the content learned in a course can be predicted and controlled, the lack of familiarity of this method of instruction by faculty, and the difficulty in assessment. Here we present the results of a study in which we developed a team-based interdisciplinary course that combined the fields of biology and civil engineering across three years. We used PBL scenarios as the only learning tool, wrote the problem scenarios, and developed the means to assess these courses and the results of that assessment. Our data indicates that PBL changed students’ perception of their learning in content knowledge and promoted a change in students’ learning styles. Although no statistically significant improvement in problem-solving skills and critical thinking skills was observed, students reported substantial changes in their problem-based learning strategies and critical thinking skills. PMID:23653774

  20. Evaluating interactive computer-based scenarios designed for learning medical technology.

    PubMed

    Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Wallergård, Mattias; Johansson, Gerd

    2014-11-01

    The use of medical equipment is growing in healthcare, resulting in an increased need for resources to educate users in how to manage the various devices. Learning the practical operation of a device is one thing, but learning how to work with the device in the actual clinical context is more challenging. This paper presents a computer-based simulation prototype for learning medical technology in the context of critical care. Properties from simulation and computer games have been adopted to create a visualization-based, interactive and contextually bound tool for learning. A participatory design process, including three researchers and three practitioners from a clinic for infectious diseases, was adopted to adjust the form and content of the prototype to the needs of the clinical practice and to create a situated learning experience. An evaluation with 18 practitioners showed that practitioners were positive to this type of tool for learning and that it served as a good platform for eliciting and sharing knowledge. Our conclusion is that this type of tools can be a complement to traditional learning resources to situate the learning in a context without requiring advanced technology or being resource-demanding. PMID:24898339

  1. Evaluating interactive computer-based scenarios designed for learning medical technology.

    PubMed

    Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Wallergård, Mattias; Johansson, Gerd

    2014-11-01

    The use of medical equipment is growing in healthcare, resulting in an increased need for resources to educate users in how to manage the various devices. Learning the practical operation of a device is one thing, but learning how to work with the device in the actual clinical context is more challenging. This paper presents a computer-based simulation prototype for learning medical technology in the context of critical care. Properties from simulation and computer games have been adopted to create a visualization-based, interactive and contextually bound tool for learning. A participatory design process, including three researchers and three practitioners from a clinic for infectious diseases, was adopted to adjust the form and content of the prototype to the needs of the clinical practice and to create a situated learning experience. An evaluation with 18 practitioners showed that practitioners were positive to this type of tool for learning and that it served as a good platform for eliciting and sharing knowledge. Our conclusion is that this type of tools can be a complement to traditional learning resources to situate the learning in a context without requiring advanced technology or being resource-demanding.

  2. A SCENARIO FOR THE FINE STRUCTURES OF SOLAR TYPE IIIb RADIO BURSTS BASED ON ELECTRON CYCLOTRON MASER EMISSION

    SciTech Connect

    Wang, C. B.

    2015-06-10

    A scenario based on electron cyclotron maser (ECM) emission is proposed for the fine structures of solar radio emission. It is suggested that under certain conditions modulation of the ratio between the plasma frequency and electron gyro frequency by ultra-low-frequency waves, which is a key parameter for excitation of ECM instability, may lead to the intermittent emission of radio waves. As an example, the explanation for the observed fine-structure components in the solar Type IIIb bursts is discussed in detail. Three primary issues of Type IIIb bursts are addressed: (1) the physical mechanism that results in intermittent emission elements that form a chain in the dynamic spectrum of Type IIIb bursts, (2) the cause of split pairs (or double stria) and triple stria, and (3) why only IIIb–III bursts are observed in the events of fundamental harmonic pair emission whereas IIIb–IIIb or III–IIIb bursts are very rarely observed.

  3. A Scenario for the Fine Structures of Solar Type IIIb Radio Bursts Based on Electron Cyclotron Maser Emission

    NASA Astrophysics Data System (ADS)

    Wang, C. B.

    2015-06-01

    A scenario based on electron cyclotron maser (ECM) emission is proposed for the fine structures of solar radio emission. It is suggested that under certain conditions modulation of the ratio between the plasma frequency and electron gyro frequency by ultra-low-frequency waves, which is a key parameter for excitation of ECM instability, may lead to the intermittent emission of radio waves. As an example, the explanation for the observed fine-structure components in the solar Type IIIb bursts is discussed in detail. Three primary issues of Type IIIb bursts are addressed: (1) the physical mechanism that results in intermittent emission elements that form a chain in the dynamic spectrum of Type IIIb bursts, (2) the cause of split pairs (or double stria) and triple stria, and (3) why only IIIb-III bursts are observed in the events of fundamental harmonic pair emission whereas IIIb-IIIb or III-IIIb bursts are very rarely observed.

  4. South African maize production scenarios for 2055 using a combined empirical and process-based model approach

    NASA Astrophysics Data System (ADS)

    Estes, L.; Bradley, B.; Oppenheimer, M.; Wilcove, D.; Beukes, H.; Schulze, R. E.; Tadross, M.

    2011-12-01

    In South Africa, a semi-arid country with a diverse agricultural sector, climate change is projected to negatively impact staple crop production. Our study examines future impacts to maize, South Africa's most widely grown staple crop. Working at finer spatial resolution than previous studies, we combine the process-based DSSAT4.5 and the empirical MAXENT models to study future maize suitability. Climate scenarios were based on 9 GCMs run under SRES A2 and B1 emissions scenarios down-scaled (using self-organizing maps) to 5838 locations. Soil properties were derived from textural and compositional data linked to 26422 landforms. DSSAT was run with typical dryland planting parameters and mean projected CO2 values. MAXENT was trained using aircraft-observed distributions and monthly climatologies data derived from downscaled daily records, with future rainfall increased by 10% to simulate CO2 related water-use efficiency gains. We assessed model accuracy based on correlations between model output and a satellite-derived yield proxy (integrated NDVI), and the overlap of modeled and observed maize field distributions. DSSAT yields were linearly correlated to mean integrated NDVI (R2 = 0.38), while MAXENT's relationship was logistic. Binary suitability maps based on thresholding model outputs were slightly more accurate for MAXENT (88%) than for DSSAT (87%) when compared to current maize field distribution. We created 18 suitability maps for each model (9 GCMs X 2 SRES) using projected changes relative to historical suitability thresholds. Future maps largely agreed in eastern South Africa, but disagreed strongly in the semi-arid west. Using a 95% confidence criterion (17 models agree), MAXENT showed a 241305 km2 suitability loss relative to its modeled historical suitability, while DSSAT showed a potential loss of only 112446 km2. Even the smaller potential loss highlighted by DSSAT is uncertain, given that DSSAT's mean (across all 18 climate scenarios) projected yield

  5. Modeling post-fire sediment yield based on two burn scenarios at the Sooke Lake Reservoir, BC, Canada

    NASA Astrophysics Data System (ADS)

    Dobre, Mariana; Elliot, William J.; Brooks, Erin S.; Smith, Tim

    2016-04-01

    Wildfires can have major adverse effects on municipal water sources. Local governments need methods to evaluate fire risk and to develop mitigation procedures. The Sooke Lake Reservoir is the primary source of water for the city of Victoria, BC and the concern is that sediment delivered from upland burned areas could have a detrimental impact on the reservoir and the water supply. We conducted a sediment delivery modeling pilot study on a portion of the Sooke Lake Reservoir (specifically, the Trestle Creek Management Unit (TCMU)) to evaluate the potential impacts of wildfire on sediment delivery from hillslopes and sub-catchments. We used a process-based hydrologic and soil erosion model called Water Erosion Prediction Project geospatial interface, GeoWEPP, to predict the sediment delivery from specific return period design storms for two burn severity scenarios: real (low-intensity burn severity) and worst (high-intensity burn severity) case scenarios. The GeoWEPP model allows users to simulate streamflow and erosion from hillslope polygons within a watershed. The model requires information on the topographic, soil and vegetative characteristics for each hillslope and a weather file. WEPP default values and several assumptions were necessary to apply the model where data were missing. Based on a 10-m DEM we delineated 16 watersheds within the TCMU area. A long term 100-year daily climate file was generated for this analysis using the CLIGEN model based on the historical observations recorded at Concrete, WA in United States, and adjusted for observed monthly precipitation observed in the Sooke Basin. We ran 100-year simulations and calculated yearly and event-based return periods (for 2, 5, 10, 20, 25, and 50 years) for each of the 16 watersheds. Overall, WEPP simulations indicate that the storms that are most likely to produce the greatest runoff and sediment load in these coastal, maritime climates with relatively low rainfall intensities are likely to occur in

  6. Projections of high resolution climate changes for South Korea using multiple-regional climate models based on four RCP scenarios. Part 1: surface air temperature

    NASA Astrophysics Data System (ADS)

    Suh, Myoung-Seok; Oh, Seok-Geun; Lee, Young-Suk; Ahn, Joong-Bae; Cha, Dong-Hyun; Lee, Dong-Kyou; Hong, Song-You; Min, Seung-Ki; Park, Seong-Chan; Kang, Hyun-Suk

    2016-05-01

    We projected surface air temperature changes over South Korea during the mid (2026-2050) and late (2076-2100) 21st century against the current climate (1981-2005) using the simulation results from five regional climate models (RCMs) driven by Hadley Centre Global Environmental Model, version 2, coupled with the Atmosphere- Ocean (HadGEM2-AO), and two ensemble methods (equal weighted averaging, weighted averaging based on Taylor's skill score) under four Representative Concentration Pathways (RCP) scenarios. In general, the five RCM ensembles captured the spatial and seasonal variations, and probability distribution of temperature over South Korea reasonably compared to observation. They particularly showed a good performance in simulating annual temperature range compared to HadGEM2-AO. In future simulation, the temperature over South Korea will increase significantly for all scenarios and seasons. Stronger warming trends are projected in the late 21st century than in the mid-21st century, in particular under RCP8.5. The five RCM ensembles projected that temperature changes for the mid/late 21st century relative to the current climate are +1.54°C/+1.92°C for RCP2.6, +1.68°C/+2.91°C for RCP4.5, +1.17°C/+3.11°C for RCP6.0, and +1.75°C/+4.73°C for RCP8.5. Compared to the temperature projection of HadGEM2-AO, the five RCM ensembles projected smaller increases in temperature for all RCP scenarios and seasons. The inter-RCM spread is proportional to the simulation period (i.e., larger in the late-21st than mid-21st century) and significantly greater (about four times) in winter than summer for all RCP scenarios. Therefore, the modeled predictions of temperature increases during the late 21st century, particularly for winter temperatures, should be used with caution.

  7. Moral foundations vignettes: a standardized stimulus database of scenarios based on moral foundations theory.

    PubMed

    Clifford, Scott; Iyengar, Vijeth; Cabeza, Roberto; Sinnott-Armstrong, Walter

    2015-12-01

    Research on the emotional, cognitive, and social determinants of moral judgment has surged in recent years. The development of moral foundations theory (MFT) has played an important role, demonstrating the breadth of morality. Moral psychology has responded by investigating how different domains of moral judgment are shaped by a variety of psychological factors. Yet, the discipline lacks a validated set of moral violations that span the moral domain, creating a barrier to investigating influences on judgment and how their neural bases might vary across the moral domain. In this paper, we aim to fill this gap by developing and validating a large set of moral foundations vignettes (MFVs). Each vignette depicts a behavior violating a particular moral foundation and not others. The vignettes are controlled on many dimensions including syntactic structure and complexity making them suitable for neuroimaging research. We demonstrate the validity of our vignettes by examining respondents' classifications of moral violations, conducting exploratory and confirmatory factor analysis, and demonstrating the correspondence between the extracted factors and existing measures of the moral foundations. We expect that the MFVs will be beneficial for a wide variety of behavioral and neuroimaging investigations of moral cognition. PMID:25582811

  8. Moral foundations vignettes: a standardized stimulus database of scenarios based on moral foundations theory

    PubMed Central

    Iyengar, Vijeth; Cabeza, Roberto; Sinnott-Armstrong, Walter

    2016-01-01

    Research on the emotional, cognitive, and social determinants of moral judgment has surged in recent years. The development of moral foundations theory (MFT) has played an important role, demonstrating the breadth of morality. Moral psychology has responded by investigating how different domains of moral judgment are shaped by a variety of psychological factors. Yet, the discipline lacks a validated set of moral violations that span the moral domain, creating a barrier to investigating influences on judgment and how their neural bases might vary across the moral domain. In this paper, we aim to fill this gap by developing and validating a large set of moral foundations vignettes (MFVs). Each vignette depicts a behavior violating a particular moral foundation and not others. The vignettes are controlled on many dimensions including syntactic structure and complexity making them suitable for neuroimaging research. We demonstrate the validity of our vignettes by examining respondents’ classifications of moral violations, conducting exploratory and confirmatory factor analysis, and demonstrating the correspondence between the extracted factors and existing measures of the moral foundations. We expect that the MFVs will be beneficial for a wide variety of behavioral and neuroimaging investigations of moral cognition. PMID:25582811

  9. Moral foundations vignettes: a standardized stimulus database of scenarios based on moral foundations theory.

    PubMed

    Clifford, Scott; Iyengar, Vijeth; Cabeza, Roberto; Sinnott-Armstrong, Walter

    2015-12-01

    Research on the emotional, cognitive, and social determinants of moral judgment has surged in recent years. The development of moral foundations theory (MFT) has played an important role, demonstrating the breadth of morality. Moral psychology has responded by investigating how different domains of moral judgment are shaped by a variety of psychological factors. Yet, the discipline lacks a validated set of moral violations that span the moral domain, creating a barrier to investigating influences on judgment and how their neural bases might vary across the moral domain. In this paper, we aim to fill this gap by developing and validating a large set of moral foundations vignettes (MFVs). Each vignette depicts a behavior violating a particular moral foundation and not others. The vignettes are controlled on many dimensions including syntactic structure and complexity making them suitable for neuroimaging research. We demonstrate the validity of our vignettes by examining respondents' classifications of moral violations, conducting exploratory and confirmatory factor analysis, and demonstrating the correspondence between the extracted factors and existing measures of the moral foundations. We expect that the MFVs will be beneficial for a wide variety of behavioral and neuroimaging investigations of moral cognition.

  10. Scenario-Based Validation of Moderate Resolution DEMs Freely Available for Complex Himalayan Terrain

    NASA Astrophysics Data System (ADS)

    Singh, Mritunjay Kumar; Gupta, R. D.; Snehmani; Bhardwaj, Anshuman; Ganju, Ashwagosha

    2016-02-01

    Accuracy of the Digital Elevation Model (DEM) affects the accuracy of various geoscience and environmental modelling results. This study evaluates accuracies of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global DEM Version-2 (GDEM V2), the Shuttle Radar Topography Mission (SRTM) X-band DEM and the NRSC Cartosat-1 DEM V1 (CartoDEM). A high resolution (1 m) photogrammetric DEM (ADS80 DEM), having a high absolute accuracy [1.60 m linear error at 90 % confidence (LE90)], resampled at 30 m cell size was used as reference. The overall root mean square error (RMSE) in vertical accuracy was 23, 73, and 166 m and the LE90 was 36, 75, and 256 m for ASTER GDEM V2, SRTM X-band DEM and CartoDEM, respectively. A detailed error analysis was performed for individual as well as combinations of different classes of aspect, slope, land-cover and elevation zones for the study area. For the ASTER GDEM V2, forest areas with North facing slopes (0°-5°) in the 4th elevation zone (3773-4369 m) showed minimum LE90 of 0.99 m, and barren with East facing slopes (>60°) falling under the 2nd elevation zone (2581-3177 m) showed maximum LE90 of 166 m. For the SRTM DEM, pixels with South-East facing slopes of 0°-5° in the 4th elevation zone covered with forest showed least LE90 of 0.33 m and maximum LE90 of 521 m was observed in the barren area with North-East facing slope (>60°) in the 4th elevation zone. In case of the CartoDEM, the snow pixels in the 2nd elevation zone with South-East facing slopes of 5°-15° showed least LE90 of 0.71 m and maximum LE90 of 1266 m was observed for the snow pixels in the 3rd elevation zone (3177-3773 m) within the South facing slope of 45°-60°. These results can be highly useful for the researchers using DEM products in various modelling exercises.

  11. Performance of regional climate models in simulations of present-day Irish climate: Implications for constructing future scenarios

    NASA Astrophysics Data System (ADS)

    Foley, A. M.

    2009-04-01

    Simulations of present-day (1961-1990) climate from 19 regional climate model experiments have been compared to the observed baseline climate for Ireland. These simulations, driven by global climate models, are obtained through the EC PRUDENCE (Prediction of Regional scenarios and Uncertainties for Defining EuropeaN Climate change risks and Effects) project. The ability to represent the statistics of Irish climate has been assessed, both temporally (comparisons of meteorological year, seasonal mean and seasonal variance) and spatially (seasonal covariation and pattern-analysis) for two key meteorological parameters, that of temperature and precipitation. For the average meteorological year (30-year averages of each month), mean temperatures are found to be within 1.5˚ C of observations, except in winter, when temperatures are overestimated by up to 2.5˚ C. Temporal variation is also not well-represented by some models in winter. Conversely, temporal variation in precipitation is most poorly simulated in summer. Seasonal variance is the area in which greatest inter-model variability is shown. Ratio of observed variance to modeled variance ranges from weak (0.5 or less) to very strong (greater than 0.7) in both seasons, and for both parameters. Spatially, temperature is over-estimated throughout Ireland, by up to 4.6˚ C in winter and 2.7˚ C in summer in individual grid cells from some models. Precipitation is found to be both under and over-estimated, with grid cell biases ranging from -2.5 mm/day to 4.2 mm/day in winter and from -1.2 mm/day to 1.8 mm/day in summer. While skill at representing the spatial precipitation pattern is found to be very strong in 16 out of 19 experiments in winter, only 2 of those experiments show the same level of skill in summer. Errors are identified in all individual models, both systematic and random. While using an ensemble average overcomes some of these deficiencies, the optimal approach is to correct systematic errors

  12. Off-Nominal Performance of the International Space Station Solar Array Wings Under Orbital Eclipse Lighting Scenarios

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Scheiman, David A.

    2005-01-01

    This paper documents testing and analyses to quantify International Space Station (ISS) Solar Array Wing (SAW) string electrical performance under highly off-nominal, low-temperature-low-intensity (LILT) operating conditions with nonsolar light sources. This work is relevant for assessing feasibility and risks associated with a Sequential Shunt Unit (SSU) remove and replace (R&R) Extravehicular Activity (EVA). During eclipse, SAW strings can be energized by moonlight, EVA suit helmet lights or video camera lights. To quantify SAW performance under these off-nominal conditions, solar cell performance testing was performed using full moon, solar simulator and Video Camera Luminaire (VCL) light sources. Test conditions included 25 to 110 C temperatures and 1- to 0.0001-Sun illumination intensities. Electrical performance data and calculated eclipse lighting intensities were combined to predict SAW current-voltage output for comparison with electrical hazard thresholds. Worst case predictions show there is no connector pin molten metal hazard but crew shock hazard limits are exceeded due to VCL illumination. Assessment uncertainties and limitations are discussed along with operational solutions to mitigate SAW electrical hazards from VCL illumination. Results from a preliminary assessment of SAW arcing are also discussed. The authors recommend further analyses once SSU, R&R, and EVA procedures are better defined.

  13. Assessment of vulnerability to future marine processes of urbanized coastal environments by a GIS-based approach: expected scenario in the metropolitan area of Bari (Italy)

    NASA Astrophysics Data System (ADS)

    Mancini, F.; Ceppi, C.; Christopulos, V.

    2013-12-01

    Literature concerning the risk assessment procedures after extreme meteorological events is generally focused on the establishing of relationship between actual severe weather conditions and impact detected over the involved zones. Such an events are classified on the basis of measurements and observation able to assess the magnitude of phenomena or on the basis of related effects on the affected area, the latter being deeply connected with the overall physical vulnerability. However such assessment almost never do consider scenario about expected extreme event and possible pattern of urbanization at the time of impact and nor the spatial and temporal uncertainty of phenomena are taken into account. The drawn of future scenario about coastal vulnerability to marine processes is therefore difficult. This work focuses the study case of the Metropoli Terra di Bari (metropolitan area of Bari, Apulia, Italy) where a coastal vulnerability analysis due to climate changes expected on the basis of expert opinions coming from the scientific community was carried out. Several possible impacts on the coastal environments were considered, in particular sea level rise inundation, flooding due to storm surge and coastal erosion. For such a purpose the methodology base on SRES (Special Report on Emission Scenario) produced by the IPCC (Intergovernmental Panel on Climate Change) was adopted after a regionalization procedure as carried out by Verburgh and others (2006) at the European scale. The open source software SLEUTH, base on the cellular automate principle, was used and the reliability of obtained scenario verified through the Monte Carlo method. Once these scenario were produced, a GIS-based multicriteria methodology was implemented to evaluate the vulnerability of the urbanized coastal area of interest. Several vulnerability maps related are therefore available for different scenario able to consider the degree of hazards and potential development of the typology and extent

  14. Material Performance of Fully-Ceramic Micro-Encapsulated Fuel under Selected LWR Design Basis Scenarios: Final Report

    SciTech Connect

    B. Boer; R. S. Sen; M. A. Pope; A. M. Ougouag

    2011-09-01

    The extension to LWRs of the use of Deep-Burn coated particle fuel envisaged for HTRs has been investigated. TRISO coated fuel particles are used in Fully-Ceramic Microencapsulated (FCM) fuel within a SiC matrix rather than the graphite of HTRs. TRISO particles are well characterized for uranium-fueled HTRs. However, operating conditions of LWRs are different from those of HTRs (temperature, neutron energy spectrum, fast fluence levels, power density). Furthermore, the time scales of transient core behavior during accidents are usually much shorter and thus more severe in LWRs. The PASTA code was updated for analysis of stresses in coated particle FCM fuel. The code extensions enable the automatic use of neutronic data (burnup, fast fluence as a function of irradiation time) obtained using the DRAGON neutronics code. An input option for automatic evaluation of temperature rise during anticipated transients was also added. A new thermal model for FCM was incorporated into the code; so-were updated correlations (for pyrocarbon coating layers) suitable to estimating dimensional changes at the high fluence levels attained in LWR DB fuel. Analyses of the FCM fuel using the updated PASTA code under nominal and accident conditions show: (1) Stress levels in SiC-coatings are low for low fission gas release (FGR) fractions of several percent, as based on data of fission gas diffusion in UO{sub 2} kernels. However, the high burnup level of LWR-DB fuel implies that the FGR fraction is more likely to be in the range of 50-100%, similar to Inert Matrix Fuels (IMFs). For this range the predicted stresses and failure fractions of the SiC coating are high for the reference particle design (500 {micro}mm kernel diameter, 100 {micro}mm buffer, 35 {micro}mm IPyC, 35 {micro}mm SiC, 40 {micro}mm OPyC). A conservative case, assuming 100% FGR, 900K fuel temperature and 705 MWd/kg (77% FIMA) fuel burnup, results in a 8.0 x 10{sup -2} failure probability. For a 'best-estimate' FGR fraction

  15. Perspectives on Performance-Based Incentive Plans.

    ERIC Educational Resources Information Center

    Duttweiler, Patricia Cloud; Ramos-Cancel, Maria L.

    This document is a synthesis of the current literature on performance-based incentive systems for teachers and administrators. Section one provides an introduction to the reform movement and to performance-based pay initiatives; a definition of terms; a brief discussion of funding sources; a discussion of compensation strategies; a description of…

  16. TAP 2: Performance-Based Training Manual

    SciTech Connect

    Not Available

    1993-08-01

    Cornerstone of safe operation of DOE nuclear facilities is personnel performing day-to-day functions which accomplish the facility mission. Performance-based training is fundamental to the safe operation. This manual has been developed to support the Training Accreditation Program (TAP) and assist contractors in efforts to develop performance-based training programs. It provides contractors with narrative procedures on performance-based training that can be modified and incorporated for facility-specific application. It is divided into sections dealing with analysis, design, development, implementation, and evaluation.

  17. Alberta's Performance-Based Funding Mechanism.

    ERIC Educational Resources Information Center

    Barnetson, Bob

    This paper provides an overview of the performance indicator-based accountability and funding mechanism implemented in the higher education system of Alberta, Canada. The paper defines the terms accountability and regulation, examines the use of performance indicators to demonstrate accountability, and explains how performance indicator-based…

  18. Launching a performance-based pay plan.

    PubMed

    Berger, S; Moyer, J

    1991-08-19

    Performance-based compensation is increasingly replacing the annual bonus as hospitals seek ways to motivate their management. Two Ernst & Young authorities outline how to establish the incentive approach and put the performance measures in place. In the process, the performance goals should communicate what's important to the organization.

  19. Risk-based decision making for staggered bioterrorist attacks : resource allocation and risk reduction in "reload" scenarios.

    SciTech Connect

    Lemaster, Michelle Nicole; Gay, David M.; Ehlen, Mark Andrew; Boggs, Paul T.; Ray, Jaideep

    2009-10-01

    Staggered bioterrorist attacks with aerosolized pathogens on population centers present a formidable challenge to resource allocation and response planning. The response and planning will commence immediately after the detection of the first attack and with no or little information of the second attack. In this report, we outline a method by which resource allocation may be performed. It involves probabilistic reconstruction of the bioterrorist attack from partial observations of the outbreak, followed by an optimization-under-uncertainty approach to perform resource allocations. We consider both single-site and time-staggered multi-site attacks (i.e., a reload scenario) under conditions when resources (personnel and equipment which are difficult to gather and transport) are insufficient. Both communicable (plague) and non-communicable diseases (anthrax) are addressed, and we also consider cases when the data, the time-series of people reporting with symptoms, are confounded with a reporting delay. We demonstrate how our approach develops allocations profiles that have the potential to reduce the probability of an extremely adverse outcome in exchange for a more certain, but less adverse outcome. We explore the effect of placing limits on daily allocations. Further, since our method is data-driven, the resource allocation progressively improves as more data becomes available.

  20. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    NASA Astrophysics Data System (ADS)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-09-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  1. A physiologically based toxicokinetic model for dietary uptake of hydrophobic organic compounds by fish: II. simulation of chronic exposure scenarios.

    PubMed

    Nichols, John W; Fitzsimmons, Patrick N; Whiteman, Frank W

    2004-02-01

    A physiologically based toxicokinetic (PBTK) model for dietary uptake of hydrophobic organic compounds by fish was used to simulate dosing scenarios commonly encountered in experimental and field studies. Simulations were initially generated for the model compound [UL-(14)C] 2,2',5,5'-tetrachlorobiphenyl ([(14)C] PCB 52). Steady-state exposures were simulated by calculating chemical concentrations in tissues of the predator corresponding to an equilibrium distribution between the fish and water (termed the bioconcentration or BCF residue data set). This residue data set was then varied in a proportional manner until whole-body chemical concentrations exhibited no net change for each set of exposure conditions. For [(14)C] PCB 52, the proportional increase in BCF residues (termed the biomagnification factor or BMF) required to achieve steady state in a food-only exposure was 2.24, while in a combined food and water exposure the BMF was 3.11. Additional simulations for the food and water exposure scenario were obtained for a set of hypothetical organic compounds with increasing log K(OW) values. Using gut permeability coefficients determined for [(14)C] PCB 52, predicted BMFs increased with chemical log K(OW), achieving levels much higher than those reported in field sampling efforts. BMFs comparable to measured values were obtained by reducing permeability coefficients within each gut segment in a log K(OW)-dependent manner. This predicted decrease in chemical permeability is consistent with earlier work, suggesting that dietary absorption of hydrophobic compounds by fish is controlled in part by factors that vary with chemical log K(OW). Relatively low rates of metabolism or growth were shown to have a substantial impact on steady-state biomagnification of chemical residues. PMID:14657516

  2. Quantitative evaluation of lake eutrophication responses under alternative water diversion scenarios: a water quality modeling based statistical analysis approach.

    PubMed

    Liu, Yong; Wang, Yilin; Sheng, Hu; Dong, Feifei; Zou, Rui; Zhao, Lei; Guo, Huaicheng; Zhu, Xiang; He, Bin

    2014-01-15

    China is confronting the challenge of accelerated lake eutrophication, where Lake Dianchi is considered as the most serious one. Eutrophication control for Lake Dianchi began in the mid-1980s. However, decision makers have been puzzled by the lack of visible water quality response to past efforts given the tremendous investment. Therefore, decision makers desperately need a scientifically sound way to quantitatively evaluate the response of lake water quality to proposed management measures and engineering works. We used a water quality modeling based scenario analysis approach to quantitatively evaluate the eutrophication responses of Lake Dianchi to an under-construction water diversion project. The primary analytic framework was built on a three-dimensional hydrodynamic, nutrient fate and transport, as well as algae dynamics model, which has previously been calibrated and validated using historical data. We designed 16 scenarios to analyze the water quality effects of three driving forces, including watershed nutrient loading, variations in diverted inflow water, and lake water level. A two-step statistical analysis consisting of an orthogonal test analysis and linear regression was then conducted to distinguish the contributions of various driving forces to lake water quality. The analysis results show that (a) the different ways of managing the diversion projects would result in different water quality response in Lake Dianchi, though the differences do not appear to be significant; (b) the maximum reduction in annual average and peak Chl-a concentration from the various ways of diversion project operation are respectively 11% and 5%; (c) a combined 66% watershed load reduction and water diversion can eliminate the lake hypoxia volume percentage from the existing 6.82% to 3.00%; and (d) the water diversion will decrease the occurrence of algal blooms, and the effect of algae reduction can be enhanced if diverted water are seasonally allocated such that wet

  3. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    NASA Astrophysics Data System (ADS)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-07-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  4. Quantitative evaluation of lake eutrophication responses under alternative water diversion scenarios: a water quality modeling based statistical analysis approach.

    PubMed

    Liu, Yong; Wang, Yilin; Sheng, Hu; Dong, Feifei; Zou, Rui; Zhao, Lei; Guo, Huaicheng; Zhu, Xiang; He, Bin

    2014-01-15

    China is confronting the challenge of accelerated lake eutrophication, where Lake Dianchi is considered as the most serious one. Eutrophication control for Lake Dianchi began in the mid-1980s. However, decision makers have been puzzled by the lack of visible water quality response to past efforts given the tremendous investment. Therefore, decision makers desperately need a scientifically sound way to quantitatively evaluate the response of lake water quality to proposed management measures and engineering works. We used a water quality modeling based scenario analysis approach to quantitatively evaluate the eutrophication responses of Lake Dianchi to an under-construction water diversion project. The primary analytic framework was built on a three-dimensional hydrodynamic, nutrient fate and transport, as well as algae dynamics model, which has previously been calibrated and validated using historical data. We designed 16 scenarios to analyze the water quality effects of three driving forces, including watershed nutrient loading, variations in diverted inflow water, and lake water level. A two-step statistical analysis consisting of an orthogonal test analysis and linear regression was then conducted to distinguish the contributions of various driving forces to lake water quality. The analysis results show that (a) the different ways of managing the diversion projects would result in different water quality response in Lake Dianchi, though the differences do not appear to be significant; (b) the maximum reduction in annual average and peak Chl-a concentration from the various ways of diversion project operation are respectively 11% and 5%; (c) a combined 66% watershed load reduction and water diversion can eliminate the lake hypoxia volume percentage from the existing 6.82% to 3.00%; and (d) the water diversion will decrease the occurrence of algal blooms, and the effect of algae reduction can be enhanced if diverted water are seasonally allocated such that wet

  5. Assessment of H.264 video compression on automated face recognition performance in surveillance and mobile video scenarios

    NASA Astrophysics Data System (ADS)

    Klare, Brendan; Burge, Mark

    2010-04-01

    We assess the impact of the H.264 video codec on the match performance of automated face recognition in surveillance and mobile video applications. A set of two hundred access control (90 pixel inter-pupilary distance) and distance surveillance (45 pixel inter-pupilary distance) videos taken under non-ideal imaging and facial recognition (e.g., pose, illumination, and expression) conditions were matched using two commercial face recognition engines in the studies. The first study evaluated automated face recognition performance on access control and distance surveillance videos at CIF and VGA resolutions using the H.264 baseline profile at nine bitrates rates ranging from 8kbs to 2048kbs. In our experiments, video signals were able to be compressed up to 128kbs before a significant drop face recognition performance occurred. The second study evaluated automated face recognition on mobile devices at QCIF, iPhone, and Android resolutions for each of the H.264 PDA profiles. Rank one match performance, cumulative match scores, and failure to enroll rates are reported.

  6. A Semantic Web-Based Authoring Tool to Facilitate the Planning of Collaborative Learning Scenarios Compliant with Learning Theories

    ERIC Educational Resources Information Center

    Isotani, Seiji; Mizoguchi, Riichiro; Isotani, Sadao; Capeli, Olimpio M.; Isotani, Naoko; de Albuquerque, Antonio R. P. L.; Bittencourt, Ig. I.; Jaques, Patricia

    2013-01-01

    When the goal of group activities is to support long-term learning, the task of designing well-thought-out collaborative learning (CL) scenarios is an important key to success. To help students adequately acquire and develop their knowledge and skills, a teacher can plan a scenario that increases the probability for learning to occur. Such a…

  7. Performing Gender: A Discourse Analysis of Theatre-Based Sexual Violence Prevention Programs

    ERIC Educational Resources Information Center

    Iverson, Susan V.

    2006-01-01

    Among the numerous approaches that are employed to prevent sexual violence, the performance of scenarios has become one of the "promising practices" in U.S. postsecondary education. This article describes findings from a pilot study to analyze scripts used for theatre-based sexual violence prevention programs. Employing the method of discourse…

  8. A specific scenario for the origin of life and the genetic code based on peptide/oligonucleotide interdependence.

    PubMed

    Griffith, Robert W

    2009-12-01

    Among various scenarios that attempt to explain how life arose, the RNA world is currently the most widely accepted scientific hypothesis among biologists. However, the RNA world is logistically implausible and doesn't explain how translation arose and DNA became incorporated into living systems. Here I propose an alternative hypothesis for life's origin based on cooperation between simple nucleic acids, peptides and lipids. Organic matter that accumulated on the prebiotic Earth segregated into phases in the ocean based on density and solubility. Synthesis of complex organic monomers and polymerization reactions occurred within a surface hydrophilic layer and at its aqueous and atmospheric interfaces. Replication of nucleic acids and translation of peptides began at the emulsified interface between hydrophobic and aqueous layers. At the core of the protobiont was a family of short nucleic acids bearing arginine's codon and anticodon that added this amino acid to pre-formed peptides. In turn, the survival and replication of nucleic acid was aided by the peptides. The arginine-enriched peptides served to sequester and transfer phosphate bond energy and acted as cohesive agents, aggregating nucleic acids and keeping them at the interface.

  9. A Specific Scenario for the Origin of Life and the Genetic Code Based on Peptide/Oligonucleotide Interdependence

    NASA Astrophysics Data System (ADS)

    Griffith, Robert W.

    2009-12-01

    Among various scenarios that attempt to explain how life arose, the RNA world is currently the most widely accepted scientific hypothesis among biologists. However, the RNA world is logistically implausible and doesn’t explain how translation arose and DNA became incorporated into living systems. Here I propose an alternative hypothesis for life’s origin based on cooperation between simple nucleic acids, peptides and lipids. Organic matter that accumulated on the prebiotic Earth segregated into phases in the ocean based on density and solubility. Synthesis of complex organic monomers and polymerization reactions occurred within a surface hydrophilic layer and at its aqueous and atmospheric interfaces. Replication of nucleic acids and translation of peptides began at the emulsified interface between hydrophobic and aqueous layers. At the core of the protobiont was a family of short nucleic acids bearing arginine’s codon and anticodon that added this amino acid to pre-formed peptides. In turn, the survival and replication of nucleic acid was aided by the peptides. The arginine-enriched peptides served to sequester and transfer phosphate bond energy and acted as cohesive agents, aggregating nucleic acids and keeping them at the interface.

  10. [Impact of demographic chance on pharmaceutical expenses in private health insurance--a scenario-based analysis].

    PubMed

    Böcking, W; Tidelski, O; Skuras, B; Bäumler, A; Kitzmann, F

    2012-08-01

    Health Insurance costs in Germany have grown constantly over the last years. This increase of costs is not only observable in the total consideration but also in all single items. An outstanding growth rate exists in the field of pharmaceutical expenses. Detailed analyses of distribution and development of these costs, separated by age and indication groups, are currently only sporadically available and mostly focusing on the Statutory Health Insurance system in Germany. This research article is based on an initial data analysis and focuses on the question how pharmaceutical expenses in a German private health insurance company will develop until the year 2050, if the observed trend of the past years continues in the same way. This analysis focuses on different age groups. The objective is the demonstration of several scenarios, which illustrate the level of influence of different parameters (demographic changes, developments of prices for pharmaceuticals). Based on the cognition of certain effects measures for handling the growing challenge of financing the health system can be deduced. As a result, both demographic changes and price effects have an significant impact on the future development of per capita pharmaceutical expenses. Whereas older age groups will still cause the highest costs, the middle-aged people will show the highest growth rates. This strong cost increase is not sustainable for the German health insurance system. In addition to previous measures of a regulatory health policy (especially improved cost-benefit-assessments) the article shows new approaches for an intensified prevention and health promotion. PMID:22872541

  11. Medical Content Searching, Retrieving, and Sharing Over the Internet: Lessons Learned From the mEducator Through a Scenario-Based Evaluation

    PubMed Central

    Spachos, Dimitris; Mylläri, Jarkko; Giordano, Daniela; Dafli, Eleni; Mitsopoulou, Evangelia; Schizas, Christos N; Pattichis, Constantinos; Nikolaidou, Maria

    2015-01-01

    Background The mEducator Best Practice Network (BPN) implemented and extended standards and reference models in e-learning to develop innovative frameworks as well as solutions that enable specialized state-of-the-art medical educational content to be discovered, retrieved, shared, and re-purposed across European Institutions, targeting medical students, doctors, educators and health care professionals. Scenario-based evaluation for usability testing, complemented with data from online questionnaires and field notes of users’ performance, was designed and utilized for the evaluation of these solutions. Objective The objective of this work is twofold: (1) to describe one instantiation of the mEducator BPN solutions (mEducator3.0 - “MEdical Education LINnked Arena” MELINA+) with a focus on the metadata schema used, as well as on other aspects of the system that pertain to usability and acceptance, and (2) to present evaluation results on the suitability of the proposed metadata schema for searching, retrieving, and sharing of medical content and with respect to the overall usability and acceptance of the system from the target users. Methods A comprehensive evaluation methodology framework was developed and applied to four case studies, which were conducted in four different countries (ie, Greece, Cyprus, Bulgaria and Romania), with a total of 126 participants. In these case studies, scenarios referring to creating, sharing, and retrieving medical educational content using mEducator3.0 were used. The data were collected through two online questionnaires, consisting of 36 closed-ended questions and two open-ended questions that referred to mEducator 3.0 and through the use of field notes during scenario-based evaluations. Results The main findings of the study showed that even though the informational needs of the mEducator target groups were addressed to a satisfactory extent and the metadata schema supported content creation, sharing, and retrieval from an end

  12. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project. [Assessment of post-closure performance for a proposed repository for high-level nuclear waste

    SciTech Connect

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab.

  13. Using an Animated Case Scenario Based on Constructivist 5E Model to Enhance Pre-Service Teachers' Awareness of Electrical Safety

    ERIC Educational Resources Information Center

    Hirca, Necati

    2013-01-01

    The objective of this study is to get pre-service teachers to develop an awareness of first aid knowledge and skills related to electrical shocking and safety within a scenario based animation based on a Constructivist 5E model. The sample of the study was composed of 78 (46 girls and 32 boys) pre-service classroom teachers from two faculties of…

  14. Performance Based Education: A Social Alchemy.

    ERIC Educational Resources Information Center

    Clements, Millard

    1982-01-01

    An exploration of performance-based education is focused through these questions: What image of human beings does it project? What image of professionals does it project? What purpose does it serve? What image of knowledge does it project? (CT)

  15. Moving from pixels to parcels: Modeling agricultural scenarios in the northern Great Plains using a hybrid raster- and vector-based approach

    NASA Astrophysics Data System (ADS)

    Sohl, T.; Wika, S.; Dornbierer, J.; Sayler, K. L.; Quenzer, R.

    2015-12-01

    Policy and economic driving forces have resulted in a higher demand for biofuel feedstocks in recent years, resulting in substantial increases in cultivated cropland in the northern Great Plains. A cellulosic-based biofuel industry could potentially further impact the region, with grassland and marginal agricultural land converted to perennial grasses or other feedstocks. Scenarios of projected land-use change are needed to enable regional stakeholders to plan for the potential consequences of expanded agricultural activity. Land-use models used to produce spatially explicit scenarios are typically raster-based and are poor at representing ownership units on which land-use change is based. This work describes a hybrid raster/vector-based modeling approach for modeling scenarios of agricultural change in the northern Great Plains. Regional scenarios of agricultural change from 2012 to 2050 were constructed, based partly on the U.S. Department of Energy's Billion Ton Update. Land-use data built from the 2012 Cropland Data Layer and the 2011 National Land Cover Database was used to establish initial conditions. Field boundaries from the U.S. Department of Agriculture's Common Land Unit dataset were used to establish ownership units. A modified version of the U.S. Geological Survey's Forecasting Scenarios of land-use (FORE-SCE) model was used to ingest vector-based field boundaries to facilitate the modeling of a farmer's choice of land use for a given year, while patch-based raster methodologies were used to represent expansion of urban/developed lands and other land use conversions. All modeled data were merged to a common raster dataset representing annual land use from 2012 to 2050. The hybrid modeling approach enabled the use of traditional, raster-based methods while integrating vector-based data to represent agricultural fields and other ownership-based units upon which land-use decisions are typically made.

  16. Combined magnetic and kinetic control of advanced tokamak steady state scenarios based on semi-empirical modelling

    NASA Astrophysics Data System (ADS)

    Moreau, D.; Artaud, J. F.; Ferron, J. R.; Holcomb, C. T.; Humphreys, D. A.; Liu, F.; Luce, T. C.; Park, J. M.; Prater, R.; Turco, F.; Walker, M. L.

    2015-06-01

    This paper shows that semi-empirical data-driven models based on a two-time-scale approximation for the magnetic and kinetic control of advanced tokamak (AT) scenarios can be advantageously identified from simulated rather than real data, and used for control design. The method is applied to the combined control of the safety factor profile, q(x), and normalized pressure parameter, βN, using DIII-D parameters and actuators (on-axis co-current neutral beam injection (NBI) power, off-axis co-current NBI power, electron cyclotron current drive power, and ohmic coil). The approximate plasma response model was identified from simulated open-loop data obtained using a rapidly converging plasma transport code, METIS, which includes an MHD equilibrium and current diffusion solver, and combines plasma transport nonlinearity with 0D scaling laws and 1.5D ordinary differential equations. The paper discusses the results of closed-loop METIS simulations, using the near-optimal ARTAEMIS control algorithm (Moreau D et al 2013 Nucl. Fusion 53 063020) for steady state AT operation. With feedforward plus feedback control, the steady state target q-profile and βN are satisfactorily tracked with a time scale of about 10 s, despite large disturbances applied to the feedforward powers and plasma parameters. The robustness of the control algorithm with respect to disturbances of the H&CD actuators and of plasma parameters such as the H-factor, plasma density and effective charge, is also shown.

  17. A scenario for solar wind penetration of earth's magnetic tail based on ion composition data from the ISEE 1 spacecraft

    NASA Technical Reports Server (NTRS)

    Lennartsson, W.

    1992-01-01

    Based on He(2+) and H(-) ion composition data from the Plasma Composition Experiment on ISEE 1, a scenario is proposed for the solar wind penetration of the earth's magnetic tail, which does not require that the solar wind plasma be magnetized. While this study does not take issue with the notion that earth's magnetic field merges with the solar wind magnetic field on a regular basis, it focuses on certain aspects of interaction between the solar wind particles and the earth's field, e.g, the fact that the geomagnetic tail always has a plasma sheet, even during times when the physical signs of magnetic merging are weak or absent. It is argued that the solar plasma enters along slots between the tail lobes and the plasma sheet, even quite close to earth, convected inward along the plasma sheet boundary layer or adjacent to it, by the electric fringe field of the ever present low-latitude magnetopause boundary layer (LLBL). The required E x B drifts are produced by closing LLBL equipotential surfaces through the plasma sheet.

  18. Variability of tsunami inundation footprints considering stochastic scenarios based on a single rupture model: Application to the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Goda, Katsuichiro; Yasuda, Tomohiro; Mori, Nobuhito; Mai, P. Martin

    2015-06-01

    The sensitivity and variability of spatial tsunami inundation footprints in coastal cities and towns due to a megathrust subduction earthquake in the Tohoku region of Japan are investigated by considering different fault geometry and slip distributions. Stochastic tsunami scenarios are generated based on the spectral analysis and synthesis method with regards to an inverted source model. To assess spatial inundation processes accurately, tsunami modeling is conducted using bathymetry and elevation data with 50 m grid resolutions. Using the developed methodology for assessing variability of tsunami hazard estimates, stochastic inundation depth maps can be generated for local coastal communities. These maps are important for improving disaster preparedness by understanding the consequences of different situations/conditions, and by communicating uncertainty associated with hazard predictions. The analysis indicates that the sensitivity of inundation areas to the geometrical parameters (i.e., top-edge depth, strike, and dip) depends on the tsunami source characteristics and the site location, and is therefore complex and highly nonlinear. The variability assessment of inundation footprints indicates significant influence of slip distributions. In particular, topographical features of the region, such as ria coast and near-shore plain, have major influence on the tsunami inundation footprints.

  19. Preliminary Safety Analysis of the Gorleben Site: Safety Concept and Application to Scenario Development Based on a Site-Specific Features, Events and Processes (FEP) Database - 13304

    SciTech Connect

    Moenig, Joerg; Beuth, Thomas; Wolf, Jens; Lommerzheim, Andre; Mrugalla, Sabine

    2013-07-01

    Based upon the German safety criteria, released in 2010 by the Federal Ministry of the Environment (BMU), a safety concept and a safety assessment concept for the disposal of heat-generating high-level waste have both been developed in the framework of the preliminary safety case for the Gorleben site (Project VSG). The main objective of the disposal is to contain the radioactive waste inside a defined rock zone, which is called containment-providing rock zone. The radionuclides shall remain essentially at the emplacement site, and at the most, a small defined quantity of material shall be able to leave this rock zone. This shall be accomplished by the geological barrier and a technical barrier system, which is required to seal the inevitable penetration of the geological barrier by the construction of the mine. The safe containment has to be demonstrated for probable and less probable evolutions of the site, while evolutions with very low probability (less than 1 % over the demonstration period of 1 million years) need not to be considered. Owing to the uncertainty in predicting the real evolution of the site, plausible scenarios have been derived in a systematic manner. Therefore, a comprehensive site-specific features, events and processes (FEP) data base for the Gorleben site has been developed. The safety concept was directly taken into account, e.g. by identification of FEP with direct influence on the barriers that provide the containment. No effort was spared to identify the interactions of the FEP, their probabilities of occurrence, and their characteristics (values). The information stored in the data base provided the basis for the development of scenarios. The scenario development methodology is based on FEP related to an impairment of the functionality of a subset of barriers, called initial barriers. By taking these FEP into account in their probable characteristics the reference scenario is derived. Thus, the reference scenario describes a

  20. Enhanced Confinement Scenarios Without Large Edge Localized Modes in Tokamaks: Control, Performance, and Extrapolability Issues for ITER

    SciTech Connect

    Maingi, R

    2014-07-01

    Large edge localized modes (ELMs) typically accompany good H-mode confinement in fusion devices, but can present problems for plasma facing components because of high transient heat loads. Here the range of techniques for ELM control deployed in fusion devices is reviewed. The two baseline strategies in the ITER baseline design are emphasized: rapid ELM triggering and peak heat flux control via pellet injection, and the use of magnetic perturbations to suppress or mitigate ELMs. While both of these techniques are moderately well developed, with reasonable physical bases for projecting to ITER, differing observations between multiple devices are also discussed to highlight the needed community R & D. In addition, recent progress in ELM-free regimes, namely Quiescent H-mode, I-mode, and Enhanced Pedestal H-mode is reviewed, and open questions for extrapolability are discussed. Finally progress and outstanding issues in alternate ELM control techniques are reviewed: supersonic molecular beam injection, edge electron cyclotron heating, lower hybrid heating and/or current drive, controlled periodic jogs of the vertical centroid position, ELM pace-making via periodic magnetic perturbations, ELM elimination with lithium wall conditioning, and naturally occurring small ELM regimes.

  1. Enhanced confinement scenarios without large edge localized modes in tokamaks: control, performance, and extrapolability issues for ITER

    NASA Astrophysics Data System (ADS)

    Maingi, R.

    2014-11-01

    Large edge localized modes (ELMs) typically accompany good H-mode confinement in fusion devices, but can present problems for plasma facing components because of high transient heat loads. Here the range of techniques for ELM control deployed in fusion devices is reviewed. Two strategies in the ITER baseline design are emphasized: rapid ELM triggering and peak heat flux control via pellet injection, and the use of magnetic perturbations to suppress or mitigate ELMs. While both of these techniques are moderately well developed, with reasonable physical bases for projecting to ITER, differing observations between multiple devices are also discussed to highlight the needed community R&D. In addition, recent progress in ELM-free regimes, namely quiescent H-mode, I-mode, and enhanced pedestal H-mode is reviewed, and open questions for extrapolability are discussed. Finally progress and outstanding issues in alternate ELM control techniques are reviewed: supersonic molecular beam injection, edge electron cyclotron heating, lower hybrid heating and/or current drive, controlled periodic jogs of the vertical centroid position, ELM pace-making via periodic magnetic perturbations, ELM elimination with lithium wall conditioning, and naturally occurring small ELM regimes.

  2. Columnar modelling of nucleation burst evolution in the convective boundary layer - first results from a feasibility study Part III: Preliminary results on physicochemical model performance using two "clean air mass" reference scenarios

    NASA Astrophysics Data System (ADS)

    Hellmuth, O.

    2006-09-01

    In Paper I of four papers, a revised columnar high-order model to investigate gas-aerosol-turbulence interactions in the convective boundary layer (CBL) was proposed. In Paper II, the model capability to predict first-, second- and third-order moments of meteorological variables in the CBL was demonstrated using available observational data. In the present Paper III, the high-order modelling concept is extended to sulphur and ammonia chemistry as well as to aerosol dynamics. Based on the previous CBL simulation, a feasibility study is performed using two "clean air mass" scenarios with an emission source at the ground but low aerosol background concentration. Such scenarios synoptically correspond to the advection of fresh post-frontal air in an anthropogenically influenced region. The aim is to evaluate the time-height evolution of ultrafine condensation nuclei (UCNs) and to elucidate the interactions between meteorological and physicochemical variables in a CBL column. The scenarios differ in the treatment of new particle formation (NPF), whereas homogeneous nucleation according to the classical nucleation theory (CNT) is considered. The first scenario considers nucleation of a binary system consisting of water vapour and sulphuric acid (H2SO4) vapour, the second one nucleation of a ternary system additionally involving ammonia (NH3). Here, the two synthetic scenarios are discussed in detail, whereas special attention is payed to the role of turbulence in the formation of the typical UCN burst behaviour, that can often be observed in the surface layer. The intercomparison of the two scenarios reveals large differences in the evolution of the UCN number concentration in the surface layer as well as in the time-height cross-sections of first-order moments and double correlation terms. Although in both cases the occurrence of NPF bursts could be simulated, the burst characteristics and genesis of the bursts are completely different. It is demonstrated, that

  3. Implementation and Analysis of a Wireless Sensor Network-Based Pet Location Monitoring System for Domestic Scenarios.

    PubMed

    Aguirre, Erik; Lopez-Iturri, Peio; Azpilicueta, Leyre; Astrain, José Javier; Villadangos, Jesús; Santesteban, Daniel; Falcone, Francisco

    2016-08-30

    The flexibility of new age wireless networks and the variety of sensors to measure a high number of variables, lead to new scenarios where anything can be monitored by small electronic devices, thereby implementing Wireless Sensor Networks (WSN). Thanks to ZigBee, RFID or WiFi networks the precise location of humans or animals as well as some biological parameters can be known in real-time. However, since wireless sensors must be attached to biological tissues and they are highly dispersive, propagation of electromagnetic waves must be studied to deploy an efficient and well-working network. The main goal of this work is to study the influence of wireless channel limitations in the operation of a specific pet monitoring system, validated at physical channel as well as at functional level. In this sense, radio wave propagation produced by ZigBee devices operating at the ISM 2.4 GHz band is studied through an in-house developed 3D Ray Launching simulation tool, in order to analyze coverage/capacity relations for the optimal system selection as well as deployment strategy in terms of number of transceivers and location. Furthermore, a simplified dog model is developed for simulation code, considering not only its morphology but also its dielectric properties. Relevant wireless channel information such as power distribution, power delay profile and delay spread graphs are obtained providing an extensive wireless channel analysis. A functional dog monitoring system is presented, operating over the implemented ZigBee network and providing real time information to Android based devices. The proposed system can be scaled in order to consider different types of domestic pets as well as new user based functionalities.

  4. Implementation and Analysis of a Wireless Sensor Network-Based Pet Location Monitoring System for Domestic Scenarios

    PubMed Central

    Aguirre, Erik; Lopez-Iturri, Peio; Azpilicueta, Leyre; Astrain, José Javier; Villadangos, Jesús; Santesteban, Daniel; Falcone, Francisco

    2016-01-01

    The flexibility of new age wireless networks and the variety of sensors to measure a high number of variables, lead to new scenarios where anything can be monitored by small electronic devices, thereby implementing Wireless Sensor Networks (WSN). Thanks to ZigBee, RFID or WiFi networks the precise location of humans or animals as well as some biological parameters can be known in real-time. However, since wireless sensors must be attached to biological tissues and they are highly dispersive, propagation of electromagnetic waves must be studied to deploy an efficient and well-working network. The main goal of this work is to study the influence of wireless channel limitations in the operation of a specific pet monitoring system, validated at physical channel as well as at functional level. In this sense, radio wave propagation produced by ZigBee devices operating at the ISM 2.4 GHz band is studied through an in-house developed 3D Ray Launching simulation tool, in order to analyze coverage/capacity relations for the optimal system selection as well as deployment strategy in terms of number of transceivers and location. Furthermore, a simplified dog model is developed for simulation code, considering not only its morphology but also its dielectric properties. Relevant wireless channel information such as power distribution, power delay profile and delay spread graphs are obtained providing an extensive wireless channel analysis. A functional dog monitoring system is presented, operating over the implemented ZigBee network and providing real time information to Android based devices. The proposed system can be scaled in order to consider different types of domestic pets as well as new user based functionalities. PMID:27589751

  5. Implementation and Analysis of a Wireless Sensor Network-Based Pet Location Monitoring System for Domestic Scenarios.

    PubMed

    Aguirre, Erik; Lopez-Iturri, Peio; Azpilicueta, Leyre; Astrain, José Javier; Villadangos, Jesús; Santesteban, Daniel; Falcone, Francisco

    2016-01-01

    The flexibility of new age wireless networks and the variety of sensors to measure a high number of variables, lead to new scenarios where anything can be monitored by small electronic devices, thereby implementing Wireless Sensor Networks (WSN). Thanks to ZigBee, RFID or WiFi networks the precise location of humans or animals as well as some biological parameters can be known in real-time. However, since wireless sensors must be attached to biological tissues and they are highly dispersive, propagation of electromagnetic waves must be studied to deploy an efficient and well-working network. The main goal of this work is to study the influence of wireless channel limitations in the operation of a specific pet monitoring system, validated at physical channel as well as at functional level. In this sense, radio wave propagation produced by ZigBee devices operating at the ISM 2.4 GHz band is studied through an in-house developed 3D Ray Launching simulation tool, in order to analyze coverage/capacity relations for the optimal system selection as well as deployment strategy in terms of number of transceivers and location. Furthermore, a simplified dog model is developed for simulation code, considering not only its morphology but also its dielectric properties. Relevant wireless channel information such as power distribution, power delay profile and delay spread graphs are obtained providing an extensive wireless channel analysis. A functional dog monitoring system is presented, operating over the implemented ZigBee network and providing real time information to Android based devices. The proposed system can be scaled in order to consider different types of domestic pets as well as new user based functionalities. PMID:27589751

  6. Climate Change Effects on Heat- and Cold-Related Mortality in the Netherlands: A Scenario-Based Integrated Environmental Health Impact Assessment

    PubMed Central

    Huynen, Maud M. T. E.; Martens, Pim

    2015-01-01

    Although people will most likely adjust to warmer temperatures, it is still difficult to assess what this adaptation will look like. This scenario-based integrated health impacts assessment explores baseline (1981–2010) and future (2050) population attributable fractions (PAF) of mortality due to heat (PAFheat) and cold (PAFcold), by combining observed temperature–mortality relationships with the Dutch KNMI’14 climate scenarios and three adaptation scenarios. The 2050 model results without adaptation reveal a decrease in PAFcold (8.90% at baseline; 6.56%–7.85% in 2050) that outweighs the increase in PAFheat (1.15% at baseline; 1.66%–2.52% in 2050). When the 2050 model runs applying the different adaptation scenarios are considered as well, however, the PAFheat ranges between 0.94% and 2.52% and the PAFcold between 6.56% and 9.85%. Hence, PAFheat and PAFcold can decrease as well as increase in view of climate change (depending on the adaptation scenario). The associated annual mortality burdens in 2050—accounting for both the increasing temperatures and mortality trend—show that heat-related deaths will range between 1879 and 5061 (1511 at baseline) and cold-related deaths between 13,149 and 19,753 (11,727 at baseline). Our results clearly illustrate that model outcomes are not only highly dependent on climate scenarios, but also on adaptation assumptions. Hence, a better understanding of (the impact of various) plausible adaptation scenarios is required to advance future integrated health impact assessments. PMID:26512680

  7. Source-Based Modeling Of Urban Stormwater Quality Response to the Selected Scenarios Combining Future Changes in Climate and Socio-Economic Factors

    NASA Astrophysics Data System (ADS)

    Borris, Matthias; Leonhardt, Günther; Marsalek, Jiri; Österlund, Heléne; Viklander, Maria

    2016-08-01

    The assessment of future trends in urban stormwater quality should be most helpful for ensuring the effectiveness of the existing stormwater quality infrastructure in the future and mitigating the associated impacts on receiving waters. Combined effects of expected changes in climate and socio-economic factors on stormwater quality were examined in two urban test catchments by applying a source-based computer model (WinSLAMM) for TSS and three heavy metals (copper, lead, and zinc) for various future scenarios. Generally, both catchments showed similar responses to the future scenarios and pollutant loads were generally more sensitive to changes in socio-economic factors (i.e., increasing traffic intensities, growth and intensification of the individual land-uses) than in the climate. Specifically, for the selected Intermediate socio-economic scenario and two climate change scenarios (RSP = 2.6 and 8.5), the TSS loads from both catchments increased by about 10 % on average, but when applying the Intermediate climate change scenario (RCP = 4.5) for two SSPs, the Sustainability and Security scenarios (SSP1 and SSP3), the TSS loads increased on average by 70 %. Furthermore, it was observed that well-designed and maintained stormwater treatment facilities targeting local pollution hotspots exhibited the potential to significantly improve stormwater quality, however, at potentially high costs. In fact, it was possible to reduce pollutant loads from both catchments under the future Sustainability scenario (on average, e.g., TSS were reduced by 20 %), compared to the current conditions. The methodology developed in this study was found useful for planning climate change adaptation strategies in the context of local conditions.

  8. Source-Based Modeling Of Urban Stormwater Quality Response to the Selected Scenarios Combining Future Changes in Climate and Socio-Economic Factors.

    PubMed

    Borris, Matthias; Leonhardt, Günther; Marsalek, Jiri; Österlund, Heléne; Viklander, Maria

    2016-08-01

    The assessment of future trends in urban stormwater quality should be most helpful for ensuring the effectiveness of the existing stormwater quality infrastructure in the future and mitigating the associated impacts on receiving waters. Combined effects of expected changes in climate and socio-economic factors on stormwater quality were examined in two urban test catchments by applying a source-based computer model (WinSLAMM) for TSS and three heavy metals (copper, lead, and zinc) for various future scenarios. Generally, both catchments showed similar responses to the future scenarios and pollutant loads were generally more sensitive to changes in socio-economic factors (i.e., increasing traffic intensities, growth and intensification of the individual land-uses) than in the climate. Specifically, for the selected Intermediate socio-economic scenario and two climate change scenarios (RSP = 2.6 and 8.5), the TSS loads from both catchments increased by about 10 % on average, but when applying the Intermediate climate change scenario (RCP = 4.5) for two SSPs, the Sustainability and Security scenarios (SSP1 and SSP3), the TSS loads increased on average by 70 %. Furthermore, it was observed that well-designed and maintained stormwater treatment facilities targeting local pollution hotspots exhibited the potential to significantly improve stormwater quality, however, at potentially high costs. In fact, it was possible to reduce pollutant loads from both catchments under the future Sustainability scenario (on average, e.g., TSS were reduced by 20 %), compared to the current conditions. The methodology developed in this study was found useful for planning climate change adaptation strategies in the context of local conditions.

  9. Source-Based Modeling Of Urban Stormwater Quality Response to the Selected Scenarios Combining Future Changes in Climate and Socio-Economic Factors.

    PubMed

    Borris, Matthias; Leonhardt, Günther; Marsalek, Jiri; Österlund, Heléne; Viklander, Maria

    2016-08-01

    The assessment of future trends in urban stormwater quality should be most helpful for ensuring the effectiveness of the existing stormwater quality infrastructure in the future and mitigating the associated impacts on receiving waters. Combined effects of expected changes in climate and socio-economic factors on stormwater quality were examined in two urban test catchments by applying a source-based computer model (WinSLAMM) for TSS and three heavy metals (copper, lead, and zinc) for various future scenarios. Generally, both catchments showed similar responses to the future scenarios and pollutant loads were generally more sensitive to changes in socio-economic factors (i.e., increasing traffic intensities, growth and intensification of the individual land-uses) than in the climate. Specifically, for the selected Intermediate socio-economic scenario and two climate change scenarios (RSP = 2.6 and 8.5), the TSS loads from both catchments increased by about 10 % on average, but when applying the Intermediate climate change scenario (RCP = 4.5) for two SSPs, the Sustainability and Security scenarios (SSP1 and SSP3), the TSS loads increased on average by 70 %. Furthermore, it was observed that well-designed and maintained stormwater treatment facilities targeting local pollution hotspots exhibited the potential to significantly improve stormwater quality, however, at potentially high costs. In fact, it was possible to reduce pollutant loads from both catchments under the future Sustainability scenario (on average, e.g., TSS were reduced by 20 %), compared to the current conditions. The methodology developed in this study was found useful for planning climate change adaptation strategies in the context of local conditions. PMID:27153819

  10. A Study of the Competency of Third Year Medical Students to Interpret Biochemically Based Clinical Scenarios Using Knowledge and Skills Gained in Year 1 and 2

    ERIC Educational Resources Information Center

    Gowda, Veena Bhaskar S.; Nagaiah, Bhaskar Hebbani; Sengodan, Bharathi

    2016-01-01

    Medical students build clinical knowledge on the grounds of previously obtained basic knowledge. The study aimed to evaluate the competency of third year medical students to interpret biochemically based clinical scenarios using knowledge and skills gained during year 1 and 2 of undergraduate medical training. Study was conducted on year 3 MBBS…

  11. Driver performance-based assessment of thermal display degradation effects

    NASA Astrophysics Data System (ADS)

    Ruffner, John W.; Massimi, Michael S.; Choi, Yoon S.; Ferrett, Donald A.

    1998-07-01

    The Driver's Vision Enhancer (DVE) is a thermal sensor and display combination currently being procured for use in U.S. Army combat and tactical wheeled vehicles. During the DVE production process, a given number of sensor or display pixels may either vary from the desired luminance values (nonuniform) or be inactive (nonresponsive). The amount and distribution of pixel luminance nonuniformity (NU) and nonresponsivity (NR) allowable in production DVEs is a significant cost factor. No driver performance-based criteria exist for determining the maximum amount of allowable NU and NR. For safety reasons, these characteristics are specified conservatively. This paper describes an experiment to assess the effects of different levels of display NU and NR on Army drivers' ability to identify scene features and obstacles using a simulated DVE display and videotaped driving scenarios. Baseline, NU, and NR display conditions were simulated using real-time image processing techniques and a computer graphics workstation. The results indicate that there is a small, but statistically insignificant decrease in identification performance with the NU conditions tested. The pattern of the performance-based results is consistent with drivers' subjective assessments of display adequacy. The implications of the results for specifying NU and NR criteria for the DVE display are discussed.

  12. An Analysis Of The Impact Of Selected Carbon Capture And Storage Policy Scenarios On The US Fossil-Based Electric Power Sector

    SciTech Connect

    Davidson, Casie L.; Dooley, James J.; Dahowski, Robert T.; Mahasenan, N Maha

    2003-09-13

    CO2 capture and storage (CCS) is rapidly emerging as a potential key climate change mitigation option. However, as policymakers and industrial stakeholders begin the process of formulating new policy for implementing CCS technologies, participants require a tool to assess large-scale CCS deployment over a number of different possible future scenarios. This paper will analyze several scenarios using two state-of-the-art Battelle developed models, the MiniCAM and the CO2-GIS for examining CCS deployment. Outputs include the total amount of CO2 captured, total annual emissions, and fossil-based generating capacity.

  13. The Scenario Model Intercomparison Project (ScenarioMIP) for CMIP6

    NASA Astrophysics Data System (ADS)

    O'Neill, Brian C.; Tebaldi, Claudia; van Vuuren, Detlef P.; Eyring, Veronika; Friedlingstein, Pierre; Hurtt, George; Knutti, Reto; Kriegler, Elmar; Lamarque, Jean-Francois; Lowe, Jason; Meehl, Gerald A.; Moss, Richard; Riahi, Keywan; Sanderson, Benjamin M.

    2016-09-01

    Projections of future climate change play a fundamental role in improving understanding of the climate system as well as characterizing societal risks and response options. The Scenario Model Intercomparison Project (ScenarioMIP) is the primary activity within Phase 6 of the Coupled Model Intercomparison Project (CMIP6) that will provide multi-model climate projections based on alternative scenarios of future emissions and land use changes produced with integrated assessment models. In this paper, we describe ScenarioMIP's objectives, experimental design, and its relation to other activities within CMIP6. The ScenarioMIP design is one component of a larger scenario process that aims to facilitate a wide range of integrated studies across the climate science, integrated assessment modeling, and impacts, adaptation, and vulnerability communities, and will form an important part of the evidence base in the forthcoming Intergovernmental Panel on Climate Change (IPCC) assessments. At the same time, it will provide the basis for investigating a number of targeted science and policy questions that are especially relevant to scenario-based analysis, including the role of specific forcings such as land use and aerosols, the effect of a peak and decline in forcing, the consequences of scenarios that limit warming to below 2 °C, the relative contributions to uncertainty from scenarios, climate models, and internal variability, and long-term climate system outcomes beyond the 21st century. To serve this wide range of scientific communities and address these questions, a design has been identified consisting of eight alternative 21st century scenarios plus one large initial condition ensemble and a set of long-term extensions, divided into two tiers defined by relative priority. Some of these scenarios will also provide a basis for variants planned to be run in other CMIP6-Endorsed MIPs to investigate questions related to specific forcings. Harmonized, spatially explicit

  14. Economic-based projections of future land use in the conterminous United States under alternative policy scenarios.

    PubMed

    Radeloff, V C; Nelson, E; Plantinga, A J; Lewis, D J; Helmers, D; Lawler, J J; Withey, J C; Beaudry, F; Martinuzzi, S; Butsic, V; Lonsdorf, E; White, D; Polasky, S

    2012-04-01

    Land-use change significantly contributes to biodiversity loss, invasive species spread, changes in biogeochemical cycles, and the loss of ecosystem services. Planning for a sustainable future requires a thorough understanding of expected land use at the fine spatial scales relevant for modeling many ecological processes and at dimensions appropriate for regional or national-level policy making. Our goal was to construct and parameterize an econometric model of land-use change to project future land use to the year 2051 at a fine spatial scale across the conterminous United States under several alternative land-use policy scenarios. We parameterized the econometric model of land-use change with the National Resource Inventory (NRI) 1992 and 1997 land-use data for 844 000 sample points. Land-use transitions were estimated for five land-use classes (cropland, pasture, range, forest, and urban). We predicted land-use change under four scenarios: business-as-usual, afforestation, removal of agricultural subsidies, and increased urban rents. Our results for the business-as-usual scenario showed widespread changes in land use, affecting 36% of the land area of the conterminous United States, with large increases in urban land (79%) and forest (7%), and declines in cropland (-16%) and pasture (-13%). Areas with particularly high rates of land-use change included the larger Chicago area, parts of the Pacific Northwest, and the Central Valley of California. However, while land-use change was substantial, differences in results among the four scenarios were relatively minor. The only scenario that was markedly different was the afforestation scenario, which resulted in an increase of forest area that was twice as high as the business-as-usual scenario. Land-use policies can affect trends, but only so much. The basic economic and demographic factors shaping land-use changes in the United States are powerful, and even fairly dramatic policy changes, showed only moderate

  15. High performance pitch-based carbon fiber

    SciTech Connect

    Tadokoro, Hiroyuki; Tsuji, Nobuyuki; Shibata, Hirotaka; Furuyama, Masatoshi

    1996-12-31

    The high performance pitch-based carbon fiber with smaller diameter, six micro in developed by Nippon Graphite Fiber Corporation. This fiber possesses high tensile modulus, high tensile strength, excellent yarn handle ability, low thermal expansion coefficient, and high thermal conductivity which make it an ideal material for space applications such as artificial satellites. Performance of this fiber as a reinforcement of composites was sufficient. With these characteristics, this pitch-based carbon fiber is expected to find wide variety of possible applications in space structures, industrial field, sporting goods and civil infrastructures.

  16. Calculation of lifetime lung cancer risks associated with radon exposure, based on various models and exposure scenarios.

    PubMed

    Hunter, Nezahat; Muirhead, Colin R; Bochicchio, Francesco; Haylock, Richard G E

    2015-09-01

    The risk of lung cancer mortality up to 75 years of age due to radon exposure has been estimated for both male and female continuing, ex- and never-smokers, based on various radon risk models and exposure scenarios. We used risk models derived from (i) the BEIR VI analysis of cohorts of radon-exposed miners, (ii) cohort and nested case-control analyses of a European cohort of uranium miners and (iii) the joint analysis of European residential radon case-control studies. Estimates of the lifetime lung cancer risk due to radon varied between these models by just over a factor of 2 and risk estimates based on models from analyses of European uranium miners exposed at comparatively low rates and of people exposed to radon in homes were broadly compatible. For a given smoking category, there was not much difference in lifetime lung cancer risk between males and females. The estimated lifetime risk of radon-induced lung cancer for exposure to a concentration of 200 Bq m(-3) was in the range 2.98-6.55% for male continuing smokers and 0.19-0.42% for male never-smokers, depending on the model used and assuming a multiplicative relationship for the joint effect of radon and smoking. Stopping smoking at age 50 years decreases the lifetime risk due to radon by around a half relative to continuing smoking, but the risk for ex-smokers remains about a factor of 5-7 higher than that for never-smokers. Under a sub-multiplicative model for the joint effect of radon and smoking, the lifetime risk of radon-induced lung cancer was still estimated to be substantially higher for continuing smokers than for never smokers. Radon mitigation-used to reduce radon concentrations at homes-can also have a substantial impact on lung cancer risk, even for persons in their 50 s; for each of continuing smokers, ex-smokers and never-smokers, radon mitigation at age 50 would lower the lifetime risk of radon-induced lung cancer by about one-third. To maximise risk reductions, smokers in high

  17. A simplified physically-based model to calculate surface water temperature of lakes from air temperature in climate change scenarios

    NASA Astrophysics Data System (ADS)

    Piccolroaz, S.; Toffolon, M.

    2012-12-01

    Modifications of water temperature are crucial for the ecology of lakes, but long-term analyses are not usually able to provide reliable estimations. This is particularly true for climate change studies based on Global Circulation Models, whose mesh size is normally too coarse for explicitly including even some of the biggest lakes on Earth. On the other hand, modeled predictions of air temperature changes are more reliable, and long-term, high-resolution air temperature observational datasets are more available than water temperature measurements. For these reasons, air temperature series are often used to obtain some information about the surface temperature of water bodies. In order to do that, it is common to exploit regression models, but they are questionable especially when it is necessary to extrapolate current trends beyond maximum (or minimum) measured temperatures. Moreover, water temperature is influenced by a variety of processes of heat exchange across the lake surface and by the thermal inertia of the water mass, which also causes an annual hysteresis cycle between air and water temperatures that is hard to consider in regressions. In this work we propose a simplified, physically-based model for the estimation of the epilimnetic temperature in lakes. Starting from the zero-dimensional heat budget, we derive a simplified first-order differential equation for water temperature, primarily forced by a seasonally varying external term (mainly related to solar radiation) and an exchange term explicitly depending on the difference between air and water temperatures. Assuming annual sinusoidal cycles of the main heat flux components at the atmosphere-lake interface, eight parameters (some of them can be disregarded, though) are identified, which can be calibrated if two temporal series of air and water temperature are available. We note that such a calibration is supported by the physical interpretation of the parameters, which provide good initial

  18. Calculation of lifetime lung cancer risks associated with radon exposure, based on various models and exposure scenarios.

    PubMed

    Hunter, Nezahat; Muirhead, Colin R; Bochicchio, Francesco; Haylock, Richard G E

    2015-09-01

    The risk of lung cancer mortality up to 75 years of age due to radon exposure has been estimated for both male and female continuing, ex- and never-smokers, based on various radon risk models and exposure scenarios. We used risk models derived from (i) the BEIR VI analysis of cohorts of radon-exposed miners, (ii) cohort and nested case-control analyses of a European cohort of uranium miners and (iii) the joint analysis of European residential radon case-control studies. Estimates of the lifetime lung cancer risk due to radon varied between these models by just over a factor of 2 and risk estimates based on models from analyses of European uranium miners exposed at comparatively low rates and of people exposed to radon in homes were broadly compatible. For a given smoking category, there was not much difference in lifetime lung cancer risk between males and females. The estimated lifetime risk of radon-induced lung cancer for exposure to a concentration of 200 Bq m(-3) was in the range 2.98-6.55% for male continuing smokers and 0.19-0.42% for male never-smokers, depending on the model used and assuming a multiplicative relationship for the joint effect of radon and smoking. Stopping smoking at age 50 years decreases the lifetime risk due to radon by around a half relative to continuing smoking, but the risk for ex-smokers remains about a factor of 5-7 higher than that for never-smokers. Under a sub-multiplicative model for the joint effect of radon and smoking, the lifetime risk of radon-induced lung cancer was still estimated to be substantially higher for continuing smokers than for never smokers. Radon mitigation-used to reduce radon concentrations at homes-can also have a substantial impact on lung cancer risk, even for persons in their 50 s; for each of continuing smokers, ex-smokers and never-smokers, radon mitigation at age 50 would lower the lifetime risk of radon-induced lung cancer by about one-third. To maximise risk reductions, smokers in high

  19. So These Numbers Really Mean Something? A Role Playing Scenario-Based Approach to the Undergraduate Instrumental Analysis Laboratory

    ERIC Educational Resources Information Center

    Grannas, Amanda M.; Lagalante, Anthony F.

    2010-01-01

    A new curricular approach in our undergraduate second-year instrumental analysis laboratory was implemented. Students work collaboratively on scenarios in diverse fields including pharmaceuticals, forensics, gemology, art conservation, and environmental chemistry. Each laboratory section (approximately 12 students) is divided into three groups…

  20. A DYNAMIC PHYSIOLOGICALLY-BASED TOXICOKINETIC (DPBTK) MODEL FOR SIMULATION OF COMPLEX TOLUENE EXPOSURE SCENARIOS IN HUMANS

    EPA Science Inventory

    A GENERAL PHYSIOLOGICAL AND TOXICOKINETIC (GPAT) MODEL FOR SIMULATION OF COMPLEX TOLUENE EXPOSURE SCENARIOS IN HUMANS. E M Kenyon1, T Colemen2, C R Eklund1 and V A Benignus3. 1U.S. EPA, ORD, NHEERL, ETD, PKB, RTP, NC, USA; 2Biological Simulators, Inc., Jackson MS, USA, 3U.S. EP...

  1. Simulation-Based Assessment to Evaluate Cognitive Performance in an Anesthesiology Residency Program

    PubMed Central

    Sidi, Avner; Baslanti, Tezcan Ozrazgat; Gravenstein, Nikolaus; Lampotang, Samsun

    2014-01-01

    Background Problem solving in a clinical context requires knowledge and experience, and most traditional examinations for learners do not capture skills that are required in some situations where there is uncertainty about the proper course of action. Objective We sought to evaluate anesthesiology residents for deficiencies in cognitive performance within and across 3 clinical domains (operating room, trauma, and cardiac resuscitation) using simulation-based assessment. Methods Individual basic knowledge and cognitive performance in each simulation-based scenario were assessed in 47 residents using a 15- to 29-item scenario-specific checklist. For every scenario and item we calculated group error scenario rate (frequency) and individual (resident) item success. For all analyses, alpha was designated as 0.05. Results Postgraduate year (PGY)-3 and PGY-4 residents' cognitive items error rates were higher and success rates lower compared to basic and technical performance in each domain tested (P < .05). In the trauma and resuscitation scenarios, the cognitive error rate by PGY-4 residents was fairly high (0.29–0.5) and their cognitive success rate was low (0.5–0.68). The most common cognitive errors were anchoring, availability bias, premature closure, and confirmation bias. Conclusions Simulation-based assessment can differentiate between higher-order (cognitive) and lower-order (basic and technical) skills expected of relatively experienced (PGY-3 and PGY-4) anesthesiology residents. Simulation-based assessments can also highlight areas of relative strength and weakness in a resident group, and this information can be used to guide curricular modifications to address deficiencies in tasks requiring higher-order processing and cognition. PMID:24701316

  2. Performance-Based Evaluation and School Librarians

    ERIC Educational Resources Information Center

    Church, Audrey P.

    2015-01-01

    Evaluation of instructional personnel is standard procedure in our Pre-K-12 public schools, and its purpose is to document educator effectiveness. With Race to the Top and No Child Left Behind waivers, states are required to implement performance-based evaluations that demonstrate student academic progress. This three-year study describes the…

  3. Performance-Based Rewards and Work Stress

    ERIC Educational Resources Information Center

    Ganster, Daniel C.; Kiersch, Christa E.; Marsh, Rachel E.; Bowen, Angela

    2011-01-01

    Even though reward systems play a central role in the management of organizations, their impact on stress and the well-being of workers is not well understood. We review the literature linking performance-based reward systems to various indicators of employee stress and well-being. Well-controlled experiments in field settings suggest that certain…

  4. Performance-based inspection and maintenance strategies

    SciTech Connect

    Vesely, W.E.

    1995-04-01

    Performance-based inspection and maintenance strategies utilize measures of equipment performance to help guide inspection and maintenance activities. A relevant measure of performance for safety system components is component unavailability. The component unavailability can also be input into a plant risk model such as a Probabilistic Risk Assessment (PRA) to determine the associated plant risk performance. Based on the present and projected unavailability performance, or the present and projected risk performance, the effectiveness of current maintenance activities can be evaluated and this information can be used to plan future maintenance activities. A significant amount of information other than downtimes or failure times is collected or can be collected when an inspection or maintenance is conducted which can be used to estimate the component unavailability. This information generally involves observations on the condition or state of the component or component piecepart. The information can be detailed such as the amount of corrosion buildup or can be general such as the general state of the component described as {open_quotes}high degradation{close_quotes}, {open_quotes}moderate degradation{close_quotes}, or {open_quotes}low degradation{close_quotes}. Much of the information collected in maintenance logs is qualitative and fuzzy. As part of an NRC Research program on performance-based engineering modeling, approaches have been developed to apply Fuzzy Set Theory to information collected on the state of the component to determine the implied component or component piecepart unavailability. Demonstrations of the applications of Fuzzy Set Theory are presented utilizing information from plant maintenance logs. The demonstrations show the power of Fuzzy Set Theory in translating engineering information to reliability and risk implications.

  5. Investigating the impact of land cover change on peak river flow in UK upland peat catchments, based on modelled scenarios

    NASA Astrophysics Data System (ADS)

    Gao, Jihui; Holden, Joseph; Kirkby, Mike

    2014-05-01

    Changes to land cover can influence the velocity of overland flow. In headwater peatlands, saturation means that overland flow is a dominant source of runoff, particularly during heavy rainfall events. Human modifications in headwater peatlands may include removal of vegetation (e.g. by erosion processes, fire, pollution, overgrazing) or pro-active revegetation of peat with sedges such as Eriophorum or mosses such as Sphagnum. How these modifications affect the river flow, and in particular the flood peak, in headwater peatlands is a key problem for land management. In particular, the impact of the spatial distribution of land cover change (e.g. different locations and sizes of land cover change area) on river flow is not clear. In this presentation a new fully distributed version of TOPMODEL, which represents the effects of distributed land cover change on river discharge, was employed to investigate land cover change impacts in three UK upland peat catchments (Trout Beck in the North Pennines, the Wye in mid-Wales and the East Dart in southwest England). Land cover scenarios with three typical land covers (i.e. Eriophorum, Sphagnum and bare peat) having different surface roughness in upland peatlands were designed for these catchments to investigate land cover impacts on river flow through simulation runs of the distributed model. As a result of hypothesis testing three land cover principles emerged from the work as follows: Principle (1): Well vegetated buffer strips are important for reducing flow peaks. A wider bare peat strip nearer to the river channel gives a higher flow peak and reduces the delay to peak; conversely, a wider buffer strip with higher density vegetation (e.g. Sphagnum) leads to a lower peak and postpones the peak. In both cases, a narrower buffer strip surrounding upstream and downstream channels has a greater effect than a thicker buffer strip just based around the downstream river network. Principle (2): When the area of change is equal

  6. POSIX and Object Distributed Storage Systems Performance Comparison Studies With Real-Life Scenarios in an Experimental Data Taking Context Leveraging OpenStack Swift & Ceph

    NASA Astrophysics Data System (ADS)

    Poat, M. D.; Lauret, J.; Betts, W.

    2015-12-01

    The STAR online computing infrastructure has become an intensive dynamic system used for first-hand data collection and analysis resulting in a dense collection of data output. As we have transitioned to our current state, inefficient, limited storage systems have become an impediment to fast feedback to online shift crews. Motivation for a centrally accessible, scalable and redundant distributed storage system had become a necessity in this environment. OpenStack Swift Object Storage and Ceph Object Storage are two eye-opening technologies as community use and development have led to success elsewhere. In this contribution, OpenStack Swift and Ceph have been put to the test with single and parallel I/O tests, emulating real world scenarios for data processing and workflows. The Ceph file system storage, offering a POSIX compliant file system mounted similarly to an NFS share was of particular interest as it aligned with our requirements and was retained as our solution. I/O performance tests were run against the Ceph POSIX file system and have presented surprising results indicating true potential for fast I/O and reliability. STAR'S online compute farm historical use has been for job submission and first hand data analysis. The goal of reusing the online compute farm to maintain a storage cluster and job submission will be an efficient use of the current infrastructure.

  7. Alternative zoning scenarios for regional sustainable land use controls in China: a knowledge-based multiobjective optimisation model.

    PubMed

    Xia, Yin; Liu, Dianfeng; Liu, Yaolin; He, Jianhua; Hong, Xiaofeng

    2014-08-28

    Alternative land use zoning scenarios provide guidance for sustainable land use controls. This study focused on an ecologically vulnerable catchment on the Loess Plateau in China, proposed a novel land use zoning model, and generated alternative zoning solutions to satisfy the various requirements of land use stakeholders and managers. This model combined multiple zoning objectives, i.e., maximum zoning suitability, maximum planning compatibility and maximum spatial compactness, with land use constraints by using goal programming technique, and employed a modified simulated annealing algorithm to search for the optimal zoning solutions. The land use zoning knowledge was incorporated into the initialisation operator and neighbourhood selection strategy of the simulated annealing algorithm to improve its efficiency. The case study indicates that the model is both effective and robust. Five optimal zoning scenarios of the study area were helpful for satisfying the requirements of land use controls in loess hilly regions, e.g., land use intensification, agricultural protection and environmental conservation.

  8. Alternative Zoning Scenarios for Regional Sustainable Land Use Controls in China: A Knowledge-Based Multiobjective Optimisation Model

    PubMed Central

    Xia, Yin; Liu, Dianfeng; Liu, Yaolin; He, Jianhua; Hong, Xiaofeng

    2014-01-01

    Alternative land use zoning scenarios provide guidance for sustainable land use controls. This study focused on an ecologically vulnerable catchment on the Loess Plateau in China, proposed a novel land use zoning model, and generated alternative zoning solutions to satisfy the various requirements of land use stakeholders and managers. This model combined multiple zoning objectives, i.e., maximum zoning suitability, maximum planning compatibility and maximum spatial compactness, with land use constraints by using goal programming technique, and employed a modified simulated annealing algorithm to search for the optimal zoning solutions. The land use zoning knowledge was incorporated into the initialisation operator and neighbourhood selection strategy of the simulated annealing algorithm to improve its efficiency. The case study indicates that the model is both effective and robust. Five optimal zoning scenarios of the study area were helpful for satisfying the requirements of land use controls in loess hilly regions, e.g., land use intensification, agricultural protection and environmental conservation. PMID:25170679

  9. Method meets application: on the use of earthquake scenarios in community-based disaster preparedness and response

    NASA Astrophysics Data System (ADS)

    Sargeant, S.; Sorensen, M. B.

    2011-12-01

    More than 50% of the world's population now live in urban areas. In less developed countries, future urban population increase will be due to natural population growth and rural-to-urban migration. As urban growth continues, the vulnerability of those living in these areas is also increasing. This presents a wide variety of challenges for humanitarian organisations that often have more experience of disaster response in rural settings rather than planning for large urban disasters. The 2010 Haiti earthquake highlighted the vulnerability of these organisations and the communities that they seek to support. To meet this challenge, a key consideration is how scientific information can support the humanitarian sector and their working practices. Here we review the current state of earthquake scenario modelling practice, with special focus on scenarios to be used in disaster response and response planning, and present an evaluation of how the field looks set to evolve. We also review current good practice and lessons learned from previous earthquakes with respect to planning for and responding to earthquakes in urban settings in the humanitarian sector, identifying key sectoral priorities. We then investigate the interface between these two areas to investigate the use of earthquake scenarios in disaster response planning and identify potential challenges both with respect to development of scientific models and their application on the ground.

  10. Performance-based asphalt mixture design methodology

    NASA Astrophysics Data System (ADS)

    Ali, Al-Hosain Mansour

    performance based design procedure. Finally, the developed guidelines with easy-to-use flow charts for the integrated mix design methodology are presented.

  11. A High Performance COTS Based Computer Architecture

    NASA Astrophysics Data System (ADS)

    Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland

    2014-08-01

    Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.

  12. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  13. Using Comprehensive Science-based Disaster Scenarios to Support Seismic Safety Policy: A Case Study in Los Angeles, California

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2014-12-01

    In 2014, the USGS entered a technical assistance agreement with the City of Los Angeles to apply the results of the 2008 ShakeOut Scenario of a M7.8 earthquake on the southern San Andreas fault to develop a comprehensive plan to increase the seismic resilience of the City. The results of this project are to be submitted to the Mayor of Los Angeles at the Great ShakeOut on October 16, 2014. The ShakeOut scenario detailed how the expected cascade of failures in a big earthquake could lead to significant delays in disaster recovery that could create financial losses that greatly exceed the direct losses in the event. The goal of the seismic resilience plan is to: protect the lives of residents during earthquakes improve the capacity of the City to respond to the earthquake prepare the City to recover quickly after the earthquake so as to protect the economy of the City and all of southern California To accomplish these goals, the project addresses three areas of seismic vulnerability that were identified in the original ShakeOut Scenario: Pre-1980 buildings that present an unacceptable risk to the lives of residents, including "non-ductile reinforced concrete," and "soft-first-story" buildings Water system infrastructure (including impact on firefighting capability) Communications infrastructure The critical science needed to support policy decisions is to understand the probable consequences to the regional long-term economy caused by decisions to undertake (or not) different levels of mitigation. The arguments against mitigation are the immediate financial costs, so a better understanding of the eventual benefit is required. However, the direct savings rarely justify the mitigation costs, so the arguments in favor of mitigation are driven by the potential for cascading failures and the potential to trigger the type of long term reduction in population and economic activity that has occurred in New Orleans since Hurricane Katrina.

  14. Designer protein-based performance materials.

    PubMed

    Kumar, Manoj; Sanford, Karl J; Cuevas, William A; Cuevas, William P; Du, Mai; Collier, Katharine D; Chow, Nicole

    2006-09-01

    Repeat sequence protein polymer (RSPP) technology provides a platform to design and make protein-based performance polymers and represents the best nature has to offer. We report here that the RSPP platform is a novel approach to produce functional protein polymers that have both biomechanical and biofunctional blocks built into one molecule by design, using peptide motifs. We have shown that protein-based designer biopolymers can be made using recombinant DNA technology and fermentation and offer the ability to screen for desired properties utilizing the tremendous potential diversity of amino acid combinations. The technology also allows for large-scale manufacturing with a favorable fermentative cost-structure to deliver commercially viable performance polymers. Using three diverse examples with antimicrobial, textile targeting, and UV-protective agent, we have introduced functional attributes into structural protein polymers and shown, for example, that the functionalized RSPPs have possible applications in biodefense, industrial biotechnology, and personal care areas. This new class of biobased materials will simulate natural biomaterials that can be modified for desired function and have many advantages over conventional petroleum-based polymers.

  15. Scenario-based modelling of mass transfer mechanisms at a petroleum contaminated field site-numerical implications.

    PubMed

    Vasudevan, M; Nambi, Indumathi M; Suresh Kumar, G

    2016-06-15

    Knowledge about distribution of dissolved plumes and their influencing factors is essential for risk assessment and remediation of light non-aqueous phase liquid contamination in groundwater. Present study deals with the applicability of numerical model for simulating various hydro-geological scenarios considering non-uniform source distribution at a petroleum contaminated site in Chennai, India. The complexity associated with the hydrogeology of the site has limited scope for on-site quantification of petroleum pipeline spillage. The change in fuel composition under mass-transfer limited conditions was predicted by simultaneously comparing deviations in aqueous concentrations and activity coefficients (between Raoult's law and analytical approaches). The effects of source migration and weathering on the dissolution of major soluble fractions of petroleum fuel were also studied in relation to the apparent change in their activity coefficients and molar fractions. The model results were compared with field observations and found that field conditions were favourable for biodegradation, especially for the aromatic fraction (benzene and toluene (nearly 95% removal), polycyclic aromatic hydrocarbons (up to 65% removal) and xylene (nearly 45% removal). The results help to differentiate the effect of compositional non-ideality from rate-limited dissolution towards tailing of less soluble compounds (alkanes and trimethylbenzene). Although the effect of non-ideality decreased with distance from the source, the assumption of spatially varying residual saturation could effectively illustrate post-spill scenario by estimating the consequent decrease in mass transfer rate.

  16. Scenario-based modelling of mass transfer mechanisms at a petroleum contaminated field site-numerical implications.

    PubMed

    Vasudevan, M; Nambi, Indumathi M; Suresh Kumar, G

    2016-06-15

    Knowledge about distribution of dissolved plumes and their influencing factors is essential for risk assessment and remediation of light non-aqueous phase liquid contamination in groundwater. Present study deals with the applicability of numerical model for simulating various hydro-geological scenarios considering non-uniform source distribution at a petroleum contaminated site in Chennai, India. The complexity associated with the hydrogeology of the site has limited scope for on-site quantification of petroleum pipeline spillage. The change in fuel composition under mass-transfer limited conditions was predicted by simultaneously comparing deviations in aqueous concentrations and activity coefficients (between Raoult's law and analytical approaches). The effects of source migration and weathering on the dissolution of major soluble fractions of petroleum fuel were also studied in relation to the apparent change in their activity coefficients and molar fractions. The model results were compared with field observations and found that field conditions were favourable for biodegradation, especially for the aromatic fraction (benzene and toluene (nearly 95% removal), polycyclic aromatic hydrocarbons (up to 65% removal) and xylene (nearly 45% removal). The results help to differentiate the effect of compositional non-ideality from rate-limited dissolution towards tailing of less soluble compounds (alkanes and trimethylbenzene). Although the effect of non-ideality decreased with distance from the source, the assumption of spatially varying residual saturation could effectively illustrate post-spill scenario by estimating the consequent decrease in mass transfer rate. PMID:27017268

  17. Creating and coupling a high-resolution DTM with a 1-D hydraulic model in a GIS for scenario-based assessment of avulsion hazard in a gravel-bed river

    NASA Astrophysics Data System (ADS)

    Aggett, G. R.; Wilson, J. P.

    2009-12-01

    In this paper we explore the development and assimilation of a high resolution topographic surface with a one-dimensional hydraulic model for investigation of avulsion hazard potential on a gravel-bed river. A detailed channel and floodplain digital terrain model (DTM) is created to define the geometry parameter required by the 1D hydraulic model HEC-RAS. The ability to extract dense and optimally located cross-sections is presented as a means to optimize HEC-RAS performance. A number of flood scenarios are then run in HEC-RAS to determine the inundation potential of modeled events, the post-processed output of which facilitates calculation of spatially explicit shear stress ( τ) and level of geomorphic work (specific stream power per unit bed area, ω) for each of these. Further enhancing this scenario-based approach, the DTM is modified to simulate a large woody debris (LWD) jam and active-channel sediment aggradation to assess impact on innundation, τ, and ω, under previously modeled flow conditions. The high resolution DTM facilitates overlay and evaluation of modeled scenario results in a spatially explicit context containing considerable detail of hydrogeomorphic and other features influencing hydraulics (bars, secondary and scour channels, levees). This offers advantages for: (i) assessing the avulsion hazard potential and spatial distribution of other hydrologic and fluvial geomorphic processes; and (ii) exploration of the potential impacts of specific management strategies on the channel, including river restoration activities.

  18. Integrating experimental and numerical methods for a scenario-based quantitative assessment of subsurface energy storage options

    NASA Astrophysics Data System (ADS)

    Kabuth, Alina; Dahmke, Andreas; Hagrey, Said Attia al; Berta, Márton; Dörr, Cordula; Koproch, Nicolas; Köber, Ralf; Köhn, Daniel; Nolde, Michael; Tilmann Pfeiffer, Wolf; Popp, Steffi; Schwanebeck, Malte; Bauer, Sebastian

    2016-04-01

    Within the framework of the transition to renewable energy sources ("Energiewende"), the German government defined the target of producing 60 % of the final energy consumption from renewable energy sources by the year 2050. However, renewable energies are subject to natural fluctuations. Energy storage can help to buffer the resulting time shifts between production and demand. Subsurface geological structures provide large potential capacities for energy stored in the form of heat or gas on daily to seasonal time scales. In order to explore this potential sustainably, the possible induced effects of energy storage operations have to be quantified for both specified normal operation and events of failure. The ANGUS+ project therefore integrates experimental laboratory studies with numerical approaches to assess subsurface energy storage scenarios and monitoring methods. Subsurface storage options for gas, i.e. hydrogen, synthetic methane and compressed air in salt caverns or porous structures, as well as subsurface heat storage are investigated with respect to site prerequisites, storage dimensions, induced effects, monitoring methods and integration into spatial planning schemes. The conceptual interdisciplinary approach of the ANGUS+ project towards the integration of subsurface energy storage into a sustainable subsurface planning scheme is presented here, and this approach is then demonstrated using the examples of two selected energy storage options: Firstly, the option of seasonal heat storage in a shallow aquifer is presented. Coupled thermal and hydraulic processes induced by periodic heat injection and extraction were simulated in the open-source numerical modelling package OpenGeoSys. Situations of specified normal operation as well as cases of failure in operational storage with leaking heat transfer fluid are considered. Bench-scale experiments provided parameterisations of temperature dependent changes in shallow groundwater hydrogeochemistry. As a

  19. Performance-based assessment of reconstructed images

    SciTech Connect

    Hanson, Kenneth

    2009-01-01

    During the early 90s, I engaged in a productive and enjoyable collaboration with Robert Wagner and his colleague, Kyle Myers. We explored the ramifications of the principle that tbe quality of an image should be assessed on the basis of how well it facilitates the performance of appropriate visual tasks. We applied this principle to algorithms used to reconstruct scenes from incomplete and/or noisy projection data. For binary visual tasks, we used both the conventional disk detection and a new challenging task, inspired by the Rayleigh resolution criterion, of deciding whether an object was a blurred version of two dots or a bar. The results of human and machine observer tests were summarized with the detectability index based on the area under the ROC curve. We investigated a variety of reconstruction algorithms, including ART, with and without a nonnegativity constraint, and the MEMSYS3 algorithm. We concluded that the performance of the Raleigh task was optimized when the strength of the prior was near MEMSYS's default 'classic' value for both human and machine observers. A notable result was that the most-often-used metric of rms error in the reconstruction was not necessarily indicative of the value of a reconstructed image for the purpose of performing visual tasks.

  20. Medical students’ satisfaction with the Applied Basic Clinical Seminar with Scenarios for Students, a novel simulation-based learning method in Greece

    PubMed Central

    2016-01-01

    Purpose: The integration of simulation-based learning (SBL) methods holds promise for improving the medical education system in Greece. The Applied Basic Clinical Seminar with Scenarios for Students (ABCS3) is a novel two-day SBL course that was designed by the Scientific Society of Hellenic Medical Students. The ABCS3 targeted undergraduate medical students and consisted of three core components: the case-based lectures, the ABCDE hands-on station, and the simulation-based clinical scenarios. The purpose of this study was to evaluate the general educational environment of the course, as well as the skills and knowledge acquired by the participants. Methods: Two sets of questions were distributed to the participants: the Dundee Ready Educational Environment Measure (DREEM) questionnaire and an internally designed feedback questionnaire (InEv). A multiple-choice examination was also distributed prior to the course and following its completion. A total of 176 participants answered the DREEM questionnaire, 56 the InEv, and 60 the MCQs. Results: The overall DREEM score was 144.61 (±28.05) out of 200. Delegates who participated in both the case-based lectures and the interactive scenarios core components scored higher than those who only completed the case-based lecture session (P=0.038). The mean overall feedback score was 4.12 (±0.56) out of 5. Students scored significantly higher on the post-test than on the pre-test (P<0.001). Conclusion: The ABCS3 was found to be an effective SBL program, as medical students reported positive opinions about their experiences and exhibited improvements in their clinical knowledge and skills. PMID:27012313

  1. TAP 2, Performance-Based Training Manual

    SciTech Connect

    Not Available

    1991-07-01

    Training programs at DOE nuclear facilities should provide well- trained, qualified personnel to safely and efficiently operate the facilities in accordance with DOE requirements. A need has been identified for guidance regarding analysis, design, development, implementation, and evaluation of consistent and reliable performance-based training programs. Accreditation of training programs at Category A reactors and high-hazard and selected moderate-hazard nonreactor facilities will assure consistent, appropriate, and cost-effective training of personnel responsible for the operation, maintenance, and technical support of these facilities. Training programs that are designed and based on systematically job requirements, instead of subjective estimation of trainee needs, yield training activities that are consistent and develop or improve knowledge, skills, and abilities that can be directly related to the work setting. Because the training is job-related, the content of these programs more efficiently and effectively meets the needs of the employee. Besides a better trained work force, a greater level of operational reactor safety can be realized. This manual is intended to provide an overview of the accreditation process and a brief description of the elements necessary to construct and maintain training programs that are based on the requirements of the job. Two comparison manuals provide additional information to assist contractors in their efforts to accredit training programs.

  2. Scenario-based tsunami risk assessment using a static flooding approach and high-resolution digital elevation data: An example from Muscat in Oman

    NASA Astrophysics Data System (ADS)

    Schneider, Bastian; Hoffmann, Gösta; Reicherter, Klaus

    2016-04-01

    Knowledge of tsunami risk and vulnerability is essential to establish a well-adapted Multi Hazard Early Warning System, land-use planning and emergency management. As the tsunami risk for the coastline of Oman is still under discussion and remains enigmatic, various scenarios based on historical tsunamis were created. The suggested inundation and run-up heights were projected onto the modern infrastructural setting of the Muscat Capital Area. Furthermore, possible impacts of the worst-case tsunami event for Muscat are discussed. The approved Papathoma Tsunami Vulnerability Assessment Model was used to model the structural vulnerability of the infrastructure for a 2 m tsunami scenario, depicting the 1945 tsunami and a 5 m tsunami in Muscat. Considering structural vulnerability, the results suggest a minor tsunami risk for the 2 m tsunami scenario as the flooding is mainly confined to beaches and wadis. Especially traditional brick buildings, still predominant in numerous rural suburbs, and a prevalently coast-parallel road network lead to an increased tsunami risk. In contrast, the 5 m tsunami scenario reveals extensively inundated areas and with up to 48% of the buildings flooded, and therefore consequently a significantly higher tsunami risk. We expect up to 60000 damaged buildings and up to 380000 residents directly affected in the Muscat Capital Area, accompanied with a significant loss of life and damage to vital infrastructure. The rapid urbanization processes in the Muscat Capital Area, predominantly in areas along the coast, in combination with infrastructural, demographic and economic growth will additionally increase the tsunami risk and therefore emphasizes the importance of tsunami risk assessment in Oman.

  3. CSI-Chocolate Science Investigation and the Case of the Recipe Rip-Off: Using an Extended Problem-Based Scenario to Enhance High School Students' Science Engagement

    ERIC Educational Resources Information Center

    Marle, Peter D.; Decker, Lisa; Taylor, Victoria; Fitzpatrick, Kathleen; Khaliqi, David; Owens, Janel E.; Henry, Renee M.

    2014-01-01

    This paper discusses a K-12/university collaboration in which students participated in a four-day scenario-based summer STEM (science, technology, engineering, and mathematics) camp aimed at making difficult scientific concepts salient. This scenario, Jumpstart STEM-CSI: Chocolate Science Investigation (JSCSI), used open- and guided-inquiry…

  4. Seismic performance assessment of base-isolated safety-related nuclear structures

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2010-01-01

    Seismic or base isolation is a proven technology for reducing the effects of earthquake shaking on buildings, bridges and infrastructure. The benefit of base isolation has been presented in terms of reduced accelerations and drifts on superstructure components but never quantified in terms of either a percentage reduction in seismic loss (or percentage increase in safety) or the probability of an unacceptable performance. Herein, we quantify the benefits of base isolation in terms of increased safety (or smaller loss) by comparing the safety of a sample conventional and base-isolated nuclear power plant (NPP) located in the Eastern U.S. Scenario- and time-based assessments are performed using a new methodology. Three base isolation systems are considered, namely, (1) Friction Pendulum??? bearings, (2) lead-rubber bearings and (3) low-damping rubber bearings together with linear viscous dampers. Unacceptable performance is defined by the failure of key secondary systems because these systems represent much of the investment in a new build power plant and ensure the safe operation of the plant. For the scenario-based assessments, the probability of unacceptable performance is computed for an earthquake with a magnitude of 5.3 at a distance 7.5 km from the plant. For the time-based assessments, the annual frequency of unacceptable performance is computed considering all potential earthquakes that may occur. For both assessments, the implementation of base isolation reduces the probability of unacceptable performance by approximately four orders of magnitude for the same NPP superstructure and secondary systems. The increase in NPP construction cost associated with the installation of seismic isolators can be offset by substantially reducing the required seismic strength of secondary components and systems and potentially eliminating the need to seismically qualify many secondary components and systems. ?? 2010 John Wiley & Sons, Ltd.

  5. Comparison and Computational Performance of Tsunami-HySEA and MOST Models for LANTEX 2013 Scenario: Impact Assessment on Puerto Rico Coasts

    NASA Astrophysics Data System (ADS)

    Macías, Jorge; Mercado, Aurelio; González-Vida, José Manuel; Ortega, Sergio; Castro, Manuel Jesús

    2016-09-01

    Tsunami-HySEA model is used to simulate the Caribbean LANTEX 2013 scenario (LANTEX is the acronym for Large AtlaNtic Tsunami Exercise, which is carried out annually). The numerical simulation of the propagation and inundation phases is performed with a single integrated model but using different mesh resolutions and nested meshes. Special emphasis is placed on assessing the most exposed coastal areas at Puerto Rico affected by this event. Some comparisons with the MOST tsunami model available at the University of Puerto Rico (UPR) are made. Both models compare well for propagating tsunami waves in open sea, producing very similar results. In near-shore shallow waters, Tsunami-HySEA should be compared with the inundation version of MOST, since the propagation version is limited to deeper waters. For inundation, larger differences between model results are observed. Nevertheless, the most striking difference resides in computational time; Tsunami-HySEA is coded using the advantages of GPU architecture, and can produce a 4 h simulation in a 60 arc-sec resolution grid for the whole Caribbean Sea in less than 4 min with a single GPU and as fast as 11 s with 32 GPUs. When details about the inundation must be simulated, a 1 arc-sec (approximately 30 m) inundation resolution mesh covering all of Puerto Rico, an island with dimensions of 160 km east-west and 56 km north-south, is used, and a three-level nested meshes technique implemented. In this case approximately 8 ¾ h of wall clock time is needed for a 2-h simulation in a single GPU (versus more than 2 days for the MOST inundation, running three different parts of the island—West, Center, East—at the same time due to memory limitations in MOST). When domain decomposition techniques are finally implemented by breaking up the computational domain into sub-domains and assigning a GPU to each sub-domain (multi-GPU Tsunami-HySEA version), we show that the wall clock time significantly decreases, allowing high

  6. Scenario analysis for integrated water resources planning and management under uncertainty in the Zayandehrud river basin

    NASA Astrophysics Data System (ADS)

    Safavi, Hamid R.; Golmohammadi, Mohammad H.; Sandoval-Solis, Samuel

    2016-08-01

    The goal of this study is to develop and analyze three scenarios in the Zayandehrud river basin in Iran using a model already built and calibrated by Safavi et al. (2015) that has results for the baseline scenario. Results from the baseline scenario show that water demands will be supplied at the cost of depletion of surface and ground water resources, making this scenario undesirable and unsustainable. Supply Management, Demand Management, and Meta (supply and demand management) scenarios are the selected scenarios in this study. They are to be developed and declared into the Zayandehrud model to assess and evaluate the imminent status of the basin. Certain strategies will be employed for this purpose to improve and rectify the current management policies. The five performance criteria of time-based and volumetric reliability, resilience, vulnerability, and maximum deficit will be employed in the process of scenario analysis and evaluation. The results obtained from the performance criteria will be summed up into a so-called 'Water Resources Sustainability Index' to facilitate comparison among the likely trade-offs. Uncertainties arising from historical data, management policies, rainfall-runoff model, demand priorities, and performance criteria are considered in the proposed conceptual framework and modeled by appropriate approaches. Results show that the Supply Management scenario can be used to improve upon the demand supply but that it has no tangible effects on the improvement of the resources in the study region. In this regard, the Demand Management scenario is found to be more effective than the water supply one although it still remains unacceptable. Results of the Meta scenario indicate that both the supply and demand management scenarios must be applied if the water resources are to be safeguarded against degradation and depletion. In other words, the supply management scenario is necessary but not adequate; rather, it must be coupled to the demand

  7. High Performance Oxides-Based Thermoelectric Materials

    NASA Astrophysics Data System (ADS)

    Ren, Guangkun; Lan, Jinle; Zeng, Chengcheng; Liu, Yaochun; Zhan, Bin; Butt, Sajid; Lin, Yuan-Hua; Nan, Ce-Wen

    2015-01-01

    Thermoelectric materials have attracted much attention due to their applications in waste-heat recovery, power generation, and solid state cooling. In comparison with thermoelectric alloys, oxide semiconductors, which are thermally and chemically stable in air at high temperature, are regarded as the candidates for high-temperature thermoelectric applications. However, their figure-of-merit ZT value has remained low, around 0.1-0.4 for more than 20 years. The poor performance in oxides is ascribed to the low electrical conductivity and high thermal conductivity. Since the electrical transport properties in these thermoelectric oxides are strongly correlated, it is difficult to improve both the thermoelectric power and electrical conductivity simultaneously by conventional methods. This review summarizes recent progresses on high-performance oxide-based thermoelectric bulk-materials including n-type ZnO, SrTiO3, and In2O3, and p-type Ca3Co4O9, BiCuSeO, and NiO, enhanced by heavy-element doping, band engineering and nanostructuring.

  8. Projections of Water Stress Based on an Ensemble of Socioeconomic Growth and Climate Change Scenarios: A Case Study in Asia.

    PubMed

    Fant, Charles; Schlosser, C Adam; Gao, Xiang; Strzepek, Kenneth; Reilly, John

    2016-01-01

    The sustainability of future water resources is of paramount importance and is affected by many factors, including population, wealth and climate. Inherent in current methods to estimate these factors in the future is the uncertainty of their prediction. In this study, we integrate a large ensemble of scenarios--internally consistent across economics, emissions, climate, and population--to develop a risk portfolio of water stress over a large portion of Asia that includes China, India, and Mainland Southeast Asia in a future with unconstrained emissions. We isolate the effects of socioeconomic growth from the effects of climate change in order to identify the primary drivers of stress on water resources. We find that water needs related to socioeconomic changes, which are currently small, are likely to increase considerably in the future, often overshadowing the effect of climate change on levels of water stress. As a result, there is a high risk of severe water stress in densely populated watersheds by 2050, compared to recent history. There is strong evidence to suggest that, in the absence of autonomous adaptation or societal response, a much larger portion of the region's population will live in water-stressed regions in the near future. Tools and studies such as these can effectively investigate large-scale system sensitivities and can be useful in engaging and informing decision makers.

  9. Projections of Water Stress Based on an Ensemble of Socioeconomic Growth and Climate Change Scenarios: A Case Study in Asia.

    PubMed

    Fant, Charles; Schlosser, C Adam; Gao, Xiang; Strzepek, Kenneth; Reilly, John

    2016-01-01

    The sustainability of future water resources is of paramount importance and is affected by many factors, including population, wealth and climate. Inherent in current methods to estimate these factors in the future is the uncertainty of their prediction. In this study, we integrate a large ensemble of scenarios--internally consistent across economics, emissions, climate, and population--to develop a risk portfolio of water stress over a large portion of Asia that includes China, India, and Mainland Southeast Asia in a future with unconstrained emissions. We isolate the effects of socioeconomic growth from the effects of climate change in order to identify the primary drivers of stress on water resources. We find that water needs related to socioeconomic changes, which are currently small, are likely to increase considerably in the future, often overshadowing the effect of climate change on levels of water stress. As a result, there is a high risk of severe water stress in densely populated watersheds by 2050, compared to recent history. There is strong evidence to suggest that, in the absence of autonomous adaptation or societal response, a much larger portion of the region's population will live in water-stressed regions in the near future. Tools and studies such as these can effectively investigate large-scale system sensitivities and can be useful in engaging and informing decision makers. PMID:27028871

  10. Soil retention of hexavalent chromium released from construction and demolition waste in a road-base-application scenario.

    PubMed

    Butera, Stefania; Trapp, Stefan; Astrup, Thomas F; Christensen, Thomas H

    2015-11-15

    We investigated the retention of Cr(VI) in three subsoils with low organic matter content in laboratory experiments at concentration levels relevant to represent leachates from construction and demolition waste (C&DW) reused as unbound material in road construction. The retention mechanism appeared to be reduction and subsequent precipitation as Cr(III) on the soil. The reduction process was slow and in several experiments it was still proceeding at the end of the six-month experimental period. The overall retention reaction fit well with a second-order reaction governed by actual Cr(VI) concentration and reduction capacity of the soil. The experimentally determined reduction capacities and second-order kinetic parameters were used to model, for a 100-year period, the one-dimensional migration of Cr(VI) in the subsoil under a layer of C&DW. The resulting Cr(VI) concentration would be negligible below 7-70 cm depth. However, in rigid climates and with high water infiltration through the road pavement, the reduction reaction could be so slow that Cr(VI) might migrate as deep as 200 cm under the road. The reaction parameters and the model can form the basis for systematically assessing under which scenarios Cr(VI) from C&DW could lead to an environmental issue for ground- and receiving surface waters.

  11. Soil retention of hexavalent chromium released from construction and demolition waste in a road-base-application scenario.

    PubMed

    Butera, Stefania; Trapp, Stefan; Astrup, Thomas F; Christensen, Thomas H

    2015-11-15

    We investigated the retention of Cr(VI) in three subsoils with low organic matter content in laboratory experiments at concentration levels relevant to represent leachates from construction and demolition waste (C&DW) reused as unbound material in road construction. The retention mechanism appeared to be reduction and subsequent precipitation as Cr(III) on the soil. The reduction process was slow and in several experiments it was still proceeding at the end of the six-month experimental period. The overall retention reaction fit well with a second-order reaction governed by actual Cr(VI) concentration and reduction capacity of the soil. The experimentally determined reduction capacities and second-order kinetic parameters were used to model, for a 100-year period, the one-dimensional migration of Cr(VI) in the subsoil under a layer of C&DW. The resulting Cr(VI) concentration would be negligible below 7-70 cm depth. However, in rigid climates and with high water infiltration through the road pavement, the reduction reaction could be so slow that Cr(VI) might migrate as deep as 200 cm under the road. The reaction parameters and the model can form the basis for systematically assessing under which scenarios Cr(VI) from C&DW could lead to an environmental issue for ground- and receiving surface waters. PMID:26148961

  12. Evaluation of Precipitation from CMIP5 Models for Western Colorado and Development of a Scenario based Method for Regional Climate Change Planning

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Arnott, J. C.; Rood, R. B.

    2015-12-01

    As the latest generation of climate and Earth system models become more complex , the use of model output to project certain impact relevant indicators for regional climate change adaptation planning is increasingly sought. However, barriers due to skill remain when utilizing this model data to project changes in precipitation over mountainous areas at regional and subregional scales. Complex topography, localized meteorological phenomenon, and other factors are still not well represented by global scale models, which can limit the representation of key impact criteria needed for planning.We explore limitations and opportunities of utilizing precipitation data from Coupled Model Intercomparison Project 5(CMIP5) models to provide use relevant projections of future precipitation conditions in Western Colorado, with a focus on applications relevant to climate information needs in the resort community of Aspen. First, a model skill evaluation is conducted by comparing precipitation and temperature values of selected model ensemble from CMIP5 to observations during a historical period. The comparison is conducted for both temporal and spatial scales, on both yearly and seasonal increments. Results indicate that the models are more skillful at representing temperature than precipitation and that the apparent lack of skill for precipitation warrants caution in the use of such data in climate impacts assessment that serve to inform adaptation planning and preparedness decision making. In light of model evaluation, the authors introduce a scenario based method in which individual models within the CMIP5 ensemble are organized into plausible qualitative futures and individual model runs are selected as representative scenarios by which detailed analysis can then be applied. The results from scenario based method are viewed as useful for exploring regional climate futures in instances when it is not appropriate to utilize data directly from global-scale climate models.

  13. Scenario-based numerical modelling and the palaeo-historic record of tsunamis in Wallis and Futuna, Southwest Pacific

    NASA Astrophysics Data System (ADS)

    Lamarche, G.; Popinet, S.; Pelletier, B.; Mountjoy, J.; Goff, J.; Delaux, S.; Bind, J.

    2015-08-01

    We investigated the tsunami hazard in the remote French territory of Wallis and Futuna, Southwest Pacific, using the Gerris flow solver to produce numerical models of tsunami generation, propagation and inundation. Wallis consists of the inhabited volcanic island of Uvéa that is surrounded by a lagoon delimited by a barrier reef. Futuna and the island of Alofi form the Horn Archipelago located ca. 240 km east of Wallis. They are surrounded by a narrow fringing reef. Futuna and Alofi emerge from the North Fiji Transform Fault that marks the seismically active Pacific-Australia plate boundary. We generated 15 tsunami scenarios. For each, we calculated maximum wave elevation (MWE), inundation distance and expected time of arrival (ETA). The tsunami sources were local, regional and distant earthquake faults located along the Pacific Rim. In Wallis, the outer reef may experience 6.8 m-high MWE. Uvéa is protected by the barrier reef and the lagoon, but inundation depths of 2-3 m occur in several coastal areas. In Futuna, flow depths exceeding 2 m are modelled in several populated areas, and have been confirmed by a post-September 2009 South Pacific tsunami survey. The channel between the islands of Futuna and Alofi amplified the 2009 tsunami, which resulted in inundation distance of almost 100 m and MWE of 4.4 m. This first ever tsunami hazard modelling study of Wallis and Futuna compares well with palaeotsunamis recognised on both islands and observation of the impact of the 2009 South Pacific tsunami. The study provides evidence for the mitigating effect of barrier and fringing reefs from tsunamis.

  14. Scenario-based numerical modelling and the palaeo-historic record of tsunamis in Wallis and Futuna, Southwest Pacific

    NASA Astrophysics Data System (ADS)

    Lamarche, G.; Popinet, S.; Pelletier, B.; Mountjoy, J.; Goff, J.; Delaux, S.; Bind, J.

    2015-04-01

    We investigated the tsunami hazard in the remote French territory of Wallis and Futuna, Southwest Pacific, using the Gerris flow solver to produce numerical models of tsunami generation, propagation and inundation. Wallis consists of the inhabited volcanic island of Uvéa that is surrounded by a lagoon delimited by a barrier reef. Futuna and the island of Alofi forms the Horn Archipelago located ca. 240 km east of Wallis. They are surrounded by a narrow fringing reef. Futuna and Alofi emerge from the North Fiji Transform Fault that marks the seismically active Pacific-Australia plate boundary. We generated fifteen tsunami scenarios. For each, we calculated maximum wave elevation (MWE), inundation distance, and Expected Time of Arrival (ETA). The tsunami sources were local, regional and distant earthquake faults located along the Pacific Rim. In Wallis, the outer reef may experience 6.8 m-high MWE. Uvéa is protected by the barrier reef and the lagoon, but inundation depths of 2-3 m occur in several coastal areas. In Futuna, flow depths exceeding 2 m are modelled in several populated areas, and have been confirmed by a post-September 2009 South Pacific tsunami survey. The channel between the islands of Futuna and Alofi amplified the 2009 tsunami, which resulted in inundation distance of almost 100 m and MWE of 4.4 m. This first-ever tsunami hazard modelling study of Wallis and Futuna compares well with palaeotsunamis recognised on both islands and observation of the impact of the 2009 South Pacific tsunami. The study provides evidence for the mitigating effect of barrier and fringing reefs from tsunamis.

  15. Limits on the significant mass-loss scenario based on the globular clusters of the Fornax dwarf spheroidal galaxy

    NASA Astrophysics Data System (ADS)

    Khalaj, P.; Baumgardt, H.

    2016-03-01

    Many of the scenarios proposed to explain the origin of chemically peculiar stars in globular clusters (GCs) require significant mass loss (≥95 per cent) to explain the observed fraction of such stars. In the GCs of the Fornax dwarf galaxy, significant mass loss could be a problem. Larsen et al. showed that there is a large ratio of GCs to metal-poor field stars in Fornax and about 20-25 per cent of all the stars with [Fe/H] < -2 belong to the four metal-poor GCs. This imposes an upper limit of ˜80 per cent mass loss that could have happened in Fornax GCs. In this paper, we propose a solution to this problem by suggesting that stars can leave the Fornax galaxy. We use a series of N-body simulations to determine the limit of mass loss from Fornax as a function of the initial orbital radii of GCs and the speed with which stars leave Fornax GCs. We consider a set of cored and cuspy density profiles for Fornax. Our results show that with a cuspy model for Fornax, the fraction of stars that leave the galaxy can be as high as ˜90 per cent, when the initial orbital radii of GCs are R = 2-3 kpc and the initial speed of stars is v > 20 km s-1. We show that such large velocities can be achieved by mass loss induced by gas expulsion but not mass loss induced by stellar evolution. Our results imply that one cannot interpret the metallicity distribution of Fornax field stars as evidence against significant mass loss in Fornax GCs, if mass loss is due to gas expulsion.

  16. Scenario Development for the Southwestern United States

    NASA Astrophysics Data System (ADS)

    Mahmoud, M.; Gupta, H.; Stewart, S.; Liu, Y.; Hartmann, H.; Wagener, T.

    2006-12-01

    The primary goal of employing a scenario development approach for the U.S. southwest is to inform regional policy by examining future possibilities related to regional vegetation change, water-leasing, and riparian restoration. This approach is necessary due to a lack of existing explicit water resources application of scenarios to the entire southwest region. A formal approach for scenario development is adopted and applied towards water resources issues within the arid and semi-arid regions of the U.S. southwest following five progressive and reiterative phases: scenario definition, scenario construction, scenario analysis, scenario assessment, and risk management. In the scenario definition phase, the inputs of scientists, modelers, and stakeholders were collected in order to define and construct relevant scenarios to the southwest and its water sustainability needs. From stakeholder-driven scenario workshops and breakout sessions, the three main axes of principal change were identified to be climate change, population development patterns, and quality of information monitoring technology. Based on the extreme and varying conditions of these three main axes, eight scenario narratives were drafted to describe the state of each scenario's respective future and the events which led to it. Events and situations are described within each scenario narrative with respect to key variables; variables that are both important to regional water resources (as distinguished by scientists and modelers), and are good tracking and monitoring indicators of change. The current phase consists of scenario construction, where the drafted scenarios are re-presented to regional scientists and modelers to verify that proper key variables are included (or excluded) from the eight narratives. The next step is to construct the data sets necessary to implement the eight scenarios on the respective computational models of modelers investigating vegetation change, water-leasing, and riparian

  17. The Effects of Performance-Based Assessment Criteria on Student Performance and Self-Assessment Skills

    ERIC Educational Resources Information Center

    Fastre, Greet Mia Jos; van der Klink, Marcel R.; van Merrienboer, Jeroen J. G.

    2010-01-01

    This study investigated the effect of performance-based versus competence-based assessment criteria on task performance and self-assessment skills among 39 novice secondary vocational education students in the domain of nursing and care. In a performance-based assessment group students are provided with a preset list of performance-based…

  18. Fuzzy logic based sensor performance evaluation of vehicle mounted metal detector systems

    NASA Astrophysics Data System (ADS)

    Abeynayake, Canicious; Tran, Minh D.

    2015-05-01

    Vehicle Mounted Metal Detector (VMMD) systems are widely used for detection of threat objects in humanitarian demining and military route clearance scenarios. Due to the diverse nature of such operational conditions, operational use of VMMD without a proper understanding of its capability boundaries may lead to heavy causalities. Multi-criteria fitness evaluations are crucial for determining capability boundaries of any sensor-based demining equipment. Evaluation of sensor based military equipment is a multi-disciplinary topic combining the efforts of researchers, operators, managers and commanders having different professional backgrounds and knowledge profiles. Information acquired through field tests usually involves uncertainty, vagueness and imprecision due to variations in test and evaluation conditions during a single test or series of tests. This report presents a fuzzy logic based methodology for experimental data analysis and performance evaluation of VMMD. This data evaluation methodology has been developed to evaluate sensor performance by consolidating expert knowledge with experimental data. A case study is presented by implementing the proposed data analysis framework in a VMMD evaluation scenario. The results of this analysis confirm accuracy, practicability and reliability of the fuzzy logic based sensor performance evaluation framework.

  19. The need for and use of socio-economic scenarios for climate change analysis: A new approach based on shared socio-economic pathways

    SciTech Connect

    Kriegler, Elmar; O'Neill, Brian; Hallegatte, Stephane; Kram, Tom; Lempert, Rob; Moss, Richard H.; Wilbanks, Thomas

    2012-10-01

    A new set of socioeconomic scenarios (Shared Socioeconomic Pathways) are described that provide a set of global narratives and socio-economic pathways to pair with climate model scenarios developed using the new Representative Concentration Pathways.

  20. A Global Scale Scenario for Prebiotic Chemistry: Silica-Based Self-Assembled Mineral Structures and Formamide.

    PubMed

    Saladino, Raffaele; Botta, Giorgia; Bizzarri, Bruno Mattia; Di Mauro, Ernesto; Garcia Ruiz, Juan Manuel

    2016-05-17

    clearly specific, demonstrating that the mineral self-assembled membranes at the same time create space compartmentalization and selective catalysis of the synthesis of relevant compounds. Rather than requiring odd local conditions, the prebiotic organic chemistry scenario for the origin of life appears to be common at a universal scale and, most probably, earlier than ever thought for our planet.

  1. A Global Scale Scenario for Prebiotic Chemistry: Silica-Based Self-Assembled Mineral Structures and Formamide

    PubMed Central

    2016-01-01

    membrane are clearly specific, demonstrating that the mineral self-assembled membranes at the same time create space compartmentalization and selective catalysis of the synthesis of relevant compounds. Rather than requiring odd local conditions, the prebiotic organic chemistry scenario for the origin of life appears to be common at a universal scale and, most probably, earlier than ever thought for our planet. PMID:27115539

  2. A Global Scale Scenario for Prebiotic Chemistry: Silica-Based Self-Assembled Mineral Structures and Formamide.

    PubMed

    Saladino, Raffaele; Botta, Giorgia; Bizzarri, Bruno Mattia; Di Mauro, Ernesto; Garcia Ruiz, Juan Manuel

    2016-05-17

    clearly specific, demonstrating that the mineral self-assembled membranes at the same time create space compartmentalization and selective catalysis of the synthesis of relevant compounds. Rather than requiring odd local conditions, the prebiotic organic chemistry scenario for the origin of life appears to be common at a universal scale and, most probably, earlier than ever thought for our planet. PMID:27115539

  3. Medical Scenarios Relevant to Spaceflight

    NASA Technical Reports Server (NTRS)

    Bacal, Kira; Hurs, Victor; Doerr, Harold

    2004-01-01

    The Medical Operational Support Team (MOST) was tasked by the JSC Space Medicine and Life Sciences Directorate (SLSD) to incorporate medical simulation into 1) medical training for astronaut-crew medical officers (CMO) and medical flight control teams and 2) evaluations of procedures and resources required for medical care aboard the International Space Station (ISS). Development of evidence-based medical scenarios that mimic the physiology observed during spaceflight will be needed for the MOST to complete these two tasks. The MOST used a human patient simulator, the ISS-like resources in the Medical Simulation Laboratory (MSL), and evidence from space operations, military operations and medical literature to develop space relevant medical scenarios. These scenarios include conditions concerning airway management, Advanced Cardiac Life Support (ACLS) and mitigating anaphylactic symptoms. The MOST has used these space relevant medical scenarios to develop a preliminary space medical training regimen for NASA flight surgeons, Biomedical Flight Controllers (Biomedical Engineers; BME) and CMO-analogs. This regimen is conducted by the MOST in the MSL. The MOST has the capability to develop evidence-based space-relevant medical scenarios that can help SLSD I) demonstrate the proficiency of medical flight control teams to mitigate space-relevant medical events and 2) validate nextgeneration medical equipment and procedures for space medicine applications.

  4. Scenario-based assessment of buildings' damage and population exposure due to earthquake-induced tsunamis for the town of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Pagnoni, G.; Armigliato, A.; Tinti, S.

    2015-12-01

    Alexandria is the second biggest city in Egypt with regards to population, is a key economic area in northern Africa and has very important tourist activity. Historical records indicate that it was severely affected by a number of tsunami events. In this work we assess the tsunami hazard by running numerical simulations of tsunami impact in Alexandria through the worst-case credible tsunami scenario analysis (WCTSA). We identify three main seismic sources: the western Hellenic Arc (WHA - reference event AD 365, Mw = 8.5), the eastern Hellenic Arc (EHA - reference event 1303, Mw = 8.0) and the Cyprus Arc (CA - hypothetical scenario earthquake with Mw = 8.0), inferred from the tectonic setting and from historical tsunami catalogues. All numerical simulations are carried out in two sea level conditions (mean sea level and maximum high-tide sea level) by means of the code UBO-TSUFD, developed and maintained by the Tsunami Research Team of the University of Bologna. Relevant tsunami metrics are computed for each scenario and then used to build aggregated fields such as the maximum flood depth and the maximum inundation area. We find that the case that produces the most relevant flooding in Alexandria is the EHA scenario, with wave heights up to 4 m. The aggregate fields are used for a building vulnerability assessment according to a methodology developed in the framework of the EU-FP6 project SCHEMA and further refined in this study, based on the adoption of a suitable building damage matrix and on water inundation depth. It is found that in the districts of El Dekhila and Al Amriyah, to the south-west of the port of Dekhila, over 12 000 (13 400 in the case of maximum high tide) buildings could be affected and hundreds of them could sustain damaging consequences, ranging from critical damage to total collapse. It is also found that in the same districts tsunami inundation covers an area of about 15 km2, resulting in more than 150 000 (165 000 in the case of maximum high

  5. An integrated exposure assessment of phthalates for the general population in China based on both exposure scenario and biomonitoring estimation approaches.

    PubMed

    Cao, Yan; Liu, Jianguo; Liu, Yang; Wang, Jie; Hao, Xuewen

    2016-02-01

    The representativeness of available studies on integrated exposure assessment of phthalates for the general population in China is lacking. Based on an exhaustive review of the extensive monitoring data available for China, this study presents a large-scale estimation of exposure levels to three typical phthalates, di(2-ethylhexyl) phthalate (DEHP), di-n-butyl phthalate (DBP) and diisobutyl phthalate (DiBP), by applying both exposure scenario and biomonitoring estimation approaches. The respective median exposure levels from the exposure scenario and biomonitoring estimation approaches were 3.80, 3.02 and 1.00 μg/kg bw/day and 3.38, 3.21 and 3.32 μg/kg bw/day for DEHP, DBP and DiBP, which are acceptable levels of exposure with respect to current international guidelines. Evaluation results from the two approaches showed both similarities and differences among the different phthalates, making the exposure assessment comparable and more comprehensive. In terms of sources of exposure, food intake was the largest contributor, while indoor air exposure had greater contribution to the estimated daily intakes (EDIs) of DiBP than that of the other phthalates. Moreover, more attention should be paid to the higher exposure levels of phthalates in several intensively industrialized and urbanized areas, and the causes of the different exposure levels in the different regions need to be further explored.

  6. 3-D or median map? Earthquake scenario ground-motion maps from physics-based models versus maps from ground-motion prediction equations

    NASA Astrophysics Data System (ADS)

    Porter, K.

    2015-12-01

    There are two common ways to create a ground-motion map for a hypothetical earthquake: using ground motion prediction equations (by far the more common of the two) and using 3-D physics-based modeling. The former is very familiar to engineers, the latter much less so, and the difference can present a problem because engineers tend to trust the familiar and distrust novelty. Maps for essentially the same hypothetical earthquake using the two different methods can look very different, while appearing to present the same information. Using one or the other can lead an engineer or disaster planner to very different estimates of damage and risk. The reasons have to do with depiction of variability, spatial correlation of shaking, the skewed distribution of real-world shaking, and the upward-curving relationship between shaking and damage. The scientists who develop the two kinds of map tend to specialize in one or the other and seem to defend their turf, which can aggravate the problem of clearly communicating with engineers.The USGS Science Application for Risk Reduction's (SAFRR) HayWired scenario has addressed the challenge of explaining to engineers the differences between the two maps, and why, in a disaster planning scenario, one might want to use the less-familiar 3-D map.

  7. A Behavior-Based Employee Performance System.

    ERIC Educational Resources Information Center

    Abernathy, William B.

    2003-01-01

    Discusses human performance technology models for describing and understanding factors involved in day-to-day functioning of employees and then to develop specific remedial interventions as needed, and contrasts it to an organizational performance system perspective used to design an organization before employees are even hired to prevent bad…

  8. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  9. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  10. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  11. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  12. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  13. Human health screening level risk assessments of tertiary-butyl acetate (TBAC): calculated acute and chronic reference concentration (RfC) and Hazard Quotient (HQ) values based on toxicity and exposure scenario evaluations.

    PubMed

    Bus, James S; Banton, Marcy I; Faber, Willem D; Kirman, Christopher R; McGregor, Douglas B; Pourreau, Daniel B

    2015-02-01

    A screening level risk assessment has been performed for tertiary-butyl acetate (TBAC) examining its primary uses as a solvent in industrial and consumer products. Hazard quotients (HQ) were developed by merging TBAC animal toxicity and dose-response data with population-level, occupational and consumer exposure scenarios. TBAC has a low order of toxicity following subchronic inhalation exposure, and neurobehavioral changes (hyperactivity) in mice observed immediately after termination of exposure were used as conservative endpoints for derivation of acute and chronic reference concentration (RfC) values. TBAC is not genotoxic but has not been tested for carcinogenicity. However, TBAC is unlikely to be a human carcinogen in that its non-genotoxic metabolic surrogates tertiary-butanol (TBA) and methyl tertiary butyl ether (MTBE) produce only male rat α-2u-globulin-mediated kidney cancer and high-dose specific mouse thyroid tumors, both of which have little qualitative or quantitative relevance to humans. Benchmark dose (BMD)-modeling of the neurobehavioral responses yielded acute and chronic RfC values of 1.5 ppm and 0.3 ppm, respectively. After conservative modeling of general population and near-source occupational and consumer product exposure scenarios, almost all HQs were substantially less than 1. HQs exceeding 1 were limited to consumer use of automotive products and paints in a poorly ventilated garage-sized room (HQ = 313) and occupational exposures in small and large brake shops using no personal protective equipment or ventilation controls (HQs = 3.4-126.6). The screening level risk assessments confirm low human health concerns with most uses of TBAC and indicate that further data-informed refinements can address problematic health/exposure scenarios. The assessments also illustrate how tier-based risk assessments using read-across toxicity information to metabolic surrogates reduce the need for comprehensive animal testing.

  14. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    ERIC Educational Resources Information Center

    Misfeldt, Morten

    2015-01-01

    In this paper I describe how students use a project management simulation game based on an attack-defense mechanism where two teams of players compete by challenging each other's projects. The project management simulation game is intended to be played by pre-service construction workers and engineers. The gameplay has two parts: a planning part,…

  15. Context Impact of Clinical Scenario on Knowledge Transfer and Reasoning Capacity in a Medical Problem-Based Learning Curriculum

    ERIC Educational Resources Information Center

    Collard, A.; Brédart, S.; Bourguignon, J.-P.

    2016-01-01

    Since 2000, the faculty of Medicine at the University of Liège has integrated problem-based learning (PBL) seminars from year two to seven in its seven-year curriculum. The PBL approach has been developed to facilitate students' acquisition of reasoning capacity. This contextualized learning raises the question of the de- and re-contextualization…

  16. The Experimental Effects of the Strategic Adolescent Reading Intervention (STARI) on a Scenarios-Based Reading Comprehension Assessment

    ERIC Educational Resources Information Center

    Kim, James; Hemphill, Lowry; Troyer, Margaret; Jones, Stephanie; LaRusso, Maria; Kim, Ha-Yeon; Donovan, Suzanne; Snow, Catherine

    2016-01-01

    Nearly one-quarter of U.S. eighth graders score below basic on national assessments of reading (NCES, 2013) and are poorly equipped for the reading demands of secondary school. Struggling adolescent readers cannot summarize a simple passage, use context to determine word meanings, and have difficulties making text-based inferences. In addition,…

  17. Pre-Service Teachers' Perceptions on Game Based Learning Scenarios in Primary Reading and Writing Instruction Courses

    ERIC Educational Resources Information Center

    Karadag, Ruhan

    2015-01-01

    The aim of this study was to explore pre-service teachers' perceptions on the use of game-based learning in a Primary Reading and Writing Instruction Course. A mixed method research was used in the study. Participants were composed of a total of 189 pre-service teachers taking the Primary Reading and Writing Instruction course during the fall term…

  18. Coffee Beverage Quality Assessment Based on ETA/CPTEC-HadCM3 Model (A1B-IPCC/SRES Scenario), Southeastern Brazil

    NASA Astrophysics Data System (ADS)

    Giarolla, A.; Resende, N.; Chou, S. C.; Tavares, P. S.; Rodrigues, D. C.

    2012-04-01

    Environmental factors influence the coffee beverage quality and air temperature has a significant importance in this process. The grain maturation occurs very quickly in regions that present high temperatures and sometimes there is not enough time to complete all this phase adequately. In the other hand, with mild temperatures, the grain maturation occurs more slowly and it promotes a better quality beverage. The aim of this study was to assess the coffee beverage quality in the southeastern Brazil, based on climate projections using the Eta-CPTEC regional model driven by four members of an ensemble of the Met Office Hadley Centre Global Coupled climate model (HadCM3). The global model ensemble was run over the 21st century according to IPCC SRES, A1B emissions scenario. Each ensemble member presented different climate sensitivity in the analysis. The Eta-CPTEC-HadCM3 model was configured with a 40-km grid size and was run over the period of 1961-90 to represent a baseline climate, and over the period of 2011-2100 to simulate possible future changes and the effects on the coffee beverage quality. A coffee beverage quality classification, which depends on the annual air temperature proposed by Bressani (2007) and also, a quality coffee beverage sensory classification, based on Camargo and Cortez (1998) were considered in this study. An evaluation of the systematic errors (BIAS) for each member for the period from 1961 to 1990 was made. The results presented by Eta/CPTEC-HadCM3 model indicated that in the case of an occurrence of A1B emission scenario, the coffee beverage quality could be affected in this region due to the fact that the flavor may become stronger and unpleasant caused by rising air temperatures. The BIAS evaluation and subsequent errors removal demonstrated improvement in the scenarios simulations. A short review concerning agronomic techniques to mitigate extreme meteorological events or global warming on coffee crop based on Camargo (2010) also is

  19. Scenario based tsunami wave height estimation towards hazard evaluation for the Hellenic coastline and examples of extreme inundation zones in South Aegean

    NASA Astrophysics Data System (ADS)

    Melis, Nikolaos S.; Barberopoulou, Aggeliki; Frentzos, Elias; Krassanakis, Vassilios

    2016-04-01

    A scenario based methodology for tsunami hazard assessment is used, by incorporating earthquake sources with the potential to produce extreme tsunamis (measured through their capacity to cause maximum wave height and inundation extent). In the present study we follow a two phase approach. In the first phase, existing earthquake hazard zoning in the greater Aegean region is used to derive representative maximum expected earthquake magnitude events, with realistic seismotectonic source characteristics, and of greatest tsunamigenic potential within each zone. By stacking the scenario produced maximum wave heights a global maximum map is constructed for the entire Hellenic coastline, corresponding to all expected extreme offshore earthquake sources. Further evaluation of the produced coastline categories based on the maximum expected wave heights emphasizes the tsunami hazard in selected coastal zones with important functions (i.e. touristic crowded zones, industrial zones, airports, power plants etc). Owing to its proximity to the Hellenic Arc, many urban centres and being a popular tourist destination, Crete Island and the South Aegean region are given a top priority to define extreme inundation zoning. In the second phase, a set of four large coastal cities (Kalamata, Chania, Heraklion and Rethymno), important for tsunami hazard, due i.e. to the crowded beaches during the summer season or industrial facilities, are explored towards preparedness and resilience for tsunami hazard in Greece. To simulate tsunamis in the Aegean region (generation, propagation and runup) the MOST - ComMIT NOAA code was used. High resolution DEMs for bathymetry and topography were joined via an interface, specifically developed for the inundation maps in this study and with similar products in mind. For the examples explored in the present study, we used 5m resolution for the topography and 30m resolution for the bathymetry, respectively. Although this study can be considered as

  20. Reliable multihop broadcast protocol with a low-overhead link quality assessment for ITS based on VANETs in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224

  1. Reliable Multihop Broadcast Protocol with a Low-Overhead Link Quality Assessment for ITS Based on VANETs in Highway Scenarios

    PubMed Central

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H.

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224

  2. Reliable multihop broadcast protocol with a low-overhead link quality assessment for ITS based on VANETs in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay.

  3. Ontology-Driven Knowledge-Based Health-Care System, An Emerging Area - Challenges And Opportunities - Indian Scenario

    NASA Astrophysics Data System (ADS)

    Sunitha, A.; Babu, G. Suresh

    2014-11-01

    Recent studies in the decision making efforts in the area of public healthcare systems have been tremendously inspired and influenced by the entry of ontology. Ontology driven systems results in the effective implementation of healthcare strategies for the policy makers. The central source of knowledge is the ontology containing all the relevant domain concepts such as locations, diseases, environments and their domain sensitive inter-relationships which is the prime objective, concern and the motivation behind this paper. The paper further focuses on the development of a semantic knowledge-base for public healthcare system. This paper describes the approach and methodologies in bringing out a novel conceptual theme in establishing a firm linkage between three different ontologies related to diseases, places and environments in one integrated platform. This platform correlates the real-time mechanisms prevailing within the semantic knowledgebase and establishing their inter-relationships for the first time in India. This is hoped to formulate a strong foundation for establishing a much awaited basic need for a meaningful healthcare decision making system in the country. Introduction through a wide range of best practices facilitate the adoption of this approach for better appreciation, understanding and long term outcomes in the area. The methods and approach illustrated in the paper relate to health mapping methods, reusability of health applications, and interoperability issues based on mapping of the data attributes with ontology concepts in generating semantic integrated data driving an inference engine for user-interfaced semantic queries.

  4. Assessment of future scenarios for wind erosion sensitivity changes based on ALADIN and REMO regional climate model simulation data

    NASA Astrophysics Data System (ADS)

    Mezősi, Gábor; Blanka, Viktória; Bata, Teodóra; Ladányi, Zsuzsanna; Kemény, Gábor; Meyer, Burghard C.

    2016-07-01

    The changes in rate and pattern of wind erosion sensitivity due to climate change were investigated for 2021-2050 and 2071-2100 compared to the reference period (1961-1990) in Hungary. The sensitivities of the main influencing factors (soil texture, vegetation cover and climate factor) were evaluated by fuzzy method and a combined wind erosion sensitivity map was compiled. The climate factor, as the driving factor of the changes, was assessed based on observed data for the reference period, while REMO and ALADIN regional climate model simulation data for the future periods. The changes in wind erosion sensitivity were evaluated on potentially affected agricultural land use types, and hot spot areas were allocated. Based on the results, 5-6% of the total agricultural areas were high sensitive areas in the reference period. In the 21st century slight or moderate changes of wind erosion sensitivity can be expected, and mostly `pastures', `complex cultivation patterns', and `land principally occupied by agriculture with significant areas of natural vegetation' are affected. The applied combination of multi-indicator approach and fuzzy analysis provides novelty in the field of land sensitivity assessment. The method is suitable for regional scale analysis of wind erosion sensitivity changes and supports regional planning by allocating priority areas where changes in agro-technics or land use have to be considered.

  5. Transportation accident scenarios for commercial spent fuel

    SciTech Connect

    Wilmot, E L

    1981-02-01

    A spectrum of high severity, low probability, transportation accident scenarios involving commercial spent fuel is presented together with mechanisms, pathways and quantities of material that might be released from spent fuel to the environment. These scenarios are based on conclusions from a workshop, conducted in May 1980 to discuss transportation accident scenarios, in which a group of experts reviewed and critiqued available literature relating to spent fuel behavior and cask response in accidents.

  6. How do we know about Earth's history? Constructing the story of Earth's geologic history by collecting and interpreting evidence based scenarios.

    NASA Astrophysics Data System (ADS)

    Ruthford, Steven; DeBari, Susan; Linneman, Scott; Boriss, Miguel; Chesbrough, John; Holmes, Randall; Thibault, Allison

    2013-04-01

    Beginning in 2003, faculty from Western Washington University, Skagit Valley Community College, local public school teachers, and area tribal college members created an innovative, inquiry based undergraduate geology curriculum. The curriculum, titled "Energy and Matter in Earth's Systems," was supported through various grants and partnerships, including Math and Science Partnership and Noyce Teacher Scholarship grants from the National Science Foundation. During 2011, the authors wrote a geologic time unit for the curriculum. The unit is titled, "How Do We Know About Earth's History?" and has students actively investigate the concepts related to geologic time and methods for determining age. Starting with reflection and assessment of personal misconceptions called "Initial Ideas," students organize a series of events into a timeline. The unit then focuses on the concepts of relative dating, biostratigraphy, and historical attempts at absolute dating, including uniformitarianism, catastrophism, Halley and Joly's Salinity hypothesis, and Kelvin's Heat Loss model. With limited lecture and text, students then dive into current understandings of the age of the Earth, which include radioactive decay rates and radiometric dating. Finally, using their newfound understanding, students investigate a number of real world scenarios and create a timeline of events related to the geologic history of the Earth. The unit concludes with activities that reinforce the Earth's absolute age and direct students to summarize what they have learned by reorganizing the timeline from the "Initial Ideas" and sharing with the class. This presentation will include the lesson materials and findings from one activity titled, "The Earth's Story." The activity is located midway through the unit and begins with reflection on the question, "What are the major events in the Earth's history and when did they happen?" Students are directed to revisit the timeline of events from the "Initial Ideas

  7. Assessment of Folsom Lake response to historical and potential future climate scenarios

    USGS Publications Warehouse

    Yao, Huaming; Georgakakos, Aris P.

    2000-01-01

    An integrated forecast-decision system for Folsom Lake (California) is developed and used to assess the sensitivity of reservoir performance to various forecast-management schemes under historical and future climate scenarios. The assessments are based on various combinations of inflow forecasting models, decision rules, and climate scenarios and demonstrate that (1) reliable inflow forecasts and adaptive decision systems can substantially benefit reservoir performance and (2) dynamic operational procedures represent effective climate change coping strategies.

  8. Comparison of Sigma-Point and Extended Kalman Filters on a Realistic Orbit Determination Scenario

    NASA Technical Reports Server (NTRS)

    Gaebler, John; Hur-Diaz. Sun; Carpenter, Russell

    2010-01-01

    Sigma-point filters have received a lot of attention in recent years as a better alternative to extended Kalman filters for highly nonlinear problems. In this paper, we compare the performance of the additive divided difference sigma-point filter to the extended Kalman filter when applied to orbit determination of a realistic operational scenario based on the Interstellar Boundary Explorer mission. For the scenario studied, both filters provided equivalent results. The performance of each is discussed in detail.

  9. Performance of Skutterudite-Based Modules

    NASA Astrophysics Data System (ADS)

    Nie, G.; Suzuki, S.; Tomida, T.; Sumiyoshi, A.; Ochi, T.; Mukaiyama, K.; Kikuchi, M.; Guo, J. Q.; Yamamoto, A.; Obara, H.

    2016-08-01

    Due to their excellent thermoelectric (TE) performance, skutterudite materials have been selected by many laboratories and companies for development of TE modules to recover power from waste heat at high temperatures (300°C to 600°C). After years of effort, we have developed reliable n- and p-type skutterudite materials showing maximum figure of merit (ZT) of 1.0 at 550°C and 0.75 at 450°C, respectively. In this work, we systematically investigated the performance of a module made using these two kinds of skutterudite. We demonstrate ˜7.2% conversion efficiency for temperature of 600°C at the hot side of the module and 50°C at the cold side, and show that the module had excellent stability in the high-temperature environment. Further improving the TE performance of our skutterudites, the conversion efficiency reached ˜8.5% under the same condition.

  10. Habitat availability and gene flow influence diverging local population trajectories under scenarios of climate change: a place-based approach.

    PubMed

    Schwalm, Donelle; Epps, Clinton W; Rodhouse, Thomas J; Monahan, William B; Castillo, Jessica A; Ray, Chris; Jeffress, Mackenzie R

    2016-04-01

    Ecological niche theory holds that species distributions are shaped by a large and complex suite of interacting factors. Species distribution models (SDMs) are increasingly used to describe species' niches and predict the effects of future environmental change, including climate change. Currently, SDMs often fail to capture the complexity of species' niches, resulting in predictions that are generally limited to climate-occupancy interactions. Here, we explore the potential impact of climate change on the American pika using a replicated place-based approach that incorporates climate, gene flow, habitat configuration, and microhabitat complexity into SDMs. Using contemporary presence-absence data from occupancy surveys, genetic data to infer connectivity between habitat patches, and 21 environmental niche variables, we built separate SDMs for pika populations inhabiting eight US National Park Service units representing the habitat and climatic breadth of the species across the western United States. We then predicted occurrence probability under current (1981-2010) and three future time periods (out to 2100). Occurrence probabilities and the relative importance of predictor variables varied widely among study areas, revealing important local-scale differences in the realized niche of the American pika. This variation resulted in diverse and - in some cases - highly divergent future potential occupancy patterns for pikas, ranging from complete extirpation in some study areas to stable occupancy patterns in others. Habitat composition and connectivity, which are rarely incorporated in SDM projections, were influential in predicting pika occupancy in all study areas and frequently outranked climate variables. Our findings illustrate the importance of a place-based approach to species distribution modeling that includes fine-scale factors when assessing current and future climate impacts on species' distributions, especially when predictions are intended to manage and

  11. Performance Based Education. Technology Activity Modules.

    ERIC Educational Resources Information Center

    Custer, Rodney L., Ed.

    These Technology Activity Modules are designed to serve as an implementation resource for technology education teachers as they integrate technology education with Missouri's Academic Performance Standards and provide a source of activities and activity ideas that can be used to integrate and reinforce learning across the curriculum. The modules…

  12. SEU Performance of TAG Based Flip Flops

    NASA Technical Reports Server (NTRS)

    Shuler, Robert L.; Kouba, Coy; O'Neill, Patrick M.

    2005-01-01

    We describe heavy ion test results for two new SEU tolerant latches based on transition nand gates, one for single rail asynchronous and the other for dual rail synchronous designs, implemented in AMI 0.5microprocess.

  13. The USGS Earthquake Scenario Project

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Petersen, M. D.; Wald, L. A.; Frankel, A. D.; Quitoriano, V. R.; Lin, K.; Luco, N.; Mathias, S.; Bausch, D.

    2009-12-01

    The U.S. Geological Survey’s (USGS) Earthquake Hazards Program (EHP) is producing a comprehensive suite of earthquake scenarios for planning, mitigation, loss estimation, and scientific investigations. The Earthquake Scenario Project (ESP), though lacking clairvoyance, is a forward-looking project, estimating earthquake hazard and loss outcomes as they may occur one day. For each scenario event, fundamental input includes i) the magnitude and specified fault mechanism and dimensions, ii) regional Vs30 shear velocity values for site amplification, and iii) event metadata. A grid of standard ShakeMap ground motion parameters (PGA, PGV, and three spectral response periods) is then produced using the well-defined, regionally-specific approach developed by the USGS National Seismic Hazard Mapping Project (NHSMP), including recent advances in empirical ground motion predictions (e.g., the NGA relations). The framework also allows for numerical (3D) ground motion computations for specific, detailed scenario analyses. Unlike NSHMP ground motions, for ESP scenarios, local rock and soil site conditions and commensurate shaking amplifications are applied based on detailed Vs30 maps where available or based on topographic slope as a proxy. The scenario event set is comprised primarily by selection from the NSHMP events, though custom events are also allowed based on coordination of the ESP team with regional coordinators, seismic hazard experts, seismic network operators, and response coordinators. The event set will be harmonized with existing and future scenario earthquake events produced regionally or by other researchers. The event list includes approximate 200 earthquakes in CA, 100 in NV, dozens in each of NM, UT, WY, and a smaller number in other regions. Systematic output will include all standard ShakeMap products, including HAZUS input, GIS, KML, and XML files used for visualization, loss estimation, ShakeCast, PAGER, and for other systems. All products will be

  14. Use Of Scenario Ground Motion Maps In Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Somerville, P. G.

    2001-12-01

    dominant magnitude-distance combination. The design ground motions in current building codes correspond to a single annual probability of occurrence, so a single earthquake scenario may be used to approximate the design ground motion. However, the next generation of building codes will be based on the concept of performance based earthquake engineering. PBEE requires that ground motions be specified for several different annual frequencies of occurrence that correspond to different levels of building performance, with increasing ground motion levels (corresponding to decreasing annual probability of occurrence) causing increasingly unacceptable damage states. Thus ground motions for a set of different earthquake scenarios may be required to approximate the ground motions needed for use in PBEE.

  15. Containment performance perspectives based on IPE results

    SciTech Connect

    Lehner, J.R.; Lin, C.C.; Pratt, W.T.

    1996-12-31

    Perspectives on Containment Performance were obtained from the accident progression analyses, i.e. level 2 PRA analyses, found in the IPE submittals. Insights related to the containment failure modes, the releases associated with those failure modes, and the factors responsible for the types of containment failures and release sizes reported were gathered. The results summarized here are discussed in detail in volumes 1 and 2 of NUREG 1560. 3 refs., 4 figs.

  16. GLOBAL ALTERNATIVE FUTURE SCENARIOS

    EPA Science Inventory

    One way to examine possible future outcomes for environmental protection is through the development and analysis of alternative future scenarios. This type of assessment postulates two or more different paths that social and environmental development might take, using correspond...

  17. Performance Evaluation of Triangulation Based Range Sensors

    PubMed Central

    Guidi, Gabriele; Russo, Michele; Magrassi, Grazia; Bordegoni, Monica

    2010-01-01

    The performance of 2D digital imaging systems depends on several factors related with both optical and electronic processing. These concepts have originated standards, which have been conceived for photographic equipment and bi-dimensional scanning systems, and which have been aimed at estimating different parameters such as resolution, noise or dynamic range. Conversely, no standard test protocols currently exist for evaluating the corresponding performances of 3D imaging systems such as laser scanners or pattern projection range cameras. This paper is focused on investigating experimental processes for evaluating some critical parameters of 3D equipment, by extending the concepts defined by the ISO standards to the 3D domain. The experimental part of this work concerns the characterization of different range sensors through the extraction of their resolution, accuracy and uncertainty from sets of 3D data acquisitions of specifically designed test objects whose geometrical characteristics are known in advance. The major objective of this contribution is to suggest an easy characterization process for generating a reliable comparison between the performances of different range sensors and to check if a specific piece of equipment is compliant with the expected characteristics. PMID:22163599

  18. SDG&E`s performance-based ratemaking

    SciTech Connect

    Scadding, J.

    1995-11-01

    Performance-based ratemaking (PBR) in the electric utility industry is outlined. The following topics are discussed: average cents/RWh for residential customers; PBR throws its shadow before it; two phases of SDG and E`s PBR; elements of base-rates PBR; price performance benchmark; `non-price` performance indicators; two-way conditionality; and sharing and off-vamps.

  19. Projections of high resolution climate changes for South Korea using multiple-regional climate models based on four RCP scenarios. Part 2: precipitation

    NASA Astrophysics Data System (ADS)

    Oh, Seok-Geun; Suh, Myoung-Seok; Lee, Young-Suk; Ahn, Joong-Bae; Cha, Dong-Hyun; Lee, Dong-Kyou; Hong, Song-You; Min, Seung-Ki; Park, Seong-Chan; Kang, Hyun-Suk

    2016-05-01

    Precipitation changes over South Korea were projected using five regional climate models (RCMs) with a horizontal resolution of 12.5 km for the mid and late 21st century (2026-2050, 2076- 2100) under four Representative Concentration Pathways (RCP) scenarios against present precipitation (1981-2005). The simulation data of the Hadley Centre Global Environmental Model version 2 coupled with the Atmosphere-Ocean (HadGEM2-AO) was used as boundary data of RCMs. In general, the RCMs well simulated the spatial and seasonal variations of present precipitation compared with observation and HadGEM2-AO. Equal Weighted Averaging without Bias Correction (EWA_NBC) significantly reduced the model biases to some extent, but systematic biases in results still remained. However, the Weighted Averaging based on Taylor's skill score (WEA_Tay) showed a good statistical correction in terms of the spatial and seasonal variations, the magnitude of precipitation amount, and the probability density. In the mid-21st century, the spatial and interannual variabilities of precipitation over South Korea are projected to increase regardless of the RCP scenarios and seasons. However, the changes in area-averaged seasonal precipitation are not significant due to mixed changing patterns depending on locations. Whereas, in the late 21st century, the precipitation is projected to increase proportionally to the changes of net radiative forcing. Under RCP8.5, WEA_Tay projects the precipitation to be increased by about +19.1, +20.5, +33.3% for annual, summer and winter precipitation at 1-5% significance levels, respectively. In addition, the probability of strong precipitation (≥ 15 mm d-1) is also projected to increase significantly, particularly in WEA_Tay under RCP8.5.

  20. Characterizing the emission implications of future natural gas production and use in the U.S. and Rocky Mountain region: A scenario-based energy system modeling approach

    NASA Astrophysics Data System (ADS)

    McLeod, Jeffrey

    The recent increase in U.S. natural gas production made possible through advancements in extraction techniques including hydraulic fracturing has transformed the U.S. energy supply landscape while raising questions regarding the balance of environmental impacts associated with natural gas production and use. Impact areas at issue include emissions of methane and criteria pollutants from natural gas production, alongside changes in emissions from increased use of natural gas in place of coal for electricity generation. In the Rocky Mountain region, these impact areas have been subject to additional scrutiny due to the high level of regional oil and gas production activity and concerns over its links to air quality. Here, the MARKAL (MArket ALlocation) least-cost energy system optimization model in conjunction with the EPA-MARKAL nine-region database has been used to characterize future regional and national emissions of CO 2, CH4, VOC, and NOx attributed to natural gas production and use in several sectors of the economy. The analysis is informed by comparing and contrasting a base case, business-as-usual scenario with scenarios featuring variations in future natural gas supply characteristics, constraints affecting the electricity generation mix, carbon emission reduction strategies and increased demand for natural gas in the transportation sector. Emission trends and their associated sensitivities are identified and contrasted between the Rocky Mountain region and the U.S. as a whole. The modeling results of this study illustrate the resilience of the short term greenhouse gas emission benefits associated with fuel switching from coal to gas in the electric sector, but also call attention to the long term implications of increasing natural gas production and use for emissions of methane and VOCs, especially in the Rocky Mountain region. This analysis can help to inform the broader discussion of the potential environmental impacts of future natural gas production

  1. Competency/Performance-Based Student Teaching Program.

    ERIC Educational Resources Information Center

    Simms, Earline M.

    The competency-based, student teaching program of the Southern University (Baton Rouge, Louisiana) College of Education is described. The program basis is a listing of fourteen competencies (teaching skills) which provides a guide for structured and meaningful activities during the observation period, consistency in directing those experiences,…

  2. Performance Appraisal Is Based on Five Major Assumptions.

    ERIC Educational Resources Information Center

    Silver, Harvey A.

    This review of the performance appraisal process discusses the major assumptions on which performance appraisal is based, the general goals of performance appraisal, and the characteristics of effective performance appraisal programs. The author stresses the dependence of the process on the assumption that human behavior can be changed; he…

  3. MIOSAT Mission Scenario and Design

    NASA Astrophysics Data System (ADS)

    Agostara, C.; Dionisio, C.; Sgroi, G.; di Salvo, A.

    2008-08-01

    MIOSAT ("Mssione Ottica su microSATellite") is a low-cost technological / scientific microsatellite mission for Earth Observation, funded by Italian Space Agency (ASI) and managed by a Group Agreement between Rheinmetall Italia - B.U. Spazio - Contraves as leader and Carlo Gavazzi Space as satellite manufacturer. Several others Italians Companies, SME and Universities are involved in the development team with crucial roles. MIOSAT is a microsatellite weighting around 120 kg and placed in a 525 km altitude sun-synchronuos circular LEO orbit. The microsatellite embarks three innovative optical payloads: Sagnac multi spectral radiometer (IFAC-CNR), Mach Zehender spectrometer (IMM-CNR), high resolution pancromatic camera (Selex Galileo). In addition three technological experiments will be tested in-flight. The first one is an heat pipe based on Marangoni effect with high efficiency. The second is a high accuracy Sun Sensor using COTS components and the last is a GNSS SW receiver that utilizes a Leon2 processor. Finally a new generation of 28% efficiency solar cells will be adopted for the power generation. The platform is highly agile and can tilt along and cross flight direction. The pointing accuracy is in the order of 0,1° for each axe. The pointing determination during images acquisition is <0,02° for the axis normal to the boresight and 0,04° for the boresight. This paper deals with MIOSAT mission scenario and definition, highlighting trade-offs for mission implementation. MIOSAT mission design has been constrained from challenging requirements in terms of satellite mass, mission lifetime, instrument performance, that have implied the utilization of satellite agility capability to improve instruments performance in terms of S/N and resolution. The instruments provide complementary measurements that can be combined in effective ways to exploit new applications in the fields of atmosphere composition analysis, Earth emissions, antropic phenomena, etc. The Mission

  4. Structuring a Performance-Based Teacher Education Program in Science

    ERIC Educational Resources Information Center

    Dziuban, Charles D.; Esler, William K.

    1973-01-01

    Discusses three components of a performance based teacher education program. The program objectives are defined in terms of knowledge; performance; consequence; and affective. The selection of conditions, and evaluation methods for each objective are outlined. (PS)

  5. 48 CFR 970.1100-1 - Performance-based contracting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-based contracting concepts and methodologies that may be generally applied to management and operating... methods of accomplishing the work; use measurable (i.e., terms of quality, timeliness, quantity) performance standards and objectives and quality assurance surveillance plans; provide performance...

  6. Performance-Based Thinking and Training for Competence.

    ERIC Educational Resources Information Center

    Rakow, Joel

    1982-01-01

    Discusses five job behavior functions viewed as necessary for practicing performance-based thinking in instructional development activities. Functions examined include the abilities to plan to perform a job, execute a task, monitor or control execution, troubleshoot, and evaluate. (MER)

  7. Bioretention function under climate change scenarios in North Carolina, USA

    NASA Astrophysics Data System (ADS)

    Hathaway, J. M.; Brown, R. A.; Fu, J. S.; Hunt, W. F.

    2014-11-01

    The effect of climate change on stormwater controls is largely unknown. Evaluating such effects is important for understanding how well resiliency can be built into urban watersheds by implementing these systems. Bioretention areas with varied media depths, in situ soil types, drainage configurations, and surface infiltration capabilities have previously been monitored, modelled, and calibrated using the continuous simulation model, DRAINMOD. In this study, data from downscaled climate projections for 2055 through 2058 were utilized in these models to evaluate changes in system hydrologic function under two climate change scenarios (RCP 4.5 and 8.5). The results were compared to those generated using a “Base” scenario of observed data from 2001 to 2004. The results showed a modest change in the overall water balance of the system. In particular, the frequency and magnitude of overflow from the systems substantially increased under the climate change scenarios. As this represents an increase in the amount of uncontrolled, untreated runoff from the contributing watersheds, it is of particular concern. Further modelling showed that between 9.0 and 31.0 cm of additional storage would be required under the climate change scenarios to restrict annual overflow to that of the base scenario. Bioretention surface storage volume and infiltration rate appeared important in determining a system's ability to cope with increased yearly rainfall and higher rainfall magnitudes. As climate change effects vary based on location, similar studies should be performed in other locations to determine localized effects on stormwater controls.

  8. Spreadsheet Based Scaling Calculations and Membrane Performance

    SciTech Connect

    Wolfe, T D; Bourcier, W L; Speth, T F

    2000-12-28

    Many membrane element manufacturers provide a computer program to aid buyers in the use of their elements. However, to date there are few examples of fully integrated public domain software available for calculating reverse osmosis and nanofiltration system performance. The Total Flux and Scaling Program (TFSP), written for Excel 97 and above, provides designers and operators new tools to predict membrane system performance, including scaling and fouling parameters, for a wide variety of membrane system configurations and feedwaters. The TFSP development was funded under EPA contract 9C-R193-NTSX. It is freely downloadable at www.reverseosmosis.com/download/TFSP.zip. TFSP includes detailed calculations of reverse osmosis and nanofiltration system performance. Of special significance, the program provides scaling calculations for mineral species not normally addressed in commercial programs, including aluminum, iron, and phosphate species. In addition, ASTM calculations for common species such as calcium sulfate (CaSO{sub 4}{times}2H{sub 2}O), BaSO{sub 4}, SrSO{sub 4}, SiO{sub 2}, and LSI are also provided. Scaling calculations in commercial membrane design programs are normally limited to the common minerals and typically follow basic ASTM methods, which are for the most part graphical approaches adapted to curves. In TFSP, the scaling calculations for the less common minerals use subsets of the USGS PHREEQE and WATEQ4F databases and use the same general calculational approach as PHREEQE and WATEQ4F. The activities of ion complexes are calculated iteratively. Complexes that are unlikely to form in significant concentration were eliminated to simplify the calculations. The calculation provides the distribution of ions and ion complexes that is used to calculate an effective ion product ''Q.'' The effective ion product is then compared to temperature adjusted solubility products (Ksp's) of solids in order to calculate a Saturation Index (SI) for each solid of

  9. The changing nutrition scenario

    PubMed Central

    Gopalan, C.

    2013-01-01

    The past seven decades have seen remarkable shifts in the nutritional scenario in India. Even up to the 1950s severe forms of malnutrition such as kwashiorkar and pellagra were endemic. As nutritionists were finding home-grown and common-sense solutions for these widespread problems, the population was burgeoning and food was scarce. The threat of widespread household food insecurity and chronic undernutrition was very real. Then came the Green Revolution. Shortages of food grains disappeared within less than a decade and India became self-sufficient in food grain production. But more insidious problems arising from this revolution were looming, and cropping patterns giving low priority to coarse grains and pulses, and monocropping led to depletion of soil nutrients and ‘Green Revolution fatigue’. With improved household food security and better access to health care, clinical manifestations of severe malnutrition virtually disappeared. But the decline in chronic undernutrition and “hidden hunger” from micronutrient deficiencies was slow. On the cusp of the new century, an added factor appeared on the nutritional scene in India. With steady urban migration, upward mobility out of poverty, and an increasingly sedentary lifestyle because of improvements in technology and transport, obesity rates began to increase, resulting in a dual burden. Measured in terms of its performance in meeting its Millennium Development Goals, India has fallen short. Despite its continuing high levels of poverty and illiteracy, India has a huge demographic potential in the form of a young population. This advantage must be leveraged by investing in nutrition education, household access to nutritious diets, sanitary environment and a health-promoting lifestyle. This requires co-operation from all the stakeholders, including governments, non government organizations, scientists and the people at large. PMID:24135189

  10. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  11. Development of advanced inductive scenarios for ITER

    NASA Astrophysics Data System (ADS)

    Luce, T. C.; Challis, C. D.; Ide, S.; Joffrin, E.; Kamada, Y.; Politzer, P. A.; Schweinzer, J.; Sips, A. C. C.; Stober, J.; Giruzzi, G.; Kessel, C. E.; Murakami, M.; Na, Y.-S.; Park, J. M.; Polevoi, A. R.; Budny, R. V.; Citrin, J.; Garcia, J.; Hayashi, N.; Hobirk, J.; Hudson, B. F.; Imbeaux, F.; Isayama, A.; McDonald, D. C.; Nakano, T.; Oyama, N.; Parail, V. V.; Petrie, T. W.; Petty, C. C.; Suzuki, T.; Wade, M. R.; the ITPA Integrated Operation Scenario Topical Group Members; the ASDEX-Upgrade Team; the DIII-D Team; EFDA Contributors, JET; the JT-60U Team

    2014-01-01

    Since its inception in 2002, the International Tokamak Physics Activity topical group on Integrated Operational Scenarios (IOS) has coordinated experimental and modelling activity on the development of advanced inductive scenarios for applications in the ITER tokamak. The physics basis and the prospects for applications in ITER have been advanced significantly during that time, especially with respect to experimental results. The principal findings of this research activity are as follows. Inductive scenarios capable of higher normalized pressure (βN ⩾ 2.4) than the ITER baseline scenario (βN = 1.8) with normalized confinement at or above the standard H-mode scaling are well established under stationary conditions on the four largest diverted tokamaks (AUG, DIII-D, JET, JT-60U), demonstrated in a database of more than 500 plasmas from these tokamaks analysed here. The parameter range where high performance is achieved is broad in q95 and density normalized to the empirical density limit. MHD modes can play a key role in reaching stationary high performance, but also define the limits to achieved stability and confinement. Projection of performance in ITER from existing experiments uses empirical scalings and theory-based modelling. The status of the experimental validation of both approaches is summarized here. The database shows significant variation in the energy confinement normalized to standard H-mode confinement scalings, indicating the possible influence of additional physics variables absent from the scalings. Tests using the available information on rotation and the ratio of the electron and ion temperatures indicate neither of these variables in isolation can explain the variation in normalized confinement observed. Trends in the normalized confinement with the two dimensionless parameters that vary most from present-day experiments to ITER, gyroradius and collision frequency, are significant. Regression analysis on the multi-tokamak database has been

  12. Application of the Water Evaluation and Planning (WEAP) System for Integrated Hydrologic and Scenario-based Water Resources Systems Modeling in the Western Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Mehta, V. K.; Purkey, D. R.; Young, C.; Joyce, B.; Yates, D.

    2008-12-01

    Rivers draining western slopes of the Sierra Nevada provide critical water supply, hydropower, fisheries and recreation services to California. Coordinated efforts are under way to better characterize and model the possible impacts of climate change on Sierra Nevada hydrology. Research suggests substantial end-of- century reductions in Sierra Nevada snowpack and a shift in the center of mass of the snowmelt hydrograph. Management decisions, land use change and population growth add further complexity, necessitating the use of scenario-based modeling tools. The Water Evaluation and Planning (WEAP) system is one of the suite of tools being employed in this effort. Unlike several models that rely on perturbation of historical runoff data to simulate future climate conditions, WEAP includes a dynamically integrated watershed hydrology module that is forced by input climate time series. This allows direct simulation of water management response to climate and land use change. This paper presents ABY2008, a WEAP application for the Yuba, Bear and American River (ABY) watersheds of the Sierra Nevada. These rivers are managed by water agencies and hydropower utilities through a complex network of reservoirs, dams, hydropower plants and water conveyances. Historical watershed hydrology in ABY2008 is driven by a 10 year weekly climate time series from 1991-2000. Land use and soils data were combined into 12 landclasses representing each of 324 hydrological response units. Hydrologic parameters were incorporated from a calibration against observed streamflow developed for the entire western Sierra. Physical reservoir data, operating rules, and water deliveries to water agencies were obtained from public documents of water agencies and power utilities that manage facilities in the watersheds. ABY2008 includes 25 major reservoirs, 39 conveyances, 33 hydropower plants and 14 transmission links to 13 major water demand points. In WEAP, decisions for transferring water at

  13. BCube Ocean Scenario

    NASA Astrophysics Data System (ADS)

    Santoro, Mattia; Schofield, Oscar; Pearlman, Jay; Nativi, Stefano

    2015-04-01

    To address complex Earth system issues such as climate change and water resources, geoscientists must work across disciplinary boundaries; this requires them to access data outside of their fields. Scientists are being called upon to find, access, and use diverse and voluminous data types that are described with semantics. Within the framework of the NSF EarthCube programme, the BCube project (A Broker Framework for Next Generation Geoscience) is addressing the need for effective and efficient multi-disciplinary collaboration and interoperability through the advancement of brokering technologies. BCube develops science scenarios as key elements in providing an environment for demonstrating capabilities, benefits, and challenges of the developed e-infrastructure. The initial focus is on hydrology, oceans, polar and weather, with the intent to make the technology applicable and available to all the geosciences. This presentation focuses on the BCube ocean scenario. The purpose of this scenario is to increase the understanding of the ocean dynamics through incorporation of a wide range of in-situ and satellite data into ocean models using net primary productivity as the initial variable. The science scenario aims to identify spatial and temporal domains in ocean models, and key ecological variables. Field data sets and remote observations data sets from distributed and heterogeneous systems are accessed through the broker and will be incorporated into the models. In this work we will present the achievements in the development of the BCube ocean scenario.

  14. Scenarios for gluino coannihilation

    DOE PAGES

    Ellis, John; Evans, Jason L.; Luo, Feng; Olive, Keith A.

    2016-02-11

    In this article, we study supersymmetric scenarios in which the gluino is the next-to-lightest supersymmetric particle (NLSP), with a mass sufficiently close to that of the lightest supersymmetric particle (LSP) that gluino coannihilation becomes important. One of these scenarios is the MSSM with soft supersymmetry-breaking squark and slepton masses that are universal at an input GUT renormalization scale, but with non-universal gaugino masses. The other scenario is an extension of the MSSM to include vector-like supermultiplets. In both scenarios, we identify the regions of parameter space where gluino coannihilation is important, and discuss their relations to other regions of parametermore » space where other mechanisms bring the dark matter density into the range allowed by cosmology. In the case of the non-universal MSSM scenario, we find that the allowed range of parameter space is constrained by the requirement of electroweak symmetry breaking, the avoidance of a charged LSP and the measured mass of the Higgs boson, in particular, as well as the appearance of other dark matter (co)annihilation processes. Nevertheless, LSP masses mX ≲ 8TeV with the correct dark matter density are quite possible. In the case of pure gravity mediation with additional vector-like supermultiplets, changes to the anomaly-mediated gluino mass and the threshold effects associated with these states can make the gluino almost degenerate with the LSP, and we find a similar upper bound.« less

  15. Image based performance analysis of thermal imagers

    NASA Astrophysics Data System (ADS)

    Wegner, D.; Repasi, E.

    2016-05-01

    Due to advances in technology, modern thermal imagers resemble sophisticated image processing systems in functionality. Advanced signal and image processing tools enclosed into the camera body extend the basic image capturing capability of thermal cameras. This happens in order to enhance the display presentation of the captured scene or specific scene details. Usually, the implemented methods are proprietary company expertise, distributed without extensive documentation. This makes the comparison of thermal imagers especially from different companies a difficult task (or at least a very time consuming/expensive task - e.g. requiring the execution of a field trial and/or an observer trial). For example, a thermal camera equipped with turbulence mitigation capability stands for such a closed system. The Fraunhofer IOSB has started to build up a system for testing thermal imagers by image based methods in the lab environment. This will extend our capability of measuring the classical IR-system parameters (e.g. MTF, MTDP, etc.) in the lab. The system is set up around the IR- scene projector, which is necessary for the thermal display (projection) of an image sequence for the IR-camera under test. The same set of thermal test sequences might be presented to every unit under test. For turbulence mitigation tests, this could be e.g. the same turbulence sequence. During system tests, gradual variation of input parameters (e. g. thermal contrast) can be applied. First ideas of test scenes selection and how to assembly an imaging suite (a set of image sequences) for the analysis of imaging thermal systems containing such black boxes in the image forming path is discussed.

  16. Characterising Seismic Hazard Input for Analysis Risk to Multi-System Infrastructures: Application to Scenario Event-Based Models and extension to Probabilistic Risk

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Silva, V.

    2011-12-01

    The potential human and economic cost of earthquakes to complex urban infrastructures has been demonstrated in the most emphatic manner by recent large earthquakes such as that of Haiti (February 2010), Christchurch (September 2010 and February 2011) and Tohoku (March 2011). Consideration of seismic risk for a homogenous portfolio, such as a single building typology or infrastructure, or independent analyses of separate typologies or infrastructures, are insufficient to fully characterise the potential impacts that arise from inter-connected system failure. Individual elements of each infrastructure may be adversely affected by different facets of the ground motion (e.g. short-period acceleration, long-period displacement, cumulative energy input etc.). The accuracy and efficiency of the risk analysis is dependent on the ability to characterise these multiple features of the ground motion over a spatially distributed portfolio of elements. The modelling challenges raised by this extension to multi-system analysis of risk have been a key focus of the European Project "Systemic Seismic Vulnerability and Risk Analysis for Buildings, Lifeline Networks and Infrastructures Safety Gain (SYNER-G)", and are expected to be developed further within the Global Earthquake Model (GEM). Seismic performance of a spatially distributed infrastructure during an earthquake may be assessed by means of Monte Carlo simulation, in order to incorporate the aleatory variability of the ground motion into the network analysis. Methodologies for co-simulating large numbers of spatially cross-correlated ground motion fields are appraised, and their potential impacts on a spatially distributed portfolio of mixed building typologies assessed using idealised case study scenarios from California and Europe. Potential developments to incorporate correlation and uncertainty in site amplification and geotechnical hazard are also explored. Whilst the initial application of the seismic risk analysis is

  17. Automata learning algorithms and processes for providing more complete systems requirements specification by scenario generation, CSP-based syntax-oriented model construction, and R2D2C system requirements transformation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Margaria, Tiziana (Inventor); Rash, James L. (Inventor); Rouff, Christopher A. (Inventor); Steffen, Bernard (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.

  18. Performance-Based Staff Development: The Cost-Effective Alternative.

    ERIC Educational Resources Information Center

    Boyer, Catherine M.

    1981-01-01

    Describes how to use the performance-based concept in developing staff. Discusses the identification of objectives based on performance expectations and the development of learning experiences that (1) emphasize application of knowledge; (2) integrate adult learning principles; and (3) make use of learning contracts, self-learning packages,…

  19. Team Primacy Concept (TPC) Based Employee Evaluation and Job Performance

    ERIC Educational Resources Information Center

    Muniute, Eivina I.; Alfred, Mary V.

    2007-01-01

    This qualitative study explored how employees learn from Team Primacy Concept (TPC) based employee evaluation and how they use the feedback in performing their jobs. TPC based evaluation is a form of multirater evaluation, during which the employee's performance is discussed by one's peers in a face-to-face team setting. The study used Kolb's…

  20. Detect-to-warn scenarios for defense against airborne contaminants

    NASA Astrophysics Data System (ADS)

    Cousins, Daniel; Campbell, Steven D.; Joseph, Rose

    2004-12-01

    Detect-to-warn defense strategies against airborne contamination are based on providing warning to personnel to take temporary protective actions. The effectiveness of such detect-to-warn active strategies is measured by the reduction in contaminant exposure compared to passive exposure. Effectiveness depends on several factors, including the contaminant release and transport properties, the warning sensor performance and the protective actions taken. In this paper we analyze effectiveness for several specific scenarios where certain reasonable protective actions are assumed and sensor performance is varied. One type of scenario analyzed is the protection of outdoor personnel against an upwind instantaneous point release. Meteorological conditions such as wind speed, turbulence level and heat flux, which result in high exposure levels are assumed. Personnel are warned to temporarily use filter masks based on a warning signal from a sensor placed between them and the release point. Another type of scenario is the protection of personnel inside of a building using active ventilation control. The building air handling properties, such as air exchange and recirculation, degree of leakage and filtration and zone volume, are representative of modern office buildings. Different sensor locations and ventilation control strategies are chosen to defend against outside and inside instantaneous point releases. In each scenario, we evaluate the dependence of effectiveness on sensor sensitivity threshold and response time. In addition, we describe desired values of other sensor attributes, such as false positive sensing rate, size, power consumption, maintenance frequency and procurement cost, to support realistic deployment and operations.

  1. Safety evaluation of MHTGR licensing basis accident scenarios

    SciTech Connect

    Kroeger, P.G.

    1989-04-01

    The safety potential of the Modular High-Temperature Gas Reactor (MHTGR) was evaluated, based on the Preliminary Safety Information Document (PSID), as submitted by the US Department of Energy to the US Nuclear Regulatory Commission. The relevant reactor safety codes were extended for this purpose and applied to this new reactor concept, searching primarily for potential accident scenarios that might lead to fuel failures due to excessive core temperatures and/or to vessel damage, due to excessive vessel temperatures. The design basis accident scenario leading to the highest vessel temperatures is the depressurized core heatup scenario without any forced cooling and with decay heat rejection to the passive Reactor Cavity Cooling System (RCCS). This scenario was evaluated, including numerous parametric variations of input parameters, like material properties and decay heat. It was found that significant safety margins exist, but that high confidence levels in the core effective thermal conductivity, the reactor vessel and RCCS thermal emissivities and the decay heat function are required to maintain this safety margin. Severe accident extensions of this depressurized core heatup scenario included the cases of complete RCCS failure, cases of massive air ingress, core heatup without scram and cases of degraded RCCS performance due to absorbing gases in the reactor cavity. Except for no-scram scenarios extending beyond 100 hr, the fuel never reached the limiting temperature of 1600/degree/C, below which measurable fuel failures are not expected. In some of the scenarios, excessive vessel and concrete temperatures could lead to investment losses but are not expected to lead to any source term beyond that from the circulating inventory. 19 refs., 56 figs., 11 tabs.

  2. Standards of Performance for Community Based Educational Institutions: Quick Check of Institutional Performance.

    ERIC Educational Resources Information Center

    Association of Community Based Education, Washington, DC.

    Designed for use with "Standards of Performance for Community Based Educational Institutions" and a "Self-Assessment Workbook," this checklist helps community based educational institutions in identifying areas of performance which need improvement or further study and in assessing the overall effectiveness of the institution in carrying out its…

  3. Competency-Based Performance Appraisals: Improving Performance Evaluations of School Nutrition Managers and Assistants/Technicians

    ERIC Educational Resources Information Center

    Cross, Evelina W.; Asperin, Amelia Estepa; Nettles, Mary Frances

    2009-01-01

    Purpose: The purpose of the research was to develop a competency-based performance appraisal resource for evaluating school nutrition (SN) managers and assistants/technicians. Methods: A two-phased process was used to develop the competency-based performance appraisal resource for SN managers and assistants/technicians. In Phase I, draft…

  4. Creative use of scenarios. Final report, September 1986-April 1987

    SciTech Connect

    Tritten, J.J.

    1987-04-30

    Surprise and the Single Scenarios is the title of an article by Sir James Cable. The essence of his thesis is that the United Kingdom should not prepare its military with just one contingency in mind. Related theses have been debated for many years; should Soviet military strategy be based upon the doctrinal assumption of quick escalation to nuclear war. Should U.S. nuclear forces be procured with the requirement to survive a well-executed surprise first strike. In considering these and related political-military questions, scenarios are often created to flesh out the concept being considered. For example, military planners in the USSR undoubtedly use alternating scenarios to consider possible courses that armed conflict could take in order that they might assess the impact of short- or long-time scales on nuclear/conventional interactions. Similarly, varying scenarios are used in the U.S. to demonstrate the impact of different threat assumptions on the amount and types of nuclear forces that the U.S. should buy that would guarantee an acceptable level of retaliation. The major point to all this, and this report, is that in order to perform complex political military assessments, political scientists either explicitly or implicitly use operations analysis techniques, including simulations, gaming and scenarios.

  5. Enhanced Engine Performance During Emergency Operation Using a Model-Based Engine Control Architecture

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and application of model-based engine control (MBEC) for use during emergency operation of the aircraft. The MBEC methodology is applied to the Commercial Modular Aero-Propulsion System Simulation 40k (CMAPSS40k) and features an optimal tuner Kalman Filter (OTKF) to estimate unmeasured engine parameters, which can then be used for control. During an emergency scenario, normally-conservative engine operating limits may be relaxed to increase the performance of the engine and overall survivability of the aircraft; this comes at the cost of additional risk of an engine failure. The MBEC architecture offers the advantage of estimating key engine parameters that are not directly measureable. Estimating the unknown parameters allows for tighter control over these parameters, and on the level of risk the engine will operate at. This will allow the engine to achieve better performance than possible when operating to more conservative limits on a related, measurable parameter.

  6. Enhanced Engine Performance During Emergency Operation Using a Model-Based Engine Control Architecture

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2015-01-01

    This paper discusses the design and application of model-based engine control (MBEC) for use during emergency operation of the aircraft. The MBEC methodology is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40,000) and features an optimal tuner Kalman Filter (OTKF) to estimate unmeasured engine parameters, which can then be used for control. During an emergency scenario, normally-conservative engine operating limits may be relaxed to increase the performance of the engine and overall survivability of the aircraft; this comes at the cost of additional risk of an engine failure. The MBEC architecture offers the advantage of estimating key engine parameters that are not directly measureable. Estimating the unknown parameters allows for tighter control over these parameters, and on the level of risk the engine will operate at. This will allow the engine to achieve better performance than possible when operating to more conservative limits on a related, measurable parameter.

  7. A Native American exposure scenario.

    PubMed

    Harris, S G; Harper, B L

    1997-12-01

    EPA's Risk Assessment Guidance for Superfund (RAGS) and later documents provide guidance for estimating exposures received from suburban and agricultural activity patterns and lifestyles. However, these methods are not suitable for typical tribal communities whose members pursue, at least in part, traditional lifestyles. These lifestyles are derived from a long association with all of the resources in a particular region. We interviewed 35 members of a Columbia River Basin tribe to develop a lifestyle-based subsistence exposure scenario that represents a midrange exposure that a traditional tribal member would receive. This scenario provides a way to partially satisfy Executive Order 12,898 on environmental justice, which requires a specific evaluation of impacts from federal actions to peoples with subsistence diets. Because a subsistence diet is only a portion of what is important to a traditional lifestyle, we also used information obtained from the interviews to identify parameters for evaluating impacts to environmental and sociocultural quality of life. PMID:9463932

  8. Acting performance and flow state enhanced with sensory-motor rhythm neurofeedback comparing ecologically valid immersive VR and training screen scenarios.

    PubMed

    Gruzelier, John; Inoue, Atsuko; Smart, Roger; Steed, Anthony; Steffert, Tony

    2010-08-16

    Actors were trained in sensory-motor rhythm (SMR) neurofeedback interfaced with a computer rendition of a theatre auditorium. Enhancement of SMR led to changes in the lighting while inhibition of theta and high beta led to a reduction in intrusive audience noise. Participants were randomised to a virtual reality (VR) representation in a ReaCTor, with surrounding image projection seen through glasses, or to a 2D computer screen, which is the conventional neurofeedback medium. In addition there was a no-training comparison group. Acting performance was evaluated by three experts from both filmed, studio monologues and Hamlet excerpts on the stage of Shakespeare's Globe Theatre. Neurofeedback learning reached an asymptote earlier as did identification of the required mental state following training in the ReaCTor training compared with the computer screen, though groups reached the same asymptote. These advantages were paralleled by higher ratings of acting performance overall, well-rounded performance, and especially the creativity subscale including imaginative expression, conviction and characterisation. On the Flow State scales both neurofeedback groups scored higher than the no-training controls on self-ratings of sense of control, confidence and feeling at-one. This is the first demonstration of enhancement of artistic performance with eyes-open neurofeedback training, previously demonstrated only with eyes-closed slow-wave training. Efficacy is attributed to psychological engagement through the ecologically relevant learning context of the acting-space, putatively allowing transfer to the real world otherwise achieved with slow-wave training through imaginative visualisation. The immersive VR technology was more successful than a 2D rendition. PMID:20542087

  9. Biomass Scenario Model

    SciTech Connect

    2015-09-01

    The Biomass Scenario Model (BSM) is a unique, carefully validated, state-of-the-art dynamic model of the domestic biofuels supply chain which explicitly focuses on policy issues, their feasibility, and potential side effects. It integrates resource availability, physical/technological/economic constraints, behavior, and policy. The model uses a system dynamics simulation (not optimization) to model dynamic interactions across the supply chain.

  10. The SAFRR Tsunami Scenario

    USGS Publications Warehouse

    Porter, K.; Jones, Lucile M.; Ross, Stephanie L.; Borrero, J.; Bwarie, J.; Dykstra, D.; Geist, Eric L.; Johnson, L.; Kirby, Stephen H.; Long, K.; Lynett, P.; Miller, K.; Mortensen, Carl E.; Perry, S.; Plumlee, G.; Real, C.; Ritchie, L.; Scawthorn, C.; Thio, H.K.; Wein, Anne; Whitmore, P.; Wilson, R.; Wood, Nathan J.; Ostbo, Bruce I.; Oates, Don

    2013-01-01

    The U.S. Geological Survey and several partners operate a program called Science Application for Risk Reduction (SAFRR) that produces (among other things) emergency planning scenarios for natural disasters. The scenarios show how science can be used to enhance community resiliency. The SAFRR Tsunami Scenario describes potential impacts of a hypothetical, but realistic, tsunami affecting California (as well as the west coast of the United States, Alaska, and Hawaii) for the purpose of informing planning and mitigation decisions by a variety of stakeholders. The scenario begins with an Mw 9.1 earthquake off the Alaska Peninsula. With Pacific basin-wide modeling, we estimate up to 5m waves and 10 m/sec currents would strike California 5 hours later. In marinas and harbors, 13,000 small boats are damaged or sunk (1 in 3) at a cost of $350 million, causing navigation and environmental problems. Damage in the Ports of Los Angeles and Long Beach amount to $110 million, half of it water damage to vehicles and containerized cargo. Flooding of coastal communities affects 1800 city blocks, resulting in $640 million in damage. The tsunami damages 12 bridge abutments and 16 lane-miles of coastal roadway, costing $85 million to repair. Fire and business interruption losses will substantially add to direct losses. Flooding affects 170,000 residents and workers. A wide range of environmental impacts could occur. An extensive public education and outreach program is underway, as well as an evaluation of the overall effort.

  11. FUTURE SCENARIOS OF CHANGE IN WILDLIFE HABITAT

    EPA Science Inventory

    Studies in Pennsylvania, Iowa, California, and Oregon show varying losses of terrestrial wildlife habitat in scenarios based on different assumptions about future human land use patterns. Retrospective estimates of losses of habitat since Euro-American settlement in several stud...

  12. Ecological performance of electrical consumer products: the influence of automation and information-based measures.

    PubMed

    Sauer, Juergen; Wiese, Bettina S; Rüttinger, Bruno

    2004-01-01

    Being concerned with the environmental impact of electrical consumer products, this article examines possibilities of influencing ecological user performance through design features. Furthermore, it looks at the relationship of user characteristics and ecological performance. The impact of level of automation and type of control labelling on ecological user performance was examined in a lab-based experimental scenario with 36 users. In addition to performance indicators, a range of user variables (e.g., self-reported domestic behaviour, environmental knowledge and attitude) was measured to assess their influence on user behaviour. The results showed that low-level automation improved ecological performance whereas no such positive effect was observed for enhanced display-control labelling. Furthermore, the results suggested that the user's mental model of ecological performance was rather limited. No relationship was found between environmental knowledge, attitude and performance. The findings pointed at the strong prevalence of habits in the domestic domain. The implications of the results for designers of consumer products are discussed.

  13. Performance of standard fluoroscopy antiscatter grids in flat-detector-based cone-beam CT

    NASA Astrophysics Data System (ADS)

    Wiegert, Jens; Bertram, Matthias; Schaefer, Dirk; Conrads, Norbert; Timmer, Jan; Aach, Til; Rose, Georg

    2004-05-01

    In this paper, the performance of focused lamellar anti-scatter grids, which are currently used in fluoroscopy, is studied in order to determine guidelines of grid usage for flat detector based cone beam CT. The investigation aims at obtaining the signal to noise ratio improvement factor by the use of anti-scatter grids. First, the results of detailed Monte Carlo simulations as well as measurements are presented. From these the general characteristics of the impinging field of scattered and primary photons are derived. Phantoms modeling the head, thorax and pelvis regions have been studied for various imaging geometries with varying phantom size, cone and fan angles and patient-detector distances. Second, simulation results are shown for ideally focused and vacuum spaced grids as best case approach as well as for grids with realistic spacing materials. The grid performance is evaluated by means of the primary and scatter transmission and the signal to noise ratio improvement factor as function of imaging geometry and grid parameters. For a typical flat detector cone beam CT setup, the grid selectivity and thus the performance of anti-scatter grids is much lower compared to setups where the grid is located directly behind the irradiated object. While for small object-to-grid distances a standard grid improves the SNR, the SNR for geometries as used in flat detector based cone beam CT is deteriorated by the use of an anti-scatter grid for many application scenarios. This holds even for the pelvic region. Standard fluoroscopy anti-scatter grids were found to decrease the SNR in many application scenarios of cone beam CT due to the large patient-detector distance and have, therefore, only a limited benefit in flat detector based cone beam CT.

  14. Comparative study of performance of neutral axis tracking based damage detection

    NASA Astrophysics Data System (ADS)

    Soman, R.; Malinowski, P.; Ostachowicz, W.

    2015-07-01

    This paper presents a comparative study of a novel SHM technique for damage isolation. The performance of the Neutral Axis (NA) tracking based damage detection strategy is compared to other popularly used vibration based damage detection methods viz. ECOMAC, Mode Shape Curvature Method and Strain Flexibility Index Method. The sensitivity of the novel method is compared under changing ambient temperature conditions and in the presence of measurement noise. Finite Element Analysis (FEA) of the DTU 10 MW Wind Turbine was conducted to compare the local damage identification capability of each method and the results are presented. Under the conditions examined, the proposed method was found to be robust to ambient condition changes and measurement noise. The damage identification in some is either at par with the methods mentioned in the literature or better under the investigated damage scenarios.

  15. Performance assessment of simulated 3D laser images using Geiger-mode avalanche photo-diode: tests on simple synthetic scenarios

    NASA Astrophysics Data System (ADS)

    Coyac, Antoine; Hespel, Laurent; Riviere, Nicolas; Briottet, Xavier

    2015-10-01

    In the past few decades, laser imaging has demonstrated its potential in delivering accurate range images of objects or scenes, even at long range or under bad weather conditions (rain, fog, day and night vision). We note great improvements in the conception and development of single and multi infrared sensors, concerning embedability, circuitry reading capacity, or pixel resolution and sensitivity, allowing a wide diversity of applications (i.e. enhanced vision, long distance target detection and reconnaissance, 3D DSM generation). Unfortunately, it is often difficult to dispose of all the instruments to compare their performance for a given application. Laser imaging simulation has shown to be an interesting alternative to acquire real data, offering a higher flexibility to perform this sensors comparison, plus being time and cost efficient. In this paper, we present a 3D laser imaging end-to-end simulator using a focal plane array with Geiger mode detection, named LANGDOC. This work aims to highlight the interest and capability of this new generation of photo-diodes arrays, especially for airborne mapping and surveillance of high risk areas.

  16. Environmental assessment of spatial plan policies through land use scenarios

    SciTech Connect

    Geneletti, Davide

    2012-01-15

    This paper presents a method based on scenario analysis to compare the environmental effects of different spatial plan policies in a range of possible futures. The study aimed at contributing to overcome two limitations encountered in Strategic Environmental Assessment (SEA) for spatial planning: poor exploration of how the future might unfold, and poor consideration of alternative plan policies. Scenarios were developed through what-if functions and spatial modeling in a Geographical Information System (GIS), and consisted in maps that represent future land uses under different assumptions on key driving forces. The use of land use scenarios provided a representation of how the different policies will look like on the ground. This allowed gaining a better understanding of the policies' implications on the environment, which could be measured through a set of indicators. The research undertook a case-study approach by developing and assessing land use scenarios for the future growth of Caia, a strategically-located and fast-developing town in rural Mozambique. The effects of alternative spatial plan policies were assessed against a set of environmental performance indicators, including deforestation, loss of agricultural land, encroachment of flood-prone areas and wetlands and access to water sources. In this way, critical environmental effects related to the implementation of each policy were identified and discussed, suggesting possible strategies to address them. - Graphical abstract: Display Omitted Research Highlights: Black-Right-Pointing-Pointer The method contributes to two critical issues in SEA: exploration of the future and consideration of alternatives. Black-Right-Pointing-Pointer Future scenarios are used to test the environmental performance of different spatial plan policies in uncertainty conditions. Black-Right-Pointing-Pointer Spatially-explicit land use scenarios provide a representation of how different policies will look like on the ground.

  17. Spent fuel receipt scenarios study

    SciTech Connect

    Ballou, L.B.; Montan, D.N.; Revelli, M.A.

    1990-09-01

    This study reports on the results of an assignment from the DOE Office of Civilian Radioactive Waste Management to evaluate of the effects of different scenarios for receipt of spent fuel on the potential performance of the waste packages in the proposed Yucca Mountain high-level waste repository. The initial evaluations were performed and an interim letter report was prepared during the fall of 1988. Subsequently, the scope of work was expanded and additional analyses were conducted in 1989. This report combines the results of the two phases of the activity. This study is a part of a broader effort to investigate the options available to the DOE and the nuclear utilities for selection of spent fuel for acceptance into the Federal Waste Management System for disposal. Each major element of the system has evaluated the effects of various options on its own operations, with the objective of providing the basis for performing system-wide trade-offs and determining an optimum acceptance scenario. Therefore, this study considers different scenarios for receipt of spent fuel by the repository only from the narrow perspective of their effect on the very-near-field temperatures in the repository following permanent closure. This report is organized into three main sections. The balance of this section is devoted to a statement of the study objective, a summary of the assumptions. The second section of the report contains a discussion of the major elements of the study. The third section summarizes the results of the study and draws some conclusions from them. The appendices include copies of the waste acceptance schedule and the existing and projected spent fuel inventory that were used in the study. 10 refs., 27 figs.

  18. Estimating the economic impact of a repository from scenario-based surveys: Models of the relation of stated intent to actual behavior

    SciTech Connect

    Easterling, D.; Morwitz, V.; Kunreuther, H.

    1990-12-01

    The task of estimating the economic impact of a facility as novel and long-lived as a high-level nuclear waste (HLNW) repository is fraught with uncertainty. One approach to the forecasting problems is to survey economic agents as to how they would respond when confronted with hypothetical repository scenarios. A series of such studies conducted for the state of Nevada have examined the potential impact of a Yucca Mountain repository on behavior such as planning conventions, attending conventions, vacationing, outmigration, immigration, and business location. In each case, respondents drawn from a target population report on whether a particular repository event (either some form of an accident, or simply the presence of the facility) would cause them to act any differently than they otherwise would. The responses to such a survey provide an indication of whether or not economic behavior would be altered. However, the analysis is inevitably plagued with the question of how much credence to place in the reports of intended behavior; can we believe what people report they would do in a hypothetical situation? The present study examines a more precise version of this question regarding the validity of stated intent data. After reviewing a variety of literature in the area of intent versus actual behavior, we provide an answer to the question, ``What levels of actual behavior are consistent with the intent data that have been observed in the repository surveys?`` More formally, we assume that we are generally interested in predicting the proportion of a sample who will actually perform a target behavior. 86 refs., 6 figs., 9 tabs.

  19. Mapping of multiple parameter m-health scenarios to mobile WiMAX QoS variables.

    PubMed

    Alinejad, Ali; Philip, N; Istepanian, R S H

    2011-01-01

    Multiparameter m-health scenarios with bandwidth demanding requirements will be one of key applications in future 4 G mobile communication systems. These applications will potentially require specific spectrum allocations with higher quality of service requirements. Furthermore, one of the key 4 G technologies targeting m-health will be medical applications based on WiMAX systems. Hence, it is timely to evaluate such multiple parametric m-health scenarios over mobile WiMAX networks. In this paper, we address the preliminary performance analysis of mobile WiMAX network for multiparametric telemedical scenarios. In particular, we map the medical QoS to typical WiMAX QoS parameters to optimise the performance of these parameters in typical m-health scenario. Preliminary performance analyses of the proposed multiparametric scenarios are evaluated to provide essential information for future medical QoS requirements and constraints in these telemedical network environments. PMID:22254612

  20. Mapping of multiple parameter m-health scenarios to mobile WiMAX QoS variables.

    PubMed

    Alinejad, Ali; Philip, N; Istepanian, R S H

    2011-01-01

    Multiparameter m-health scenarios with bandwidth demanding requirements will be one of key applications in future 4 G mobile communication systems. These applications will potentially require specific spectrum allocations with higher quality of service requirements. Furthermore, one of the key 4 G technologies targeting m-health will be medical applications based on WiMAX systems. Hence, it is timely to evaluate such multiple parametric m-health scenarios over mobile WiMAX networks. In this paper, we address the preliminary performance analysis of mobile WiMAX network for multiparametric telemedical scenarios. In particular, we map the medical QoS to typical WiMAX QoS parameters to optimise the performance of these parameters in typical m-health scenario. Preliminary performance analyses of the proposed multiparametric scenarios are evaluated to provide essential information for future medical QoS requirements and constraints in these telemedical network environments.

  1. [The quality control based on the predictable performance].

    PubMed

    Zheng, D X

    2016-09-01

    The clinical performance can only be evaluated when it comes to the last step in the conventional way of prosthesis. However, it often causes the failure because of the unconformity between the expectation and final performance. Resulting from this kind of situation, quality control based on the predictable results has been suggested. It is a new idea based on the way of reverse thinking, and focuses on the need of patient and puts the final performance of the prosthesis to the first place. With the prosthodontically driven prodedure, dentists can complete the unification with the expectation and the final performance. PMID:27596338

  2. Underground infrastructure damage for a Chicago scenario

    SciTech Connect

    Dey, Thomas N; Bos, Rabdall J

    2011-01-25

    Estimating effects due to an urban IND (improvised nuclear device) on underground structures and underground utilities is a challenging task. Nuclear effects tests performed at the Nevada Test Site (NTS) during the era of nuclear weapons testing provides much information on how underground military structures respond. Transferring this knowledge to answer questions about the urban civilian environment is needed to help plan responses to IND scenarios. Explosions just above the ground surface can only couple a small fraction of the blast energy into an underground shock. The various forms of nuclear radiation have limited penetration into the ground. While the shock transmitted into the ground carries only a small fraction of the blast energy, peak stresses are generally higher and peak ground displacement is lower than in the air blast. While underground military structures are often designed to resist stresses substantially higher than due to the overlying rocks and soils (overburden), civilian structures such as subways and tunnels would generally only need to resist overburden conditions with a suitable safety factor. Just as we expect the buildings themselves to channel and shield air blast above ground, basements and other underground openings as well as changes of geology will channel and shield the underground shock wave. While a weaker shock is expected in an urban environment, small displacements on very close-by faults, and more likely, soils being displaced past building foundations where utility lines enter could readily damaged or disable these services. Immediately near an explosion, the blast can 'liquefy' a saturated soil creating a quicksand-like condition for a period of time. We extrapolate the nuclear effects experience to a Chicago-based scenario. We consider the TARP (Tunnel and Reservoir Project) and subway system and the underground lifeline (electric, gas, water, etc) system and provide guidance for planning this scenario.

  3. Riparian vegetation structure under desertification scenarios

    NASA Astrophysics Data System (ADS)

    Rosário Fernandes, M.; Segurado, Pedro; Jauch, Eduardo; Ferreira, M. Teresa

    2015-04-01

    Riparian areas are responsible for many ecological and ecosystems services, including the filtering function, that are considered crucial to the preservation of water quality and social benefits. The main goal of this study is to quantify and understand the riparian variability under desertification scenario(s) and identify the optimal riparian indicators for water scarcity and droughts (WS&D), henceforth improving river basin management. This study was performed in the Iberian Tâmega basin, using riparian woody patches, mapped by visual interpretation on Google Earth imagery, along 130 Sampling Units of 250 m long river stretches. Eight riparian structural indicators, related with lateral dimension, weighted area and shape complexity of riparian patches were calculated using Patch Analyst extension for ArcGis 10. A set of 29 hydrological, climatic, and hydrogeomorphological variables were computed, by a water modelling system (MOHID), using monthly meteorological data between 2008 and 2014. Land-use classes were also calculated, in a 250m-buffer surrounding each sampling unit, using a classification based system on Corine Land Cover. Boosted Regression Trees identified Mean-width (MW) as the optimal riparian indicator for water scarcity and drought, followed by the Weighted Class Area (WCA) (classification accuracy =0.79 and 0.69 respectively). Average Flow and Strahler number were consistently selected, by all boosted models, as the most important explanatory variables. However, a combined effect of hidrogeomorphology and land-use can explain the high variability found in the riparian width mainly in Tâmega tributaries. Riparian patches are larger towards Tâmega river mouth although with lower shape complexity, probably related with more continuous and almost monospecific stands. Climatic, hydrological and land use scenarios, singly and combined, were used to quantify the riparian variability responding to these changes, and to assess the loss of riparian

  4. Model-based Scenario Analysis of the Impact of Remediation Measures on Metal Leaching from Soils Contaminated by Historic Smelter Emissions.

    PubMed

    Joris, Ingeborg; Bronders, Jan; van der Grift, Bas; Seuntjens, Piet

    2014-05-01

    A spatially distributed model for leaching of Cd from the unsaturated zone was developed for the Belgian-Dutch transnational Kempen region. The model uses as input land-use maps, atmospheric deposition data, and soil data and is part of a larger regional model that simulates transport of Cd in soil, groundwater, and surface water. A new method for deriving deposition from multiple sites was validated using soil data in different wind directions. Leaching was calculated for the period 1890 to 2010 using a reconstruction of metal loads in the region. The model was able to reproduce spatial patterns of concentrations in soil and groundwater and predicted the concentration in shallow groundwater adequately well for the purpose of evaluating management options. For 42% of the data points, measurements and calculations were within the same concentration class. The model was used for forecasting under a reference scenario, an autonomous development scenario including climate change, and a scenario with implementation of remediation measures. The impact of autonomous development (under the most extreme scenario of climatic change) amounted to an increase of 10% in cumulative Cd flux after 100 yr as compared with the reference scenario. The impact of remediation measures was mainly local and is less pronounced (i.e., only 3% change in cumulative flux at the regional scale). The integrated model served as a tool to assist in developing management strategies and prioritization of remediation of the wide-spread heavy metal contamination in the region. PMID:25602815

  5. Model-based Scenario Analysis of the Impact of Remediation Measures on Metal Leaching from Soils Contaminated by Historic Smelter Emissions.

    PubMed

    Joris, Ingeborg; Bronders, Jan; van der Grift, Bas; Seuntjens, Piet

    2014-05-01

    A spatially distributed model for leaching of Cd from the unsaturated zone was developed for the Belgian-Dutch transnational Kempen region. The model uses as input land-use maps, atmospheric deposition data, and soil data and is part of a larger regional model that simulates transport of Cd in soil, groundwater, and surface water. A new method for deriving deposition from multiple sites was validated using soil data in different wind directions. Leaching was calculated for the period 1890 to 2010 using a reconstruction of metal loads in the region. The model was able to reproduce spatial patterns of concentrations in soil and groundwater and predicted the concentration in shallow groundwater adequately well for the purpose of evaluating management options. For 42% of the data points, measurements and calculations were within the same concentration class. The model was used for forecasting under a reference scenario, an autonomous development scenario including climate change, and a scenario with implementation of remediation measures. The impact of autonomous development (under the most extreme scenario of climatic change) amounted to an increase of 10% in cumulative Cd flux after 100 yr as compared with the reference scenario. The impact of remediation measures was mainly local and is less pronounced (i.e., only 3% change in cumulative flux at the regional scale). The integrated model served as a tool to assist in developing management strategies and prioritization of remediation of the wide-spread heavy metal contamination in the region.

  6. Design of a Performance-Adaptive PID Control System Based on Modeling Performance Assessment

    NASA Astrophysics Data System (ADS)

    Yamamoto, Toru

    In industrial processes represented by petroleum and refinery processes, it is necessary to establish the performance-driven control strategy in order to improve the productivity, which the control performance is firstly evaluated, and the controller is reconstructed. This paper describes a design scheme of performance-adaptive PID controllers which are based on the above control mechanism. According to the proposed control scheme, the system identification works corresponding to the result of modeling performance assessment, and PID parameters are computed using the newly estimated system parameters. In calculating the PID parameters, the desired control performance is considered. The behaviour of the proposed control scheme is numerically examined in some simulation examples.

  7. Performance-Based Pay in the Federal Government. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Performance-Based Pay in the Federal Government"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Steve Nelson discusses the evolution of employee pay systems in the federal government, from the inception of the General Schedule to continuing interest in creating more…

  8. Assessment in Performance-Based Secondary Music Classes

    ERIC Educational Resources Information Center

    Pellegrino, Kristen; Conway, Colleen M.; Russell, Joshua A.

    2015-01-01

    After sharing research findings about grading and assessment practices in secondary music ensemble classes, we offer examples of commonly used assessment tools (ratings scale, checklist, rubric) for the performance ensemble. Then, we explore the various purposes of assessment in performance-based music courses: (1) to meet state, national, and…

  9. Lunar transportation scenarios utilising the Space Elevator

    NASA Astrophysics Data System (ADS)

    Engel, Kilian A.

    2005-07-01

    The Space Elevator (SE) concept has begun to receive an increasing amount of attention within the space community over the past couple of years and is no longer widely dismissed as pure science fiction. In light of the renewed interest in a, possibly sustained, human presence on the Moon and the fact that transportation and logistics form the bottleneck of many conceivable lunar missions, it is interesting to investigate what role the SE could eventually play in implementing an efficient Earth to Moon transportation system. The elevator allows vehicles to ascend from Earth and be injected into a trans-lunar trajectory without the use of chemical thrusters, thus eliminating gravity loss, aerodynamic loss and the need of high thrust multistage launch systems. Such a system therefore promises substantial savings of propellant and structural mass and could greatly increase the efficiency of Earth to Moon transportation. This paper analyzes different elevator-based trans-lunar transportation scenarios and characterizes them in terms of a number of benchmark figures. The transportation scenarios include direct elevator-launched trans-lunar trajectories, elevator-launched trajectories via L1 and L2, as well as launch from an Earth-based elevator and subsequent rendezvous with lunar elevators placed either on the near or on the far side of the Moon. The benchmark figures by which the different transfer options are characterized and evaluated include release radius (RR), required Δv, transfer times as well as other factors such as accessibility of different lunar latitudes, frequency of launch opportunities and mission complexity. The performances of the different lunar transfer options are compared with each other as well as with the performance of conventional mission concepts, represented by Apollo.

  10. Lunar transportation scenarios utilising the Space Elevator.

    PubMed

    Engel, Kilian A

    2005-01-01

    The Space Elevator (SE) concept has begun to receive an increasing amount of attention within the space community over the past couple of years and is no longer widely dismissed as pure science fiction. In light of the renewed interest in a, possibly sustained, human presence on the Moon and the fact that transportation and logistics form the bottleneck of many conceivable lunar missions, it is interesting to investigate what role the SE could eventually play in implementing an efficient Earth to Moon transportation system. The elevator allows vehicles to ascend from Earth and be injected into a trans-lunar trajectory without the use of chemical thrusters, thus eliminating gravity loss, aerodynamic loss and the need of high thrust multistage launch systems. Such a system therefore promises substantial savings of propellant and structural mass and could greatly increase the efficiency of Earth to Moon transportation. This paper analyzes different elevator-based trans-lunar transportation scenarios and characterizes them in terms of a number of benchmark figures. The transportation scenarios include direct elevator-launched trans-lunar trajectories, elevator launched trajectories via L1 and L2, as well as launch from an Earth-based elevator and subsequent rendezvous with lunar elevators placed either on the near or on the far side of the Moon. The benchmark figures by which the different transfer options are characterized and evaluated include release radius (RR), required delta v, transfer times as well as other factors such as accessibility of different lunar latitudes, frequency of launch opportunities and mission complexity. The performances of the different lunar transfer options are compared with each other as well as with the performance of conventional mission concepts, represented by Apollo.

  11. Workforce management strategies in a disaster scenario.

    SciTech Connect

    Kelic, Andjelka; Turk, Adam L.

    2008-08-01

    A model of the repair operations of the voice telecommunications network is used to study labor management strategies under a disaster scenario where the workforce is overwhelmed. The model incorporates overtime and fatigue functions and optimizes the deployment of the workforce based on the cost of the recovery and the time it takes to recover. The analysis shows that the current practices employed in workforce management in a disaster scenario are not optimal and more strategic deployment of that workforce is beneficial.

  12. Evaluation of a weather generator-based method for statistically downscaling non-stationary climate scenarios for impact assessment at a point scale

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The non-stationarity is a major concern for statistically downscaling climate change scenarios for impact assessment. This study is to evaluate whether a statistical downscaling method is fully applicable to generate daily precipitation under non-stationary conditions in a wide range of climatic zo...

  13. Improving Public School Performance through Vision-Based Leadership

    ERIC Educational Resources Information Center

    Kantabutra, Sooksan

    2005-01-01

    While vision-based leadership, frequently referred to as transformational leadership in the education literature, is widely regarded as critical to successful organization transformation, little research has been conducted into the relationship between vision-based leadership and public school performance in Thailand. Derived from substantial…

  14. Evaluating hydrological model performance using information theory-based metrics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  15. High performance computing for three-dimensional agent-based molecular models.

    PubMed

    Pérez-Rodríguez, G; Pérez-Pérez, M; Fdez-Riverola, F; Lourenço, A

    2016-07-01

    Agent-based simulations are increasingly popular in exploring and understanding cellular systems, but the natural complexity of these systems and the desire to grasp different modelling levels demand cost-effective simulation strategies and tools. In this context, the present paper introduces novel sequential and distributed approaches for the three-dimensional agent-based simulation of individual molecules in cellular events. These approaches are able to describe the dimensions and position of the molecules with high accuracy and thus, study the critical effect of spatial distribution on cellular events. Moreover, two of the approaches allow multi-thread high performance simulations, distributing the three-dimensional model in a platform independent and computationally efficient way. Evaluation addressed the reproduction of molecular scenarios and different scalability aspects of agent creation and agent interaction. The three approaches simulate common biophysical and biochemical laws faithfully. The distributed approaches show improved performance when dealing with large agent populations while the sequential approach is better suited for small to medium size agent populations. Overall, the main new contribution of the approaches is the ability to simulate three-dimensional agent-based models at the molecular level with reduced implementation effort and moderate-level computational capacity. Since these approaches have a generic design, they have the major potential of being used in any event-driven agent-based tool. PMID:27372059

  16. High performance computing for three-dimensional agent-based molecular models.

    PubMed

    Pérez-Rodríguez, G; Pérez-Pérez, M; Fdez-Riverola, F; Lourenço, A

    2016-07-01

    Agent-based simulations are increasingly popular in exploring and understanding cellular systems, but the natural complexity of these systems and the desire to grasp different modelling levels demand cost-effective simulation strategies and tools. In this context, the present paper introduces novel sequential and distributed approaches for the three-dimensional agent-based simulation of individual molecules in cellular events. These approaches are able to describe the dimensions and position of the molecules with high accuracy and thus, study the critical effect of spatial distribution on cellular events. Moreover, two of the approaches allow multi-thread high performance simulations, distributing the three-dimensional model in a platform independent and computationally efficient way. Evaluation addressed the reproduction of molecular scenarios and different scalability aspects of agent creation and agent interaction. The three approaches simulate common biophysical and biochemical laws faithfully. The distributed approaches show improved performance when dealing with large agent populations while the sequential approach is better suited for small to medium size agent populations. Overall, the main new contribution of the approaches is the ability to simulate three-dimensional agent-based models at the molecular level with reduced implementation effort and moderate-level computational capacity. Since these approaches have a generic design, they have the major potential of being used in any event-driven agent-based tool.

  17. Scenario-Based Multi-Objective Optimum Allocation Model for Earthquake Emergency Shelters Using a Modified Particle Swarm Optimization Algorithm: A Case Study in Chaoyang District, Beijing, China

    PubMed Central

    Zhao, Xiujuan; Xu, Wei; Ma, Yunjia; Hu, Fuyu

    2015-01-01

    The correct location of earthquake emergency shelters and their allocation to residents can effectively reduce the number of casualties by providing safe havens and efficient evacuation routes during the chaotic period of the unfolding disaster. However, diverse and strict constraints and the discrete feasible domain of the required models make the problem of shelter location and allocation more difficult. A number of models have been developed to solve this problem, but there are still large differences between the models and the actual situation because the characteristics of the evacuees and the construction costs of the shelters have been excessively simplified. We report here the development of a multi-objective model for the allocation of residents to earthquake shelters by considering these factors using the Chaoyang district, Beijing, China as a case study. The two objectives of this model were to minimize the total weighted evacuation time from residential areas to a specified shelter and to minimize the total area of all the shelters. The two constraints were the shelter capacity and the service radius. Three scenarios were considered to estimate the number of people who would need to be evacuated. The particle swarm optimization algorithm was first modified by applying the von Neumann structure in former loops and global structure in later loops, and then used to solve this problem. The results show that increasing the shelter area can result in a large decrease in the total weighted evacuation time from scheme 1 to scheme 9 in scenario A, from scheme 1 to scheme 9 in scenario B, from scheme 1 to scheme 19 in scenario C. If the funding were not a limitation, then the final schemes of each scenario are the best solutions, otherwise the earlier schemes are more reasonable. The modified model proved to be useful for the optimization of shelter allocation, and the result can be used as a scientific reference for planning shelters in the Chaoyang district

  18. Memory Benchmarks for SMP-Based High Performance Parallel Computers

    SciTech Connect

    Yoo, A B; de Supinski, B; Mueller, F; Mckee, S A

    2001-11-20

    As the speed gap between CPU and main memory continues to grow, memory accesses increasingly dominates the performance of many applications. The problem is particularly acute for symmetric multiprocessor (SMP) systems, where the shared memory may be accessed concurrently by a group of threads running on separate CPUs. Unfortunately, several key issues governing memory system performance in current systems are not well understood. Complex interactions between the levels of the memory hierarchy, buses or switches, DRAM back-ends, system software, and application access patterns can make it difficult to pinpoint bottlenecks and determine appropriate optimizations, and the situation is even more complex for SMP systems. To partially address this problem, we formulated a set of multi-threaded microbenchmarks for characterizing and measuring the performance of the underlying memory system in SMP-based high-performance computers. We report our use of these microbenchmarks on two important SMP-based machines. This paper has four primary contributions. First, we introduce a microbenchmark suite to systematically assess and compare the performance of different levels in SMP memory hierarchies. Second, we present a new tool based on hardware performance monitors to determine a wide array of memory system characteristics, such as cache sizes, quickly and easily; by using this tool, memory performance studies can be targeted to the full spectrum of performance regimes with many fewer data points than is otherwise required. Third, we present experimental results indicating that the performance of applications with large memory footprints remains largely constrained by memory. Fourth, we demonstrate that thread-level parallelism further degrades memory performance, even for the latest SMPs with hardware prefetching and switch-based memory interconnects.

  19. Understanding the relationship between safety investment and safety performance of construction projects through agent-based modeling.

    PubMed

    Lu, Miaojia; Cheung, Clara Man; Li, Heng; Hsu, Shu-Chien

    2016-09-01

    The construction industry in Hong Kong increased its safety investment by 300% in the past two decades; however, its accident rate has plateaued to around 50% for one decade. Against this backdrop, researchers have found inconclusive results on the causal relationship between safety investment and safety performance. Using agent-based modeling, this study takes an unconventional bottom-up approach to study safety performance on a construction site as an outcome of a complex system defined by interactions among a worksite, individual construction workers, and different safety investments. Instead of focusing on finding the absolute relationship between safety investment and safety performance, this study contributes to providing a practical framework to investigate how different safety investments interacting with different parameters such as human and environmental factors could affect safety performance. As a result, we could identify cost-effective safety investments under different construction scenarios for delivering optimal safety performance.

  20. Understanding the relationship between safety investment and safety performance of construction projects through agent-based modeling.

    PubMed

    Lu, Miaojia; Cheung, Clara Man; Li, Heng; Hsu, Shu-Chien

    2016-09-01

    The construction industry in Hong Kong increased its safety investment by 300% in the past two decades; however, its accident rate has plateaued to around 50% for one decade. Against this backdrop, researchers have found inconclusive results on the causal relationship between safety investment and safety performance. Using agent-based modeling, this study takes an unconventional bottom-up approach to study safety performance on a construction site as an outcome of a complex system defined by interactions among a worksite, individual construction workers, and different safety investments. Instead of focusing on finding the absolute relationship between safety investment and safety performance, this study contributes to providing a practical framework to investigate how different safety investments interacting with different parameters such as human and environmental factors could affect safety performance. As a result, we could identify cost-effective safety investments under different construction scenarios for delivering optimal safety performance. PMID:27240124

  1. Examining the ethical and social issues of health technology design through the public appraisal of prospective scenarios: a study protocol describing a multimedia-based deliberative method

    PubMed Central

    2014-01-01

    Background The design of health technologies relies on assumptions that affect how they will be implemented, such as intended use, complexity, impact on user autonomy, and appropriateness. Those who design and implement technologies make several ethical and social assumptions on behalf of users and society more broadly, but there are very few tools to examine prospectively whether such assumptions are warranted and how the public define and appraise the desirability of health innovations. This study protocol describes a three-year study that relies on a multimedia-based prospective method to support public deliberations that will enable a critical examination of the social and ethical issues of health technology design. Methods The first two steps of our mixed-method study were completed: relying on a literature review and the support of our multidisciplinary expert committee, we developed scenarios depicting social and technical changes that could unfold in three thematic areas within a 25-year timeframe; and for each thematic area, we created video clips to illustrate prospective technologies and short stories to describe their associated dilemmas. Using this multimedia material, we will: conduct four face-to-face deliberative workshops with members of the public (n = 40) who will later join additional participants (n = 25) through an asynchronous online forum; and analyze and integrate three data sources: observation, group deliberations, and a self-administered participant survey. Discussion This study protocol will be of interest to those who design and assess public involvement initiatives and to those who examine the implementation of health innovations. Our premise is that using user-friendly tools in a deliberative context that foster participants’ creativity and reflexivity in pondering potential technoscientific futures will enable our team to analyze a range of normative claims, including some that may prove problematic and others that may

  2. Evaluation of the Terminal Sequencing and Spacing System for Performance Based Navigation Arrivals

    NASA Technical Reports Server (NTRS)

    Thipphavong, Jane; Jung, Jaewoo; Swenson, Harry N.; Martin, Lynne; Lin, Melody; Nguyen, Jimmy

    2013-01-01

    NASA has developed the Terminal Sequencing and Spacing (TSS) system, a suite of advanced arrival management technologies combining timebased scheduling and controller precision spacing tools. TSS is a ground-based controller automation tool that facilitates sequencing and merging arrivals that have both current standard ATC routes and terminal Performance-Based Navigation (PBN) routes, especially during highly congested demand periods. In collaboration with the FAA and MITRE's Center for Advanced Aviation System Development (CAASD), TSS system performance was evaluated in human-in-the-loop (HITL) simulations with currently active controllers as participants. Traffic scenarios had mixed Area Navigation (RNAV) and Required Navigation Performance (RNP) equipage, where the more advanced RNP-equipped aircraft had preferential treatment with a shorter approach option. Simulation results indicate the TSS system achieved benefits by enabling PBN, while maintaining high throughput rates-10% above baseline demand levels. Flight path predictability improved, where path deviation was reduced by 2 NM on average and variance in the downwind leg length was 75% less. Arrivals flew more fuel-efficient descents for longer, spending an average of 39 seconds less in step-down level altitude segments. Self-reported controller workload was reduced, with statistically significant differences at the p less than 0.01 level. The RNP-equipped arrivals were also able to more frequently capitalize on the benefits of being "Best-Equipped, Best- Served" (BEBS), where less vectoring was needed and nearly all RNP approaches were conducted without interruption.

  3. Ischemic preconditioning and clinical scenarios

    PubMed Central

    Narayanan, Srinivasan V.; Dave, Kunjan R.; Perez-Pinzon, Miguel A.

    2013-01-01

    Purpose of review Ischemic preconditioning (IPC) is gaining attention as a novel neuroprotective therapy and could provide an improved mechanistic understanding of tolerance to cerebral ischemia. The purpose of this article is to review the recent work in the field of IPC and its applications to clinical scenarios. Recent findings The cellular signaling pathways that are activated following IPC are now better understood and have enabled investigators to identify several IPC mimetics. Most of these studies were performed in rodents, and efficacy of these mimetics remains to be evaluated in human patients. Additionally, remote ischemic preconditioning (RIPC) may have higher translational value than IPC. Repeated cycles of temporary ischemia in a remote organ can activate protective pathways in the target organ, including the heart and brain. Clinical trials are underway to test the efficacy of RIPC in protecting brain against subarachnoid hemorrhage. Summary IPC, RIPC, and IPC mimetics have the potential to be therapeutic in various clinical scenarios. Further understanding of IPC-induced neuroprotection pathways and utilization of clinically relevant animal models are necessary to increase the translational potential of IPC in the near future. PMID:23197083

  4. Policy design and performance of emissions trading markets: an adaptive agent-based analysis.

    PubMed

    Bing, Zhang; Qinqin, Yu; Jun, Bi

    2010-08-01

    Emissions trading is considered to be a cost-effective environmental economic instrument for pollution control. However, the pilot emissions trading programs in China have failed to bring remarkable success in the campaign for pollution control. The policy design of an emissions trading program is found to have a decisive impact on its performance. In this study, an artificial market for sulfur dioxide (SO2) emissions trading applying the agent-based model was constructed. The performance of the Jiangsu SO2 emissions trading market under different policy design scenario was also examined. Results show that the market efficiency of emissions trading is significantly affected by policy design and existing policies. China's coal-electricity price system is the principal factor influencing the performance of the SO2 emissions trading market. Transaction costs would also reduce market efficiency. In addition, current-level emissions discharge fee/tax and banking mechanisms do not distinctly affect policy performance. Thus, applying emissions trading in emission control in China should consider policy design and interaction with other existing policies.

  5. Policy design and performance of emissions trading markets: an adaptive agent-based analysis.

    PubMed

    Bing, Zhang; Qinqin, Yu; Jun, Bi

    2010-08-01

    Emissions trading is considered to be a cost-effective environmental economic instrument for pollution control. However, the pilot emissions trading programs in China have failed to bring remarkable success in the campaign for pollution control. The policy design of an emissions trading program is found to have a decisive impact on its performance. In this study, an artificial market for sulfur dioxide (SO2) emissions trading applying the agent-based model was constructed. The performance of the Jiangsu SO2 emissions trading market under different policy design scenario was also examined. Results show that the market efficiency of emissions trading is significantly affected by policy design and existing policies. China's coal-electricity price system is the principal factor influencing the performance of the SO2 emissions trading market. Transaction costs would also reduce market efficiency. In addition, current-level emissions discharge fee/tax and banking mechanisms do not distinctly affect policy performance. Thus, applying emissions trading in emission control in China should consider policy design and interaction with other existing policies. PMID:20590153

  6. A Likelihood-Based Approach to Identifying Contaminated Food Products Using Sales Data: Performance and Challenges

    PubMed Central

    Kaufman, James; Lessler, Justin; Harry, April; Edlund, Stefan; Hu, Kun; Douglas, Judith; Thoens, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias

    2014-01-01

    Foodborne disease outbreaks of recent years demonstrate that due to increasingly interconnected supply chains these type of crisis situations have the potential to affect thousands of people, leading to significant healthcare costs, loss of revenue for food companies, and—in the worst cases—death. When a disease outbreak is detected, identifying the contaminated food quickly is vital to minimize suffering and limit economic losses. Here we present a likelihood-based approach that has the potential to accelerate the time needed to identify possibly contaminated food products, which is based on exploitation of food products sales data and the distribution of foodborne illness case reports. Using a real world food sales data set and artificially generated outbreak scenarios, we show that this method performs very well for contamination scenarios originating from a single “guilty” food product. As it is neither always possible nor necessary to identify the single offending product, the method has been extended such that it can be used as a binary classifier. With this extension it is possible to generate a set of potentially “guilty” products that contains the real outbreak source with very high accuracy. Furthermore we explore the patterns of food distributions that lead to “hard-to-identify” foods, the possibility of identifying these food groups a priori, and the extent to which the likelihood-based method can be used to quantify uncertainty. We find that high spatial correlation of sales data between products may be a useful indicator for “hard-to-identify” products. PMID:24992565

  7. Biomass Scenario Model Scenario Library: Definitions, Construction, and Description

    SciTech Connect

    Inman, D.; Vimmerstedt, L.; Bush, B.; Peterson, S.

    2014-04-01

    Understanding the development of the biofuels industry in the United States is important to policymakers and industry. The Biomass Scenario Model (BSM) is a system dynamics model of the biomass-to-biofuels system that can be used to explore policy effects on biofuels development. Because of the complexity of the model, as well as the wide range of possible future conditions that affect biofuels industry development, we have not developed a single reference case but instead developed a set of specific scenarios that provide various contexts for our analyses. The purpose of this report is to describe the scenarios that comprise the BSM scenario library. At present, we have the following policy-focused scenarios in our library: minimal policies, ethanol-focused policies, equal access to policies, output-focused policies, technological diversity focused, and the point-of-production- focused. This report describes each scenario, its policy settings, and general insights gained through use of the scenarios in analytic studies.

  8. Integrating policy-based management and SLA performance monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tzong-Jye; Lin, Chin-Yi; Chang, Shu-Hsin; Yen, Meng-Tzu

    2001-10-01

    Policy-based management system provides the configuration capability for the system administrators to focus on the requirements of customers. The service level agreement performance monitoring mechanism helps system administrators to verify the correctness of policies. However, it is difficult for a device to process the policies directly because the policies are the management concept. This paper proposes a mechanism to decompose a policy into rules that can be efficiently processed by a device. Thus, the device may process the rule and collect the performance statistics information efficiently; and the policy-based management system may collect these performance statistics information and report the service-level agreement performance monitoring information to the system administrator. The proposed policy-based management system achieves both the policy configuration and service-level agreement performance monitoring requirements. A policy consists of a condition part and an action part. The condition part is a Boolean expression of a source host IP group, a destination host IP group, etc. The action part is the parameters of services. We say that an address group is compact if it only consists of a range of IP address that can be denoted by a pair of IP address and corresponding IP mask. If the condition part of a policy only consists of the compact address group, we say that the policy is a rule. Since a device can efficiently process a compact address and a system administrator prefers to define a range of IP address, the policy-based management system has to translate policy into rules and supplements the gaps between policy and rules. The proposed policy-based management system builds the relationships between VPN and policies, policy and rules. Since the system administrator wants to monitor the system performance information of VPNs and policies, the proposed policy-based management system downloads the relationships among VPNs, policies and rules to the

  9. Development of the Computerized Model of Performance-Based Measurement System to Measure Nurses' Clinical Competence.

    PubMed

    Liou, Shwu-Ru; Liu, Hsiu-Chen; Tsai, Shu-Ling; Cheng, Ching-Yu; Yu, Wei-Chieh; Chu, Tsui-Ping

    2016-04-01

    Critical thinking skills and clinical competence are for providing quality patient care. The purpose of this study is to develop the Computerized Model of Performance-Based Measurement system based on the Clinical Reasoning Model. The system can evaluate and identify learning needs for clinical competency and be used as a learning tool to increase clinical competency by using computers. The system includes 10 high-risk, high-volume clinical case scenarios coupled with questions testing clinical reasoning, interpersonal, and technical skills. Questions were sequenced to reflect patients' changing condition and arranged by following the process of collecting and managing information, diagnosing and differentiating urgency of problems, and solving problems. The content validity and known-groups validity was established. The Kuder-Richardson Formula 20 was 0.90 and test-retest reliability was supported (r = 0.78). Nursing educators can use the system to understand students' needs for achieving clinical competence, and therefore, educational plans can be made to better prepare students and facilitate their smooth transition to a future clinical environment. Clinical nurses can use the system to evaluate their performance-based abilities and weakness in clinical reasoning. Appropriate training programs can be designed and implemented to practically promote nurses' clinical competence and quality of patient care. PMID:26829522

  10. Human Factors Considerations for Performance-Based Navigation

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Adams, Catherine A.

    2006-01-01

    A transition toward a performance-based navigation system is currently underway in both the United States and around the world. Performance-based navigation incorporates Area Navigation (RNAV) and Required Navigation Performance (RNP) procedures that do not rely on the location of ground-based navigation aids. These procedures offer significant benefits to both operators and air traffic managers. Under sponsorship from the Federal Aviation Administration (FAA), the National Aeronautics and Space Administration (NASA) has undertaken a project to document human factors issues that have emerged during RNAV and RNP operations and propose areas for further consideration. Issues were found to include aspects of air traffic control and airline procedures, aircraft systems, and procedure design. Major findings suggest the need for human factors-specific instrument procedure design guidelines. Ongoing industry and government activities to address air-ground communication terminology, procedure design improvements, and chart-database commonality are strongly encouraged.

  11. Parallel performance optimizations on unstructured mesh-based simulations

    SciTech Connect

    Sarje, Abhinav; Song, Sukhyun; Jacobsen, Douglas; Huck, Kevin; Hollingsworth, Jeffrey; Malony, Allen; Williams, Samuel; Oliker, Leonid

    2015-06-01

    This paper addresses two key parallelization challenges the unstructured mesh-based ocean modeling code, MPAS-Ocean, which uses a mesh based on Voronoi tessellations: (1) load imbalance across processes, and (2) unstructured data access patterns, that inhibit intra- and inter-node performance. Our work analyzes the load imbalance due to naive partitioning of the mesh, and develops methods to generate mesh partitioning with better load balance and reduced communication. Furthermore, we present methods that minimize both inter- and intranode data movement and maximize data reuse. Our techniques include predictive ordering of data elements for higher cache efficiency, as well as communication reduction approaches. We present detailed performance data when running on thousands of cores using the Cray XC30 supercomputer and show that our optimization strategies can exceed the original performance by over 2×. Additionally, many of these solutions can be broadly applied to a wide variety of unstructured grid-based computations.

  12. Performance-based development of a basic nuclear engineering course

    SciTech Connect

    Knief, R.A. )

    1993-01-01

    Over the past 19 yr, a basic nuclear engineering course has been developed with methods traditionally applied to training programs. The performance-based approach uses elements classified roughly as analysis, design/development, implementation, and evaluation/feedback. Prior to the accident at Three Mile Island unit 2 (TMI-2), performance-based training applications were rare in the nuclear community. An exception was at Sandia Laboratories (SNL). American Telephone Telegraph (AT T) - holder of the SNL contract with what is now the US Department of Energy - and its Bell Laboratories subsidiary had for some time emphasized in-hours technical education and training using performance-based methods to ensure that in-house and contracted instructors focused on course outcomes rather than merely subject matter.

  13. Alpha neurofeedback training improves SSVEP-based BCI performance

    NASA Astrophysics Data System (ADS)

    Wan, Feng; Nuno da Cruz, Janir; Nan, Wenya; Wong, Chi Man; Vai, Mang I.; Rosa, Agostinho

    2016-06-01

    Objective. Steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs) can provide relatively easy, reliable and high speed communication. However, the performance is still not satisfactory, especially in some users who are not able to generate strong enough SSVEP signals. This work aims to strengthen a user’s SSVEP by alpha down-regulating neurofeedback training (NFT) and consequently improve the performance of the user in using SSVEP-based BCIs. Approach. An experiment with two steps was designed and conducted. The first step was to investigate the relationship between the resting alpha activity and the SSVEP-based BCI performance, in order to determine the training parameter for the NFT. Then in the second step, half of the subjects with ‘low’ performance (i.e. BCI classification accuracy <80%) were randomly assigned to a NFT group to perform a real-time NFT, and the rest half to a non-NFT control group for comparison. Main results. The first step revealed a significant negative correlation between the BCI performance and the individual alpha band (IAB) amplitudes in the eyes-open resting condition in a total of 33 subjects. In the second step, it was found that during the IAB down-regulating NFT, on average the subjects were able to successfully decrease their IAB amplitude over training sessions. More importantly, the NFT group showed an average increase of 16.5% in the SSVEP signal SNR (signal-to-noise ratio) and an average increase of 20.3% in the BCI classification accuracy, which was significant compared to the non-NFT control group. Significance. These findings indicate that the alpha down-regulating NFT can be used to improve the SSVEP signal quality and the subjects’ performance in using SSVEP-based BCIs. It could be helpful to the SSVEP related studies and would contribute to more effective SSVEP-based BCI applications.

  14. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation.

    PubMed

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-05-10

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011.

  15. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation

    NASA Astrophysics Data System (ADS)

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-05-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011.

  16. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation.

    PubMed

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-01-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

  17. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation

    PubMed Central

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-01-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

  18. Performance Comparison of HPF and MPI Based NAS Parallel Benchmarks

    NASA Technical Reports Server (NTRS)

    Saini, Subhash

    1997-01-01

    Compilers supporting High Performance Form (HPF) features first appeared in late 1994 and early 1995 from Applied Parallel Research (APR), Digital Equipment Corporation, and The Portland Group (PGI). IBM introduced an HPF compiler for the IBM RS/6000 SP2 in April of 1996. Over the past two years, these implementations have shown steady improvement in terms of both features and performance. The performance of various hardware/ programming model (HPF and MPI) combinations will be compared, based on latest NAS Parallel Benchmark results, thus providing a cross-machine and cross-model comparison. Specifically, HPF based NPB results will be compared with MPI based NPB results to provide perspective on performance currently obtainable using HPF versus MPI or versus hand-tuned implementations such as those supplied by the hardware vendors. In addition, we would also present NPB, (Version 1.0) performance results for the following systems: DEC Alpha Server 8400 5/440, Fujitsu CAPP Series (VX, VPP300, and VPP700), HP/Convex Exemplar SPP2000, IBM RS/6000 SP P2SC node (120 MHz), NEC SX-4/32, SGI/CRAY T3E, and SGI Origin2000. We would also present sustained performance per dollar for Class B LU, SP and BT benchmarks.

  19. Roadmap Toward a Predictive Performance-based Commercial Energy Code

    SciTech Connect

    Rosenberg, Michael I.; Hart, Philip R.

    2014-10-01

    Energy codes have provided significant increases in building efficiency over the last 38 years, since the first national energy model code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. The current focus on prescriptive codes has limitations including significant variation in actual energy performance depending on which prescriptive options are chosen, a lack of flexibility for designers and developers, and the inability to handle control optimization that is specific to building type and use. This paper provides a high level review of different options for energy codes, including prescriptive, prescriptive packages, EUI Target, outcome-based, and predictive performance approaches. This paper also explores a next generation commercial energy code approach that places a greater emphasis on performance-based criteria. A vision is outlined to serve as a roadmap for future commercial code development. That vision is based on code development being led by a specific approach to predictive energy performance combined with building specific prescriptive packages that are designed to be both cost-effective and to achieve a desired level of performance. Compliance with this new approach can be achieved by either meeting the performance target as demonstrated by whole building energy modeling, or by choosing one of the prescriptive packages.

  20. Biotechnology-based odour control: design criteria and performance data.

    PubMed

    Quigley, C; Easter, C; Burrowes, P; Witherspoon, J

    2004-01-01

    As neighbouring areas continue to encroach upon wastewater treatment plants, there is an increasing need for odour control to mitigate potential negative offsite odorous impacts. One technology that is gaining widespread acceptance is biotechnology, which utilises the inherent ability of certain microorganisms to biodegrade offensive odorous compounds. Two main advantages of this form of treatment over other odour control technologies include the absence of hazardous chemicals and relatively low operation and maintenance requirements. The purpose of this paper is to provide information related to odour control design criteria used in sizing/selecting biotechnology-based odour control technologies, and to provide odour removal performance data obtained from several different biotechnology-based odour control systems. CH2M HILL has collected biotechnology-based odour control performance data over the last several years in order to track the continued performance of various biofilters and biotowers over time. Specifically, odour removal performance data have been collected from soil-, organic- and inorganic-media biofilters and inert inorganic media biotowers. Results indicate that biotechnology-based odour control is a viable and consistent technology capable of achieving high removal performance for odour and hydrogen sulphide. It is anticipated that the information presented in this paper will be of interest to anyone involved with odour control technology evaluation/selection or design review.

  1. Biotechnology-based odour control: design criteria and performance data.

    PubMed

    Quigley, C; Easter, C; Burrowes, P; Witherspoon, J

    2004-01-01

    As neighbouring areas continue to encroach upon wastewater treatment plants, there is an increasing need for odour control to mitigate potential negative offsite odorous impacts. One technology that is gaining widespread acceptance is biotechnology, which utilises the inherent ability of certain microorganisms to biodegrade offensive odorous compounds. Two main advantages of this form of treatment over other odour control technologies include the absence of hazardous chemicals and relatively low operation and maintenance requirements. The purpose of this paper is to provide information related to odour control design criteria used in sizing/selecting biotechnology-based odour control technologies, and to provide odour removal performance data obtained from several different biotechnology-based odour control systems. CH2M HILL has collected biotechnology-based odour control performance data over the last several years in order to track the continued performance of various biofilters and biotowers over time. Specifically, odour removal performance data have been collected from soil-, organic- and inorganic-media biofilters and inert inorganic media biotowers. Results indicate that biotechnology-based odour control is a viable and consistent technology capable of achieving high removal performance for odour and hydrogen sulphide. It is anticipated that the information presented in this paper will be of interest to anyone involved with odour control technology evaluation/selection or design review. PMID:15484776

  2. Viral hepatitis: Indian scenario.

    PubMed

    Satsangi, Sandeep; Chawla, Yogesh K

    2016-07-01

    Viral hepatitis is a cause for major health care burden in India and is now equated as a threat comparable to the "big three" communicable diseases - HIV/AIDS, malaria and tuberculosis. Hepatitis A virus and Hepatitis E virus are predominantly enterically transmitted pathogens and are responsible to cause both sporadic infections and epidemics of acute viral hepatitis. Hepatitis B virus and Hepatitis C virus are predominantly spread via parenteral route and are notorious to cause chronic hepatitis which can lead to grave complications including cirrhosis of liver and hepatocellular carcinoma. Around 400 million people all over the world suffer from chronic hepatitis and the Asia-Pacific region constitutes the epicentre of this epidemic. The present article would aim to cover the basic virologic aspects of these viruses and highlight the present scenario of viral hepatitis in India. PMID:27546957

  3. Performance-Based Technology Selection Filter description report

    SciTech Connect

    O'Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  4. Measurement-based performance evaluation technique for high-performance computers

    NASA Technical Reports Server (NTRS)

    Sharma, S.; Natarajan, C.; Iyer, R. K.

    1993-01-01

    A measurement-based performance evaluation technique has been used to characterize the OS performance of Cedar, a hierarchical shared-memory multiprocessor system. Thirteen OS performance meters were used to capture the operating system activities for compute-bound workloads. Three representative applications from the Perfect Benchmark Suite were used to measure the OS performance in a dedicated system and in multiprogrammed workloads. It was found that 13-23 percent of the total execution time on a dedicated system was spent in executing OS-related activities. Under multiprogramming, 12-14 percent of the total execution time was used by the OS. The impact of multiprogramming on the operating system performance meters was also measured.

  5. Optimizing Decision Preparedness by Adapting Scenario Complexity and Automating Scenario Generation

    NASA Technical Reports Server (NTRS)

    Dunne, Rob; Schatz, Sae; Flore, Stephen M.; Nicholson, Denise

    2011-01-01

    Klein's recognition-primed decision (RPD) framework proposes that experts make decisions by recognizing similarities between current decision situations and previous decision experiences. Unfortunately, military personnel arQ often presented with situations that they have not experienced before. Scenario-based training (S8T) can help mitigate this gap. However, SBT remains a challenging and inefficient training approach. To address these limitations, the authors present an innovative formulation of scenario complexity that contributes to the larger research goal of developing an automated scenario generation system. This system will enable trainees to effectively advance through a variety of increasingly complex decision situations and experiences. By adapting scenario complexities and automating generation, trainees will be provided with a greater variety of appropriately calibrated training events, thus broadening their repositories of experience. Preliminary results from empirical testing (N=24) of the proof-of-concept formula are presented, and future avenues of scenario complexity research are also discussed.

  6. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  7. EDITORIAL: Where next with global environmental scenarios? Where next with global environmental scenarios?

    NASA Astrophysics Data System (ADS)

    O'Neill, Brian; Pulver, Simone; Van Deveer, Stacy; Garb, Yaakov

    2008-12-01

    -oriented scenario exercises also generate scenario products, but such products are recognized as meaningful mostly (or only) in the social context in which they were developed. It should be noted that those seeking to understand the functions, implications and utility of scenarios can approach analysis of scenarios and their impacts from either perspective—focusing attention on product outcomes and influence or assessing procedural and contextual dynamics and implications. Papers in this issue examine various aspects of scenario products, scenario processes and their interactions, with specific reference to global environmental change scenarios. Hulme and Dessai (2008) use the product-process distinction as a starting point for developing a framework to evaluate the success of scenario exercises. They identify 'prediction success', 'decision success' and 'learning success' as three evaluation metrics for scenarios, with the first two most relevant to scenario products and the last emphasizing procedural aspects of scenarios. They suggest that viewing scenarios primarily as products implies examining how closely actual outcomes have matched envisioned outcomes, while viewing them primarily as processes suggests evaluating the extent to which scenarios engaged participants and enabled their learning. O'Neill and Nakicenovic (2008) focus on Hulme and Dessai's evaluation metric, learning. Based on a review of six scenario/assessment exercises, they ask if and how scenario products have incorporated comparative assessments of results in order to enable cumulative learning across scenario efforts. The authors conclude that, although participating modelling teams have benefited greatly from the process of scenario activities and applied that learning to other scenario exercises in which they engage, learning from comparative assessments of scenario products has been rather limited; the latter due to the limited time and resources invested in comparative analysis. Pitcher (2009) speaks

  8. Generating Scenarios When Data Are Missing

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan

    2007-01-01

    The Hypothetical Scenario Generator (HSG) is being developed in conjunction with other components of artificial-intelligence systems for automated diagnosis and prognosis of faults in spacecraft, aircraft, and other complex engineering systems. The HSG accepts, as input, possibly incomplete data on the current state of a system (see figure). The HSG models a potential fault scenario as an ordered disjunctive tree of conjunctive consequences, wherein the ordering is based upon the likelihood that a particular conjunctive path will be taken for the given set of inputs. The computation of likelihood is based partly on a numerical ranking of the degree of completeness of data with respect to satisfaction of the antecedent conditions of prognostic rules. The results from the HSG are then used by a model-based artificial- intelligence subsystem to predict realistic scenarios and states.

  9. Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations

    NASA Technical Reports Server (NTRS)

    Khambatta, Cyrus F.

    2007-01-01

    A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

  10. Performance Invalidity Base Rates Among Healthy Undergraduate Research Participants.

    PubMed

    Ross, Thomas P; Poston, Ashley M; Rein, Patricia A; Salvatore, Andrew N; Wills, Nathan L; York, Taylor M

    2016-02-01

    Few studies have examined base rates of suboptimal effort among healthy, undergraduate students recruited for neuropsychological research. An and colleagues (2012, Conducting research with non-clinical healthy undergraduates: Does effort play a role in neuropsychological test performance? Archives of Clinical Neuropsychology, 27, 849-857) reported high rates of performance invalidity (30.8%-55.6%), calling into question the validity of findings generated from samples of college students. In contrast, subsequent studies have reported much lower base rates ranging from 2.6% to 12%. The present study replicated and extended previous work by examining the performance of 108 healthy undergraduates on the Dot Counting Test, Victoria Symptom Validity Test, Word Memory Test, and a brief battery of neuropsychological measures. During initial testing, 8.3% of the sample scored below cutoffs on at least one Performance Validity Test, while 3.7% were classified as invalid at Time 2 (M interval = 34.4 days). The present findings add to a growing number of studies that suggest performance invalidity base rates in samples of non-clinical, healthy college students are much lower than An and colleagues initial findings. Although suboptimal effort is much less problematic than suggested by An and colleagues, recent reports as high as 12% indicate including measures of effort may be of value when using college students as participants. Methodological issues and recommendations for future research are presented.

  11. Performance evaluation of wavelet-based face verification on a PDA recorded database

    NASA Astrophysics Data System (ADS)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  12. Advanced Organic Permeable-Base Transistor with Superior Performance.

    PubMed

    Klinger, Markus P; Fischer, Axel; Kaschura, Felix; Scholz, Reinhard; Lüssem, Björn; Kheradmand-Boroujeni, Bahman; Ellinger, Frank; Kasemann, Daniel; Leo, Karl

    2015-12-16

    An optimized vertical organic permeable-base transistor (OPBT) competing with the best organic field-effect transistors in performance, while employing low-cost fabrication techniques, is presented. The OPBT stands out by its excellent power efficiency at the highest frequencies.

  13. Leading Instructional Practices in a Performance-Based System

    ERIC Educational Resources Information Center

    Kauble, Anna; Wise, Donald

    2015-01-01

    Given the shift to Common Core, educational leaders are challenged to see new directions in teaching and learning. The purpose of this study was to investigate the instructional practices which may be related to the effectiveness of a performance-based system (PBS) and their impact on student achievement, as part of a thematic set of dissertations…

  14. Joint Workshops. Performance Based Apprentice and Technical Training. Final Report.

    ERIC Educational Resources Information Center

    Oriel, Arthur E.

    A series of five workshops were held to disseminate, to 39 industrial and college and 61 Bureau of Apprenticeship and Training (BAT) personnel, information about the principles, methods, and effectiveness of Performance Based Training (PBT) in apprentice programs. Following the workshops, 90% of the industrial and 61% of the BAT personnel…

  15. Performance-Based Measurement: Action for Organizations and HPT Accountability

    ERIC Educational Resources Information Center

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  16. 48 CFR 970.3706 - Performance-based acquisition.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Performance-based acquisition. 970.3706 Section 970.3706 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Facilities Management Contracting...

  17. 48 CFR 970.3706 - Performance-based acquisition.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Performance-based acquisition. 970.3706 Section 970.3706 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Facilities Management Contracting...

  18. Begging the Question: Performativity and Studio-Based Research

    ERIC Educational Resources Information Center

    Petelin, George

    2014-01-01

    The requirement that candidates in studio-based or practice-led higher degrees by research should formulate a research question has been found to be problematic by some writers. The present article argues that this stance, particularly as it is articulated by proponents of the influential category of "performative research" (Haseman,…

  19. Energy Conservation in the Home. Performance Based Lesson Plans.

    ERIC Educational Resources Information Center

    Alabama State Dept. of Education, Montgomery. Home Economics Service.

    These ten performance-based lesson plans concentrate on tasks related to energy conservation in the home. They are (1) caulk cracks, holes, and joints; (2) apply weatherstripping to doors and windows; (3) add plastic/solar screen window covering; (4) arrange furniture for saving energy; (5) set heating/cooling thermostat; (6) replace faucet…

  20. The Evolution of Performance Based Teacher Education Programs.

    ERIC Educational Resources Information Center

    Aubertine, Horace E.

    This document is a discussion of a systemized approach to education theory and practice, especially as it applies to performance-based teacher education. The author uses as the basis of his discussion the physical sciences and their use of approximation models (an illustration of this use is the historical development of the description of matter…