Sample records for define reference scenarios

  1. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  2. 3D reconstruction optimization using imagery captured by unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Bassie, Abby L.; Meacham, Sean; Young, David; Turnage, Gray; Moorhead, Robert J.

    2017-05-01

    Because unmanned air vehicles (UAVs) are emerging as an indispensable image acquisition platform in precision agriculture, it is vitally important that researchers understand how to optimize UAV camera payloads for analysis of surveyed areas. In this study, imagery captured by a Nikon RGB camera attached to a Precision Hawk Lancaster was used to survey an agricultural field from six different altitudes ranging from 45.72 m (150 ft.) to 121.92 m (400 ft.). After collecting imagery, two different software packages (MeshLab and AgiSoft) were used to measure predetermined reference objects within six three-dimensional (3-D) point clouds (one per altitude scenario). In-silico measurements were then compared to actual reference object measurements, as recorded with a tape measure. Deviations of in-silico measurements from actual measurements were recorded as Δx, Δy, and Δz. The average measurement deviation in each coordinate direction was then calculated for each of the six flight scenarios. Results from MeshLab vs. AgiSoft offered insight into the effectiveness of GPS-defined point cloud scaling in comparison to user-defined point cloud scaling. In three of the six flight scenarios flown, MeshLab's 3D imaging software (user-defined scale) was able to measure object dimensions from 50.8 to 76.2 cm (20-30 inches) with greater than 93% accuracy. The largest average deviation in any flight scenario from actual measurements was 14.77 cm (5.82 in.). Analysis of the point clouds in AgiSoft (GPS-defined scale) yielded even smaller Δx, Δy, and Δz than the MeshLab measurements in over 75% of the flight scenarios. The precisions of these results are satisfactory in a wide variety of precision agriculture applications focused on differentiating and identifying objects using remote imagery.

  3. Energy crops on landfills: functional, environmental, and costs analysis of different landfill configurations.

    PubMed

    Pivato, Alberto; Garbo, Francesco; Moretto, Marco; Lavagnolo, Maria Cristina

    2018-02-09

    The cultivation of energy crops on landfills represents an important challenge for the near future, as the possibility to use devalued sites for energy production is very attractive. In this study, four scenarios have been assessed and compared with respect to a reference case defined for northern Italy. The scenarios were defined taking into consideration current energy crops issues. In particular, the first three scenarios were based on energy maximisation, phytotreatment ability, and environmental impact, respectively. The fourth scenario was a combination of these characteristics emphasised by the previous scenarios. A multi-criteria analysis, based on economic, energetic, and environmental aspects, was performed. From the analysis, the best scenario resulted to be the fourth, with its ability to pursue several objectives simultaneously and obtain the best score relatively to both environmental and energetic criteria. On the contrary, the economic criterion emerges as weak, as all the considered scenarios showed some limits from this point of view. Important indications for future designs can be derived. The decrease of leachate production due to the presence of energy crops on the top cover, which enhances evapotranspiration, represents a favourable but critical aspect in the definition of the results.

  4. Requirements for plug and play information infrastructure frameworks and architectures to enable virtual enterprises

    NASA Astrophysics Data System (ADS)

    Bolton, Richard W.; Dewey, Allen; Horstmann, Paul W.; Laurentiev, John

    1997-01-01

    This paper examines the role virtual enterprises will have in supporting future business engagements and resulting technology requirements. Two representative end-user scenarios are proposed that define the requirements for 'plug-and-play' information infrastructure frameworks and architectures necessary to enable 'virtual enterprises' in US manufacturing industries. The scenarios provide a high- level 'needs analysis' for identifying key technologies, defining a reference architecture, and developing compliant reference implementations. Virtual enterprises are short- term consortia or alliances of companies formed to address fast-changing opportunities. Members of a virtual enterprise carry out their tasks as if they all worked for a single organization under 'one roof', using 'plug-and-play' information infrastructure frameworks and architectures to access and manage all information needed to support the product cycle. 'Plug-and-play' information infrastructure frameworks and architectures are required to enhance collaboration between companies corking together on different aspects of a manufacturing process. This new form of collaborative computing will decrease cycle-time and increase responsiveness to change.

  5. Biomass Scenario Model Documentation: Data and References

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Y.; Newes, E.; Bush, B.

    2013-05-01

    The Biomass Scenario Model (BSM) is a system dynamics model that represents the entire biomass-to-biofuels supply chain, from feedstock to fuel use. The BSM is a complex model that has been used for extensive analyses; the model and its results can be better understood if input data used for initialization and calibration are well-characterized. It has been carefully validated and calibrated against the available data, with data gaps filled in using expert opinion and internally consistent assumed values. Most of the main data sources that feed into the model are recognized as baseline values by the industry. This report documentsmore » data sources and references in Version 2 of the BSM (BSM2), which only contains the ethanol pathway, although subsequent versions of the BSM contain multiple conversion pathways. The BSM2 contains over 12,000 total input values, with 506 distinct variables. Many of the variables are opportunities for the user to define scenarios, while others are simply used to initialize a stock, such as the initial number of biorefineries. However, around 35% of the distinct variables are defined by external sources, such as models or reports. The focus of this report is to provide insight into which sources are most influential in each area of the supply chain.« less

  6. Extraterrestrial processing and manufacturing of large space systems, volume 1, chapters 1-6

    NASA Technical Reports Server (NTRS)

    Miller, R. H.; Smith, D. B. S.

    1979-01-01

    Space program scenarios for production of large space structures from lunar materials are defined. The concept of the space manufacturing facility (SMF) is presented. The manufacturing processes and equipment for the SMF are defined and the conceptual layouts are described for the production of solar cells and arrays, structures and joints, conduits, waveguides, RF equipment radiators, wire cables, and converters. A 'reference' SMF was designed and its operation requirements are described.

  7. Review of potential EGS sites and possible EGS demonstration scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1999-09-01

    Review of potential sites for Enhanced Geothermal Systems (EGS) and development of reference scenarios for EGS demonstration projects are two sub-tasks included in the FY 1999 EGS Research and Development (R&D) Management Task (DOE Task Order Number DE-AT07-99ID60365, included in the Appendix of this report). These sub-tasks are consistent with the EGS Strategic Plan, which includes milestones relating to EGS site selection (Milestone 4, to be completed in 2004) and development of a cost-shared, pilot-scale demonstration project (Milestone 5, to be completed in 2008). The purpose of the present work is to provide some reference points for discussing what typemore » of EGS projects might be undertaken, where they might be located, and what the associated benefits are likely to be. The review of potential EGS sites is presented in Chapter 2 of this report. It draws upon site-selection criteria (and potential project sites that were identified using those criteria) developed at a mini-workshop held at the April 1998 DOE Geothermal Program Review to discuss EGS R&D issues. The criteria and the sites were the focus of a paper presented at the 4th International Hot Dry Rock Forum in Strasbourg in September 1998 (Sass and Robertson-Tait, 1998). The selection criteria, project sites and possible EGS developments discussed in the workshop and paper are described in more detail herein. Input from geothermal operators is incorporated, and water availability and transmission-line access are emphasized. The reference scenarios for EGS demonstration projects are presented in Chapter 3. Three alternative scenarios are discussed: (1) a stand-alone demonstration plant in an area with no existing geothermal development; (2) a separate generating facility adjacent to an existing geothermal development; and (3) an EGS project that supplies an existing geothermal power plant with additional generating capacity. Furthermore, information potentially useful to DOE in framing solicitations and selecting projects for funding is discussed objectively. Although defined as separate sub-tasks, the EGS site review and reference scenarios are closely related. The incremental approach to EGS development that has recently been adopted could logically be expected to yield proposals for studies that lead up to and include production-enhancement experiments in producing geothermal fields in the very near future. However, the strategic plan clearly calls for the development of a more comprehensive demonstration project that can generate up to perhaps 10 MW (gross). It is anticipated that a series of small-scale experiments will define what realistically may be achieved in the near future, thus setting the stage for a successful pilot demonstration. This report continues the process of presenting information on EGS sites and experiments, and begins the process of defining what a demonstration project might be.« less

  8. The space station assembly phase: Flight telerobotic servicer feasibility. Volume 2: Methodology and case study

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Gyamfi, Max A.; Volkmer, Kent; Zimmerman, Wayne F.

    1987-01-01

    A methodology is described for examining the feasibility of a Flight Telerobotic Servicer (FTS) using two assembly scenarios, defined at the EVA task level, for the 30 shuttle flights (beginning with MB-1) over a four-year period. Performing all EVA tasks by crew only is compared to a scenario in which crew EVA is augmented by FTS. A reference FTS concept is used as a technology baseline and life-cycle cost analysis is performed to highlight cost tradeoffs. The methodology, procedure, and data used to complete the analysis are documented in detail.

  9. An evaluation of climate change effects in estuarine salinity patterns: Application to Ria de Aveiro shallow water system

    NASA Astrophysics Data System (ADS)

    Vargas, Catarina I. C.; Vaz, Nuno; Dias, João M.

    2017-04-01

    It is of global interest, for the definition of effective adaptation strategies, to make an assessment of climate change impacts in coastal environments. In this study, the salinity patterns adjustments and the correspondent Venice System zonations adaptations are evaluated through numerical modelling for Ria de Aveiro, a mesotidal shallow water lagoon located in the Portuguese coast, for the end of the 21st century in a climate change context. A reference (equivalent to present conditions) and three future scenarios are defined and simulated, both for wet and dry conditions. The future scenarios are designed with the following changes to the reference: scenario 1) projected mean sea level (MSL) rise; scenario 2) projected river flow discharges; and scenario 3) projections for both MSL and river flow discharges. The projections imposed are: a MSL rise of 0.42 m; a freshwater flow reduction of ∼22% for the wet season and a reduction of ∼87% for the dry season. Modelling results are analyzed for different tidal ranges. Results indicate: a) a salinity upstream intrusion and a generalized salinity increase for sea level rise scenario, with higher significance in middle-to-upper lagoon zones; b) a maximum salinity increase of ∼12 in scenario 3 and wet conditions for Espinheiro channel, the one with higher freshwater contribution; c) an upstream displacement of the saline fronts occurring in wet conditions for all future scenarios, with stronger expression for scenario 3, of ∼2 km in Espinheiro channel; and d) a landward progression of the saltier physical zones established in the Venice System scheme. The adaptation of the ecosystem to the upstream relocation of physical zones may be blocked by human settlements and other artificial barriers surrounding the estuarine environment.

  10. Problems encountered when defining Arctic amplification as a ratio

    PubMed Central

    Hind, Alistair; Zhang, Qiong; Brattström, Gudrun

    2016-01-01

    In climate change science the term ‘Arctic amplification’ has become synonymous with an estimation of the ratio of a change in Arctic temperatures compared with a broader reference change under the same period, usually in global temperatures. Here, it is shown that this definition of Arctic amplification comes with a suite of difficulties related to the statistical properties of the ratio estimator itself. Most problematic is the complexity of categorizing uncertainty in Arctic amplification when the global, or reference, change in temperature is close to 0 over a period of interest, in which case it may be impossible to set bounds on this uncertainty. An important conceptual distinction is made between the ‘Ratio of Means’ and ‘Mean Ratio’ approaches to defining a ratio estimate of Arctic amplification, as they do not only possess different uncertainty properties regarding the amplification factor, but are also demonstrated to ask different scientific questions. Uncertainty in the estimated range of the Arctic amplification factor using the latest global climate models and climate forcing scenarios is expanded upon and shown to be greater than previously demonstrated for future climate projections, particularly using forcing scenarios with lower concentrations of greenhouse gases. PMID:27461918

  11. Problems encountered when defining Arctic amplification as a ratio.

    PubMed

    Hind, Alistair; Zhang, Qiong; Brattström, Gudrun

    2016-07-27

    In climate change science the term 'Arctic amplification' has become synonymous with an estimation of the ratio of a change in Arctic temperatures compared with a broader reference change under the same period, usually in global temperatures. Here, it is shown that this definition of Arctic amplification comes with a suite of difficulties related to the statistical properties of the ratio estimator itself. Most problematic is the complexity of categorizing uncertainty in Arctic amplification when the global, or reference, change in temperature is close to 0 over a period of interest, in which case it may be impossible to set bounds on this uncertainty. An important conceptual distinction is made between the 'Ratio of Means' and 'Mean Ratio' approaches to defining a ratio estimate of Arctic amplification, as they do not only possess different uncertainty properties regarding the amplification factor, but are also demonstrated to ask different scientific questions. Uncertainty in the estimated range of the Arctic amplification factor using the latest global climate models and climate forcing scenarios is expanded upon and shown to be greater than previously demonstrated for future climate projections, particularly using forcing scenarios with lower concentrations of greenhouse gases.

  12. Protocol Architecture Model Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASA's four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. This report applies the methodology to three space Internet-based communications scenarios for future missions. CNS has conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. The scenarios are: Scenario 1: Unicast communications between a Low-Earth-Orbit (LEO) spacecraft inspace Internet node and a ground terminal Internet node via a Tracking and Data Rela Satellite (TDRS) transfer; Scenario 2: Unicast communications between a Low-Earth-Orbit (LEO) International Space Station and a ground terminal Internet node via a TDRS transfer; Scenario 3: Multicast Communications (or "Multicasting"), 1 Spacecraft to N Ground Receivers, N Ground Transmitters to 1 Ground Receiver via a Spacecraft.

  13. No-reference image quality assessment for horizontal-path imaging scenarios

    NASA Astrophysics Data System (ADS)

    Rios, Carlos; Gladysz, Szymon

    2013-05-01

    There exist several image-enhancement algorithms and tasks associated with imaging through turbulence that depend on defining the quality of an image. Examples include: "lucky imaging", choosing the width of the inverse filter for image reconstruction, or stopping iterative deconvolution. We collected a number of image quality metrics found in the literature. Particularly interesting are the blind, "no-reference" metrics. We discuss ways of evaluating the usefulness of these metrics, even when a fully objective comparison is impossible because of the lack of a reference image. Metrics are tested on simulated and real data. Field data comes from experiments performed by the NATO SET 165 research group over a 7 km distance in Dayton, Ohio.

  14. Energy and environmental evaluation of combined cooling heating and power system

    NASA Astrophysics Data System (ADS)

    Bugaj, Andrzej

    2017-11-01

    The paper addresses issues involving problems of implementing combined cooling, heating and power (CCHP) system to industrial facility with well-defined demand profiles of cooling, heating and electricity. The application of CCHP system in this particular industrial facility is being evaluated by comparison with the reference system that consists of three conventional methods of energy supply: (a) electricity from external grid, (b) heat from gas-fired boilers and (c) cooling from vapour compression chillers run by electricity from the grid. The CCHP system scenario is based on the combined heat and power (CHP) plant with gas turbine-compressor arrangement and water/lithium bromide absorption chiller of a single-effect type. Those two scenarios are analysed in terms of annual primary energy usage as well as emissions of CO2. The results of the analysis show an extent of primary energy savings of the CCHP system in comparison with the reference system. Furthermore, the environmental impact of the CCHP usage, in the form of greenhouse gases emission reductions, compares quite favourably with the reference conventional option.

  15. The role of internal reference prices in consumers' willingness to pay judgments: Thaler's Beer Pricing Task revisited.

    PubMed

    Ranyard, R; Charlton, J P; Williamson, J

    2001-02-01

    Alternative reference prices, either displayed in the environment (external) or recalled from memory (internal) are known to influence consumer judgments and decisions. In one line of previous research, internal reference prices have been defined in terms of general price expectations. However, Thaler (Marketing Science 4 (1985) 199; Journal of Behavioral Decision Making 12 (1999) 183) defined them as fair prices expected from specific types of seller. Using a Beer Pricing Task, he found that seller context had a substantial effect on willingness to pay, and concluded that this was due to specific internal reference prices evoked by specific contexts. In a think aloud study using the same task (N = 48), we found only a marginal effect of seller context. In a second study using the Beer Pricing Task and seven analogous ones (N = 144), general internal reference prices were estimated by asking people what they normally paid for various commodities. Both general internal reference prices and seller context influenced willingness to pay, although the effect of the latter was again rather small. We conclude that general internal reference prices have a greater impact in these scenarios than specific ones, because of the lower cognitive load involved in their storage and retrieval.

  16. Cost Optimal Elastic Auto-Scaling in Cloud Infrastructure

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, S.; Sidhanta, S.; Ganguly, S.; Nemani, R. R.

    2014-12-01

    Today, elastic scaling is critical part of leveraging cloud. Elastic scaling refers to adding resources only when it is needed and deleting resources when not in use. Elastic scaling ensures compute/server resources are not over provisioned. Today, Amazon and Windows Azure are the only two platform provider that allow auto-scaling of cloud resources where servers are automatically added and deleted. However, these solution falls short of following key features: A) Requires explicit policy definition such server load and therefore lacks any predictive intelligence to make optimal decision; B) Does not decide on the right size of resource and thereby does not result in cost optimal resource pool. In a typical cloud deployment model, we consider two types of application scenario: A. Batch processing jobs → Hadoop/Big Data case B. Transactional applications → Any application that process continuous transactions (Requests/response) In reference of classical queuing model, we are trying to model a scenario where servers have a price and capacity (size) and system can add delete servers to maintain a certain queue length. Classical queueing models applies to scenario where number of servers are constant. So we cannot apply stationary system analysis in this case. We investigate the following questions 1. Can we define Job queue and use the metric to define such a queue to predict the resource requirement in a quasi-stationary way? Can we map that into an optimal sizing problem? 2. Do we need to get into a level of load (CPU/Data) on server level to characterize the size requirement? How do we learn that based on Job type?

  17. Forward modeling of an atmospheric scenario: path characterization in terms of scattering intensity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosisio, Ada Vittoria; Cadeddu, Maria P.; Fionda, Ermanno

    The knowledge of possible impairments due to atmospheric propagation is of importance in the framework of future 5G mobile networks that use spectrum resource up to the W band. Here, the authors propose the scalar Scatter Indicator (SI), defined as the difference between the simulated TB at 72 GHz and the TB value at the same frequency estimated from a combination of TBs values at 23.8 and 31.4 GHz under assumed scatter-free condition. On the basis of radiosonde profiles observed in Milan, Linate (Italy) in 2005, clear-sky scenarios are used as reference to define a scatter-free TB’s database. A secondmore » database of simulated TBs including scattering effects is generated with ARTS to build the SI. Numerical results show that the SI assumes significant positive values with increasing drop effective radius and total liquid water LWP and it can be used to identify the scattering due to hydrometeor« less

  18. Impact of pre-imputation SNP-filtering on genotype imputation results

    PubMed Central

    2014-01-01

    Background Imputation of partially missing or unobserved genotypes is an indispensable tool for SNP data analyses. However, research and understanding of the impact of initial SNP-data quality control on imputation results is still limited. In this paper, we aim to evaluate the effect of different strategies of pre-imputation quality filtering on the performance of the widely used imputation algorithms MaCH and IMPUTE. Results We considered three scenarios: imputation of partially missing genotypes with usage of an external reference panel, without usage of an external reference panel, as well as imputation of completely un-typed SNPs using an external reference panel. We first created various datasets applying different SNP quality filters and masking certain percentages of randomly selected high-quality SNPs. We imputed these SNPs and compared the results between the different filtering scenarios by using established and newly proposed measures of imputation quality. While the established measures assess certainty of imputation results, our newly proposed measures focus on the agreement with true genotypes. These measures showed that pre-imputation SNP-filtering might be detrimental regarding imputation quality. Moreover, the strongest drivers of imputation quality were in general the burden of missingness and the number of SNPs used for imputation. We also found that using a reference panel always improves imputation quality of partially missing genotypes. MaCH performed slightly better than IMPUTE2 in most of our scenarios. Again, these results were more pronounced when using our newly defined measures of imputation quality. Conclusion Even a moderate filtering has a detrimental effect on the imputation quality. Therefore little or no SNP filtering prior to imputation appears to be the best strategy for imputing small to moderately sized datasets. Our results also showed that for these datasets, MaCH performs slightly better than IMPUTE2 in most scenarios at the cost of increased computing time. PMID:25112433

  19. Defining climate change scenario characteristics with a phase space of cumulative primary energy and carbon intensity

    NASA Astrophysics Data System (ADS)

    Ritchie, Justin; Dowlatabadi, Hadi

    2018-02-01

    Climate change modeling relies on projections of future greenhouse gas emissions and other phenomena leading to changes in planetary radiative forcing. Scenarios of socio-technical development consistent with end-of-century forcing levels are commonly produced by integrated assessment models. However, outlooks for forcing from fossil energy combustion can also be presented and defined in terms of two essential components: total energy use this century and the carbon intensity of that energy. This formulation allows a phase space diagram to succinctly describe a broad range of possible outcomes for carbon emissions from the future energy system. In the following paper, we demonstrate this phase space method with the Representative Concentration Pathways (RCPs) as used in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). The resulting RCP phase space is applied to map IPCC Working Group III (WGIII) reference case ‘no policy’ scenarios. Once these scenarios are described as coordinates in the phase space, data mining techniques can readily distill their core features. Accordingly, we conduct a k-means cluster analysis to distinguish the shared outlooks of these scenarios for oil, gas and coal resource use. As a whole, the AR5 database depicts a transition toward re-carbonization, where a world without climate policy inevitably leads to an energy supply with increasing carbon intensity. This orientation runs counter to the experienced ‘dynamics as usual’ of gradual decarbonization, suggesting climate change targets outlined in the Paris Accord are more readily achievable than projected to date.

  20. Trade-Off Analysis Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASAs Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASAs four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. CNS previously developed a report which applied the methodology, to three space Internet-based communications scenarios for future missions. CNS conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. GRC selected for further analysis the scenario that involved unicast communications between a Low-Earth-Orbit (LEO) International Space Station (ISS) and a ground terminal Internet node via a Tracking and Data Relay Satellite (TDRS) transfer. This report contains a tradeoff analysis on the selected scenario. The analysis examines the performance characteristics of the various protocols and architectures. The tradeoff analysis incorporates the results of a CNS developed analytical model that examined performance parameters.

  1. Preliminary identification of potentially disruptive scenarios at the Greater Confinement Disposal Facility, Area 5 of the Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guzowski, R.V.; Newman, G.

    1993-12-01

    The Greater Confinement Disposal location is being evaluated to determine whether defense-generated transuranic waste buried at this location complies with the Containment Requirements established by the US Environmental Protection Agency. One step in determining compliance is to identify those combinations of events and processes (scenarios) that define possible future states of the disposal system for which performance assessments must be performed. An established scenario-development procedure was used to identify a comprehensive set of mutually exclusive scenarios. To assure completeness, 761 features, events, processes, and other listings (FEPS) were compiled from 11 references. This number was reduced to 205 primarily throughmore » the elimination of duplications. The 205 FEPs were screened based on site-specific, goal-specific, and regulatory criteria. Four events survived screening and were used in preliminary scenario development: (1) exploratory drilling penetrates a GCD borehole, (2) drilling of a withdrawal/injection well penetrates a GCD borehole, (3) subsidence occurs at the RWMS, and (4) irrigation occurs at the RWMS. A logic diagram was used to develop 16 scenarios from the four events. No screening of these scenarios was attempted at this time. Additional screening of the currently retained events and processes will be based on additional data and information from site-characterization activities. When screening of the events and processes is completed, a final set of scenarios will be developed and screened based on consequence and probability of occurrence.« less

  2. CERISE, a French radioprotection code, to assess the radiological impact and acceptance criteria of installations for material handling, and recycling or disposal of very low-level radioactive waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santucci, P.; Guetat, P.

    1993-12-31

    This document describes the code CERISE, Code d`Evaluations Radiologiques Individuelles pour des Situations en Enterprise et dans l`Environnement. This code has been developed in the frame of European studies to establish acceptance criteria of very low-level radioactive waste and materials. This code is written in Fortran and runs on PC. It calculates doses received by the different pathways: external exposure, ingestion, inhalation and skin contamination. Twenty basic scenarios are already elaborated, which have been determined from previous studies. Calculations establish the relation between surface, specific and/or total activities, and doses. Results can be expressed as doses for an average activitymore » unit, or as average activity limits for a set of reference doses (defined for each scenario analyzed). In this last case, the minimal activity values and the corresponding limiting scenarios, are selected and summarized in a final table.« less

  3. Scenarios Based on Shared Socioeconomic Pathway Assumptions

    NASA Astrophysics Data System (ADS)

    Edmonds, J.

    2013-12-01

    A set of new scenarios is being developed by the international scientific community as part of a larger program that was articulated in Moss, et al. (2009), published in Nature. A long series of meetings including climate researchers drawn from the climate modeling, impacts, adaptation and vulnerability (IAV) and integrated assessment modeling (IAM) communities have led to the development of a set of five Shared Socioeconomic Pathways (SSPs), which define the state of human and natural societies at a macro scale over the course of the 21st century without regard to climate mitigation or change. SSPs were designed to explore a range of possible futures consistent with greater or lesser challenges to mitigation and challenges to adaptation. They include a narrative storyline and a set of quantified measures--e.g. demographic and economic profiles--that define the high-level state of society as it evolves over the 21st century under the assumption of no significant climate feedback. SSPs can be used to develop quantitative scenarios of human Earth systems using IAMs. IAMs produce information about greenhouse gas emissions, energy systems, the economy, agriculture and land use. Each set of SSPs will have a different human Earth system realization for each IAM. Five groups from the IAM community have begun to explore the implications of SSP assumptions for emissions, energy, economy, agriculture and land use. We report the quantitative results of initial experiments from those groups. A major goal of the Moss, et al. strategy was to enable the use of CMIP5 climate model ensemble products for IAV research. CMIP5 climate scenarios used four Representative Concentration Pathway (RCP) scenarios, defined in terms of radiative forcing in the year 2100: 2.6, 4.5, 6.0, and 8.5 Wm-2. There is no reason to believe that the SSPs will generate year 2100 levels of radiative forcing that correspond to the four RCP levels, though it is important that at least one SSP produce a scenario with at least 8.5 Wm-2. To address this problem each SSP scenario can be treated as a reference scenario, to which emissions mitigation policies can be applied to create a set of RCP replications. These RCP replications have the underlying SSP socio-economic assumptions in addition to policy assumptions and radiative forcing levels consistent with the CMIP5 products. We report quantitative results of initial experiments from the five participating groups.

  4. A multi-source probabilistic hazard assessment of tephra dispersal in the Neapolitan area

    NASA Astrophysics Data System (ADS)

    Sandri, Laura; Costa, Antonio; Selva, Jacopo; Folch, Arnau; Macedonio, Giovanni; Tonini, Roberto

    2015-04-01

    In this study we present the results obtained from a long-term Probabilistic Hazard Assessment (PHA) of tephra dispersal in the Neapolitan area. Usual PHA for tephra dispersal needs the definition of eruptive scenarios (usually by grouping eruption sizes and possible vent positions in a limited number of classes) with associated probabilities, a meteorological dataset covering a representative time period, and a tephra dispersal model. PHA then results from combining simulations considering different volcanological and meteorological conditions through weights associated to their specific probability of occurrence. However, volcanological parameters (i.e., erupted mass, eruption column height, eruption duration, bulk granulometry, fraction of aggregates) typically encompass a wide range of values. Because of such a natural variability, single representative scenarios or size classes cannot be adequately defined using single values for the volcanological inputs. In the present study, we use a method that accounts for this within-size-class variability in the framework of Event Trees. The variability of each parameter is modeled with specific Probability Density Functions, and meteorological and volcanological input values are chosen by using a stratified sampling method. This procedure allows for quantifying hazard without relying on the definition of scenarios, thus avoiding potential biases introduced by selecting single representative scenarios. Embedding this procedure into the Bayesian Event Tree scheme enables the tephra fall PHA and its epistemic uncertainties. We have appied this scheme to analyze long-term tephra fall PHA from Vesuvius and Campi Flegrei, in a multi-source paradigm. We integrate two tephra dispersal models (the analytical HAZMAP and the numerical FALL3D) into BET_VH. The ECMWF reanalysis dataset are used for exploring different meteorological conditions. The results obtained show that PHA accounting for the whole natural variability are consistent with previous probabilities maps elaborated for Vesuvius and Campi Flegrei on the basis of single representative scenarios, but show significant differences. In particular, the area characterized by a 300 kg/m2-load exceedance probability larger than 5%, accounting for the whole range of variability (that is, from small violent strombolian to plinian eruptions), is similar to that displayed in the maps based on the medium magnitude reference eruption, but it is of a smaller extent. This is due to the relatively higher weight of the small magnitude eruptions considered in this study, but neglected in the reference scenario maps. On the other hand, in our new maps the area characterized by a 300 kg/m2-load exceedance probability larger than 1% is much larger than that of the medium magnitude reference eruption, due to the contribution of plinian eruptions at lower probabilities, again neglected in the reference scenario maps.

  5. The development of an inherent safety approach to the prevention of domino accidents.

    PubMed

    Cozzani, Valerio; Tugnoli, Alessandro; Salzano, Ernesto

    2009-11-01

    The severity of industrial accidents in which a domino effect takes place is well known in the chemical and process industry. The application of an inherent safety approach for the prevention of escalation events leading to domino accidents was explored in the present study. Reference primary scenarios were analyzed and escalation vectors were defined. Inherent safety distances were defined and proposed as a metric to express the intensity of the escalation vectors. Simple rules of thumb were presented for a preliminary screening of these distances. Swift reference indices for layout screening with respect to escalation hazard were also defined. Two case studies derived from existing layouts of oil refineries were selected to understand the potentialities coming from the application in the methodology. The results evidenced that the approach allows a first comparative assessment of the actual domino hazard in a layout, and the identification of critical primary units with respect to escalation events. The methodology developed also represents a useful screening tool to identify were to dedicate major efforts in the design of add-on measures, optimizing conventional passive and active measures for the prevention of severe domino accidents.

  6. Regional Issue Identification and Assessment (RIIA). Volume III. Institutional barriers to developing power generation facilities in the Pacific Northwest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, F. A.; Sawyer, C. H.; Maxwell, J. H.

    1979-10-01

    The Regional Assessments Division in the US Department of Energy (DOE) has undertaken a program to assess the probable consequences of various national energy policies in regions of the United States and to evaluate the constraints on national energy policy imposed by conditions in these regions. The program is referred to as the Regional Issues Identification and Assessment (RIIA) Program. Currently the RIIA Program is evaluating the Trendlong Mid-Mid scenario, a pattern of energy development for 1985 and 1990 derived from the Project Independence Evaluation System (PIES) model. This scenario assumes a medium annual growth rate in both the nationalmore » demand for and national supply of energy. It has been disaggregated to specify the generating capacity to be supplied by each energy source in each state. Pacific Northwest Laboratory (PNL) has the responsibility for evaluating the scenario for the Federal Region 10, consisting of Alaska, Idaho, Oregon, and Washington. PNL is identifying impacts and constraints associated with realizing the scenario in a variety of categories, including air and water quality impacts, health and safety effects, and socioeconomic impacts. This report summarizes the analysis of one such category: institutional constraints - defined to include legal, organizational, and political barriers to the achievement of the scenario in the Northwest.« less

  7. Exploring the reversibility of marine climate change impacts in temperature overshoot scenarios

    NASA Astrophysics Data System (ADS)

    Zickfeld, K.; Li, X.; Tokarska, K.; Kohfeld, K. E.

    2017-12-01

    Artificial carbon dioxide removal (CDR) from the atmosphere has been proposed as a measure for mitigating climate change and restoring the climate system to a `safe' state after overshoot. Previous studies have demonstrated that the changes in surface air temperature due to anthropogenic CO2 emissions can be reversed through CDR, while some oceanic properties, for example thermosteric sea level rise, show a delay in their response to CDR. This research aims to investigate the reversibility of changes in ocean conditions after implementation of CDR with a focus on ocean biogeochemical properties. To achieve this, we analyze climate model simulations based on two sets of emission scenarios. We first use RCP2.6 and its extension until year 2300 as the reference scenario and design several temperature and cumulative CO2 emissions "overshoot" scenarios based on other RCPs, which represents cases with less ambitious mitigation policies in the near term that temporarily exceed the 2 °C target adopted by the Paris Agreement. In addition, we use a set of emission scenarios with a reference scenario limiting warming to 1.5°C in the long term and two overshoot scenarios. The University of Victoria Earth System Climate Model (UVic ESCM), a climate model of intermediate complexity, is forced with these emission scenarios. We compare the response of select ocean variables (seawater temperature, pH, dissolved oxygen) in the overshoot scenarios to that in the respective reference scenario at the time the same amount of cumulative emissions is achieved. Our results suggest that the overshoot and subsequent return to a reference CO2 cumulative emissions level would leave substantial impacts on the marine environment. Although the changes in global mean sea surface variables (temperature, pH and dissolved oxygen) are largely reversible, global mean ocean temperature, dissolved oxygen and pH differ significantly from those in the reference scenario. Large ocean areas exhibit temperature increase and pH and dissolved oxygen decrease relative to the reference scenario without cumulative CO2 emissions overshoot. Furthermore, our results show that the higher the level of overshoot, the lower the reversibility of changes in the marine environment.

  8. Comparison of tablet-based strategies for incision planning in laser microsurgery

    NASA Astrophysics Data System (ADS)

    Schoob, Andreas; Lekon, Stefan; Kundrat, Dennis; Kahrs, Lüder A.; Mattos, Leonardo S.; Ortmaier, Tobias

    2015-03-01

    Recent research has revealed that incision planning in laser surgery deploying stylus and tablet outperforms state-of-the-art micro-manipulator-based laser control. Providing more detailed quantitation regarding that approach, a comparative study of six tablet-based strategies for laser path planning is presented. Reference strategy is defined by monoscopic visualization and continuous path drawing on a graphics tablet. Further concepts deploying stereoscopic or a synthesized laser view, point-based path definition, real-time teleoperation or a pen display are compared with the reference scenario. Volunteers were asked to redraw and ablate stamped lines on a sample. Performance is assessed by measuring planning accuracy, completion time and ease of use. Results demonstrate that significant differences exist between proposed concepts. The reference strategy provides more accurate incision planning than the stereo or laser view scenario. Real-time teleoperation performs best with respect to completion time without indicating any significant deviation in accuracy and usability. Point-based planning as well as the pen display provide most accurate planning and increased ease of use compared to the reference strategy. As a result, combining the pen display approach with point-based planning has potential to become a powerful strategy because of benefiting from improved hand-eye-coordination on the one hand and from a simple but accurate technique for path definition on the other hand. These findings as well as the overall usability scale indicating high acceptance and consistence of proposed strategies motivate further advanced tablet-based planning in laser microsurgery.

  9. Children and adults exposed to electromagnetic fields at the ICNIRP reference levels: theoretical assessment of the induced peak temperature increase.

    PubMed

    Bakker, J F; Paulides, M M; Neufeld, E; Christ, A; Kuster, N; van Rhoon, G C

    2011-08-07

    To avoid potentially adverse health effects of electromagnetic fields (EMF), the International Commission on Non-Ionizing Radiation Protection (ICNIRP) has defined EMF reference levels. Restrictions on induced whole-body-averaged specific absorption rate (SAR(wb)) are provided to keep the whole-body temperature increase (T(body, incr)) under 1 °C during 30 min. Additional restrictions on the peak 10 g spatial-averaged SAR (SAR(10g)) are provided to prevent excessive localized tissue heating. The objective of this study is to assess the localized peak temperature increase (T(incr, max)) in children upon exposure at the reference levels. Finite-difference time-domain modeling was used to calculate T(incr, max) in six children and two adults exposed to orthogonal plane-wave configurations. We performed a sensitivity study and Monte Carlo analysis to assess the uncertainty of the results. Considering the uncertainties in the model parameters, we found that a peak temperature increase as high as 1 °C can occur for worst-case scenarios at the ICNIRP reference levels. Since the guidelines are deduced from temperature increase, we used T(incr, max) as being a better metric to prevent excessive localized tissue heating instead of localized peak SAR. However, we note that the exposure time should also be considered in future guidelines. Hence, we advise defining limits on T(incr, max) for specified durations of exposure.

  10. An effective XML based name mapping mechanism within StoRM

    NASA Astrophysics Data System (ADS)

    Corso, E.; Forti, A.; Ghiselli, A.; Magnoni, L.; Zappi, R.

    2008-07-01

    In a Grid environment the naming capability allows users to refer to specific data resources in a physical storage system using a high level logical identifier. This logical identifier is typically organized in a file system like structure, a hierarchical tree of names. Storage Resource Manager (SRM) services map the logical identifier to the physical location of data evaluating a set of parameters as the desired quality of services and the VOMS attributes specified in the requests. StoRM is a SRM service developed by INFN and ICTP-EGRID to manage file and space on standard POSIX and high performing parallel and cluster file systems. An upcoming requirement in the Grid data scenario is the orthogonality of the logical name and the physical location of data, in order to refer, with the same identifier, to different copies of data archived in various storage areas with different quality of service. The mapping mechanism proposed in StoRM is based on a XML document that represents the different storage components managed by the service, the storage areas defined by the site administrator, the quality of service they provide and the Virtual Organization that want to use the storage area. An appropriate directory tree is realized in each storage component reflecting the XML schema. In this scenario StoRM is able to identify the physical location of a requested data evaluating the logical identifier and the specified attributes following the XML schema, without querying any database service. This paper presents the namespace schema defined, the different entities represented and the technical details of the StoRM implementation.

  11. Emissions of indoor air pollutants from six user scenarios in a model room

    NASA Astrophysics Data System (ADS)

    Höllbacher, Eva; Ters, Thomas; Rieder-Gradinger, Cornelia; Srebotnik, Ewald

    2017-02-01

    In this study six common user scenarios putatively influencing indoor air quality were performed in a model room constructed according to the specifications of the European Reference Room given in the new horizontal prestandard prEN 16516 to gain further information about the influence of user activities on indoor air quality. These scenarios included the use of cleaning agent, an electric air freshener, an ethanol fireplace and cosmetics as well as cigarette smoking and peeling of oranges. Four common indoor air pollutants were monitored: volatile organic compounds (VOC), particulate matter (PM), carbonyl compounds and CO2. The development of all pollutants was determined during and after the test performance. For each measured pollutant, well-defined maximum values could be assigned to one or more of the individual user scenarios. The highest VOC concentration was measured during orange-peeling reaching a maximum value of 3547 μg m-3. Carbonyl compounds and PM were strongly elevated while cigarette smoking. Here, a maximum formaldehyde concentration of 76 μg m-3 and PM concentration of 378 μg m-3 were measured. CO2 was only slightly affected by most of the tests except the use of the ethanol fireplace where a maximum concentration of 1612 ppm was reached. Generally, the user scenarios resulted in a distinct increase of several indoor pollutants that usually decreased rapidly after the removal of the source.

  12. Operations analysis for lunar surface construction: Results of two office of exploration case studies

    NASA Astrophysics Data System (ADS)

    Bell, Lisa Y.; Boles, Walter; Smith, Alvin

    1991-08-01

    In an environment of intense competition for Federal funding, the U.S. space research community is responsible for developing a feasible, cost-effective approach to establishing a surface base on the moon to fulfill long-term Government objectives. This report presents the results of a construction operations analysis of two lunar scenarios provided by the National Aeronautics and Space Administration (NASA). Activities necessary to install the lunar base surface elements are defined and scheduled, based on the productivities and availability of the base resources allocated to the projects depicted in each scenario. The only construction project in which the required project milestones were not completed within the nominal timeframe was the initial startup phase of NASA's FY89 Lunar Evolution Case Study (LECS), primarily because this scenario did not include any Earth-based telerobotic site preparation before the arrival of the first crew. The other scenario analyzed. Reference Mission A from NASA's 90-Day Study of the Human Exploration of the Moon and Mars, did use telerobotic site preparation before the manned phase of the base construction. Details of the analysis for LECS are provided, including spreadsheets indicating quantities of work and Gantt charts depicting the general schedule for the work. This level of detail is not presented for the scenario based on the 90-Day Study because many of the projects include the same (or similar) surface elements and facilities.

  13. Communication architecture for AAL. Supporting patient care by health care providers in AAL-enhanced living quarters.

    PubMed

    Nitzsche, T; Thiele, S; Häber, A; Winter, A

    2014-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Using Data from Ambient Assisted Living and Smart Homes in Electronic Health Records". Concepts of Ambient Assisted Living (AAL) support a long-term health monitoring and further medical and other services for multi-morbid patients with chronic diseases. In Germany many AAL and telemedical applications exist. Synergy effects by common agreements for essential application components and standards are not achieved. It is necessary to define a communication architecture which is based on common definitions of communication scenarios, application components and communication standards. The development of a communication architecture requires different steps. To gain a reference model for the problem area different AAL and telemedicine projects were compared and relevant data elements were generalized. The derived reference model defines standardized communication links. As a result the authors present an approach towards a reference architecture for AAL-communication. The focus of the architecture lays on the communication layer. The necessary application components are identified and a communication based on standards and their extensions is highlighted. The exchange of patient individual events supported by an event classification model, raw and aggregated data from the personal home area over a telemedicine center to health care providers is possible.

  14. Modeling the impact of climate change in Germany with biosphere models for long-term safety assessment of nuclear waste repositories.

    PubMed

    Staudt, C; Semiochkina, N; Kaiser, J C; Pröhl, G

    2013-01-01

    Biosphere models are used to evaluate the exposure of populations to radionuclides from a deep geological repository. Since the time frame for assessments of long-time disposal safety is 1 million years, potential future climate changes need to be accounted for. Potential future climate conditions were defined for northern Germany according to model results from the BIOCLIM project. Nine present day reference climate regions were defined to cover those future climate conditions. A biosphere model was developed according to the BIOMASS methodology of the IAEA and model parameters were adjusted to the conditions at the reference climate regions. The model includes exposure pathways common to those reference climate regions in a stylized biosphere and relevant to the exposure of a hypothetical self-sustaining population at the site of potential radionuclide contamination from a deep geological repository. The end points of the model are Biosphere Dose Conversion factors (BDCF) for a range of radionuclides and scenarios normalized for a constant radionuclide concentration in near-surface groundwater. Model results suggest an increased exposure of in dry climate regions with a high impact of drinking water consumption rates and the amount of irrigation water used for agriculture. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Emotion recognition bias for contempt and anger in body dysmorphic disorder.

    PubMed

    Buhlmann, Ulrike; Etcoff, Nancy L; Wilhelm, Sabine

    2006-03-01

    Body dysmorphic disorder (BDD) patients are preoccupied with imagined defects or flaws in appearance (e.g., size or shape of nose). They are afraid of negative evaluations by others and often suffer significant morbidity including hospitalization and suicide attempts. Many patients experience ideas of reference, e.g., they often believe others take special notice of their "flaw". Facial expressions play an important role in conveying negative or positive feelings, and sympathy or rejection. In this study, we investigated emotion recognition deficits in 18 BDD patients and 18 healthy controls. Participants were presented with two questionnaires accompanying facial photographs. One questionnaire included self-referent scenarios ("Imagine that the bank teller is looking at you. What is his facial expression like?"), whereas the other one included other-referent scenarios ("Imagine that the bank teller is looking at a friend of yours," etc.), and participants were asked to identify the corresponding emotion (e.g., anger, contempt, neutral, or surprise). Overall, BDD patients, relative to controls, had difficulty identifying emotional expressions in self-referent scenarios. They misinterpreted more expressions as contemptuous and angry in self-referent scenarios than did controls. However, they did not have significantly more difficulties identifying emotional expressions in other-referent scenarios than controls. Thus, poor insight and ideas of reference, common in BDD, might be related to a bias for misinterpreting other people's emotional expressions as negative. Perceiving others as rejecting might reinforce concerns about one's personal perceived ugliness and social desirability.

  16. HUMEX, a study on the survivability and adaptation of humans to long-duration exploratory missions, part I: lunar missions.

    PubMed

    Horneck, G; Facius, R; Reichert, M; Rettberg, P; Seboldt, W; Manzey, D; Comet, B; Maillet, A; Preiss, H; Schauer, L; Dussap, C G; Poughon, L; Belyavin, A; Reitz, G; Baumstark-Khan, C; Gerzer, R

    2003-01-01

    The European Space Agency has recently initiated a study of the human responses, limits and needs with regard to the stress environments of interplanetary and planetary missions. Emphasis has been laid on human health and performance care as well as advanced life support developments including bioregenerative life support systems and environmental monitoring. The overall study goals were as follows: (i) to define reference scenarios for a European participation in human exploration and to estimate their influence on the life sciences and life support requirements; (ii) for selected mission scenarios, to critically assess the limiting factors for human health, wellbeing, and performance and to recommend relevant countermeasures; (iii) for selected mission scenarios, to critically assess the potential of advanced life support developments and to propose a European strategy including terrestrial applications; (iv) to critically assess the feasibility of existing facilities and technologies on ground and in space as testbeds in preparation for human exploratory missions and to develop a test plan for ground and space campaigns; (v) to develop a roadmap for a future European strategy towards human exploratory missions, including preparatory activities and terrestrial applications and benefits. This paper covers the part of the HUMEX study dealing with lunar missions. A lunar base at the south pole where long-time sunlight and potential water ice deposits could be assumed was selected as the Moon reference scenario. The impact on human health, performance and well being has been investigated from the view point of the effects of microgravity (during space travel), reduced gravity (on the Moon) and abrupt gravity changes (during launch and landing), of the effects of cosmic radiation including solar particle events, of psychological issues as well as general health care. Countermeasures as well as necessary research using ground-based test beds and/or the International Space Station have been defined. Likewise advanced life support systems with a high degree of autonomy and regenerative capacity and synergy effects were considered where bioregenerative life support systems and biodiagnostic systems become essential. Finally, a European strategy leading to a potential European participation in future human exploratory missions has been recommended. c2003 COSPAR. Published by Elsevier Ltd. All rights reserved.

  17. HUMEX, a study on the survivability and adaptation of humans to long-duration exploratory missions, part I: lunar missions

    NASA Technical Reports Server (NTRS)

    Horneck, G.; Facius, R.; Reichert, M.; Rettberg, P.; Seboldt, W.; Manzey, D.; Comet, B.; Maillet, A.; Preiss, H.; Schauer, L.; hide

    2003-01-01

    The European Space Agency has recently initiated a study of the human responses, limits and needs with regard to the stress environments of interplanetary and planetary missions. Emphasis has been laid on human health and performance care as well as advanced life support developments including bioregenerative life support systems and environmental monitoring. The overall study goals were as follows: (i) to define reference scenarios for a European participation in human exploration and to estimate their influence on the life sciences and life support requirements; (ii) for selected mission scenarios, to critically assess the limiting factors for human health, wellbeing, and performance and to recommend relevant countermeasures; (iii) for selected mission scenarios, to critically assess the potential of advanced life support developments and to propose a European strategy including terrestrial applications; (iv) to critically assess the feasibility of existing facilities and technologies on ground and in space as testbeds in preparation for human exploratory missions and to develop a test plan for ground and space campaigns; (v) to develop a roadmap for a future European strategy towards human exploratory missions, including preparatory activities and terrestrial applications and benefits. This paper covers the part of the HUMEX study dealing with lunar missions. A lunar base at the south pole where long-time sunlight and potential water ice deposits could be assumed was selected as the Moon reference scenario. The impact on human health, performance and well being has been investigated from the view point of the effects of microgravity (during space travel), reduced gravity (on the Moon) and abrupt gravity changes (during launch and landing), of the effects of cosmic radiation including solar particle events, of psychological issues as well as general health care. Countermeasures as well as necessary research using ground-based test beds and/or the International Space Station have been defined. Likewise advanced life support systems with a high degree of autonomy and regenerative capacity and synergy effects were considered where bioregenerative life support systems and biodiagnostic systems become essential. Finally, a European strategy leading to a potential European participation in future human exploratory missions has been recommended. c2003 COSPAR. Published by Elsevier Ltd. All rights reserved.

  18. Tsallis’ quantum q-fields

    NASA Astrophysics Data System (ADS)

    Plastino, A.; Rocca, M. C.

    2018-05-01

    We generalize several well known quantum equations to a Tsallis’ q-scenario, and provide a quantum version of some classical fields associated with them in the recent literature. We refer to the q-Schródinger, q-Klein-Gordon, q-Dirac, and q-Proca equations advanced in, respectively, Phys. Rev. Lett. 106, 140601 (2011), EPL 118, 61004 (2017) and references therein. We also introduce here equations corresponding to q-Yang-Mills fields, both in the Abelian and non-Abelian instances. We show how to define the q-quantum field theories corresponding to the above equations, introduce the pertinent actions, and obtain equations of motion via the minimum action principle. These q-fields are meaningful at very high energies (TeV scale) for q = 1.15, high energies (GeV scale) for q = 1.001, and low energies (MeV scale) for q = 1.000001 [Nucl. Phys. A 955 (2016) 16 and references therein]. (See the ALICE experiment at the LHC). Surprisingly enough, these q-fields are simultaneously q-exponential functions of the usual linear fields’ logarithms.

  19. A safety rule approach to surveillance and eradication of biological invasions

    PubMed Central

    Haight, Robert G.; Koch, Frank H.; Venette, Robert; Studens, Kala; Fournier, Ronald E.; Swystun, Tom; Turgeon, Jean J.

    2017-01-01

    Uncertainty about future spread of invasive organisms hinders planning of effective response measures. We present a two-stage scenario optimization model that accounts for uncertainty about the spread of an invader, and determines survey and eradication strategies that minimize the expected program cost subject to a safety rule for eradication success. The safety rule includes a risk standard for the desired probability of eradication in each invasion scenario. Because the risk standard may not be attainable in every scenario, the safety rule defines a minimum proportion of scenarios with successful eradication. We apply the model to the problem of allocating resources to survey and eradicate the Asian longhorned beetle (ALB, Anoplophora glabripennis) after its discovery in the Greater Toronto Area, Ontario, Canada. We use historical data on ALB spread to generate a set of plausible invasion scenarios that characterizes the uncertainty of the beetle’s extent. We use these scenarios in the model to find survey and tree removal strategies that minimize the expected program cost while satisfying the safety rule. We also identify strategies that reduce the risk of very high program costs. Our results reveal two alternative strategies: (i) delimiting surveys and subsequent tree removal based on the surveys' outcomes, or (ii) preventive host tree removal without referring to delimiting surveys. The second strategy is more likely to meet the stated objectives when the capacity to detect an invader is low or the aspirations to eradicate it are high. Our results provide practical guidelines to identify the best management strategy given aspirational targets for eradication and spending. PMID:28759584

  20. Analysis of LNG peakshaving-facility release-prevention systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelto, P.J.; Baker, E.G.; Powers, T.B.

    1982-05-01

    The purpose of this study is to provide an analysis of release prevention systems for a reference LNG peakshaving facility. An overview assessment of the reference peakshaving facility, which preceeded this effort, identified 14 release scenarios which are typical of the potential hazards involved in the operation of LNG peakshaving facilities. These scenarios formed the basis for this more detailed study. Failure modes and effects analysis and fault tree analysis were used to estimate the expected frequency of each release scenario for the reference peakshaving facility. In addition, the effectiveness of release prevention, release detection, and release control systems weremore » evaluated.« less

  1. Evaluating the use of biomass energy with carbon capture and storage in low emission scenarios

    NASA Astrophysics Data System (ADS)

    Vaughan, Naomi E.; Gough, Clair; Mander, Sarah; Littleton, Emma W.; Welfle, Andrew; Gernaat, David E. H. J.; van Vuuren, Detlef P.

    2018-04-01

    Biomass Energy with Carbon Capture and Storage (BECCS) is heavily relied upon in scenarios of future emissions that are consistent with limiting global mean temperature increase to 1.5 °C or 2 °C above pre-industrial. These temperature limits are defined in the Paris Agreement in order to reduce the risks and impacts of climate change. Here, we explore the use of BECCS technologies in a reference scenario and three low emission scenarios generated by an integrated assessment model (IMAGE). Using these scenarios we investigate the feasibility of key implicit and explicit assumptions about these BECCS technologies, including biomass resource, land use, CO2 storage capacity and carbon capture and storage (CCS) deployment rate. In these scenarios, we find that half of all global CO2 storage required by 2100 occurs in USA, Western Europe, China and India, which is compatible with current estimates of regional CO2 storage capacity. CCS deployment rates in the scenarios are very challenging compared to historical rates of fossil, renewable or nuclear technologies and are entirely dependent on stringent policy action to incentivise CCS. In the scenarios, half of the biomass resource is derived from agricultural and forestry residues and half from dedicated bioenergy crops grown on abandoned agricultural land and expansion into grasslands (i.e. land for forests and food production is protected). Poor governance of the sustainability of bioenergy crop production can significantly limit the amount of CO2 removed by BECCS, through soil carbon loss from direct and indirect land use change. Only one-third of the bioenergy crops are grown in regions associated with more developed governance frameworks. Overall, the scenarios in IMAGE are ambitious but consistent with current relevant literature with respect to assumed biomass resource, land use and CO2 storage capacity.

  2. Effects of the proposed California WaterFix North Delta Diversion on survival of juvenile Chinook salmon (Oncorhynchus tshawytscha) in the Sacramento-San Joaquin River Delta, northern California

    USGS Publications Warehouse

    Perry, Russell W.; Pope, Adam C.

    2018-05-11

    The California Department of Water Resources and Bureau of Reclamation propose new water intake facilities on the Sacramento River in northern California that would convey some of the water for export to areas south of the Sacramento-San Joaquin River Delta (hereinafter referred to as the Delta) through tunnels rather than through the Delta. The collection of water intakes, tunnels, pumping facilities, associated structures, and proposed operations are collectively referred to as California WaterFix. The water intake facilities, hereinafter referred to as the North Delta Diversion (NDD), are proposed to be located on the Sacramento River downstream of the city of Sacramento and upstream of the first major river junction where Sutter Slough branches from the Sacramento River. The NDD can divert a maximum discharge of 9,000 cubic feet per second (ft3 /s) from the Sacramento River, which reduces the amount of Sacramento River inflow into the Delta. In this report, we conduct four analyses to investigate the effect of the NDD and its proposed operation on survival of juvenile Chinook salmon (Oncorhynchus tshawytscha). All analyses used the results of a Bayesian survival model that allowed us to simulate travel time, migration routing, and survival of juvenile Chinook salmon migrating through the Delta in response to NDD operations, which affected both inflows to the Delta and operation of the Delta Cross Channel (DCC). For the first analysis, we evaluated the effect of the NDD bypass rules on salmon survival. The NDD bypass rules are a set of operational rule curves designed to provide adaptive levels of fish protection by defining allowable diversion rates as a function of (1) Sacramento River discharge as measured at Freeport, and (2) time of year when endangered runs requiring the most protection are present. We determined that all bypass rule curves except constant low-level pumping (maximum diversion of 900 ft3 /s) could cause a sizeable decrease in survival by as much as 6–10 percentage points. The maximum decrease in survival occurred at an intermediate Sacramento River flow of about 20,000–30,000 ft3 /s. Diversion rates increased rapidly as Sacramento River flows increased from 20,000 ft3 /s to 30,000 ft3 /s, until a maximum diversion rate was reached at 9,000 ft3 /s. Because through-Delta survival increases sharply over this range of Sacramento River flow before beginning to level off with further flow increases, increasing diversion rates over this flow range causes a large decrease in survival relative to no diversion.  For the second analysis, we applied the survival model to 82 years of daily simulated flows under the Proposed Action (PA) and No Action Alternative (NAA). The PA includes operation of the Central Valley Project/State Water Project with implementation of the NDD and its operations prescribed by the NDD bypass rules, whereas the NAA assumes system operations without implementation of the NDD. We also evaluated a “Level 1” (L1) scenario, which was similar to the PA scenario but applied the most protective bypass rule known as Level 1 post-pulse operations. We noted a high probability that survival under the PA scenario was lower than under the NAA scenario, and that travel time was longer under PA relative to NAA in most simulation years. However, the largest survival differences between the PA and NAA scenarios occurred during October–November and May–June. Although bypass rules are less restrictive during these periods, we determined that more frequent use of the DCC under PA led to the largest differences in survival between the two scenarios. Additionally, we noted no difference in median survival decreases between the PA and L1 scenarios, although in some years the L1 scenario had a lower survival decrease than the PA scenario. For the third analysis, we proposed a quantitative approach for developing NDD rule curves (that is, prescribed diversion flows for given inflows) by using the survival model to identify diversion rates that meet a criterion of a having a small probability of exceeding a given decrease in survival. We examined diversion rates that led to a 10% chance of exceeding a given decrease in survival for a range of absolute and relative decreases in survival. To maintain a given constant level of protection across the range of river flows, our analysis indicated that diversions had to increase at a much slower rate with respect to Sacramento River flow relative to the rule curves defined in the NDD bypass table. Additionally, we determined that diversion rates could be higher than under the bypass table rule curves at river flows less than 20,000 ft3 /s, but diversions had to be less than defined by NDD bypass rules at higher flows. For the fourth analysis, we simulated the effect of “real-time operations” on salmon survival, where bypass flow rates were determined by the presence of juvenile salmon entering the Delta, as indicated by juvenile salmon catch in a rotary screw trap upstream of the Delta. For this analysis, we evaluated NDD operations as defined by the L1 scenario and an additional scenario (Unlimited Pulse Protection [UPP]) that provided protection to an unlimited number of fish pulses. This analysis indicated that the highest catches occurred during flow pulses when daily survival was high, which caused annual survival to be weighted towards periods of high daily survival, resulting in a high annual survival. We determined that the mean annual survival decreased by 1–4 percentage points, and annual survival decreases were more frequently smaller for the UPP scenario. Additionally, because the UPP scenario protected an unlimited number of fish pulses, decreases in daily survival under the UPP scenario were less than under the L1 scenario.

  3. Future Scenarios of Livestock and Land Use in Brazil

    NASA Astrophysics Data System (ADS)

    Costa, M. H.; Abrahão, G. M.

    2016-12-01

    Brazil currently has about 213 M cattle heads in 151 M ha of pastures. In the last 40 years, both the top 5% and the average stocking rate are increasing exponentially in Brazil, while the relative yield gap has been constant. Using these historical relationships, we estimate future scenarios of livestock and land use in Brazil. We assume a reference scenario for the top 5%, in which pasturelands are adequately fertilized, soil is not compacted and well drained, grasses are never burned, pastures are divided in 8 subdivisions of regular area, are cattle is rotated through the subdivisions. The reference scenario does not consider irrigation or feed supplementation. We calibrate a computer model and run it for the pasturelands throughout the entire country. We conclude that current pastures have about 20% efficiency to raise cattle compared to the reference scenario. Considering the reference scenario, we predict an equilibrium will be reached in about 100 years, with top 5% with about 9.3 heads per ha and the average 4.3 heads per ha, or 600 M heads of livestock. Considering a more pessimistic scenario, which considers an inflection of the curve in present times, we predict an equilibrium will be reached in about 60 years, with the top 5% stocking rate equal to 4.3 heads per ha and the average equal to 2.2 heads per ha, or 300 M heads of livestock. Both cases represent a considerable expansion of the livestock, maybe even higher than the growth of the global demands for beef. These scenarios indicate that not all existing pasturelands need to be used in the future - a significant part of them may be converted to croplands, which will also contribute to the reduction of deforestation.

  4. Comment on "Calculations for the one-dimensional soft Coulomb problem and the hard Coulomb limit".

    PubMed

    Carrillo-Bernal, M A; Núñez-Yépez, H N; Salas-Brito, A L; Solis, Didier A

    2015-02-01

    In the referred paper, the authors use a numerical method for solving ordinary differential equations and a softened Coulomb potential -1/√[x(2)+β(2)] to study the one-dimensional Coulomb problem by approaching the parameter β to zero. We note that even though their numerical findings in the soft potential scenario are correct, their conclusions do not extend to the one-dimensional Coulomb problem (β=0). Their claims regarding the possible existence of an even ground state with energy -∞ with a Dirac-δ eigenfunction and of well-defined parity eigenfunctions in the one-dimensional hydrogen atom are questioned.

  5. A study of the application of differential techniques to the global positioning system for a helicopter precision approach

    NASA Technical Reports Server (NTRS)

    Mccall, D. L.

    1984-01-01

    The results of a simulation study to define the functional characteristics of a airborne and ground reference GPS receiver for use in a Differential GPS system are doumented. The operations of a variety of receiver types (sequential-single channel, continuous multi-channel, etc.) are evaluated for a typical civil helicopter mission scenario. The math model of each receiver type incorporated representative system errors including intentional degradation. The results include the discussion of the receiver relative performance, the spatial correlative properties of individual range error sources, and the navigation algorithm used to smooth the position data.

  6. Using reference values to define disease based on the lower limit of normal biased the population attributable fraction, but not the population excess risk: the example of chronic airflow obstruction.

    PubMed

    Burney, Peter; Minelli, Cosetta

    2018-01-01

    The impact of disease on population health is most commonly estimated by the population attributable fraction (PAF), or less commonly by the excess risk, an alternative measure that estimates the absolute risk of disease in the population that can be ascribed to the exposure. Using chronic airflow obstruction as an example, we examined the impact on these estimates of defining disease based on different "normal" values. We estimated PAF and the excess risk in scenarios in which the true rate of disease was 10% in the exposed and 5% in the unexposed, and where either 50% or 20% of the population was exposed. Disease definition was based on a "lower limit of normal", using the 5th, 1st and 0.2nd centile of values in a "normal" population as thresholds to define normality. Where normality is defined by centiles of values in a "normal" population, PAF is strongly influenced by which centile is selected to define normality. This is not true for the population excess risk. Care should be taken when interpreting estimates of PAF when disease is defined from a centile of a normal population. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Economic evaluation of genomic selection in small ruminants: a sheep meat breeding program.

    PubMed

    Shumbusho, F; Raoul, J; Astruc, J M; Palhiere, I; Lemarié, S; Fugeray-Scarbel, A; Elsen, J M

    2016-06-01

    Recent genomic evaluation studies using real data and predicting genetic gain by modeling breeding programs have reported moderate expected benefits from the replacement of classic selection schemes by genomic selection (GS) in small ruminants. The objectives of this study were to compare the cost, monetary genetic gain and economic efficiency of classic selection and GS schemes in the meat sheep industry. Deterministic methods were used to model selection based on multi-trait indices from a sheep meat breeding program. Decisional variables related to male selection candidates and progeny testing were optimized to maximize the annual monetary genetic gain (AMGG), that is, a weighted sum of meat and maternal traits annual genetic gains. For GS, a reference population of 2000 individuals was assumed and genomic information was available for evaluation of male candidates only. In the classic selection scheme, males breeding values were estimated from own and offspring phenotypes. In GS, different scenarios were considered, differing by the information used to select males (genomic only, genomic+own performance, genomic+offspring phenotypes). The results showed that all GS scenarios were associated with higher total variable costs than classic selection (if the cost of genotyping was 123 euros/animal). In terms of AMGG and economic returns, GS scenarios were found to be superior to classic selection only if genomic information was combined with their own meat phenotypes (GS-Pheno) or with their progeny test information. The predicted economic efficiency, defined as returns (proportional to number of expressions of AMGG in the nucleus and commercial flocks) minus total variable costs, showed that the best GS scenario (GS-Pheno) was up to 15% more efficient than classic selection. For all selection scenarios, optimization increased the overall AMGG, returns and economic efficiency. As a conclusion, our study shows that some forms of GS strategies are more advantageous than classic selection, provided that GS is already initiated (i.e. the initial reference population is available). Optimizing decisional variables of the classic selection scheme could be of greater benefit than including genomic information in optimized designs.

  8. Defining Scenarios: Linking Integrated Models, Regional Concerns, and Stakeholders

    NASA Astrophysics Data System (ADS)

    Hartmann, H. C.; Stewart, S.; Liu, Y.; Mahmoud, M.

    2007-05-01

    Scenarios are important tools for long-term planning, and there is great interest in using integrated models in scenario studies. However, scenario definition and assessment are creative, as well as scientific, efforts. Using facilitated creative processes, we have worked with stakeholders to define regionally significant scenarios that encompass a broad range of hydroclimatic, socioeconomic, and institutional dimensions. The regional scenarios subsequently inform the definition of local scenarios that work with context-specific integrated models that, individually, can address only a subset of overall regional complexity. Based on concerns of stakeholders in the semi-arid US Southwest, we prioritized three dimensions that are especially important, yet highly uncertain, for long-term planning: hydroclimatic conditions (increased variability, persistent drought), development patterns (urban consolidation, distributed rural development), and the nature of public institutions (stressed, proactive). Linking across real-world decision contexts and integrated modeling efforts poses challenges of creatively connecting the conceptual models held by both the research and stakeholder communities.

  9. Untangling Consequential Futures: Discovering Self-Consistent Regional and Global Multi-Sector Change Scenarios

    NASA Astrophysics Data System (ADS)

    Lamontagne, J. R.; Reed, P. M.

    2017-12-01

    Impacts and adaptations to global change largely occur at regional scales, yet they are shaped globally through the interdependent evolution of the climate, energy, agriculture, and industrial systems. It is important for regional actors to account for the impacts of global changes on their systems in a globally consistent but regionally relevant way. This can be challenging because emerging global reference scenarios may not reflect regional challenges. Likewise, regionally specific scenarios may miss important global feedbacks. In this work, we contribute a scenario discovery framework to identify regionally-specific decision relevant scenarios from an ensemble of scenarios of global change. To this end, we generated a large ensemble of time evolving regional, multi-sector global change scenarios by a full factorial sampling of the underlying assumptions in the emerging shared socio-economic pathways (SSPs), using the Global Change Assessment Model (GCAM). Statistical and visual analytics were then used to discover which SSP assumptions are particularly consequential for various regions, considering a broad range of time-evolving metrics that encompass multiple spatial scales and sectors. In an illustrative examples, we identify the most important global change narratives to inform water resource scenarios for several geographic regions using the proposed scenario discovery framework. Our results highlight the importance of demographic and agricultural evolution compared to technical improvements in the energy sector. We show that narrowly sampling a few canonical reference scenarios provides a very narrow view of the consequence space, increasing the risk of tacitly ignoring major impacts. Even optimistic scenarios contain unintended, disproportionate regional impacts and intergenerational transfers of consequence. Formulating consequential scenarios of deeply and broadly uncertain futures requires a better exploration of which quantitative measures of consequences are important, for whom are they important, where, and when. To this end, we have contributed a large database of climate change futures that can support `backwards' scenario generation techniques, that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.

  10. Designing a Methodology for Future Air Travel Scenarios

    NASA Technical Reports Server (NTRS)

    Wuebbles, Donald J.; Baughcum, Steven L.; Gerstle, John H.; Edmonds, Jae; Kinnison, Douglas E.; Krull, Nick; Metwally, Munir; Mortlock, Alan; Prather, Michael J.

    1992-01-01

    The growing demand on air travel throughout the world has prompted several proposals for the development of commercial aircraft capable of transporting a large number of passengers at supersonic speeds. Emissions from a projected fleet of such aircraft, referred to as high-speed civil transports (HSCT's), are being studied because of their possible effects on the chemistry and physics of the global atmosphere, in particular, on stratospheric ozone. At the same time, there is growing concern about the effects on ozone from the emissions of current (primarily subsonic) aircraft emissions. Evaluating the potential atmospheric impact of aircraft emissions from HSCT's requires a scientifically sound understanding of where the aircraft fly and under what conditions the aircraft effluents are injected into the atmosphere. A preliminary set of emissions scenarios are presented. These scenarios will be used to understand the sensitivity of environment effects to a range of fleet operations, flight conditions, and aircraft specifications. The baseline specifications for the scenarios are provided: the criteria to be used for developing the scenarios are defined, the required data base for initiating the development of the scenarios is established, and the state of the art for those scenarios that have already been developed is discussed. An important aspect of the assessment will be the evaluation of realistic projections of emissions as a function of both geographical distribution and altitude from an economically viable commercial HSCT fleet. With an assumed introduction date of around the year 2005, it is anticipated that there will be no HSCT aircraft in the global fleet at that time. However, projections show that, by 2015, the HSCT fleet could reach significant size. We assume these projections of HSCT and subsonic fleets for about 2015 can the be used as input to global atmospheric chemistry models to evaluate the impact of the HSCT fleets, relative to an all-subsonic future fleet. The methodology, procedures, and recommendations for the development of future HSCT and the subsonic fleet scenarios used for this evaluation are discussed.

  11. ISECG Mission Scenarios and Their Role in Informing Next Steps for Human Exploration Beyond Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Culbert, Christopher J.; Mongrard, Olivier; Satoh, Naoki; Goodliff, Kandyce; Seaman, Calvin H.; Troutman, Patrick; Martin, Eric

    2011-01-01

    The International Space Exploration Coordination Group (ISECG) was established in response to The Global Exploration Strategy (GES): The Framework for Coordination developed by fourteen space agencies* and released in May 2007. This GES Framework Document recognizes that preparing for human space exploration is a stepwise process, starting with basic knowledge and culminating in a sustained human presence in deep space. ISECG has developed several optional global exploration mission scenarios enabling the phased transition from human operations in Low Earth Orbit (LEO) and utilization of the International Space Station (ISS) to human missions beyond LEO leading ultimately to human missions to cis-lunar space, the Moon, Near Earth Asteroids, Mars and its environs. Mission scenarios provide the opportunity for judging various exploration approaches in a manner consistent with agreed international goals and strategies. Each ISECG notional mission scenario reflects a series of coordinated human and robotic exploration missions over a 25-year horizon. Mission scenarios are intended to provide insights into next steps for agency investments, following on the success of the ISS. They also provide a framework for advancing the definition of Design Reference Missions (DRMs) and the concepts for capabilities contained within. Each of the human missions contained in the scenarios has been characterized by a DRM which is a top level definition of mission sequence and the capabilities needed to execute that mission. While DRMs are generally destination focused, they will comprise capabilities which are reused or evolved from capabilities used at other destinations. In this way, an evolutionary approach to developing a robust set of capabilities to sustainably explore our solar system is defined. Agencies also recognize that jointly planning for our next steps, building on the accomplishments of ISS, is important to ensuring the robustness and sustainability of any human exploration plan. Developing a shared long-term vision is important, but agencies recognize this is an evolutionary process and requires consideration of many strategic factors. Strategic factors such as the implications of an emerging commercial space industry in LEO, the opportunity provided by extending ISS lifetime to at least 2020, and the importance of defining a plan which is sustainable in light of inevitable domestic policy shifts are timely for agency consideration.

  12. New trends in transportation and land use scenario planning : five case studies of regional and local scenario planning efforts

    DOT National Transportation Integrated Search

    2010-04-01

    This report summarizes important findings from a literature review on scenario planning processes and a scan of stakeholders. It also presents case studies on innovative, next generation scenario planning efforts. The project team defined next ...

  13. Decommissioning of offshore oil and gas facilities: a comparative assessment of different scenarios.

    PubMed

    Ekins, Paul; Vanner, Robin; Firebrace, James

    2006-06-01

    A material and energy flow analysis, with corresponding financial flows, was carried out for different decommissioning scenarios for the different elements of an offshore oil and gas structure. A comparative assessment was made of the non-financial (especially environmental) outcomes of the different scenarios, with the reference scenario being to leave all structures in situ, while other scenarios envisaged leaving them on the seabed or removing them to shore for recycling and disposal. The costs of each scenario, when compared with the reference scenario, give an implicit valuation of the non-financial outcomes (e.g. environmental improvements), should that scenario be adopted by society. The paper concludes that it is not clear that the removal of the topsides and jackets of large steel structures to shore, as currently required by regulations, is environmentally justified; that concrete structures should certainly be left in place; and that leaving footings, cuttings and pipelines in place, with subsequent monitoring, would also be justified unless very large values were placed by society on a clear seabed and trawling access.

  14. A reliable simultaneous representation of seismic hazard and of ground shaking recurrence

    NASA Astrophysics Data System (ADS)

    Peresan, A.; Panza, G. F.; Magrin, A.; Vaccari, F.

    2015-12-01

    Different earthquake hazard maps may be appropriate for different purposes - such as emergency management, insurance and engineering design. Accounting for the lower occurrence rate of larger sporadic earthquakes may allow to formulate cost-effective policies in some specific applications, provided that statistically sound recurrence estimates are used, which is not typically the case of PSHA (Probabilistic Seismic Hazard Assessment). We illustrate the procedure to associate the expected ground motions from Neo-deterministic Seismic Hazard Assessment (NDSHA) to an estimate of their recurrence. Neo-deterministic refers to a scenario-based approach, which allows for the construction of a broad range of earthquake scenarios via full waveforms modeling. From the synthetic seismograms the estimates of peak ground acceleration, velocity and displacement, or any other parameter relevant to seismic engineering, can be extracted. NDSHA, in its standard form, defines the hazard computed from a wide set of scenario earthquakes (including the largest deterministically or historically defined credible earthquake, MCE) and it does not supply the frequency of occurrence of the expected ground shaking. A recent enhanced variant of NDSHA that reliably accounts for recurrence has been developed and it is applied to the Italian territory. The characterization of the frequency-magnitude relation can be performed by any statistically sound method supported by data (e.g. multi-scale seismicity model), so that a recurrence estimate is associated to each of the pertinent sources. In this way a standard NDSHA map of ground shaking is obtained simultaneously with the map of the corresponding recurrences. The introduction of recurrence estimates in NDSHA naturally allows for the generation of ground shaking maps at specified return periods. This permits a straightforward comparison between NDSHA and PSHA maps.

  15. Human factors requirements for telerobotic command and control: The European Space Agency experimental programme

    NASA Technical Reports Server (NTRS)

    Stone, Robert J.

    1991-01-01

    Space Telerobotics research, performed under contract to the European Space Agency (ESA), concerning the execution of human factors experiments, and ultimately leading to the development of a telerobotics test bed, has been carried out since 1985 by a British Consortium consisting of British Aerospace, the United Kingdom Atomic Energy Authority and, more recently, the UK National Advanced Robotics Research Centre. The principal aim of the first study of the series was to derive preliminary requirements for a teleoperation servicing system, with reference to two mission model scenarios. The first scenario introduced the problem of communications time delays, and their likely effect on the ground-based operator in control of a manipulator system on board an unmanned servicing vehicle in Low Earth Orbit. In the second scenario, the operator was located on the NASA Orbiter aft flight deck, supervising the control of a prototype manipulator in the 'servicing' of an experimental payload in the cargo bay area. Human factors analyses centered on defining the requirements for the teleoperator workstation, such as identifying basic ergonomic requirements for workstation and panel layouts, defining teleoperation strategies, developing alphanumeric and graphic screen formats for the supervision or direct control of the manipulator, and the potential applications of expert system technology. The second study for ESA involved an experimental appraisal of some of the important issues highlighted in the first study, for which relevant human factors data did not exist. Of central importance during the second study was the issue of communications time delays and their effect on the manual control of a teleoperated manipulator from a ground-based command and control station.

  16. An Economic Aspect of the AVOID Programme: Analysis Using the AIM/CGE Model

    NASA Astrophysics Data System (ADS)

    Matsumoto, Ken'ichi; Masui, Toshihiko

    2010-05-01

    This presentation purposes to show the results of the analysis that the AIM/CGE [Global] model contributed to Work Stream 1 of the AVOID programme. Three economic models participate in this WS to analyze the economic aspects of defined climate policies, and the AIM/CGE [Global] model is one of them. The reference scenario is SRES A1B and five policy scenarios (2016.R2.H, 2016.R4.L, 2016.R5.L, 2030.R2.H, and 2030.R5.L) are considered. The climate policies are expressed as emissions pathways of several gases such as greenhouse gases and aerosols. The AIM/CGE [Global] model is a recursive dynamic global CGE model with 21 industrial sectors and 24 world regions. These definitions are based on the GTAP6 database and it is used as the economic data of the base year. Some important characteristics of this model can be summarized as follows: power generation by various sources (from non-renewables to renewables) are considered; CCS technology is modeled; biomass energy (both traditional and purpose-grown) production and consumption are included; not only CO2 emissions but also other gases are considered; international markets are modeled for international trade of some fossil fuels; relationships between the costs and resource reserves of fossil fuels are modeled. The model is run with 10-year time steps until 2100. For the reference case, there are no constraints and the model is run based on the drivers (assumptions on GDP and population for A1B) and AEEI. The reference case does not have the same emissions pathways as the prescribed emissions for A1B in AVOID. For scenario cases, the model is run under emissions constraints. In particular, for each policy scenario, the constraint on each gas in each 10-year step is derived. The percentage reduction in emissions that occurs between the AVOID A1B scenario and the particular policy scenario, for each gas in each 10-year period is first calculated, and then these percentage reductions are applied to the AIM reference case to derive the constraints for each gas over the 21st century. The main results provided to AVOID were carbon prices and GDP for each scenario case. About the carbon prices, the results show that the higher the emissions reduction rate and the earlier the peak, the higher the carbon prices will be, and the prices tend to be higher over time (536/tCO2 in 2100 for 2016.R5.L). These trends are quite different from those of the E3MG model which assumes constant carbon tax for each scenario (232/tCO2 in 2100 for 2016.R5.L). In addition, the higher carbon prices are necessary in the AIM/CGE model than the E3MG model, especially in the latter half of the century. About the GDP trends, the results indicate that negative GDP changes occur for all scenarios cases, and higher GDP damage is observed as the reduction rate becomes higher and the peak comes earlier (-7.04% in 2100 for 2016.R5.L). These trends are extremely different from those of the E3MG model which shows positive GDP effects (+4.89% in 2100 for 2016.R5.L). The differences of the results among the two models are caused by (1) technological change assumptions, (2) revenue recycling methodology, (3) timing of emissions cuts, and (4) modeling approaches. We expect to have a more detailed discussion at the session.

  17. Integrated modeling of agricultural scenarios (IMAS) to support pesticide action plans: the case of the Coulonge drinking water catchment area (SW France).

    PubMed

    Vernier, Françoise; Leccia-Phelpin, Odile; Lescot, Jean-Marie; Minette, Sébastien; Miralles, André; Barberis, Delphine; Scordia, Charlotte; Kuentz-Simonet, Vanessa; Tonneau, Jean-Philippe

    2017-03-01

    Non-point source pollution is a cause of major concern within the European Union. This is reflected in increasing public and political focus on a more sustainable use of pesticides, as well as a reduction in diffuse pollution. Climate change will likely to lead to an even more intensive use of pesticides in the future, affecting agriculture in many ways. At the same time, the Water Framework Directive (WFD) and associated EU policies called for a "good" ecological and chemical status to be achieved for water bodies by the end of 2015, currently delayed to 2021-2027 due to a lack of efficiency in policies and timescale of resilience for hydrosystems, especially groundwater systems. Water managers need appropriate and user-friendly tools to design agro-environmental policies. These tools should help them to evaluate the potential impacts of mitigation measures on water resources, more clearly define protected areas, and more efficiently distribute financial incentives to farmers who agree to implement alternative practices. At present, a number of reports point out that water managers do not use appropriate information from monitoring or models to make decisions and set environmental action plans. In this paper, we propose an integrated and collaborative approach to analyzing changes in land use, farming systems, and practices and to assess their effects on agricultural pressure and pesticide transfers to waters. The integrated modeling of agricultural scenario (IMAS) framework draws on a range of data and expert knowledge available within areas where a pesticide action plan can be defined to restore the water quality, French "Grenelle law" catchment areas, French Water Development and Management Plan areas, etc. A so-called "reference scenario" represents the actual soil occupation and pesticide-spraying practices used in both conventional and organic farming. A number of alternative scenarios are then defined in cooperation with stakeholders, including socio-economic conditions for developing alternative agricultural systems or targeting mitigation measures. Our integrated assessment of these scenarios combines the calculation of spatialized environmental indicators with integrated bio-economic modeling. The latter is achieved by a combined use of Soil and Water Assessment Tool (SWAT) modeling with our own purpose-built land use generator module (Generator of Land Use version 2 (GenLU2)) and an economic model developed using General Algebraic Modeling System (GAMS) for cost-effectiveness assessment. This integrated approach is applied to two embedded catchment areas (total area of 360,000 ha) within the Charente river basin (SW France). Our results show that it is possible to differentiate scenarios based on their effectiveness, represented by either evolution of pressure (agro-environmental indicators) or transport into waters (pesticide concentrations). By analyzing the implementation costs borne by farmers, it is possible to identify the most cost-effective scenarios at sub-basin and other aggregated levels (WFD hydrological entities, sensitive areas). Relevant results and indicators are fed into a specifically designed database. Data warehousing is used to provide analyses and outputs at all thematic, temporal, or spatial aggregated levels, defined by the stakeholders (type of crops, herbicides, WFD areas, years), using Spatial On-Line Analytical Processing (SOLAP) tools. The aim of this approach is to allow public policy makers to make more informed and reasoned decisions when managing sensitive areas and/or implementing mitigation measures.

  18. CDM analysis

    NASA Technical Reports Server (NTRS)

    Larson, Robert E.; Mcentire, Paul L.; Oreilly, John G.

    1993-01-01

    The C Data Manager (CDM) is an advanced tool for creating an object-oriented database and for processing queries related to objects stored in that database. The CDM source code was purchased and will be modified over the course of the Arachnid project. In this report, the modified CDM is referred to as MCDM. Using MCDM, a detailed series of experiments was designed and conducted on a Sun Sparcstation. The primary results and analysis of the CDM experiment are provided in this report. The experiments involved creating the Long-form Faint Source Catalog (LFSC) database and then analyzing it with respect to following: (1) the relationships between the volume of data and the time required to create a database; (2) the storage requirements of the database files; and (3) the properties of query algorithms. The effort focused on defining, implementing, and analyzing seven experimental scenarios: (1) find all sources by right ascension--RA; (2) find all sources by declination--DEC; (3) find all sources in the right ascension interval--RA1, RA2; (4) find all sources in the declination interval--DEC1, DEC2; (5) find all sources in the rectangle defined by--RA1, RA2, DEC1, DEC2; (6) find all sources that meet certain compound conditions; and (7) analyze a variety of query algorithms. Throughout this document, the numerical results obtained from these scenarios are reported; conclusions are presented at the end of the document.

  19. Scope of practice: freedom within limits.

    PubMed

    Schuiling, K D; Slager, J

    2000-01-01

    "Scope of practice" has a variety of meanings amongst midwives, other health professionals, health organizations, and consumers of midwifery care. For some, it refers to the Standards for the Practice of Midwifery; for others, it encompasses the legal base of practice; still others equate it with the components of the clinical parameters of practice. Because "scope of practice" is dynamic and parameters of practice can be impacted by many variables, succinctly defining "scope of practice" is difficult. This article provides a comprehensive discussion of the concept "scope of practice." Clinical scenarios are provided as case exemplars. The aim of this paper is to provide both new and experienced midwives with a substantive definition of the concept "scope of practice."

  20. Pre-crash scenario typology for crash avoidance research

    DOT National Transportation Integrated Search

    2007-04-01

    This report defines a new pre-crash scenario typology for crash avoidance research based on the 2004 General Estimates System (GES) crash database, which consists of pre-crash scenarios depicting vehicle movements and dynamics as well as the critical...

  1. Complexity associated with the optimisation of capability options in military operations

    NASA Astrophysics Data System (ADS)

    Pincombe, A.; Bender, A.; Allen, G.

    2005-12-01

    In the context of a military operation, even if the intended actions, the geographic location, and the capabilities of the opposition are known, there are still some critical uncertainties that could have a major impact on the effectiveness of a given set of capabilities. These uncertainties include unpredictable events and the response alternatives that are available to the command and control elements of the capability set. They greatly complicate any a priori mathematical description. In a forecasting approach, the most likely future might be chosen and a solution sought that is optimal for that case. With scenario analysis, futures are proposed on the basis of critical uncertainties and the option that is most robust is chosen. We use scenario analysis but our approach is different in that we focus on the complexity and use the coupling between scenarios and options to create information on ideal options. The approach makes use of both soft and hard operations research methods, with subject matter expertise being used to define plausible responses to scenarios. In each scenario, uncertainty affects only a subset of the system-inherent variables and the variables that describe system-environment interactions. It is this scenario-specific reduction of variables that makes the problem mathematically tractable. The process we define is significantly different to existing scenario analysis processes, so we have named it adversarial scenario analysis. It can be used in conjunction with other methods, including recent improvements to the scenario analysis process. To illustrate the approach, we undertake a tactical level scenario analysis for a logistics problem that is defined by a network, expected throughputs to end users, the transport capacity available, the infrastructure at the nodes and the capacities of roads, stocks etc. The throughput capacity, e.g. the effectiveness, of the system relies on all of these variables and on the couplings between them. The system is initially in equilibrium for a given level of demand. However, different, and simpler, solutions emerge as the balance of couplings and the importance of variables change. The scenarios describe such changes in conditions. For each scenario it was possible to define measures that describe the differences between options. As with agent-based distillations, the solution is essentially qualitative and exploratory, bringing awareness of possible future difficulties and of the capabilities that are necessary if we are to deal successfully with those difficulties.

  2. ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.

    PubMed

    Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

    2006-03-31

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.

  3. AERIS - applications for the environment : real-time information synthesis : eco-lanes operational scenario modeling report.

    DOT National Transportation Integrated Search

    2014-12-01

    This report constitutes the detailed modeling and evaluation results of the Eco-Lanes Operational Scenario defined by the Applications for the Environment: Real-Time Information Synthesis (AERIS) Program. The Operational Scenario constitutes six appl...

  4. A new item response theory model to adjust data allowing examinee choice

    PubMed Central

    Costa, Marcelo Azevedo; Braga Oliveira, Rivert Paulo

    2018-01-01

    In a typical questionnaire testing situation, examinees are not allowed to choose which items they answer because of a technical issue in obtaining satisfactory statistical estimates of examinee ability and item difficulty. This paper introduces a new item response theory (IRT) model that incorporates information from a novel representation of questionnaire data using network analysis. Three scenarios in which examinees select a subset of items were simulated. In the first scenario, the assumptions required to apply the standard Rasch model are met, thus establishing a reference for parameter accuracy. The second and third scenarios include five increasing levels of violating those assumptions. The results show substantial improvements over the standard model in item parameter recovery. Furthermore, the accuracy was closer to the reference in almost every evaluated scenario. To the best of our knowledge, this is the first proposal to obtain satisfactory IRT statistical estimates in the last two scenarios. PMID:29389996

  5. Scenarios for Dutch Teacher Education. A Trip to Rome: Coach Bus Company or Travel Agency?

    ERIC Educational Resources Information Center

    Snoek, Marco

    2003-01-01

    Stimulated by severe teacher shortages, teacher education in the Netherlands is changing toward competence-based, work- based-, and market-oriented programs. A Dutch scenario project was developed in which four scenarios for the future of teacher education emerged. These scenarios were structured by the freedom of schools to define the type of…

  6. Assessing the Formation of Experience-Based Gender Expectations in an Implicit Learning Scenario

    PubMed Central

    Öttl, Anton; Behne, Dawn M.

    2017-01-01

    The present study investigates the formation of new word-referent associations in an implicit learning scenario, using a gender-coded artificial language with spoken words and visual referents. Previous research has shown that when participants are explicitly instructed about the gender-coding system underlying an artificial lexicon, they monitor the frequency of exposure to male vs. female referents within this lexicon, and subsequently use this probabilistic information to predict the gender of an upcoming referent. In an explicit learning scenario, the auditory and visual gender cues are necessarily highlighted prior to acqusition, and the effects previously observed may therefore depend on participants' overt awareness of these cues. To assess whether the formation of experience-based expectations is dependent on explicit awareness of the underlying coding system, we present data from an experiment in which gender-coding was acquired implicitly, thereby reducing the likelihood that visual and auditory gender cues are used strategically during acquisition. Results show that even if the gender coding system was not perfectly mastered (as reflected in the number of gender coding errors), participants develop frequency based expectations comparable to those previously observed in an explicit learning scenario. In line with previous findings, participants are quicker at recognizing a referent whose gender is consistent with an induced expectation than one whose gender is inconsistent with an induced expectation. At the same time however, eyetracking data suggest that these expectations may surface earlier in an implicit learning scenario. These findings suggest that experience-based expectations are robust against manner of acquisition, and contribute to understanding why similar expectations observed in the activation of stereotypes during the processing of natural language stimuli are difficult or impossible to suppress. PMID:28936186

  7. Data Assimilation Techniques for Ionospheric Reference Scenarios - project overview and achieved outcomes

    NASA Astrophysics Data System (ADS)

    Gerzen, Tatjana; Wilken, Volker; Hoque, Mainul; Minkwitz, David; Schlueter, Stefan

    2016-04-01

    The ionosphere is the upper part of the Earth's atmosphere, where sufficient free electrons exist to affect the propagation of radio waves. Therefore, the treatment of the ionosphere is a critical issue for many applications dealing with trans-ionospheric signals such as GNSS positioning, GNSS related augmentation systems (e.g. EGNOS and WAAS) and remote sensing. The European Geostationary Navigation Overlay Service (EGNOS) is the European Satellite Based Augmentation Service (SBAS) that provides value added services, in particular to safety critical GNSS applications, e.g. aviation and maritime traffic. In the frame of the European GNSS Evolution Programme (EGEP), ESA has launched several activities, supporting the design, development and qualification of the operational EGNOS infrastructure and associated services. Ionospheric Reference Scenarios (IRSs) are used by ESA in order to conduct the EGNOS performance simulations and to assure the capability for maintaining accuracy, integrity and availability of the EGNOS system, especially during ionospheric storm conditions. The project Data Assimilation Techniques for Ionospheric Reference Scenarios (DAIS) - aims the provision of improved EGNOS IRSs. The main tasks are the calculation and validation of time series of IRSs by a 3D assimilation approach that combines space borne and ground based GNSS observations as well as ionosonde measurements with an ionospheric background model. The special focus thereby is to demonstrate that space-based measurements can significantly contribute to fill data gaps in GNSS ground networks (particularly in Africa and over the oceans) when generating the IRSs. In this project we selected test periods of perturbed and nominal ionospheric conditions and filtered the collected data for outliers. We defined and developed an applicable technique for the 3D assimilation and applied this technique for the generation of IRSs covering the EGNOS V3 extended service area. Afterwards the generated 3D ionosphere reconstructions as well as the final IRSs are validated with independent GNSS slant TEC (Total Electron Content) data, vertical sounding observations and JASON 1 and 2 derived vertical TEC. This presentation gives an overview about the DAIS project and the achieved results. We outline the assimilation approach, show the reconstruction and the validation results and finally address open questions.

  8. Data Assimilation Techniques for Ionospheric Reference Scenarios - project overview and first results

    NASA Astrophysics Data System (ADS)

    Gerzen, Tatjana; Mainul Hoque, M.; Wilken, Volker; Minkwitz, David; Schlüter, Stefan

    2015-04-01

    The European Geostationary Navigation Overlay Service (EGNOS) is the European Satellite Based Augmentation Service (SBAS) that provides value added services, in particular to Safety of Live (SoL) users of the Global Navigation Satellite Systems (GNSS). In the frame of the European GNSS Evolution Programme (EGEP), ESA has launched several activities, which are aiming to support the design, development and qualification of the future operational EGNOS infrastructure and associated services. The ionosphere is the part of the upper Earth's atmosphere between about 50 km and 1000 km above the Earth's surface, which contains sufficient free electrons to cause strong impact on radio signal propagation. Therefore, treatment of the ionosphere is a critical issue to guarantee the EGNOS system performance. In order to conduct the EGNOS end-to-end performance simulations and to assure the capability for maintaining integrity of the EGNOS system especially during ionospheric storm conditions, Ionospheric Reference Scenarios (IRSs) are introduced by ESA. The project Data Assimilation Techniques for Ionospheric Reference Scenarios (DAIS) - aims to generate improved EGNOS IRSs by combining space borne and ground based GNSS observations. The main focus of this project is to demonstrate that ionospheric radio occultation (IRO) measurements can significantly contribute to fill data gaps in GNSS ground networks (particularly in Africa and over the oceans) when generating the IRSs. The primary tasks are the calculation and validation of time series of IRSs (i.e. TEC maps) by a 3D assimilation approach that combines IRO and ground based GNSS measurements with an ionospheric background model in an optimal way. In the first phase of the project we selected appropriate test periods, one presenting perturbed and the other one - nominal ionospheric conditions, collected and filtered the corresponding data. We defined and developed an applicable technique for the 3D assimilation and applied this technique for the generation of IRSs covering the EGNOS V3 service area. This presentation gives an overview about the DAIS project and the first results. We outline the assimilation approach, show test run results and finally address and discuss open questions.

  9. A scenario elicitation methodology to identify the drivers of electricity infrastructure cost in South America

    NASA Astrophysics Data System (ADS)

    Moksnes, Nandi; Taliotis, Constantinos; Broad, Oliver; de Moura, Gustavo; Howells, Mark

    2017-04-01

    Developing a set of scenarios to assess a proposed policy or future development pathways requires a certain level of information, as well as establishing the socio-economic context. As the future is difficult to predict, great care in defining the selected scenarios is needed. Even so it can be difficult to assess if the selected scenario is covering the possible solution space. Instead, this paper's methodology develops a large set of scenarios (324) in OSeMOSYS using the SAMBA 2.0 (South America Model Base) model to assess long-term electricity supply scenarios and applies a scenario-discovery statistical data mining algorithm, Patient Rule Induction Method (PRIM). By creating a multidimensional space, regions related to high and low cost can be identified as well as their key driver. The six key drivers are defined a priori in three (high, medium, low) or two levers (high, low): 1) Demand projected from GDP, population, urbanization and transport, 2) Fossil fuel price, 3) Climate change impact on hydropower, 4) Renewable technology learning rate, 5) Discount rate, 6) CO2 emission targets.

  10. Assessing Health Impacts of Pictorial Health Warning Labels on Cigarette Packs in Korea Using DYNAMO-HIA

    PubMed Central

    2017-01-01

    Objectives This study aimed to predict the 10-year impacts of the introduction of pictorial warning labels (PWLs) on cigarette packaging in 2016 in Korea for adults using DYNAMO-HIA. Methods In total, four scenarios were constructed to better understand the potential health impacts of PWLs: two for PWLs and the other two for a hypothetical cigarette tax increase. In both policies, an optimistic and a conservative scenario were constructed. The reference scenario assumed the 2015 smoking rate would remain the same. Demographic data and epidemiological data were obtained from various sources. Differences in the predicted smoking prevalence and prevalence, incidence, and mortality from diseases were compared between the reference scenario and the four policy scenarios. Results It was predicted that the optimistic PWLs scenario (PWO) would lower the smoking rate by 4.79% in males and 0.66% in females compared to the reference scenario in 2017. However, the impact on the reduction of the smoking rate was expected to diminish over time. PWO will prevent 85 238 cases of diabetes, 67 948 of chronic obstructive pulmonary disease, 31 526 of ischemic heart disease, 21 036 of lung cancer, and 3972 prevalent cases of oral cancer in total over the 10-year span due to the reductions in smoking prevalence. The impacts of PWO are expected to be between the impact of the optimistic and the conservative cigarette tax increase scenarios. The results were sensitive to the transition probability of smoking status. Conclusions The introduction of PWLs in 2016 in Korea is expected reduce smoking prevalence and disease cases for the next 10 years, but regular replacements of PWLs are needed for persistent impacts. PMID:28768403

  11. Assessing Health Impacts of Pictorial Health Warning Labels on Cigarette Packs in Korea Using DYNAMO-HIA.

    PubMed

    Kang, Eunjeong

    2017-07-01

    This study aimed to predict the 10-year impacts of the introduction of pictorial warning labels (PWLs) on cigarette packaging in 2016 in Korea for adults using DYNAMO-HIA. In total, four scenarios were constructed to better understand the potential health impacts of PWLs: two for PWLs and the other two for a hypothetical cigarette tax increase. In both policies, an optimistic and a conservative scenario were constructed. The reference scenario assumed the 2015 smoking rate would remain the same. Demographic data and epidemiological data were obtained from various sources. Differences in the predicted smoking prevalence and prevalence, incidence, and mortality from diseases were compared between the reference scenario and the four policy scenarios. It was predicted that the optimistic PWLs scenario (PWO) would lower the smoking rate by 4.79% in males and 0.66% in females compared to the reference scenario in 2017. However, the impact on the reduction of the smoking rate was expected to diminish over time. PWO will prevent 85 238 cases of diabetes, 67 948 of chronic obstructive pulmonary disease, 31 526 of ischemic heart disease, 21 036 of lung cancer, and 3972 prevalent cases of oral cancer in total over the 10-year span due to the reductions in smoking prevalence. The impacts of PWO are expected to be between the impact of the optimistic and the conservative cigarette tax increase scenarios. The results were sensitive to the transition probability of smoking status. The introduction of PWLs in 2016 in Korea is expected reduce smoking prevalence and disease cases for the next 10 years, but regular replacements of PWLs are needed for persistent impacts.

  12. Projecting county-level populations under three future scenarios: a technical document supporting the Forest Service 2010 RPA Assessment

    Treesearch

    Stanley J. Zarnoch; H. Ken Cordell; Carter J. Betz

    2010-01-01

    County-level population projections from 2010 to 2060 are developed under three national population growth scenarios for reporting in the 2010 Renewable Resources Planning Act (RPA) Assessment. These population growth scenarios are tied to global futures scenarios defined by the Intergovernmental Panel on Climate Change (IPCC), a program within the United Nations...

  13. Modeling the impact of development and management options on future water resource use in the Nyangores sub-catchment of the Mara Basin in Kenya

    NASA Astrophysics Data System (ADS)

    Omonge, Paul; Herrnegger, Mathew; Fürst, Josef; Olang, Luke

    2016-04-01

    Despite the increasing water insecurity consequent of competing uses, the Nyangores sub-catchment of Kenya is yet to develop an inclusive water use and allocation plan for its water resource systems. As a step towards achieving this, this contribution employed the Water Evaluation and Planning (WEAP) system to evaluate selected policy based water development and management options for future planning purposes. Major water resources of the region were mapped and quantified to establish the current demand versus supply status. To define a reference scenario for subsequent model projections, additional data on urban and rural water consumption, water demand for crop types, daily water use for existing factories and industries were also collated through a rigorous fieldwork procedure. The model was calibrated using the parameter estimation tool (PEST) and validated against observed streamflow data, and subsequently used to simulate feasible management options. Due to lack of up-to-date data for the current year, the year 2000 was selected as the base year for the scenario simulations up to the year 2030, which has been set by the country for realizing most flagship development projects. From the results obtained, the current annual water demand within the sub-catchment is estimated to be around 27.2 million m3 of which 24% is being met through improved and protected water sources including springs, wells and boreholes, while 76% is met through informal and unprotected sources which are insufficient to cater for future increases in demand. Under the reference scenario, the WEAP model predicted an annual total inadequate supply of 8.1 million m3 mostly in the dry season by the year 2030. The current annual unmet water demand is 1.3 million m3 and is noteworthy in the dry seasons of December through February at the irrigation demand site. The monthly unmet domestic demand under High Population Growth (HPG) was projected to be 1.06 million m3 by the year 2030. However, within the improved Water Conservation Scenario (WCS), the total water demand is projected to decline by 24.2% in the same period. Key words: Nyangores catchment, Water Resources, WEAP, Scenario Analysis, Kenya

  14. Large Ensemble Analytic Framework for Consequence-Driven Discovery of Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Lamontagne, Jonathan R.; Reed, Patrick M.; Link, Robert; Calvin, Katherine V.; Clarke, Leon E.; Edmonds, James A.

    2018-03-01

    An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socioeconomic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of "challenges to mitigation" and "challenges to adaptation" to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial data set of 33,750 scenarios generated using the Global Change Assessment Model. We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user-specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support "bottom-up" scenario generation techniques that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.

  15. critcial human health issues in connection with future human missions to mMars: the HUMEX study of ESA

    NASA Astrophysics Data System (ADS)

    Horneck, G.; Humex Team

    ESA has recently initiated a study of the human responses, limits and needs with regard to the stress environments of interplanetary and planetary missions. Emphasis was laid on human health and performance care as well as Advanced Life Support Developments including Bioregenerative Life Support Systems and environmental monitoring. The overall study goals were as follows: (i) to define reference scenarios for a European participation in human exploration and to estimate their influence on the Life Sciences and Life Support requirements; (ii) for selected mission scenarios, to critically assess the limiting factors for human health, wellbeing, and performance and to recommend relevant countermeasures; (iii) for selected mission scenarios, to critically assess the potential of Advanced Life Support Developments and to pro-pose a European strategy including terrestrial applications; (iv) to critically assess the feasibility of existing facilities and technologies on ground and in space as test-beds in preparation for human exploratory missions and to develop a test plan for ground and ISS campaigns; (v) to develop a roadmap for a future European strategy towards human exploratory missions, including preparatory activities and terrestrial applications and benefits. Two scenarios for a Mars mission were selected: (i) with a 30 days stay on Mars, and (ii) with about 500 days stay on Mars. The impact on human health, perform-ance and well being has been investigated from the view point of (i) the effects of microgravity (during space travel), reduced gravity (on Mars) and abrupt gravity changes (during launch and landing), (ii) the effects of cosmic radiation including solar particle events, (iii) psychological issues as well as general health care. Coun-termeasures as well as necessary research using ground-based testbeds and/or the ISS have been defined. The need for highly intelligent autonomous diagnostic and therapy systems was emphasized. Advanced life support systems with a high degree of autonomy and regenerative capacity and synergy effects were considered where bioregenerative life support systems and biodiagnostic systems become essential especially for the long-term Mars scenario. The considerations have been incorpo-rated into a roadmap for a future European strategy in human health issues for a potential European participation in a cooperative international exploration of our solar system by humans. Ref. Horneck et al, 2003, HUMEX, study on the Survivability and Adaptation of Humans to Long-Duration Exploratory Missions, ESA SP 1264

  16. HUMEX, a study on the survivability and adaptation of humans to long- duration exploratory missions

    NASA Astrophysics Data System (ADS)

    Horneck, G.

    ESA has recently initiated a study of the human responses, limits and needs with regard to the stress environments of interplanetary and planetary missions. Emphasis was laid on human health and performance care as well as Advanced Life Support Developments including Bioregenerative Life Support Systems and environmental monitoring. The overall study goals were as follows: (i) to define reference scenarios for a European participation in human exploration and to estimate their influence on the Life Sciences and Life Support requirements; (ii) for selected mission scenarios, to critically assess the limiting factors for human health, wellbeing, and performance and to recommend relevant countermeasures; (iii) for selected mission scenarios, to critically assess the potential of Advanced Life Support Developments and to propose a European strategy including terrestrial applications; (iv) to critically assess the feasibility of existing facilities and technologies on ground and in space as testbeds in preparation for human exploratory missions and to develop a test plan for ground and ISS campaigns; (v) to develop a roadmap for a future European strategy towards human exploratory missions, including preparatory activities and terrestrial applications and benefits. A lunar base at the south pole where constant sunlight and potential water ice deposits could be assumed was selected as the moon scenario. the impact on human health, performance and well being has been investigated from the view point of the effects of microgravity (during space travel), reduced gravity (on the Moon) and abrupt gravity changes (during launch and landing), of the effects of cosmic radiation including solar particle events, of psychological issues as well as general health care. Countermeasures as well as necessary research using ground- based testbeds and/or the ISS have been defined. The need for highly intelligent autonomous diagnostic and therapy systems was considered as a driver also for terrestrial applications. Likewise advanced life support systems with a high degree of autonomy and regenerative capacity and synergy effects were considered where bioregenerative life support systems and biodiagnistic systems become essential especially for the long-term Mars scenario. A roadmap for a future European strategy leading to a potential European participation in a cooperative human exploratory mission, either to the Moon or to Mars, was produced. Ref. Horneck et al. HUMEX, study on the Survivability and Adaptation of Humans to Long-Duration Exploratory Missions, ESA SP (in press)

  17. Temporal steering and security of quantum key distribution with mutually unbiased bases against individual attacks

    NASA Astrophysics Data System (ADS)

    Bartkiewicz, Karol; Černoch, Antonín; Lemr, Karel; Miranowicz, Adam; Nori, Franco

    2016-06-01

    Temporal steering, which is a temporal analog of Einstein-Podolsky-Rosen steering, refers to temporal quantum correlations between the initial and final state of a quantum system. Our analysis of temporal steering inequalities in relation to the average quantum bit error rates reveals the interplay between temporal steering and quantum cloning, which guarantees the security of quantum key distribution based on mutually unbiased bases against individual attacks. The key distributions analyzed here include the Bennett-Brassard 1984 protocol and the six-state 1998 protocol by Bruss. Moreover, we define a temporal steerable weight, which enables us to identify a kind of monogamy of temporal correlation that is essential to quantum cryptography and useful for analyzing various scenarios of quantum causality.

  18. Effects of sea level rise, land subsidence, bathymetric change and typhoon tracks on storm flooding in the coastal areas of Shanghai.

    PubMed

    Wang, Jun; Yi, Si; Li, Mengya; Wang, Lei; Song, Chengcheng

    2018-04-15

    We compared the effects of three key environmental factors of coastal flooding: sea level rise (SLR), land subsidence (LS) and bathymetric change (BC) in the coastal areas of Shanghai. We use the hydrological simulation model MIKE 21 to simulate flood magnitudes under multiple scenarios created from combinations of the key environmental factors projected to year 2030 and 2050. Historical typhoons (TC9711, TC8114, TC0012, TC0205 and TC1109), which caused extremely high surges and considerable losses, were selected as reference tracks to generate potential typhoon events that would make landfalls in Shanghai (SHLD), in the north of Zhejiang (ZNLD) and moving northwards in the offshore area of Shanghai (MNS) under those scenarios. The model results provided assessment of impact of single and compound effects of the three factors (SLR, LS and BC) on coastal flooding in Shanghai for the next few decades. Model simulation showed that by the year 2030, the magnitude of storm flooding will increase due to the environmental changes defined by SLR, LS, and BC. Particularly, the compound scenario of the three factors will generate coastal floods that are 3.1, 2.7, and 1.9 times greater than the single factor change scenarios by, respectively, SLR, LS, and BC. Even more drastically, in 2050, the compound impact of the three factors would be 8.5, 7.5, and 23.4 times of the single factors. It indicates that the impact of environmental changes is not simple addition of the effects from individual factors, but rather multiple times greater of that when the projection time is longer. We also found for short-term scenarios, the bathymetry change is the most important factor for the changes in coastal flooding; and for long-term scenarios, sea level rise and land subsidence are the major factors that coastal flood prevention and management should address. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Error reduction in three-dimensional metrology combining optical and touch probe data

    NASA Astrophysics Data System (ADS)

    Gerde, Janice R.; Christens-Barry, William A.

    2010-08-01

    Analysis of footwear under the Harmonized Tariff Schedule of the United States (HTSUS) is partly based on identifying the boundary ("parting line") between the "external surface area upper" (ESAU) and the sample's sole. Often, that boundary is obscured. We establish the parting line as the curved intersection between the sample outer surface and its insole surface. The outer surface is determined by discrete point cloud coordinates obtained using a laser scanner. The insole surface is defined by point cloud data, obtained using a touch probe device-a coordinate measuring machine (CMM). Because these point cloud data sets do not overlap spatially, a polynomial surface is fitted to the insole data and extended to intersect a mesh fitted to the outer surface point cloud. This line of intersection defines the ESAU boundary, permitting further fractional area calculations to proceed. The defined parting line location is sensitive to the polynomial used to fit experimental data. Extrapolation to the intersection with the ESAU can heighten this sensitivity. We discuss a methodology for transforming these data into a common reference frame. Three scenarios are considered: measurement error in point cloud coordinates, from fitting a polynomial surface to a point cloud then extrapolating beyond the data set, and error from reference frame transformation. These error sources can influence calculated surface areas. We describe experiments to assess error magnitude, the sensitivity of calculated results on these errors, and minimizing error impact on calculated quantities. Ultimately, we must ensure that statistical error from these procedures is minimized and within acceptance criteria.

  20. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  1. Scripting Scenarios for the Human Patient Simulator

    NASA Technical Reports Server (NTRS)

    Bacal, Kira; Miller, Robert; Doerr, Harold

    2004-01-01

    The Human Patient Simulator (HPS) is particularly useful in providing scenario-based learning which can be tailored to fit specific scenarios and which can be modified in realtime to enhance the teaching environment. Scripting these scenarios so as to maximize learning requires certain skills, in order to ensure that a change in student performance, understanding, critical thinking, and/or communication skills results. Methods: A "good" scenario can be defined in terms of applicability, learning opportunities, student interest, and clearly associated metrics. Obstacles to such a scenario include a lack of understanding of the applicable environment by the scenario author(s), a desire (common among novices) to cover too many topics, failure to define learning objectives, mutually exclusive or confusing learning objectives, unskilled instructors, poor preparation , disorganized approach, or an inappropriate teaching philosophy (such as "trial by fire" or education through humiliation). Results: Descriptions of several successful teaching programs, used in the military, civilian, and NASA medical environments , will be provided, along with sample scenarios. Discussion: Simulator-based lessons have proven to be a time- and cost-efficient manner by which to educate medical personnel. Particularly when training for medical care in austere environments (pre-hospital, aeromedical transport, International Space Station, military operations), the HPS can enhance the learning experience.

  2. Environmental consequences of future biogas technologies based on separated slurry.

    PubMed

    Hamelin, Lorie; Wesnæs, Marianne; Wenzel, Henrik; Petersen, Bjørn M

    2011-07-01

    This consequential life cycle assessment study highlights the key environmental aspects of producing biogas from separated pig and cow slurry, a relatively new but probable scenario for future biogas production, as it avoids the reliance on constrained carbon cosubstrates. Three scenarios involving different slurry separation technologies have been assessed and compared to a business-as-usual reference slurry management scenario. The results show that the environmental benefits of such biogas production are highly dependent upon the efficiency of the separation technology used to concentrate the volatile solids in the solid fraction. The biogas scenario involving the most efficient separation technology resulted in a dry matter separation efficiency of 87% and allowed a net reduction of the global warming potential of 40%, compared to the reference slurry management. This figure comprises the whole slurry life cycle, including the flows bypassing the biogas plant. This study includes soil carbon balances and a method for quantifying the changes in yield resulting from increased nitrogen availability as well as for quantifying mineral fertilizers displacement. Soil carbon balances showed that between 13 and 50% less carbon ends up in the soil pool with the different biogas alternatives, as opposed to the reference slurry management.

  3. Reference interval estimation: Methodological comparison using extensive simulations and empirical data.

    PubMed

    Daly, Caitlin H; Higgins, Victoria; Adeli, Khosrow; Grey, Vijay L; Hamid, Jemila S

    2017-12-01

    To statistically compare and evaluate commonly used methods of estimating reference intervals and to determine which method is best based on characteristics of the distribution of various data sets. Three approaches for estimating reference intervals, i.e. parametric, non-parametric, and robust, were compared with simulated Gaussian and non-Gaussian data. The hierarchy of the performances of each method was examined based on bias and measures of precision. The findings of the simulation study were illustrated through real data sets. In all Gaussian scenarios, the parametric approach provided the least biased and most precise estimates. In non-Gaussian scenarios, no single method provided the least biased and most precise estimates for both limits of a reference interval across all sample sizes, although the non-parametric approach performed the best for most scenarios. The hierarchy of the performances of the three methods was only impacted by sample size and skewness. Differences between reference interval estimates established by the three methods were inflated by variability. Whenever possible, laboratories should attempt to transform data to a Gaussian distribution and use the parametric approach to obtain the most optimal reference intervals. When this is not possible, laboratories should consider sample size and skewness as factors in their choice of reference interval estimation method. The consequences of false positives or false negatives may also serve as factors in this decision. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  4. Cost-efficiency analyses for the US of biosimilar filgrastim-sndz, reference filgrastim, pegfilgrastim, and pegfilgrastim with on-body injector in the prophylaxis of chemotherapy-induced (febrile) neutropenia.

    PubMed

    McBride, Ali; Campbell, Kim; Bikkina, Mohan; MacDonald, Karen; Abraham, Ivo; Balu, Sanjeev

    2017-10-01

    Guidelines recommend prophylaxis with granulocyte colony-stimulating factor for chemotherapy-induced (febrile) neutropenia (CIN/FN) based on regimen myelotoxicity and patient-related risk factors. The aim was to conduct a cost-efficiency analysis for the US of the direct acquisition and administration costs of the recently approved biosimilar filgrastim-sndz (Zarxio EP2006) with reference to filgrastim (Neupogen), pegfilgrastim (Neulasta), and a pegfilgrastim injection device (Neulasta Onpro; hereafter pegfilgrastim-injector) for CIN/FN prophylaxis. A cost-efficiency analysis of the prophylaxis of one patient during one chemotherapy cycle under 1-14 days' time horizon was conducted using the unit dose average selling price (ASP) and Current Procedural Terminology (CPT) codes for subcutaneous prophylactic injection under four scenarios: cost of medication only (COSTMED), patient self-administration (SELFADMIN), healthcare provider (HCP) initiating administration followed by self-administration (HCPSTART), and HCP providing full administration (HCPALL). Two case studies were created to illustrate real-world clinical implications. The analyses were replicated using wholesale acquisition cost (WAC). Using ASP + CPT, cost savings achieved with filgrastim-sndz relative to reference filgrastim ranged from $65 (1 day) to $916 (14 days) across all scenarios. Relative to pegfilgrastim, savings with filgrastim-sndz ranged from $834 (14 days) up to $3,666 (1 day) under the COSTMED, SELFADMIN, and HPOSTART scenarios; and from $284 (14 days) up to $3,666 (1 day) under the HPOALL scenario. Similar to the cost-savings compared to pegfilgrastim, filgrastim-sndz achieved savings relative to pegfilgrastim-injector: from $834 (14 days) to $3,666 (1 day) under the COSTMED scenario, from $859 (14 days) to $3,692 (1 day) under SELFADMIN, from $817 (14 days) to $3,649 (1 day) under HPOSTART, and from $267 (14 days) to $3,649 (1 day) under HPOALL. Cost savings of filgrastim-sndz using WAC + CPT were even greater under all scenarios. Prophylaxis with filgrastim-sndz, a biosimilar filgrastim, was associated consistently with significant cost-savings over prophylaxis with reference filgrastim, pegfilgrastim, and pegfilgrastim-injector, and this across various administration scenarios.

  5. Reference-dependent preferences for maternity wards: an exploration of two reference points.

    PubMed

    Neuman, Einat

    2014-01-01

    It is now well established that a person's valuation of the benefit from an outcome of a decision is determined by the intrinsic "consumption utility" of the outcome itself and also by the relation of the outcome to some reference point. The most notable expression of such reference-dependent preferences is loss aversion. What precisely this reference point is, however, is less clear. This paper claims and provides empirical evidence for the existence of more than one reference point. Using a discrete choice experiment in the Israeli public health-care sector, within a sample of 219 women who had given birth, it is shown that respondents refer to two reference points : (i) a constant scenario that is used in the experiment; and (ii) also the actual state of the quantitative attributes of the service (number of beds in room of hospitalization; and travel time from residence to hospital). In line with the loss aversion theory, it is also shown that losses (vis-à-vis the constant scenario and vis-à-vis the actual state) accumulate and have reinforced effects, while gains do not.

  6. Prediction of Change in Prescription Ingredient Costs and Co-payment Rates under a Reference Pricing System in South Korea.

    PubMed

    Heo, Ji Haeng; Rascati, Karen L; Lee, Eui-Kyung

    2017-05-01

    The reference pricing system (RPS) establishes reference prices within interchangeable reference groupings. For drugs priced higher than the reference point, patients pay the difference between the reference price and the total price. To predict potential changes in prescription ingredient costs and co-payment rates after implementation of an RPS in South Korea. Korean National Health Insurance claims data were used as a baseline to develop possible RPS models. Five components of a potential RPS policy were varied: reference groupings, reference pricing methods, co-pay reduction programs, manufacturer price reductions, and increased drug substitutions. The potential changes for prescription ingredient costs and co-payment rates were predicted for the various scenarios. It was predicted that transferring the difference (total price minus reference price) from the insurer to patients would reduce ingredient costs from 1.4% to 22.8% for the third-party payer (government), but patient co-payment rates would increase from a baseline of 20.4% to 22.0% using chemical groupings and to 25.0% using therapeutic groupings. Savings rates in prescription ingredient costs (government and patient combined) were predicted to range from 1.6% to 13.7% depending on various scenarios. Although the co-payment rate would increase, a 15% price reduction by manufacturers coupled with a substitution rate of 30% would result in a decrease in the co-payment amount (change in absolute dollars vs. change in rates). Our models predicted that the implementation of RPS in South Korea would lead to savings in ingredient costs for the third-party payer and co-payments for patients with potential scenarios. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This document is a 21-page summary of the 200+ page analysis that explores one clearly defined scenario for providing 20% of our nation's electricity demand with wind energy by 2030 and contrasts it to a scenario of no new U.S. wind power capacity.

  8. A design space exploration for control of Critical Quality Attributes of mAb.

    PubMed

    Bhatia, Hemlata; Read, Erik; Agarabi, Cyrus; Brorson, Kurt; Lute, Scott; Yoon, Seongkyu

    2016-10-15

    A unique "design space (DSp) exploration strategy," defined as a function of four key scenarios, was successfully integrated and validated to enhance the DSp building exercise, by increasing the accuracy of analyses and interpretation of processed data. The four key scenarios, defining the strategy, were based on cumulative analyses of individual models developed for the Critical Quality Attributes (23 Glycan Profiles) considered for the study. The analyses of the CQA estimates and model performances were interpreted as (1) Inside Specification/Significant Model (2) Inside Specification/Non-significant Model (3) Outside Specification/Significant Model (4) Outside Specification/Non-significant Model. Each scenario was defined and illustrated through individual models of CQA aligning the description. The R(2), Q(2), Model Validity and Model Reproducibility estimates of G2, G2FaGbGN, G0 and G2FaG2, respectively, signified the four scenarios stated above. Through further optimizations, including the estimation of Edge of Failure and Set Point Analysis, wider and accurate DSps were created for each scenario, establishing critical functional relationship between Critical Process Parameters (CPPs) and Critical Quality Attributes (CQAs). A DSp provides the optimal region for systematic evaluation, mechanistic understanding and refining of a QbD approach. DSp exploration strategy will aid the critical process of consistently and reproducibly achieving predefined quality of a product throughout its lifecycle. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Component-Level Electronic-Assembly Repair (CLEAR) Operational Concept

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Bradish, Martin A.; Juergens, Jeffrey R.; Lewis, Michael J.; Vrnak, Daniel R.

    2011-01-01

    This Component-Level Electronic-Assembly Repair (CLEAR) Operational Concept document was developed as a first step in developing the Component-Level Electronic-Assembly Repair (CLEAR) System Architecture (NASA/TM-2011-216956). The CLEAR operational concept defines how the system will be used by the Constellation Program and what needs it meets. The document creates scenarios for major elements of the CLEAR architecture. These scenarios are generic enough to apply to near-Earth, Moon, and Mars missions. The CLEAR operational concept involves basic assumptions about the overall program architecture and interactions with the CLEAR system architecture. The assumptions include spacecraft and operational constraints for near-Earth orbit, Moon, and Mars missions. This document addresses an incremental development strategy where capabilities evolve over time, but it is structured to prevent obsolescence. The approach minimizes flight hardware by exploiting Internet-like telecommunications that enables CLEAR capabilities to remain on Earth and to be uplinked as needed. To minimize crew time and operational cost, CLEAR exploits offline development and validation to support online teleoperations. Operational concept scenarios are developed for diagnostics, repair, and functional test operations. Many of the supporting functions defined in these operational scenarios are further defined as technologies in NASA/TM-2011-216956.

  10. Cost Implications of Value-Based Pricing for Companion Diagnostic Tests in Precision Medicine.

    PubMed

    Zaric, Gregory S

    2016-07-01

    Many interpretations of personalized medicine, also referred to as precision medicine, include discussions of companion diagnostic tests that allow drugs to be targeted to those individuals who are most likely to benefit or that allow treatment to be designed in a way such that individuals who are unlikely to benefit do not receive treatment. Many authors have commented on the clinical and competitive implications of companion diagnostics, but there has been relatively little formal analysis of the cost implications of companion diagnostics, although cost reduction is often cited as a significant benefit of precision medicine. We investigate the potential impact on costs of precision medicine implemented through the use of companion diagnostics. We develop a framework in which the costs of companion diagnostic tests are determined by considerations of profit maximization and cost effectiveness. We analyze four scenarios that are defined by the incremental cost-effectiveness ratio of the new drug in the absence of a companion diagnostic test. We find that, in most scenarios, precision medicine strategies based on companion diagnostics should be expected to lead to increases in costs in the short term and that costs would fall only in a limited number of situations.

  11. Designing Robust and Reliable Timestamps for Remote Patient Monitoring.

    PubMed

    Clarke, Malcolm; Schluter, Paul; Reinhold, Barry; Reinhold, Brian

    2015-09-01

    Having timestamps that are robust and reliable is essential for remote patient monitoring in order for patient data to have context and to be correlated with other data. However, unlike hospital systems for which guidelines on timestamps are currently provided by HL7 and IHE, remote patient monitoring platforms are: operated in environments where it can be difficult to synchronize with reliable time sources; include devices with simple or no clock; and may store data spanning significant periods before able to upload. Existing guidelines prove inadequate. This paper analyzes the requirements and the operating scenarios of remote patient monitoring platforms and defines a framework to convey information on the conditions under which observations were made by the device and forwarded by the gateway in order for data to be managed appropriately and to include both reference to local time and an underlying continuous reference timeline. We define the timestamp formats of HL7 to denote the different conditions of operation and describe extensions to the existing definition of the HL7 timestamp to differentiate between time local to GMT (+0000) and universal coordinated time or network time protocol time where no geographic time zone is implied (-0000). We further describe how timestamps from devices having only simple or no clocks might be managed reliably by a gateway to provide timestamps that are referenced to local time and an underlying continuous reference timeline. We extend the HL7 message to include information to permit a subsequent receiver of the data to understand the quality of the timestamp and how it has been translated. We present evaluation from deploying a platform for 12 months.

  12. DNA analysis in Disaster Victim Identification.

    PubMed

    Montelius, Kerstin; Lindblom, Bertil

    2012-06-01

    DNA profiling and matching is one of the primary methods to identify missing persons in a disaster, as defined by the Interpol Disaster Victim Identification Guide. The process to identify a victim by DNA includes: the collection of the best possible ante-mortem (AM) samples, the choice of post-mortem (PM) samples, DNA-analysis, matching and statistical weighting of the genetic relationship or match. Each disaster has its own scenario, and each scenario defines its own methods for identification of the deceased.

  13. Relevance of workplace social mixing during influenza pandemics: an experimental modelling study of workplace cultures.

    PubMed

    Timpka, T; Eriksson, H; Holm, E; Strömgren, M; Ekberg, J; Spreco, A; Dahlström, Ö

    2016-07-01

    Workplaces are one of the most important regular meeting places in society. The aim of this study was to use simulation experiments to examine the impact of different workplace cultures on influenza dissemination during pandemics. The impact is investigated by experiments with defined social-mixing patterns at workplaces using semi-virtual models based on authentic sociodemographic and geographical data from a North European community (population 136 000). A simulated pandemic outbreak was found to affect 33% of the total population in the community with the reference academic-creative workplace culture; virus transmission at the workplace accounted for 10·6% of the cases. A model with a prevailing industrial-administrative workplace culture generated 11% lower incidence than the reference model, while the model with a self-employed workplace culture (also corresponding to a hypothetical scenario with all workplaces closed) produced 20% fewer cases. The model representing an academic-creative workplace culture with restricted workplace interaction generated 12% lower cumulative incidence compared to the reference model. The results display important theoretical associations between workplace social-mixing cultures and community-level incidence rates during influenza pandemics. Social interaction patterns at workplaces should be taken into consideration when analysing virus transmission patterns during influenza pandemics.

  14. Accuracy and Reliability of Emergency Department Triage Using the Emergency Severity Index: An International Multicenter Assessment.

    PubMed

    Mistry, Binoy; Stewart De Ramirez, Sarah; Kelen, Gabor; Schmitz, Paulo S K; Balhara, Kamna S; Levin, Scott; Martinez, Diego; Psoter, Kevin; Anton, Xavier; Hinson, Jeremiah S

    2018-05-01

    We assess accuracy and variability of triage score assignment by emergency department (ED) nurses using the Emergency Severity Index (ESI) in 3 countries. In accordance with previous reports and clinical observation, we hypothesize low accuracy and high variability across all sites. This cross-sectional multicenter study enrolled 87 ESI-trained nurses from EDs in Brazil, the United Arab Emirates, and the United States. Standardized triage scenarios published by the Agency for Healthcare Research and Quality (AHRQ) were used. Accuracy was defined by concordance with the AHRQ key and calculated as percentages. Accuracy comparisons were made with one-way ANOVA and paired t test. Interrater reliability was measured with Krippendorff's α. Subanalyses based on nursing experience and triage scenario type were also performed. Mean accuracy pooled across all sites and scenarios was 59.2% (95% confidence interval [CI] 56.4% to 62.0%) and interrater reliability was modest (α=.730; 95% CI .692 to .767). There was no difference in overall accuracy between sites or according to nurse experience. Medium-acuity scenarios were scored with greater accuracy (76.4%; 95% CI 72.6% to 80.3%) than high- or low-acuity cases (44.1%, 95% CI 39.3% to 49.0% and 54%, 95% CI 49.9% to 58.2%), and adult scenarios were scored with greater accuracy than pediatric ones (66.2%, 95% CI 62.9% to 69.7% versus 46.9%, 95% CI 43.4% to 50.3%). In this multinational study, concordance of nurse-assigned ESI score with reference standard was universally poor and variability was high. Although the ESI is the most popular ED triage tool in the United States and is increasingly used worldwide, our findings point to a need for more reliable ED triage tools. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  15. Evaluation of surface water budget and assessment the global water cycle for the IPCC AR4 A1B scenario simulations

    NASA Astrophysics Data System (ADS)

    Baek, H.; Park, E.; Kwon, W.

    2009-12-01

    Water balance calculations are becoming increasingly important for earth-system studies, because humans require water for their survival. Especially, the relationship between climate change and freshwater resources is of primary concern to human society and also has implications for all living species. The goal of this study is to assess the closure and annual variations of the water cycles based on the multi-model ensemble approach. In this study, the projection results of the previous works focusing on global and six sub-regions are updated using sixteen atmosphere-ocean general circulation model (AOGCM) simulations based on the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) A1B scenario. Before projecting future climate, model performances are evaluated on the simulation of the present-day climate. From the result, we construct and use mainly multi-model ensembles (MMEs), which is referred to as MME9, defined from nine selected AOGCMs of higher performance. Analyzed variables include annual and seasonal precipitation, evaporation, and runoff. The overall projection results from MME9 show that most regions will experience warmer and wetter climate at the end of 21st century. The evaporation shows a very similar trend to precipitation, but not in the runoff projection. The internal and inter-model variabilities are larger in the runoff than both precipitation and evaporation. Moreover, the runoff is notably reduced in Europe at the end of 21st century.

  16. Analysis and simulation of water-level, specific conductance, and total phosphorus dynamics of the Loxahatchee National Wildlife Refuge, Florida, 1995-2006

    USGS Publications Warehouse

    Conrads, Paul; Roehl, Edwin A.

    2010-01-01

    Two scenarios were simulated with the LOXANN DSS. One scenario increased the historical flows at four control structures by 40 percent. The second scenario used a user-defined hydrograph to set the outflow from the Refuge to the weekly average inflow to the Refuge delayed by 2 days. Both scenarios decreased the potential of canal water intruding into the marsh by decreasing the slope of the water level between the canals and the marsh.

  17. A comparison between the example reference biosphere model ERB 2B and a process-based model: simulation of a natural release scenario.

    PubMed

    Almahayni, T

    2014-12-01

    The BIOMASS methodology was developed with the objective of constructing defensible assessment biospheres for assessing potential radiological impacts of radioactive waste repositories. To this end, a set of Example Reference Biospheres were developed to demonstrate the use of the methodology and to provide an international point of reference. In this paper, the performance of the Example Reference Biosphere model ERB 2B associated with the natural release scenario, discharge of contaminated groundwater to the surface environment, was evaluated by comparing its long-term projections of radionuclide dynamics and distribution in a soil-plant system to those of a process-based, transient advection-dispersion model (AD). The models were parametrised with data characteristic of a typical rainfed winter wheat crop grown on a sandy loam soil under temperate climate conditions. Three safety-relevant radionuclides, (99)Tc, (129)I and (237)Np with different degree of sorption were selected for the study. Although the models were driven by the same hydraulic (soil moisture content and water fluxes) and radiological (Kds) input data, their projections were remarkably different. On one hand, both models were able to capture short and long-term variation in activity concentration in the subsoil compartment. On the other hand, the Reference Biosphere model did not project any radionuclide accumulation in the topsoil and crop compartments. This behaviour would underestimate the radiological exposure under natural release scenarios. The results highlight the potential role deep roots play in soil-to-plant transfer under a natural release scenario where radionuclides are released into the subsoil. When considering the relative activity and root depth profiles within the soil column, much of the radioactivity was taken up into the crop from the subsoil compartment. Further improvements were suggested to address the limitations of the Reference Biosphere model presented in this paper. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. ESTIMATION OF EXPOSURE DOSES FOR THE SAFE MANAGEMENT OF NORM WASTE DISPOSAL.

    PubMed

    Jeong, Jongtae; Ko, Nak Yul; Cho, Dong-Keun; Baik, Min Hoon; Yoon, Ki-Hoon

    2018-03-16

    Naturally occurring radioactive materials (NORM) wastes with different radiological characteristics are generated in several industries. The appropriate options for NORM waste management including disposal options should be discussed and established based on the act and regulation guidelines. Several studies calculated the exposure dose and mass of NORM waste to be disposed in landfill site by considering the activity concentration level and exposure dose. In 2012, the Korean government promulgated an act on the safety control of NORM around living environments to protect human health and the environment. For the successful implementation of this act, we suggest a reference design for a landfill for the disposal of NORM waste. Based on this reference landfill, we estimate the maximum exposure doses and the relative impact of each pathway to exposure dose for three scenarios: a reference scenario, an ingestion pathway exclusion scenario, and a low leach rate scenario. Also, we estimate the possible quantity of NORM waste disposal into a landfill as a function of the activity concentration level of U series, Th series and 40K and two kinds of exposure dose levels, 1 and 0.3 mSv/y. The results of this study can be used to support the establishment of technical bases of the management strategy for the safe disposal of NORM waste.

  19. Atmospheric circulation and hydroclimate impacts of alternative warming scenarios for the Eocene

    NASA Astrophysics Data System (ADS)

    Carlson, Henrik; Caballero, Rodrigo

    2017-08-01

    Recent work in modelling the warm climates of the early Eocene shows that it is possible to obtain a reasonable global match between model surface temperature and proxy reconstructions, but only by using extremely high atmospheric CO2 concentrations or more modest CO2 levels complemented by a reduction in global cloud albedo. Understanding the mix of radiative forcing that gave rise to Eocene warmth has important implications for constraining Earth's climate sensitivity, but progress in this direction is hampered by the lack of direct proxy constraints on cloud properties. Here, we explore the potential for distinguishing among different radiative forcing scenarios via their impact on regional climate changes. We do this by comparing climate model simulations of two end-member scenarios: one in which the climate is warmed entirely by CO2 (which we refer to as the greenhouse gas (GHG) scenario) and another in which it is warmed entirely by reduced cloud albedo (which we refer to as the low CO2-thin clouds or LCTC scenario) . The two simulations have an almost identical global-mean surface temperature and equator-to-pole temperature difference, but the LCTC scenario has ˜ 11 % greater global-mean precipitation than the GHG scenario. The LCTC scenario also has cooler midlatitude continents and warmer oceans than the GHG scenario and a tropical climate which is significantly more El Niño-like. Extremely high warm-season temperatures in the subtropics are mitigated in the LCTC scenario, while cool-season temperatures are lower at all latitudes. These changes appear large enough to motivate further, more detailed study using other climate models and a more realistic set of modelling assumptions.

  20. Prospective testing of neo-deterministic seismic hazard scenarios for the Italian territory

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Vaccari, Franco; Kossobokov, Vladimir; Panza, Giuliano F.

    2013-04-01

    A reliable and comprehensive characterization of expected seismic ground shaking, eventually including the related time information, is essential in order to develop effective mitigation strategies and increase earthquake preparedness. Moreover, any effective tool for SHA must demonstrate its capability in anticipating the ground shaking related with large earthquake occurrences, a result that can be attained only through rigorous verification and validation process. So far, the major problems in classical probabilistic methods for seismic hazard assessment, PSHA, consisted in the adequate description of the earthquake recurrence, particularly for the largest and sporadic events, and of the attenuation models, which may be unable to account for the complexity of the medium and of the seismic sources and are often weekly constrained by the available observations. Current computational resources and physical knowledge of the seismic waves generation and propagation processes allow nowadays for viable numerical and analytical alternatives to the use of attenuation relations. Accordingly, a scenario-based neo-deterministic approach, NDSHA, to seismic hazard assessment has been proposed, which allows considering a wide range of possible seismic sources as the starting point for deriving scenarios by means of full waveforms modeling. The method does not make use of attenuation relations and naturally supplies realistic time series of ground shaking, including reliable estimates of ground displacement readily applicable to seismic isolation techniques. Based on NDSHA, an operational integrated procedure for seismic hazard assessment has been developed, that allows for the definition of time dependent scenarios of ground shaking, through the routine updating of formally defined earthquake predictions. The integrated NDSHA procedure for seismic input definition, which is currently applied to the Italian territory, combines different pattern recognition techniques, designed for the space-time identification of strong earthquakes, with algorithms for the realistic modeling of ground motion. Accordingly, a set of deterministic scenarios of ground motion at bedrock, which refers to the time interval when a strong event is likely to occur within the alerted area, can be defined by means of full waveform modeling, both at regional and local scale. CN and M8S predictions, as well as the related time-dependent ground motion scenarios associated with the alarmed areas, are regularly updated every two months since 2006. The routine application of the time-dependent NDSHA approach provides information that can be useful in assigning priorities for timely mitigation actions and, at the same time, allows for a rigorous prospective testing and validation of the proposed methodology. As an example, for sites where ground shaking values greater than 0.2 g are estimated at bedrock, further investigations can be performed taking into account the local soil conditions, to assess the performances of relevant structures, such as historical and strategic buildings. The issues related with prospective testing and validation of the time-dependent NDSHA scenarios will be discussed, illustrating the results obtained for the recent strong earthquakes in Italy, including the May 20, 2012 Emilia earthquake.

  1. AERIS - applications for the environment : real-time information synthesis : eco-signal operations modeling report.

    DOT National Transportation Integrated Search

    2014-12-01

    This report constitutes the detailed modeling and evaluation results of the Eco-Signal Operations Operational Scenario defined by the AERIS program. The Operational Scenario constitutes four applications that are designed to provide environmental ben...

  2. Design and Principles Enabling the Space Reference FOM

    NASA Technical Reports Server (NTRS)

    Moeller, Bjoern; Dexter, Dan; Madden, Michael; Crues, Edwin Z.; Garro, Alfredo; Skuratovskiy, Anton

    2017-01-01

    A first complete draft of the Simulation Interoperability Standards Organization (SISO) Space Reference Federation Object Model (FOM) has now been produced. This paper provides some insights into its capabilities and discusses the opportunity for reuse in other domains. The focus of this first version of the standard is execution control, time management and coordinate systems, well-known reference frames, as well as some basic support for physical entities. The biggest part of the execution control is the coordinated start-up process. This process contains a number of steps, including checking of required federates, handling of early versus late joiners, sharing of federation wide configuration data and multi-phase initialization. An additional part of Execution Control is the coordinated and synchronized transition between Run mode, Freeze mode and Shutdown. For time management, several time lines are defined, including real-time, scenario time, High Level Architecture (HLA) logical time and physical time. A strategy for mixing simulations that use different time steps is introduced, as well as an approach for finding common boundaries for fully synchronized freeze. For describing spatial information, a mechanism with a set of reference frames is specified. Each reference frame has a position and orientation related to a parent reference frame. This makes it possible for federates to perform calculations in reference frames that are convenient to them. An operation on the Moon can be performed using lunar coordinates whereas an operation on Earth can be performed using Earth coordinates. At the same time, coordinates in one reference frame have an unambiguous relationship to a coordinate in another reference frame. While the Space Reference FOM is originally being developed for Space operations, the authors believe that many parts of it can be reused for any simulation that has a focus on physical processes with one or more coordinate systems, and require high fidelity and repeatability.

  3. EPA QUICK REFERENCE GUIDES

    EPA Science Inventory

    EPA Quick Reference Guides are compilations of information on chemical and biological terrorist agents. The information is presented in consistent format and includes agent characteristics, release scenarios, health and safety data, real-time field detection, effect levels, samp...

  4. Scenario driven data modelling: a method for integrating diverse sources of data and data streams

    DOEpatents

    Brettin, Thomas S.; Cottingham, Robert W.; Griffith, Shelton D.; Quest, Daniel J.

    2015-09-08

    A system and method of integrating diverse sources of data and data streams is presented. The method can include selecting a scenario based on a topic, creating a multi-relational directed graph based on the scenario, identifying and converting resources in accordance with the scenario and updating the multi-directed graph based on the resources, identifying data feeds in accordance with the scenario and updating the multi-directed graph based on the data feeds, identifying analytical routines in accordance with the scenario and updating the multi-directed graph using the analytical routines and identifying data outputs in accordance with the scenario and defining queries to produce the data outputs from the multi-directed graph.

  5. Employment references: defamation law in the clinical laboratory.

    PubMed

    Parks, D G

    1993-01-01

    The law of defamation and the risks involved in issuing employment references are discussed. A hypothetical scenario is used to illustrate the legal standards governing the tort of defamation and to apply those standards to employment references. Practical suggestions for a "controlled reference" policy are provided, with the objective of allowing for responsible exchange of employment information and avoiding a defamation lawsuit.

  6. Comparing brain graphs in which nodes are regions of interest or independent components: A simulation study.

    PubMed

    Yu, Qingbao; Du, Yuhui; Chen, Jiayu; He, Hao; Sui, Jing; Pearlson, Godfrey; Calhoun, Vince D

    2017-11-01

    A key challenge in building a brain graph using fMRI data is how to define the nodes. Spatial brain components estimated by independent components analysis (ICA) and regions of interest (ROIs) determined by brain atlas are two popular methods to define nodes in brain graphs. It is difficult to evaluate which method is better in real fMRI data. Here we perform a simulation study and evaluate the accuracies of a few graph metrics in graphs with nodes of ICA components, ROIs, or modified ROIs in four simulation scenarios. Graph measures with ICA nodes are more accurate than graphs with ROI nodes in all cases. Graph measures with modified ROI nodes are modulated by artifacts. The correlations of graph metrics across subjects between graphs with ICA nodes and ground truth are higher than the correlations between graphs with ROI nodes and ground truth in scenarios with large overlapped spatial sources. Moreover, moving the location of ROIs would largely decrease the correlations in all scenarios. Evaluating graphs with different nodes is promising in simulated data rather than real data because different scenarios can be simulated and measures of different graphs can be compared with a known ground truth. Since ROIs defined using brain atlas may not correspond well to real functional boundaries, overall findings of this work suggest that it is more appropriate to define nodes using data-driven ICA than ROI approaches in real fMRI data. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Flood hazard assessment for french NPPs

    NASA Astrophysics Data System (ADS)

    Rebour, Vincent; Duluc, Claire-Marie; Guimier, Laurent

    2015-04-01

    This paper presents the approach for flood hazard assessment for NPP which is on-going in France in the framework of post-Fukushima activities. These activities were initially defined considering both European "stress tests" of NPPs pursuant to the request of the European Council, and the French safety audit of civilian nuclear facilities in the light of the Fukushima Daiichi accident. The main actors in that process are the utility (EDF is, up to date, the unique NPP's operator in France), the regulatory authority (ASN) and its technical support organization (IRSN). This paper was prepared by IRSN, considering official positions of the other main actors in the current review process, it was not officially endorsed by them. In France, flood hazard to be considered for design basis definition (for new NPPs and for existing NPPs in periodic safety reviews conducted every 10 years) was revised before Fukushima-Daichi accident, due to le Blayais NPP December 1999 experience (partial site flooding and loss of some safety classified systems). The paper presents in the first part an overview of the revised guidance for design basis flood. In order to address design extension conditions (conditions that could result from natural events exceeding the design basis events), a set of flooding scenarios have been defined by adding margins on the scenarios that are considered for the design. Due to the diversity of phenomena to be considered for flooding hazard, the margin assessment is specific to each flooding scenario in terms of parameter to be penalized and of degree of variation of this parameter. The general approach to address design extension conditions is presented in the second part of the paper. The next parts present the approach for five flooding scenarios including design basis scenario and additional margin to define design extension scenarios.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eckert-Gallup, Aubrey Celia; Lewis, John R.; Brooks, Dusty Marie

    This report describes the methods, results, and conclusions of the analysis of 11 scenarios defined to exercise various options available in the xLPR (Extremely Low Probability of Rupture) Version 2 .0 code. The scope of the scenario analysis is three - fold: (i) exercise the various options and components comprising xLPR v2.0 and defining each scenario; (ii) develop and exercise methods for analyzing and interpreting xLPR v2.0 outputs ; and (iii) exercise the various sampling options available in xLPR v2.0. The simulation workflow template developed during the course of this effort helps to form a basis for the application ofmore » the xLPR code to problems with similar inputs and probabilistic requirements and address in a systematic manner the three points covered by the scope.« less

  9. Data supporting the comparative life cycle assessment of different municipal solid waste management scenarios

    PubMed Central

    Ali Rajaeifar, Mohammad; Tabatabaei, Meisam; Ghanavati, Hossein

    2015-01-01

    Environmental assessment of municipal solid waste (MSW) management scenarios would help to select eco-friendly scenarios. In this study, the inventory data in support of life cycle assessment of different MSW are presented. The scenarios were defined as: anaerobic digestion (AD, Sc-0), landfilling combined with composting (Sc-1), incineration (Sc-2), incineration combined with composting (Sc-3), and AD combined with incineration (Sc-4). The current article contains flowcharts of the different scenarios. Additionally, six supplementary files including inventory data on the different scenarios, data on the different damage assessment categories, normalization, and single scores are presented (Supplementary files 1–6). The analysis of the different scenarios revealed that the most eco-friendly scenario to be implemented in the future would be the combination of AD and incineration (Sc-4). PMID:26217743

  10. Future Education: Learning the Future. Scenarios and Strategies in Europe. CEDEFOP Reference Series.

    ERIC Educational Resources Information Center

    van Wieringen, Fons; Sellin, Burkart; Schmidt, Ghislaine

    Five research institutes covering five European Union (EU) member states and five Central and Eastern European countries participated in a scenario project designed to improve understanding of vocational education and training (VET) systems in their economic-technological, employment-labor, and training-knowledge environments. The participating…

  11. Climate change impacts on freshwater fish, coral reefs, and related ecosystem services in the United States

    EPA Science Inventory

    We analyzed the potential physical and economic impacts of climate change on freshwater fisheries and coral reefs in the United States, examining a reference scenario and two policy scenarios that limit global greenhouse gas (GHG) emissions. We modeled shifts in suitable habitat ...

  12. Incorporating quality and safety education for nurses competencies in simulation scenario design.

    PubMed

    Jarzemsky, Paula; McCarthy, Jane; Ellis, Nadege

    2010-01-01

    When planning a simulation scenario, even if adopting prepackaged simulation scenarios, faculty should first conduct a task analysis to guide development of learning objectives and cue critical events. The authors describe a strategy for systematic planning of simulation-based training that incorporates knowledge, skills, and attitudes as defined by the Quality and Safety Education for Nurses (QSEN) initiative. The strategy cues faculty to incorporate activities that target QSEN competencies (patient-centered care, teamwork and collaboration, evidence-based practice, quality improvement, informatics, and safety) before, during, and after simulation scenarios.

  13. Improving representation of canopy temperatures for modeling subcanopy incoming longwave radiation to the snow surface

    NASA Astrophysics Data System (ADS)

    Webster, Clare; Rutter, Nick; Jonas, Tobias

    2017-09-01

    A comprehensive analysis of canopy surface temperatures was conducted around a small and large gap at a forested alpine site in the Swiss Alps during the 2015 and 2016 snowmelt seasons (March-April). Canopy surface temperatures within the small gap were within 2-3°C of measured reference air temperature. Vertical and horizontal variations in canopy surface temperatures were greatest around the large gap, varying up to 18°C above measured reference air temperature during clear-sky days. Nighttime canopy surface temperatures around the study site were up to 3°C cooler than reference air temperature. These measurements were used to develop a simple parameterization for correcting reference air temperature for elevated canopy surface temperatures during (1) nighttime conditions (subcanopy shortwave radiation is 0 W m-2) and (2) periods of increased subcanopy shortwave radiation >400 W m-2 representing penetration of shortwave radiation through the canopy. Subcanopy shortwave and longwave radiation collected at a single point in the subcanopy over a 24 h clear-sky period was used to calculate a nighttime bulk offset of 3°C for scenario 1 and develop a multiple linear regression model for scenario 2 using reference air temperature and subcanopy shortwave radiation to predict canopy surface temperature with a root-mean-square error (RMSE) of 0.7°C. Outside of these two scenarios, reference air temperature was used to predict subcanopy incoming longwave radiation. Modeling at 20 radiometer locations throughout two snowmelt seasons using these parameterizations reduced the mean bias and RMSE to below 10 W m s-2 at all locations.

  14. Micro-Logistics Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Cirillo, William; Stromgren, Chel; Galan, Ricardo

    2008-01-01

    Traditionally, logistics analysis for space missions has focused on the delivery of elements and goods to a destination. This type of logistics analysis can be referred to as "macro-logistics". While the delivery of goods is a critical component of mission analysis, it captures only a portion of the constraints that logistics planning may impose on a mission scenario. The other component of logistics analysis concerns the local handling of goods at the destination, including storage, usage, and disposal. This type of logistics analysis, referred to as "micro-logistics", may also be a primary driver in the viability of a human lunar exploration scenario. With the rigorous constraints that will be placed upon a human lunar outpost, it is necessary to accurately evaluate micro-logistics operations in order to develop exploration scenarios that will result in an acceptable level of system performance.

  15. Assessment of riverine load of contaminants to European seas under policy implementation scenarios: an example with 3 pilot substances.

    PubMed

    Marinov, Dimitar; Pistocchi, Alberto; Trombetti, Marco; Bidoglio, Giovanni

    2014-01-01

    An evaluation of conventional emission scenarios is carried out targeting a possible impact of European Union (EU) policies on riverine loads to the European seas for 3 pilot pollutants: lindane, trifluralin, and perfluorooctane sulfonate (PFOS). The policy scenarios are investigated to the time horizon of year 2020 starting from chemical-specific reference conditions and considering different types of regulatory measures including business as usual (BAU), current trend (CT), partial implementation (PI), or complete ban (PI ban) of emissions. The scenario analyses show that the model-estimated lindane load of 745 t to European seas in 1995, based on the official emission data, would be reduced by 98.3% to approximately 12.5 t in 2005 (BAU scenario), 10 years after the start of the EU regulation of this chemical. The CT and PI ban scenarios indicate a reduction of sea loads of lindane in 2020 by 74% and 95%, respectively, when compared to the BAU estimate. For trifluralin, an annual load of approximately 61.7 t is estimated for the baseline year 2003 (BAU scenario), although the applied conservative assumptions related to pesticide use data availability in Europe. Under the PI (ban) scenario, assuming only small residual emissions of trifluralin, we estimate a sea loading of approximately 0.07 t/y. For PFOS, the total sea load from all European countries is estimated at approximately 5.8 t/y referred to 2007 (BAU scenario). Reducing the total load of PFOS below 1 t/y requires emissions to be reduced by 84%. The analysis of conventional scenarios or scenario typologies for emissions of contaminants using simple spatially explicit GIS-based models is suggested as a viable, affordable exercise that may support the assessment of implementation of policies and the identification or negotiation of emission reduction targets. © 2013 SETAC.

  16. Solar power satellite system definition study, phase 2. Volume 2: Reference system description

    NASA Technical Reports Server (NTRS)

    1979-01-01

    System descriptions and cost estimates for the reference system of the solar power satellite program are presented. The reference system is divided into five principal elements: the solar power satellites; space construction and support; space and ground transportation; ground receiving stations; and operations control. The program scenario and non-recurring costs are briefly described.

  17. The Volta Grande do Xingu: reconstruction of past environments and forecasting of future scenarios of a unique Amazonian fluvial landscape

    NASA Astrophysics Data System (ADS)

    Sawakuchi, A. O.; Hartmann, G. A.; Sawakuchi, H. O.; Pupim, F. N.; Bertassoli, D. J.; Parra, M.; Antinao, J. L.; Sousa, L. M.; Sabaj Pérez, M. H.; Oliveira, P. E.; Santos, R. A.; Savian, J. F.; Grohmann, C. H.; Medeiros, V. B.; McGlue, M. M.; Bicudo, D. C.; Faustino, S. B.

    2015-12-01

    The Xingu River is a large clearwater river in eastern Amazonia and its downstream sector, known as the Volta Grande do Xingu ("Xingu Great Bend"), is a unique fluvial landscape that plays an important role in the biodiversity, biogeochemistry and prehistoric and historic peopling of Amazonia. The sedimentary dynamics of the Xingu River in the Volta Grande and its downstream sector will be shifted in the next few years due to the construction of dams associated with the Belo Monte hydropower project. Impacts on river biodiversity and carbon cycling are anticipated, especially due to likely changes in sedimentation and riverbed characteristics. This research project aims to define the geological and climate factors responsible for the development of the Volta Grande landscape and to track its environmental changes during the Holocene, using the modern system as a reference. In this context, sediment cores, riverbed rock and sediment samples and greenhouse gas (GHG) samples were collected in the Volta Grande do Xingu and adjacent upstream and downstream sectors. The reconstruction of past conditions in the Volta Grande is necessary for forecasting future scenarios and defining biodiversity conservation strategies under the operation of Belo Monte dams. This paper describes the scientific questions of the project and the sampling surveys performed by an international team of Earth scientists and biologists during the dry seasons of 2013 and 2014. Preliminary results are presented and a future workshop is planned to integrate results, present data to the scientific community and discuss possibilities for deeper drilling in the Xingu ria to extend the sedimentary record of the Volta Grande do Xingu.

  18. Accuracy of estimation of genomic breeding values in pigs using low-density genotypes and imputation.

    PubMed

    Badke, Yvonne M; Bates, Ronald O; Ernst, Catherine W; Fix, Justin; Steibel, Juan P

    2014-04-16

    Genomic selection has the potential to increase genetic progress. Genotype imputation of high-density single-nucleotide polymorphism (SNP) genotypes can improve the cost efficiency of genomic breeding value (GEBV) prediction for pig breeding. Consequently, the objectives of this work were to: (1) estimate accuracy of genomic evaluation and GEBV for three traits in a Yorkshire population and (2) quantify the loss of accuracy of genomic evaluation and GEBV when genotypes were imputed under two scenarios: a high-cost, high-accuracy scenario in which only selection candidates were imputed from a low-density platform and a low-cost, low-accuracy scenario in which all animals were imputed using a small reference panel of haplotypes. Phenotypes and genotypes obtained with the PorcineSNP60 BeadChip were available for 983 Yorkshire boars. Genotypes of selection candidates were masked and imputed using tagSNP in the GeneSeek Genomic Profiler (10K). Imputation was performed with BEAGLE using 128 or 1800 haplotypes as reference panels. GEBV were obtained through an animal-centric ridge regression model using de-regressed breeding values as response variables. Accuracy of genomic evaluation was estimated as the correlation between estimated breeding values and GEBV in a 10-fold cross validation design. Accuracy of genomic evaluation using observed genotypes was high for all traits (0.65-0.68). Using genotypes imputed from a large reference panel (accuracy: R(2) = 0.95) for genomic evaluation did not significantly decrease accuracy, whereas a scenario with genotypes imputed from a small reference panel (R(2) = 0.88) did show a significant decrease in accuracy. Genomic evaluation based on imputed genotypes in selection candidates can be implemented at a fraction of the cost of a genomic evaluation using observed genotypes and still yield virtually the same accuracy. On the other side, using a very small reference panel of haplotypes to impute training animals and candidates for selection results in lower accuracy of genomic evaluation.

  19. Mobile, Collaborative Situated Knowledge Creation for Urban Planning

    PubMed Central

    Zurita, Gustavo; Baloian, Nelson

    2012-01-01

    Geo-collaboration is an emerging research area in computer sciences studying the way spatial, geographically referenced information and communication technologies can support collaborative activities. Scenarios in which information associated to its physical location are of paramount importance are often referred as Situated Knowledge Creation scenarios. To date there are few computer systems supporting knowledge creation that explicitly incorporate physical context as part of the knowledge being managed in mobile face-to-face scenarios. This work presents a collaborative software application supporting visually-geo-referenced knowledge creation in mobile working scenarios while the users are interacting face-to-face. The system allows to manage data information associated to specific physical locations for knowledge creation processes in the field, such as urban planning, identifying specific physical locations, territorial management, etc.; using Tablet-PCs and GPS in order to geo-reference data and information. It presents a model for developing mobile applications supporting situated knowledge creation in the field, introducing the requirements for such an application and the functionalities it should have in order to fulfill them. The paper also presents the results of utility and usability evaluations. PMID:22778639

  20. Mobile, collaborative situated knowledge creation for urban planning.

    PubMed

    Zurita, Gustavo; Baloian, Nelson

    2012-01-01

    Geo-collaboration is an emerging research area in computer sciences studying the way spatial, geographically referenced information and communication technologies can support collaborative activities. Scenarios in which information associated to its physical location are of paramount importance are often referred as Situated Knowledge Creation scenarios. To date there are few computer systems supporting knowledge creation that explicitly incorporate physical context as part of the knowledge being managed in mobile face-to-face scenarios. This work presents a collaborative software application supporting visually-geo-referenced knowledge creation in mobile working scenarios while the users are interacting face-to-face. The system allows to manage data information associated to specific physical locations for knowledge creation processes in the field, such as urban planning, identifying specific physical locations, territorial management, etc.; using Tablet-PCs and GPS in order to geo-reference data and information. It presents a model for developing mobile applications supporting situated knowledge creation in the field, introducing the requirements for such an application and the functionalities it should have in order to fulfill them. The paper also presents the results of utility and usability evaluations.

  1. [Effects of sampling plot number on tree species distribution prediction under climate change].

    PubMed

    Liang, Yu; He, Hong-Shi; Wu, Zhi-Wei; Li, Xiao-Na; Luo, Xu

    2013-05-01

    Based on the neutral landscapes under different degrees of landscape fragmentation, this paper studied the effects of sampling plot number on the prediction of tree species distribution at landscape scale under climate change. The tree species distribution was predicted by the coupled modeling approach which linked an ecosystem process model with a forest landscape model, and three contingent scenarios and one reference scenario of sampling plot numbers were assumed. The differences between the three scenarios and the reference scenario under different degrees of landscape fragmentation were tested. The results indicated that the effects of sampling plot number on the prediction of tree species distribution depended on the tree species life history attributes. For the generalist species, the prediction of their distribution at landscape scale needed more plots. Except for the extreme specialist, landscape fragmentation degree also affected the effects of sampling plot number on the prediction. With the increase of simulation period, the effects of sampling plot number on the prediction of tree species distribution at landscape scale could be changed. For generalist species, more plots are needed for the long-term simulation.

  2. Automated Construction of Molecular Active Spaces from Atomic Valence Orbitals.

    PubMed

    Sayfutyarova, Elvira R; Sun, Qiming; Chan, Garnet Kin-Lic; Knizia, Gerald

    2017-09-12

    We introduce the atomic valence active space (AVAS), a simple and well-defined automated technique for constructing active orbital spaces for use in multiconfiguration and multireference (MR) electronic structure calculations. Concretely, the technique constructs active molecular orbitals capable of describing all relevant electronic configurations emerging from a targeted set of atomic valence orbitals (e.g., the metal d orbitals in a coordination complex). This is achieved via a linear transformation of the occupied and unoccupied orbital spaces from an easily obtainable single-reference wave function (such as from a Hartree-Fock or Kohn-Sham calculations) based on projectors to targeted atomic valence orbitals. We discuss the premises, theory, and implementation of the idea, and several of its variations are tested. To investigate the performance and accuracy, we calculate the excitation energies for various transition-metal complexes in typical application scenarios. Additionally, we follow the homolytic bond breaking process of a Fenton reaction along its reaction coordinate. While the described AVAS technique is not a universal solution to the active space problem, its premises are fulfilled in many application scenarios of transition-metal chemistry and bond dissociation processes. In these cases the technique makes MR calculations easier to execute, easier to reproduce by any user, and simplifies the determination of the appropriate size of the active space required for accurate results.

  3. Reference scenarios for deforestation and forest degradation in support of REDD: a review of data and methods

    NASA Astrophysics Data System (ADS)

    Olander, Lydia P.; Gibbs, Holly K.; Steininger, Marc; Swenson, Jennifer J.; Murray, Brian C.

    2008-04-01

    Global climate policy initiatives are now being proposed to compensate tropical forest nations for reducing carbon emissions from deforestation and forest degradation (REDD). These proposals have the potential to include developing countries more actively in international greenhouse gas mitigation and to address a substantial share of the world's emissions which come from tropical deforestation. For such a policy to be viable it must have a credible benchmark against which emissions reduction can be calculated. This benchmark, sometimes termed a baseline or reference emissions scenario, can be based directly on historical emissions or can use historical emissions as input for business as usual projections. Here, we review existing data and methods that could be used to measure historical deforestation and forest degradation reference scenarios including FAO (Food and Agricultural Organization of the United Nations) national statistics and various remote sensing sources. The freely available and corrected global Landsat imagery for 1990, 2000 and soon to come for 2005 may be the best primary data source for most developing countries with other coarser resolution high frequency or radar data as a valuable complement for addressing problems with cloud cover and for distinguishing larger scale degradation. While sampling of imagery has been effectively useful for pan-tropical and continental estimates of deforestation, wall-to-wall (or full coverage) allows more detailed assessments for measuring national-level reference emissions. It is possible to measure historical deforestation with sufficient certainty for determining reference emissions, but there must be continued calls at the international level for making high-resolution imagery available, and for financial and technical assistance to help countries determine credible reference scenarios. The data available for past years may not be sufficient for assessing all forms of forest degradation, but new data sources will have greater potential in 2007 and after. This paper focuses only on the methods for measuring changes in forest area, but this information must be coupled with estimates of change in forest carbon stocks in order to quantify emissions from deforestation and forest degradation.

  4. Low carbon and clean energy scenarios for India: Analysis of targets approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shukla, Priyadarshi R.; Chaturvedi, Vaibhav

    2012-12-01

    Low carbon energy technologies are gaining increasing importance in India for reducing emissions as well as diversifying its energy supply mix. The present paper presents and analyses a targeted approach for pushing solar, wind and nuclear technologies in the Indian energy market. Targets for these technologies have been constructed on the basis of Indian government documents, policy announcements and expert opinion. Different targets have been set for the reference scenario and the carbon price scenario. In the reference scenario it is found that in the long run all solar, wind and nuclear will achieve their targets without any subsidy push.more » In the short run however, nuclear and solar energy require significant subsidy push. Nuclear energy requires a much higher subsidy allocation as compared to solar because the targets assumed are also higher for nuclear energy. Under a carbon price scenario, the carbon price drives the penetration of these technologies significantly. Still subsidy is required especially in the short run when the carbon price is low. It is also found that pushing solar, wind and nuclear technologies might lead to decrease in share of CCS under the price scenario and biomass under both BAU and price scenario, which implies that one set of low carbon technologies is substituted by other set of low carbon technologies. Thus the objective of emission mitigation might not be achieved due to this substitution. Moreover sensitivity on nuclear energy cost was done to represent risk mitigation for this technology and it was found that higher cost can significantly decrease the share of this technology under both the BAU and carbon price scenario.« less

  5. Dual Mission Scenarios for the Human Lunar Campaign - Performance, Cost and Risk Benefits

    NASA Technical Reports Server (NTRS)

    Saucillo, Rudolph J.; Reeves, David M.; Chrone, Jonathan D.; Stromgren, Chel; Reeves, John D.; North, David D.

    2008-01-01

    Scenarios for human lunar operations with capabilities significantly beyond Constellation Program baseline missions are potentially feasible based on the concept of dual, sequential missions utilizing a common crew and a single Ares I/CEV (Crew Exploration Vehicle). For example, scenarios possible within the scope of baseline technology planning include outpost-based sortie missions and dual sortie missions. Top level cost benefits of these dual sortie scenarios may be estimated by comparison to the Constellation Program reference two-mission-per-year lunar campaign. The primary cost benefit is the accomplishment of Mission B with a "single launch solution" since no Ares I launch is required. Cumulative risk to the crew is lowered since crew exposure to launch risks and Earth return risks are reduced versus comparable Constellation Program reference two-mission-per-year scenarios. Payload-to-the-lunar-surface capability is substantially increased in the Mission B sortie as a result of additional propellant available for Lunar Lander #2 descent. This additional propellant is a result of EDS #2 transferring a smaller stack through trans-lunar injection and using remaining propellant to perform a portion of the lunar orbit insertion (LOI) maneuver. This paper describes these dual mission concepts, including cost, risk and performance benefits per lunar sortie site, and provides an initial feasibility assessment.

  6. Deep water tsunami simulation at global scale using an elastoacoustic approach

    NASA Astrophysics Data System (ADS)

    Salazar Monroy, E. F.; Ramirez-Guzman, L.; Bielak, J.; Sanchez-Sesma, F. J.

    2017-12-01

    In this work, we present the results for the first stage of a tsunami global simulation project using an elastoacoustic approach. The solid-fluid interaction, which is only valid on a global scale and far distances from the coast, is modelled using a finite element scheme for a 2D geometry. Comparing analytic and numerical solutions, we observe a good fit for a homogeneous domain - with an extension of 20 km - using 15 points per wavelength. Subsequently, we performed 2D realizations taking a section from a global 3D model and projecting the Tohoku-Oki source obtained by the USGS. The 3D Global model uses the ETOPO1 and the Preliminary Reference Earth Model (Dziewonski and Anderson, 1981). We analysed 3 cross sections, defined using DART buoys as a reference for each section (i.e., initial and final profile point). Surface water elevation obtained with this coupling strategy is constrained at low frequencies (0.2 Hz). We expect that this coupling strategy could approximate the model to high frequencies and realistic scenarios considering other geometries (i.e., 3D) and a complete domain (i.e., surface and deep).

  7. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was sUGcessful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  8. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was successful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  9. Revisiting the taxonomical classification of Porcine Circovirus type 2 (PCV2): still a real challenge.

    PubMed

    Franzo, Giovanni; Cortey, Martí; Olvera, Alex; Novosel, Dinko; Castro, Alessandra Marnie Martins Gomes De; Biagini, Philippe; Segalés, Joaquim; Drigo, Michele

    2015-08-28

    PCV2 has emerged as one of the most devastating viral infections of swine farming, causing a relevant economic impact due to direct losses and control strategies expenses. Epidemiological and experimental studies have evidenced that genetic diversity is potentially affecting the virulence of PVC2. The growing number of PCV2 complete genomes and partial sequences available at GenBank questioned the accepted PCV2 classification. Nine hundred seventy five PCV2 complete genomes and 1,270 ORF2 sequences available from GenBank were subjected to recombination, PASC and phylogenetic analyses and results were used for comparison with previous classification scheme. The outcome of these analyses favors the recognition of four genotypes on the basis of ORF2 sequences, namely PCV2a, PCV2b, PCV2c and PCV2d-mPCV2b. To deal with the difficulty of founding an unambiguous classification and accounting the impossibility to define a p-distance cut-off, a set of reference sequences that could be used in further phylogenetic studies for PCV2 genotyping was established. Being aware that extensive phylogenetic analyses are time-consuming and often impracticable during routine diagnostic activity, ORF2 nucleotide positions adequately conserved in the reference sequences were identified and reported to allow a quick genotype differentiation. Globally, the present work provides an updated scenario of PCV2 genotypes distribution and, based on the limits of the previous classification criteria, proposes new rapid and effective schemes for differentiating the four defined PCV2 genotypes.

  10. 40 CFR 63.43 - Maximum achievable control technology (MACT) determinations for constructed and reconstructed...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... determinations for alternative operating scenarios. Approval of such determinations satisfies the requirements of section 112(g) of each such scenario. (4) Regardless of the review process, the MACT emission limitation... determined by the permitting authority. (2) Based upon available information, as defined in this subpart, the...

  11. Deep Borehole Disposal Remediation Costs for Off-Normal Outcomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finger, John T.; Cochran, John R.; Hardin, Ernest

    2015-08-17

    This memo describes rough-order-of-magnitude (ROM) cost estimates for a set of off-normal (accident) scenarios, as defined for two waste package emplacement method options for deep borehole disposal: drill-string and wireline. It summarizes the different scenarios and the assumptions made for each, with respect to fishing, decontamination, remediation, etc.

  12. Defining a Simulation Capability Hierarchy for the Modeling of a SeaBase Enabler (SBE)

    DTIC Science & Technology

    2010-09-01

    ability to maintain the sea lanes of communication. Relief efforts in crisis-stricken countries like India in 2007, Aceh Indonesia and Sri Lanka in...the number of entities that were built into the scenario run for each category. 104 Advanced Scenario Results Speed Cargo Rate Escorts SURF

  13. A medical digital library to support scenario and user-tailored information retrieval.

    PubMed

    Chu, W W; Johnson, D B; Kangarloo, H

    2000-06-01

    Current large-scale information sources are designed to support general queries and lack the ability to support scenario-specific information navigation, gathering, and presentation. As a result, users are often unable to obtain desired specific information within a well-defined subject area. Today's information systems do not provide efficient content navigation, incremental appropriate matching, or content correlation. We are developing the following innovative technologies to remedy these problems: 1) scenario-based proxies, enabling the gathering and filtering of information customized for users within a pre-defined domain; 2) context-sensitive navigation and matching, providing approximate matching and similarity links when an exact match to a user's request is unavailable; 3) content correlation of documents, creating semantic links between documents and information sources; and 4) user models for customizing retrieved information and result presentation. A digital medical library is currently being constructed using these technologies to provide customized information for the user. The technologies are general in nature and can provide custom and scenario-specific information in many other domains (e.g., crisis management).

  14. Perioperative fluid therapy: defining a clinical algorithm between insufficient and excessive.

    PubMed

    Strunden, Mike S; Tank, Sascha; Kerner, Thoralf

    2016-12-01

    In the perioperative scenario, adequate fluid and volume therapy is a challenging task. Despite improved knowledge on the physiology of the vascular barrier function and its respective pathophysiologic disturbances during the perioperative process, clear-cut therapeutic principles are difficult to implement. Neglecting the physiologic basis of the vascular barrier and the cardiovascular system, numerous studies proclaiming different approaches to fluid and volume therapy do not provide a rationale, as various surgical and patient risk groups, and different fluid regimens combined with varying hemodynamic measures and variable algorithms led to conflicting results. This review refers to the physiologic basis and answers questions inseparably conjoined to a rational approach to perioperative fluid and volume therapy: Why does fluid get lost from the vasculature perioperatively? Whereto does it get lost? Based on current findings and rationale considerations, which fluid replacement algorithm could be implemented into clinical routine? Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Ecological scenarios analyzed and evaluated by a shallow lake model.

    PubMed

    Kardaetz, Sascha; Strube, Torsten; Brüggemann, Rainer; Nützmann, Gunnar

    2008-07-01

    We applied the complex ecosystem model EMMO, which was adopted to the shallow lake Müggelsee (Germany), in order to evaluate a large set of ecological scenarios. By means of EMMO, 33 scenarios and 17 indicators were defined to characterize their effects on the lake ecosystem. The indicators were based on model outputs of EMMO and can be separated into biological indicators, such as chlorophyll-a and cyanobacteria, and hydro-chemical indicators, such as phosphorus. The question to be solved was, what is the ranking of the scenarios based on their characterization by these 17 indicators? And how can we handle high quantities of complex data within evaluation procedures? The scenario evaluation was performed by partial order theory which, however, did not provide a clear result. By subsequently applying the hierarchical cluster analysis (complete linkage) it was possible to reduce the data matrix to indicator and scenario representatives. Even though this step implies losses of information, it simplifies the application of partial order theory and the post processing by METEOR. METEOR is derived from partial order theory and allows the stepwise aggregation of indicators, which subsequently leads to a distinct and clear decision. In the final evaluation result the best scenario was the one which defines a minimum nutrient input and no phosphorus release from the sediment while the worst scenario is characterized by a maximum nutrient input and extensive phosphorus release from the sediment. The reasonable and comprehensive results show that the combination of partial order, cluster analysis and METEOR can handle big amounts of data in a very clear and transparent way, and therefore is ideal in the context of complex ecosystem models, like that we applied.

  16. Greenhouse gas emissions and land use change from Jatropha curcas-based jet fuel in Brazil.

    PubMed

    Bailis, Robert E; Baka, Jennifer E

    2010-11-15

    This analysis presents a comparison of life-cycle GHG emissions from synthetic paraffinic kerosene (SPK) produced as jet fuel substitute from jatropha curcas feedstock cultivated in Brazil against a reference scenario of conventional jet fuel. Life cycle inventory data are derived from surveys of actual Jatropha growers and processors. Results indicate that a baseline scenario, which assumes a medium yield of 4 tons of dry fruit per hectare under drip irrigation with existing logistical conditions using energy-based coproduct allocation methodology, and assumes a 20-year plantation lifetime with no direct land use change (dLUC), results in the emissions of 40 kg CO₂e per GJ of fuel produced, a 55% reduction relative to conventional jet fuel. However, dLUC based on observations of land-use transitions leads to widely varying changes in carbon stocks ranging from losses in excess of 50 tons of carbon per hectare when Jatropha is planted in native cerrado woodlands to gains of 10-15 tons of carbon per hectare when Jatropha is planted in former agro-pastoral land. Thus, aggregate emissions vary from a low of 13 kg CO₂e per GJ when Jatropha is planted in former agro-pastoral lands, an 85% decrease from the reference scenario, to 141 kg CO₂e per GJ when Jatropha is planted in cerrado woodlands, a 60% increase over the reference scenario. Additional sensitivities are also explored, including changes in yield, exclusion of irrigation, shortened supply chains, and alternative allocation methodologies.

  17. Responding to cough presentations: an interview study with Cambodian pharmacies participating in a National Tuberculosis Referral Program.

    PubMed

    Bell, Carolyn A; Pichenda, Koeut; Ilomäki, Jenni; Duncan, Gregory J; Eang, Mao Tan; Saini, Bandana

    2016-04-01

    Asia-Pacific carries a high burden of respiratory-related mortality. Timely referral and detection of tuberculosis cases optimizes patient and public health outcomes. Registered private pharmacies in Cambodia participate in a National Tuberculosis Referral Program to refer clients with cough suggestive of tuberculosis to public sector clinics for diagnosis and care. The objective of this study was to investigate clinical intentions of pharmacy staff when presented with a hypothetical case of a client with prolonged cough suggestive of tuberculosis. A random sample of 180 pharmacies was selected. Trained interviewers administered a hypothetical case scenario to trained pharmacy staff. Participants provided 'yes'/'no' responses to five clinical actions presented in the scenario. Actions were not mutually exclusive. Data were tabulated and compared using chi-square tests or Fisher's exact tests. Overall, 156 (92%) participants would have referred the symptomatic client in the case scenario. Participants who would have referred the client were less likely to sell a cough medicine (42% vs. 100%, P < 0.001) and less likely to sell an antibiotic (19% vs. 79%, P < 0.001) than those who would not have referred the client. Involving pharmacies in a Referral Program may have introduced concepts of appropriate clinical care when responding to clients presenting with cough suggestive of tuberculosis. However, results showed enhancing clinical competence among all referral programme participants particularly among non-referring pharmacies and those making concurrent sales of cough-related products would optimize pharmacy-initiated referral. Further research into actual clinical practices at Referral Program pharmacies would be justified. © 2015 John Wiley & Sons, Ltd.

  18. Uncertainty in training image-based inversion of hydraulic head data constrained to ERT data: Workflow and case study

    NASA Astrophysics Data System (ADS)

    Hermans, Thomas; Nguyen, Frédéric; Caers, Jef

    2015-07-01

    In inverse problems, investigating uncertainty in the posterior distribution of model parameters is as important as matching data. In recent years, most efforts have focused on techniques to sample the posterior distribution with reasonable computational costs. Within a Bayesian context, this posterior depends on the prior distribution. However, most of the studies ignore modeling the prior with realistic geological uncertainty. In this paper, we propose a workflow inspired by a Popper-Bayes philosophy that data should first be used to falsify models, then only be considered for matching. We propose a workflow consisting of three steps: (1) in defining the prior, we interpret multiple alternative geological scenarios from literature (architecture of facies) and site-specific data (proportions of facies). Prior spatial uncertainty is modeled using multiple-point geostatistics, where each scenario is defined using a training image. (2) We validate these prior geological scenarios by simulating electrical resistivity tomography (ERT) data on realizations of each scenario and comparing them to field ERT in a lower dimensional space. In this second step, the idea is to probabilistically falsify scenarios with ERT, meaning that scenarios which are incompatible receive an updated probability of zero while compatible scenarios receive a nonzero updated belief. (3) We constrain the hydrogeological model with hydraulic head and ERT using a stochastic search method. The workflow is applied to a synthetic and a field case studies in an alluvial aquifer. This study highlights the importance of considering and estimating prior uncertainty (without data) through a process of probabilistic falsification.

  19. Multiscale Resilience of Complex Systems

    NASA Astrophysics Data System (ADS)

    Tchiguirinskaia, I.; Schertzer, D. J. M.; Giangola-Murzyn, A.; Hoang Cong, T.

    2014-12-01

    We first argue the need for well defined resilience metrics to better evaluate the resilience of complex systems such as (peri-)urban flood management systems. We review both the successes and limitations of resilience metrics in the framework of dynamical systems and their generalization in the framework of the viability theory. We then point out that the most important step to achieve is to define resilience across scales instead of doing it at a given scale. Our preliminary, critical analysis of the series of attempts to define an operational resilience metrics led us to consider a scale invariant metrics based on the scale independent codimension of extreme singularities. Multifractal downscaling of climate scenarios can be considered as a first illustration. We focussed on a flood scenario evaluation method with the help of two singularities γ_s and γ_Max, corresponding respectively to an effective and a probable maximum singularity, that yield an innovative framework to address the issues of flood resilience systems in a scale independent manner. Indeed, the stationarity of the universal multifractal parameters would result into a rather stable value of probable maximum singularity γ_s. By fixing the limit of acceptability for a maximum flood water depth at a given scale, with a corresponding singularity, we effectively fix the threshold of the probable maximum singularity γ_s as a criterion of the flood resilience we accept. Then various scenarios of flood resilient measures could be simulated with the help of Multi-Hydro under upcoming climat scenarios. The scenarios that result in estimates of either γ_Max or γ_s below the pre-selected γ_s value will assure the effective flood resilience of the whole modeled system across scales. The research for this work was supported, in part, by the EU FP7 SMARTesT and INTERREG IVB RainGain projects.

  20. A statistically robust EEG re-referencing procedure to mitigate reference effect

    PubMed Central

    Lepage, Kyle Q.; Kramer, Mark A.; Chu, Catherine J.

    2014-01-01

    Background The electroencephalogram (EEG) remains the primary tool for diagnosis of abnormal brain activity in clinical neurology and for in vivo recordings of human neurophysiology in neuroscience research. In EEG data acquisition, voltage is measured at positions on the scalp with respect to a reference electrode. When this reference electrode responds to electrical activity or artifact all electrodes are affected. Successful analysis of EEG data often involves re-referencing procedures that modify the recorded traces and seek to minimize the impact of reference electrode activity upon functions of the original EEG recordings. New method We provide a novel, statistically robust procedure that adapts a robust maximum-likelihood type estimator to the problem of reference estimation, reduces the influence of neural activity from the re-referencing operation, and maintains good performance in a wide variety of empirical scenarios. Results The performance of the proposed and existing re-referencing procedures are validated in simulation and with examples of EEG recordings. To facilitate this comparison, channel-to-channel correlations are investigated theoretically and in simulation. Comparison with existing methods The proposed procedure avoids using data contaminated by neural signal and remains unbiased in recording scenarios where physical references, the common average reference (CAR) and the reference estimation standardization technique (REST) are not optimal. Conclusion The proposed procedure is simple, fast, and avoids the potential for substantial bias when analyzing low-density EEG data. PMID:24975291

  1. A new framework for climate sensitivity and prediction: a modelling perspective

    NASA Astrophysics Data System (ADS)

    Ragone, Francesco; Lucarini, Valerio; Lunkeit, Frank

    2016-03-01

    The sensitivity of climate models to increasing CO2 concentration and the climate response at decadal time-scales are still major factors of uncertainty for the assessment of the long and short term effects of anthropogenic climate change. While the relative slow progress on these issues is partly due to the inherent inaccuracies of numerical climate models, this also hints at the need for stronger theoretical foundations to the problem of studying climate sensitivity and performing climate change predictions with numerical models. Here we demonstrate that it is possible to use Ruelle's response theory to predict the impact of an arbitrary CO2 forcing scenario on the global surface temperature of a general circulation model. Response theory puts the concept of climate sensitivity on firm theoretical grounds, and addresses rigorously the problem of predictability at different time-scales. Conceptually, these results show that performing climate change experiments with general circulation models is a well defined problem from a physical and mathematical point of view. Practically, these results show that considering one single CO2 forcing scenario is enough to construct operators able to predict the response of climatic observables to any other CO2 forcing scenario, without the need to perform additional numerical simulations. We also introduce a general relationship between climate sensitivity and climate response at different time scales, thus providing an explicit definition of the inertia of the system at different time scales. This technique allows also for studying systematically, for a large variety of forcing scenarios, the time horizon at which the climate change signal (in an ensemble sense) becomes statistically significant. While what we report here refers to the linear response, the general theory allows for treating nonlinear effects as well. These results pave the way for redesigning and interpreting climate change experiments from a radically new perspective.

  2. Bridging Scales: Developing a Framework to Build a City-Scale Environmental Scenario for Japanese Municipalities

    NASA Astrophysics Data System (ADS)

    Hashimoto, S.; Fujita, T.; Nakayama, T.; Xu, K.

    2007-12-01

    There is an ongoing project on establishing environmental scenarios in Japan to evaluate middle to long-term environmental policy and technology options toward low carbon society. In this project, the time horizon of the scenarios is set for 2050 on the ground that a large part of social infrastructure in Japan is likely to be renovated by that time, and cities are supposed to play important roles in building low carbon society in Japan. This belief is held because cities or local governments could implement various policies and programs, such as land use planning and promotion of new technologies with low GHG emissions, which produce an effect in an ununiform manner, taking local socio-economic conditions into account, while higher governments, either national or prefectural, could impose environmental tax on electricity and gas to alleviate ongoing GHG emissions, which uniformly covers their jurisdictions. In order for local governments to devise and implement concrete administrative actions equipped with rational policies and technologies, referring the environmental scenarios developed for the entire nation, we need to localize the national scenarios, both in terms of spatial and temporal extent, so that they could better reflect local socio-economic and institutional conditions. In localizing the national scenarios, the participation of stakeholders is significant because they play major roles in shaping future society. Stakeholder participation in the localization process would bring both creative and realistic inputs on how future unfolds on a city scale. In this research, 1) we reviewed recent efforts on international and domestic scenario development to set a practical time horizon for a city-scale environmental scenario, which would lead to concrete environmental policies and programs, 2) designed a participatory scenario development/localization process, drawing on the framework of the 'Story-and-Simulation' or SAS approach, which Alcamo(2001) proposed, and 3) started implementing it to the city of Kawasaki, Kanagawa, Japan, in cooperation with municipal officials and stakeholders. The participatory process is to develop city-scale environmental scenarios toward low carbon society, referring international and domestic environmental scenarios. Though the scenario development is still in process, it has already brought practical knowledge about and experience on how to bridge scenarios developed for different temporal and spatial scales.

  3. Communications payload concepts for geostationary facilities

    NASA Technical Reports Server (NTRS)

    Poley, William A.; Lekan, Jack

    1987-01-01

    Summarized and compared are the major results of two NASA sponsored studies that defined potential communication payload concepts to meet the satellite traffic forecast for the turn of the century for the continental US and Region 2 of the International Telecommunications Union. The studies were performed by the Ford Aerospace and Communications Corporation and RCA Astro-Electronics (now GE-RCA Astro-Space Division). Future scenarios of aggregations of communications services are presented. Payload concepts are developed and defined in detail for nine of the scenarios. Payload costs and critical technologies per payload are also presented. Finally the payload concepts are compared and the findings of the reports are discussed.

  4. Objective comparison of particle tracking methods.

    PubMed

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R; Godinez, William J; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E G; Jaldén, Joakim; Blau, Helen M; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P; Dan, Han-Wei; Tsai, Yuh-Show; Ortiz de Solórzano, Carlos; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-03-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Because manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized an open competition in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to notable practical conclusions for users and developers.

  5. Long-term US energy outlook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friesen, G.

    Chase Econometrics summarizes the assumptions underlying long-term US energy forecasts. To illustrate the uncertainty involved in forecasting for the period to the year 2000, they compare Chase Econometrics forecasts with some recent projections prepared by the DOE Office of Policy, Planning and Analysis for the annual National Energy Policy Plan supplement. Scenario B, the mid-range reference case, is emphasized. The purpose of providing Scenario B as well as Scenarios A and C as alternate cases is to show the sensitivity of oil price projections to small swings in energy demand. 4 tables.

  6. Testbeds for Assessing Critical Scenarios in Power Control Systems

    NASA Astrophysics Data System (ADS)

    Dondossola, Giovanna; Deconinck, Geert; Garrone, Fabrizio; Beitollahi, Hakem

    The paper presents a set of control system scenarios implemented in two testbeds developed in the context of the European Project CRUTIAL - CRitical UTility InfrastructurAL Resilience. The selected scenarios refer to power control systems encompassing information and communication security of SCADA systems for grid teleoperation, impact of attacks on inter-operator communications in power emergency conditions, impact of intentional faults on the secondary and tertiary control in power grids with distributed generators. Two testbeds have been developed for assessing the effect of the attacks and prototyping resilient architectures.

  7. Effects of tree size and spatial distribution on growth of ponderosa pine forests under alternative management scenarios

    Treesearch

    C.W. Woodall; C.E. Fiedler; R.E. McRoberts

    2009-01-01

    Forest ecosystems may be actively managed toward heterogeneous stand structures to provide both economic (e.g., wood production and carbon credits) and environmental benefits (e.g., invasive pest resistance). In order to facilitate wider adoption of possibly more sustainable forest stand structures, defining growth expectations among alternative management scenarios is...

  8. Studying the Ability of 7th Grade Students to Define the Circle and Its Elements in the Context of Mathematical Language

    ERIC Educational Resources Information Center

    Akarsu, Esra; Yilmaz, Süha

    2015-01-01

    In this study, it was aimed to study the mathematical language skills that the 7th grade students use in defining the circle and its elements. In the study, the mathematical language skills of students that they use in defining the circle and its elements in a scenario were compared to the mathematical language skills they use in defining them…

  9. Orbital construction support equipment

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Approximately 200 separate construction steps were defined for the three solar power satellite (SPS) concepts. Detailed construction scenarios were developed which describe the specific tasks to be accomplished, and identify general equipment requirements. The scenarios were used to perform a functional analysis, which resulted in the definition of 100 distinct SPS elements. These elements are the components, parts, subsystems, or assemblies upon which construction activities take place. The major SPS elements for each configuration are shown. For those elements, 300 functional requirements were identified in seven generic processes. Cumulatively, these processes encompass all functions required during SPS construction/assembly. Individually each process is defined such that it includes a specific type of activity. Each SPS element may involve activities relating to any or all of the generic processes. The processes are listed, and examples of the requirements defined for a typical element are given.

  10. A Comparison of the Kernel Equating Method with Traditional Equating Methods Using SAT[R] Data

    ERIC Educational Resources Information Center

    Liu, Jinghua; Low, Albert C.

    2008-01-01

    This study applied kernel equating (KE) in two scenarios: equating to a very similar population and equating to a very different population, referred to as a distant population, using SAT[R] data. The KE results were compared to the results obtained from analogous traditional equating methods in both scenarios. The results indicate that KE results…

  11. The psychometrics of mental workload: multiple measures are sensitive but divergent.

    PubMed

    Matthews, Gerald; Reinerman-Jones, Lauren E; Barber, Daniel J; Abich, Julian

    2015-02-01

    A study was run to test the sensitivity of multiple workload indices to the differing cognitive demands of four military monitoring task scenarios and to investigate relationships between indices. Various psychophysiological indices of mental workload exhibit sensitivity to task factors. However, the psychometric properties of multiple indices, including the extent to which they intercorrelate, have not been adequately investigated. One hundred fifty participants performed in four task scenarios based on a simulation of unmanned ground vehicle operation. Scenarios required threat detection and/or change detection. Both single- and dual-task scenarios were used. Workload metrics for each scenario were derived from the electroencephalogram (EEG), electrocardiogram, transcranial Doppler sonography, functional near infrared, and eye tracking. Subjective workload was also assessed. Several metrics showed sensitivity to the differing demands of the four scenarios. Eye fixation duration and the Task Load Index metric derived from EEG were diagnostic of single-versus dual-task performance. Several other metrics differentiated the two single tasks but were less effective in differentiating single- from dual-task performance. Psychometric analyses confirmed the reliability of individual metrics but failed to identify any general workload factor. An analysis of difference scores between low- and high-workload conditions suggested an effort factor defined by heart rate variability and frontal cortex oxygenation. General workload is not well defined psychometrically, although various individual metrics may satisfy conventional criteria for workload assessment. Practitioners should exercise caution in using multiple metrics that may not correspond well, especially at the level of the individual operator.

  12. A new scenario framework for climate change research: The concept of Shared Climate Policy Assumptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kriegler, Elmar; Edmonds, James A.; Hallegatte, Stephane

    2014-04-01

    The paper presents the concept of shared climate policy assumptions as an important element of the new scenario framework. Shared climate policy assumptions capture key climate policy dimensions such as the type and scale of mitigation and adaptation measures. They are not specified in the socio-economic reference pathways, and therefore introduce an important third dimension to the scenario matrix architecture. Climate policy assumptions will have to be made in any climate policy scenario, and can have a significant impact on the scenario description. We conclude that a meaningful set of shared climate policy assumptions is useful for grouping individual climatemore » policy analyses and facilitating their comparison. Shared climate policy assumptions should be designed to be policy relevant, and as a set to be broad enough to allow a comprehensive exploration of the climate change scenario space.« less

  13. Analysis of Pulsed Flow Modification Alternatives, Lower Missouri River, 2005

    USGS Publications Warehouse

    Jacobson, Robert B.

    2008-01-01

    The graphical, tabular, and statistical data presented in this report resulted from analysis of alternative flow regime designs considered by a group of Missouri River managers, stakeholders, and scientists during the summer of 2005. This plenary group was charged with designing a flow regime with increased spring flow pulses to support reproduction and survival of the endangered pallid sturgeon. Environmental flow components extracted from the reference natural flow regime were used to design and assess performance of alternative flow regimes. The analysis is based on modeled flow releases from Gavins Point Dam (near Yankton, South Dakota) for nine design alternatives and two reference scenarios; the reference scenarios are the run-of-the-river and the water-control plan implemented in 2004. The alternative designs were developed by the plenary group with the goal of providing pulsed spring flows, while retaining traditional social and economic uses of the river.

  14. Development of groundwater pesticide exposure modeling scenarios for vulnerable spring and winter wheat-growing areas.

    PubMed

    Padilla, Lauren; Winchell, Michael; Peranginangin, Natalia; Grant, Shanique

    2017-11-01

    Wheat crops and the major wheat-growing regions of the United States are not included in the 6 crop- and region-specific scenarios developed by the US Environmental Protection Agency (USEPA) for exposure modeling with the Pesticide Root Zone Model conceptualized for groundwater (PRZM-GW). The present work augments the current scenarios by defining appropriately vulnerable PRZM-GW scenarios for high-producing spring and winter wheat-growing regions that are appropriate for use in refined pesticide exposure assessments. Initial screening-level modeling was conducted for all wheat areas across the conterminous United States as defined by multiple years of the Cropland Data Layer land-use data set. Soil, weather, groundwater temperature, evaporation depth, and crop growth and management practices were characterized for each wheat area from publicly and nationally available data sets and converted to input parameters for PRZM. Approximately 150 000 unique combinations of weather, soil, and input parameters were simulated with PRZM for an herbicide applied for postemergence weed control in wheat. The resulting postbreakthrough average herbicide concentrations in a theoretical shallow aquifer were ranked to identify states with the largest regions of relatively vulnerable wheat areas. For these states, input parameters resulting in near 90 th percentile postbreakthrough average concentrations corresponding to significant wheat areas with shallow depth to groundwater formed the basis for 4 new spring wheat scenarios and 4 new winter wheat scenarios to be used in PRZM-GW simulations. Spring wheat scenarios were identified in North Dakota, Montana, Washington, and Texas. Winter wheat scenarios were identified in Oklahoma, Texas, Kansas, and Colorado. Compared to the USEPA's original 6 scenarios, postbreakthrough average herbicide concentrations in the new scenarios were lower than all but Florida Potato and Georgia Coastal Peanuts of the original scenarios and better represented regions dominated by wheat crops. Integr Environ Assess Manag 2017;13:992-1006. © 2017 The Authors. Integrated Environmental Assessment and Management Published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management Published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  15. Linking multimetric and multivariate approaches to assess the ecological condition of streams.

    PubMed

    Collier, Kevin J

    2009-10-01

    Few attempts have been made to combine multimetric and multivariate analyses for bioassessment despite recognition that an integrated method could yield powerful tools for bioassessment. An approach is described that integrates eight macroinvertebrate community metrics into a Principal Components Analysis to develop a Multivariate Condition Score (MCS) from a calibration dataset of 511 samples. The MCS is compared to an Index of Biotic Integrity (IBI) derived using the same metrics based on the ratio to the reference site mean. Both approaches were highly correlated although the MCS appeared to offer greater potential for discriminating a wider range of impaired conditions. Both the MCS and IBI displayed low temporal variability within reference sites, and were able to distinguish between reference conditions and low levels of catchment modification and local habitat degradation, although neither discriminated among three levels of low impact. Pseudosamples developed to test the response of the metric aggregation approaches to organic enrichment, urban, mining, pastoral and logging stressor scenarios ranked pressures in the same order, but the MCS provided a lower score for the urban scenario and a higher score for the pastoral scenario. The MCS was calculated for an independent test dataset of urban and reference sites, and yielded similar results to the IBI. Although both methods performed comparably, the MCS approach may have some advantages because it removes the subjectivity of assigning thresholds for scoring biological condition, and it appears to discriminate a wider range of degraded conditions.

  16. US National Climate Assessment (NCA) Scenarios for Assessing Our Climate Future: Issues and Methodological Perspectives Background Whitepaper for Participants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moss, Richard H.; Engle, Nathan L.; Hall, John

    This whitepaper is intended to provide a starting point for discussion at a workshop for the National Climate Assessment (NCA) that focuses on the use and development of scenarios. The paper will provide background needed by participants in the workshop in order to review options for developing and using scenarios in NCA. The paper briefly defines key terms and establishes a conceptual framework for developing consistent scenarios across different end uses and spatial scales. It reviews uses of scenarios in past U.S. national assessments and identifies potential users of and needs for scenarios for both the report scheduled for releasemore » in June 2013 and to support an ongoing distributed assessment process in sectors and regions around the country. Because scenarios prepared for the NCA will need to leverage existing research, the paper takes account of recent scientific advances and activities that could provide needed inputs. Finally, it considers potential approaches for providing methods, data, and other tools for assessment participants. We note that the term 'scenarios' has many meanings. An important goal of the whitepaper (and portions of the workshop agenda) is pedagogical (i.e., to compare different meanings and uses of the term and make assessment participants aware of the need to be explicit about types and uses of scenarios). In climate change research, scenarios have been used to establish bounds for future climate conditions and resulting effects on human and natural systems, given a defined level of greenhouse gas emissions. This quasi-predictive use contrasts with the way decision analysts typically use scenarios (i.e., to consider how robust alternative decisions or strategies may be to variation in key aspects of the future that are uncertain). As will be discussed, in climate change research and assessment, scenarios describe a range of aspects of the future, including major driving forces (both human activities and natural processes), changes in climate and related environmental conditions (e.g., sea level), and evolution of societal capability to respond to climate change. This wide range of scenarios is needed because the implications of climate change for the environment and society depend not only on changes in climate themselves, but also on human responses. This degree of breadth introduces and number of challenges for communication and research.« less

  17. Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City

    NASA Astrophysics Data System (ADS)

    Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo

    2014-05-01

    The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen in future and based on a meaningful synthesis of parameters' values with control of their correlations for maintaining internal consistencies. This paper aims at incorporating a set of data mining and sampling tools to assess uncertainty of model outputs under future climatic and socio-economic changes for Dhaka city and providing a decision support system for robust flood management and mitigation policies. After constructing an uncertainty matrix to identify the main sources of uncertainty for Dhaka City, we identify several hazard and vulnerability maps based on future climatic and socio-economic scenarios. The vulnerability of each flood management alternative under different set of scenarios is determined and finally the robustness of each plausible solution considered is defined based on the above assessment.

  18. A Physical Model for Three-Phase Compaction in Silicic Magma Reservoirs

    NASA Astrophysics Data System (ADS)

    Huber, Christian; Parmigiani, Andrea

    2018-04-01

    We develop a model for phase separation in magma reservoirs containing a mixture of silicate melt, crystals, and fluids (exsolved volatiles). The interplay between the three phases controls the dynamics of phase separation and consequently the chemical and physical evolution of magma reservoirs. The model we propose is based on the two-phase damage theory approach of Bercovici et al. (2001, https://doi.org/10.1029/2000JB900430) and Bercovici and Ricard (2003, https://doi.org/10.1046/j.1365-246X.2003.01854.x) because it offers the leverage of considering interface (in the macroscopic limit) between phases that can deform depending on the mechanical work and phase changes taking place locally in the magma. Damage models also offer the advantage that pressure is defined uniquely to each phase and does not need to be equal among phases, which will enable us to consider, in future studies, the large capillary pressure at which fluids are mobilized in mature, crystal-rich, magma bodies. In this first analysis of three-phase compaction, we solve the three-phase compaction equations numerically for a simple 1-D problem where we focus on the effect of fluids on the efficiency of melt-crystal separation considering the competition between viscous and buoyancy stresses only. We contrast three sets of simulations to explore the behavior of three-phase compaction, a melt-crystal reference compaction scenario (two-phase compaction), a three-phase scenario without phase changes, and finally a three-phase scenario with a parameterized second boiling (crystallization-induced exsolution). The simulations show a dramatic difference between two-phase (melt crystals) and three-phase (melt-crystals-exsolved volatiles) compaction-driven phase separation. We find that the presence of a lighter, significantly less viscous fluid hinders melt-crystal separation.

  19. Information-seeking behaviors of medical students: a classification of questions asked of librarians and physicians.

    PubMed Central

    Wildemuth, B M; de Bliek, R; Friedman, C P; Miya, T S

    1994-01-01

    To solve a problem, a person often asks questions of someone with more expertise. This paper reports on a study of the types of questions asked and how the experts are chosen. In the study, sixty-three first-year medical students responded to clinical scenarios, each describing a patient affected by a toxin and asking questions concerning the identity of the toxin and its characteristics. After answering those questions, the students were asked to imagine that they had access to a medical reference librarian and an internist specializing in toxicology. The students then generated two questions for each expert about each clinical scenario. Each question was categorized according to the type of information requested, and the frequency of each type of question was calculated. The study found that students most often asked for the identification of the toxin(s), references about the scenario, or the effects of the toxin; an explanation of the patient's symptoms; or a description of the appropriate treatment. Students were more likely to address questions on the identity of the toxin and references to the hypothetical librarian; they were more likely to ask the internist for explanations of the symptoms and descriptions of the treatment. The implications of these results for the design of information and educational systems are discussed. PMID:7920340

  20. Software Architecture: Managing Design for Achieving Warfighter Capability

    DTIC Science & Technology

    2007-04-30

    The Government’s requirements and specifications for a new weapon...at the Preliminary Design Review (PDR) is likely to have a much higher probability of meeting the warfighters’ need for capability. Test -case...inventories of test cases are developed from the user-defined scenarios so that there is one or more test case for every scenario. The test cases will

  1. Secondary School Education in Assam (India) with Special Reference to Mathematics

    ERIC Educational Resources Information Center

    Das, N. R.; Baruah, Karuna

    2010-01-01

    This paper describes the prevailing academic scenarios of a representative group of secondary schools in Assam (India) with special references to students performance in general and mathematics performance in particular. The state of Assam is one of the economically backward regions of India and is witnessing socio-political disturbances mainly…

  2. Implications of an Absolute Simultaneity Theory for Cosmology and Universe Acceleration

    PubMed Central

    Kipreos, Edward T.

    2014-01-01

    An alternate Lorentz transformation, Absolute Lorentz Transformation (ALT), has similar kinematics to special relativity yet maintains absolute simultaneity in the context of a preferred reference frame. In this study, it is shown that ALT is compatible with current experiments to test Lorentz invariance only if the proposed preferred reference frame is locally equivalent to the Earth-centered non-rotating inertial reference frame, with the inference that in an ALT framework, preferred reference frames are associated with centers of gravitational mass. Applying this theoretical framework to cosmological data produces a scenario of universal time contraction in the past. In this scenario, past time contraction would be associated with increased levels of blueshifted light emissions from cosmological objects when viewed from our current perspective. The observation that distant Type Ia supernovae are dimmer than predicted by linear Hubble expansion currently provides the most direct evidence for an accelerating universe. Adjusting for the effects of time contraction on a redshift–distance modulus diagram produces a linear distribution of supernovae over the full redshift spectrum that is consistent with a non-accelerating universe. PMID:25536116

  3. Implications of an absolute simultaneity theory for cosmology and universe acceleration.

    PubMed

    Kipreos, Edward T

    2014-01-01

    An alternate Lorentz transformation, Absolute Lorentz Transformation (ALT), has similar kinematics to special relativity yet maintains absolute simultaneity in the context of a preferred reference frame. In this study, it is shown that ALT is compatible with current experiments to test Lorentz invariance only if the proposed preferred reference frame is locally equivalent to the Earth-centered non-rotating inertial reference frame, with the inference that in an ALT framework, preferred reference frames are associated with centers of gravitational mass. Applying this theoretical framework to cosmological data produces a scenario of universal time contraction in the past. In this scenario, past time contraction would be associated with increased levels of blueshifted light emissions from cosmological objects when viewed from our current perspective. The observation that distant Type Ia supernovae are dimmer than predicted by linear Hubble expansion currently provides the most direct evidence for an accelerating universe. Adjusting for the effects of time contraction on a redshift-distance modulus diagram produces a linear distribution of supernovae over the full redshift spectrum that is consistent with a non-accelerating universe.

  4. Explicitly Accounting for Protected Lands within the GCAM 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dooley, James J.; Zhou, Yuyu

    2012-05-01

    The Global Change Assessment Model Version 3.0 defines three different levels of “Protected Lands” within the agricultural and landuse component. These three different scenarios effectively cordon off 3.5% (5.0 million km2) of the Earth’s terrestrial lands in the de minimus Protected Land Scenario, 5.0% (7.20 million km2) in the Core Protected Land Scenario, and 8.2% (11.8 million km2) in the Expanded Protected Land Scenario. None of these scenarios represents the “right” level of Protected Lands for the planet today or tomorrow. Rather, the goal is to create a range of scenarios that can be used in modeling human responses tomore » climate change and the impact those would have on managed and unmanaged terrestrial lands. These scenarios harness the wealth of information in the United Nations Environment Programme World Conservation Monitoring Centre’s World Database on Protected Areas and its categories of explicit degrees of protection.« less

  5. Mise en Scene: Conversion of Scenarios to CSP Traces for the Requirements-to-Design-to-Code Project

    NASA Technical Reports Server (NTRS)

    Carter. John D.; Gardner, William B.; Rash, James L.; Hinchey, Michael G.

    2007-01-01

    The "Requirements-to-Design-to-Code" (R2D2C) project at NASA's Goddard Space Flight Center is based on deriving a formal specification expressed in Communicating Sequential Processes (CSP) notation from system requirements supplied in the form of CSP traces. The traces, in turn, are to be extracted from scenarios, a user-friendly medium often used to describe the required behavior of computer systems under development. This work, called Mise en Scene, defines a new scenario medium (Scenario Notation Language, SNL) suitable for control-dominated systems, coupled with a two-stage process for automatic translation of scenarios to a new trace medium (Trace Notation Language, TNL) that encompasses CSP traces. Mise en Scene is offered as an initial solution to the problem of the scenarios-to-traces "D2" phase of R2D2C. A survey of the "scenario" concept and some case studies are also provided.

  6. Towards a Psychological Construct of Being Moved

    PubMed Central

    Menninghaus, Winfried; Wagner, Valentin; Hanich, Julian; Wassiliwizky, Eugen; Kuehnast, Milena; Jacobsen, Thomas

    2015-01-01

    The emotional state of being moved, though frequently referred to in both classical rhetoric and current language use, is far from established as a well-defined psychological construct. In a series of three studies, we investigated eliciting scenarios, emotional ingredients, appraisal patterns, feeling qualities, and the affective signature of being moved and related emotional states. The great majority of the eliciting scenarios can be assigned to significant relationship and critical life events (especially death, birth, marriage, separation, and reunion). Sadness and joy turned out to be the two preeminent emotions involved in episodes of being moved. Both the sad and the joyful variants of being moved showed a coactivation of positive and negative affect and can thus be ranked among the mixed emotions. Moreover, being moved, while featuring only low-to-mid arousal levels, was experienced as an emotional state of high intensity; this applied to responses to fictional artworks no less than to own-life and other real, but media-represented, events. The most distinctive findings regarding cognitive appraisal dimensions were very low ratings for causation of the event by oneself and for having the power to change its outcome, along with very high ratings for appraisals of compatibility with social norms and self-ideals. Putting together the characteristics identified and discussed throughout the three studies, the paper ends with a sketch of a psychological construct of being moved. PMID:26042816

  7. Parametric Analysis for Aurora Mars Manned Mission Concept Definition

    NASA Astrophysics Data System (ADS)

    Augros, P.; Bonnefond, F.; Ranson, S.

    In the frame of the Aurora program (ESA program), Europe plans to get its own vision about future Mars manned mission. Within this context, we have performed an end-to-end analysis of what could be these missions, focusing on transportation aspects and mobile in-situ infrastructure. This paper will define what is needed to land on Mars, what is needed to return from Mars surface, will explore the round trip options and their consequences on the mission design and feasibility and will analyze the launcher issue and the in-orbit assembly scenarios. The main results enable to rediscover a candidate mission based on a scenario close to the NASA reference mission (Ref [1]). The main interest, from transportation point of view, is that the spacecraft are similar: same insertion stage, same descent vehicle. Such design can be possible with deployable aeroshield for Mars entry vehicle, in-situ water and propellant production, improved habitat technology, conjunction like round trip (minimum V avoiding science fiction design), a launcher payload capability of 100 tons in LEO with a payload size of 30 m long and 7.5 m diameter. An alternative, limiting also the overall mass in LEO, could be a no Mars infrastructure deployment and a single spacecraft going to Mars and returning back to Earth. But it implies for the crew to stay in Mars orbit several months, waiting for the next opportunity ensuring a minimum V.

  8. An Anomalous Composition in Slow Solar Wind as a Signature of Magnetic Reconnection in its Source Region

    NASA Astrophysics Data System (ADS)

    Zhao, L.; Landi, E.; Lepri, S. T.; Kocher, M.; Zurbuchen, T. H.; Fisk, L. A.; Raines, J. M.

    2017-01-01

    In this paper, we study a subset of slow solar winds characterized by an anomalous charge state composition and ion temperatures compared to average solar wind distributions, and thus referred to as an “Outlier” wind. We find that although this wind is slower and denser than normal slow wind, it is accelerated from the same source regions (active regions and quiet-Sun regions) as the latter and its occurrence rate depends on the solar cycle. The defining property of the Outlier wind is that its charge state composition is the same as that of normal slow wind, with the only exception being a very large decrease in the abundance of fully charged species (He2+, C6+, N7+, O8+, Mg12+), resulting in a significant depletion of the He and C element abundances. Based on these observations, we suggest three possible scenarios for the origin of this wind: (1) local magnetic waves preferentially accelerating non-fully stripped ions over fully stripped ions from a loop opened by reconnection; (2) depleted fully stripped ions already contained in the corona magnetic loops before they are opened up by reconnection; or (3) fully stripped ions depleted by Coulomb collision after magnetic reconnection in the solar corona. If any one of these three scenarios is confirmed, the Outlier wind represents a direct signature of slow wind release through magnetic reconnection.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, L.; Landi, E.; Lepri, S. T.

    In this paper, we study a subset of slow solar winds characterized by an anomalous charge state composition and ion temperatures compared to average solar wind distributions, and thus referred to as an “Outlier” wind. We find that although this wind is slower and denser than normal slow wind, it is accelerated from the same source regions (active regions and quiet-Sun regions) as the latter and its occurrence rate depends on the solar cycle. The defining property of the Outlier wind is that its charge state composition is the same as that of normal slow wind, with the only exceptionmore » being a very large decrease in the abundance of fully charged species (He{sup 2+}, C{sup 6+}, N{sup 7+}, O{sup 8+}, Mg{sup 12+}), resulting in a significant depletion of the He and C element abundances. Based on these observations, we suggest three possible scenarios for the origin of this wind: (1) local magnetic waves preferentially accelerating non-fully stripped ions over fully stripped ions from a loop opened by reconnection; (2) depleted fully stripped ions already contained in the corona magnetic loops before they are opened up by reconnection; or (3) fully stripped ions depleted by Coulomb collision after magnetic reconnection in the solar corona. If any one of these three scenarios is confirmed, the Outlier wind represents a direct signature of slow wind release through magnetic reconnection.« less

  10. Towards a psychological construct of being moved.

    PubMed

    Menninghaus, Winfried; Wagner, Valentin; Hanich, Julian; Wassiliwizky, Eugen; Kuehnast, Milena; Jacobsen, Thomas

    2015-01-01

    The emotional state of being moved, though frequently referred to in both classical rhetoric and current language use, is far from established as a well-defined psychological construct. In a series of three studies, we investigated eliciting scenarios, emotional ingredients, appraisal patterns, feeling qualities, and the affective signature of being moved and related emotional states. The great majority of the eliciting scenarios can be assigned to significant relationship and critical life events (especially death, birth, marriage, separation, and reunion). Sadness and joy turned out to be the two preeminent emotions involved in episodes of being moved. Both the sad and the joyful variants of being moved showed a coactivation of positive and negative affect and can thus be ranked among the mixed emotions. Moreover, being moved, while featuring only low-to-mid arousal levels, was experienced as an emotional state of high intensity; this applied to responses to fictional artworks no less than to own-life and other real, but media-represented, events. The most distinctive findings regarding cognitive appraisal dimensions were very low ratings for causation of the event by oneself and for having the power to change its outcome, along with very high ratings for appraisals of compatibility with social norms and self-ideals. Putting together the characteristics identified and discussed throughout the three studies, the paper ends with a sketch of a psychological construct of being moved.

  11. Human exploration mission studies

    NASA Technical Reports Server (NTRS)

    Cataldo, Robert L.

    1989-01-01

    The Office of Exploration has established a process whereby all NASA field centers and other NASA Headquarters offices participate in the formulation and analysis of a wide range of mission strategies. These strategies were manifested into specific scenarios or candidate case studies. The case studies provided a systematic approach into analyzing each mission element. First, each case study must address several major themes and rationale including: national pride and international prestige, advancement of scientific knowledge, a catalyst for technology, economic benefits, space enterprise, international cooperation, and education and excellence. Second, the set of candidate case studies are formulated to encompass the technology requirement limits in the life sciences, launch capabilities, space transfer, automation, and robotics in space operations, power, and propulsion. The first set of reference case studies identify three major strategies: human expeditions, science outposts, and evolutionary expansion. During the past year, four case studies were examined to explore these strategies. The expeditionary missions include the Human Expedition to Phobos and Human Expedition to Mars case studies. The Lunar Observatory and Lunar Outpost to Early Mars Evolution case studies examined the later two strategies. This set of case studies established the framework to perform detailed mission analysis and system engineering to define a host of concepts and requirements for various space systems and advanced technologies. The details of each mission are described and, specifically, the results affecting the advanced technologies required to accomplish each mission scenario are presented.

  12. A technological infrastructure to sustain Internetworked Enterprises

    NASA Astrophysics Data System (ADS)

    La Mattina, Ernesto; Savarino, Vincenzo; Vicari, Claudia; Storelli, Davide; Bianchini, Devis

    In the Web 3.0 scenario, where information and services are connected by means of their semantics, organizations can improve their competitive advantage by publishing their business and service descriptions. In this scenario, Semantic Peer to Peer (P2P) can play a key role in defining dynamic and highly reconfigurable infrastructures. Organizations can share knowledge and services, using this infrastructure to move towards value networks, an emerging organizational model characterized by fluid boundaries and complex relationships. This chapter collects and defines the technological requirements and architecture of a modular and multi-Layer Peer to Peer infrastructure for SOA-based applications. This technological infrastructure, based on the combination of Semantic Web and P2P technologies, is intended to sustain Internetworked Enterprise configurations, defining a distributed registry and enabling more expressive queries and efficient routing mechanisms. The following sections focus on the overall architecture, while describing the layers that form it.

  13. Defining level A IVIVC dissolution specifications based on individual in vitro dissolution profiles of a controlled release formulation.

    PubMed

    González-García, I; García-Arieta, A; Merino-Sanjuan, M; Mangas-Sanjuan, V; Bermejo, M

    2018-07-01

    Regulatory guidelines recommend that, when a level A IVIVC is established, dissolution specification should be established using averaged data and the maximum difference between AUC and C max between the reference and test formulations cannot be greater than 20%. However, averaging data assumes a loss of information and may reflect a bias in the results. The objective of the current work is to present a new approach to establish dissolution specifications using a new methodology (individual approach) instead of average data (classical approach). Different scenarios were established based on the relationship between in vitro-in vivo dissolution rate coefficient using a level A IVIVC of a controlled release formulation. Then, in order to compare this new approach with the classical one, six additional batches were simulated. For each batch, 1000 simulations of a dissolution assay were run. C max ratios between the reference formulation and each batch were calculated showing that the individual approach was more sensitive and able to detect differences between the reference and the batch formulation compared to the classical approach. Additionally, the new methodology displays wider dissolution specification limits than the classical approach, ensuring that any tablet from the new batch would generate in vivo profiles which its AUC or C max ratio will be out of the 0.8-1.25 range, taking into account the in vitro and in vivo variability of the new batches developed. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Objective comparison of particle tracking methods

    PubMed Central

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F.; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R.; Godinez, William J.; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E. G.; Jaldén, Joakim; Blau, Helen M.; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L.; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P.; Dan, Han-Wei; Tsai, Yuh-Show; de Solórzano, Carlos Ortiz; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-01-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Since manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized, for the first time, an open competition, in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to important practical conclusions for users and developers. PMID:24441936

  15. Impact of forecasted changes in Polish economy (2015 and 2020) on nutrient emission into the river basins.

    PubMed

    Pastuszak, Marianna; Kowalkowski, Tomasz; Kopiński, Jerzy; Stalenga, Jarosław; Panasiuk, Damian

    2014-09-15

    Poland, with its large drainage area, with 50% contribution of agricultural land and 45% contribution of population to overall agricultural land area and population number in the Baltic catchment, is the largest exporter of riverine nitrogen (N) and phosphorus (P) to the sea. The economic transition has resulted in substantial, statistically significant decline in N, P export from Polish territory to the Baltic Sea. Following the obligations arising from the Helsinki Commission (HELCOM) declarations, in the coming years, Poland is expected to reduce riverine N loads by ca. 25% and P loads by ca. 60% as referred to the average flow normalized loads recorded in 1997-2003. The aim of this paper is to estimate annual source apportioned N and P emissions into these river basins in 2015 and 2020 with application of modeling studies (MONERIS). Twelve scenarios, encompassing changes in anthropogenic (diffuse, point source) and natural pressure (precipitation, water outflow due to climate change), have been applied. Modeling outcome for the period 2003-2008 served as our reference material. In applied scenarios, N emission into the Oder basin in 2015 and 2020 shows an increase from 4.2% up to 9.1% as compared with the reference period. N emission into the Vistula basin is more variable and shows an increase by max. 17.8% or a decrease by max. 4.7%, depending on the scenario. The difference between N emission into the Oder and Vistula basins is related to the catchment peculiarities and handling of point sources emission. P emission into both basins shows identical scenario patters and a maximum decrease reaches 17.8% in the Oder and 16.7% in the Vistula basin. Despite a declining tendency in P loads in both rivers in all the scenarios, HELCOM targeted P load reduction is not feasible. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Modeling background radiation in Southern Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haber, Daniel A.; Burnley, Pamela C.; Adcock, Christopher T.

    Aerial gamma ray surveys are an important tool for national security, scientific, and industrial interests in determining locations of both anthropogenic and natural sources of radioactivity. There is a relationship between radioactivity and geology and in the past this relationship has been used to predict geology from an aerial survey. The purpose of this project is to develop a method to predict the radiologic exposure rate of the geologic materials by creating a high resolution background model. The intention is for this method to be used in an emergency response scenario where the background radiation envi-ronment is unknown. Two studymore » areas in Southern Nevada have been modeled using geologic data, images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), geochemical data, and pre-existing low resolution aerial surveys from the National Uranium Resource Evaluation (NURE) Survey. Using these data, geospatial areas that are homogenous in terms of K, U, and Th, referred to as background radiation units, are defined and the gamma ray exposure rate is predicted. The prediction is compared to data collected via detailed aerial survey by the Department of Energy's Remote Sensing Lab - Nellis, allowing for the refinement of the technique. By using geologic units to define radiation background units of exposed bedrock and ASTER visualizations to subdivide and define radiation background units within alluvium, successful models have been produced for Government Wash, north of Lake Mead, and for the western shore of Lake Mohave, east of Searchlight, NV.« less

  17. Modeling background radiation in Southern Nevada

    DOE PAGES

    Haber, Daniel A.; Burnley, Pamela C.; Adcock, Christopher T.; ...

    2017-02-06

    Aerial gamma ray surveys are an important tool for national security, scientific, and industrial interests in determining locations of both anthropogenic and natural sources of radioactivity. There is a relationship between radioactivity and geology and in the past this relationship has been used to predict geology from an aerial survey. The purpose of this project is to develop a method to predict the radiologic exposure rate of the geologic materials by creating a high resolution background model. The intention is for this method to be used in an emergency response scenario where the background radiation envi-ronment is unknown. Two studymore » areas in Southern Nevada have been modeled using geologic data, images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), geochemical data, and pre-existing low resolution aerial surveys from the National Uranium Resource Evaluation (NURE) Survey. Using these data, geospatial areas that are homogenous in terms of K, U, and Th, referred to as background radiation units, are defined and the gamma ray exposure rate is predicted. The prediction is compared to data collected via detailed aerial survey by the Department of Energy's Remote Sensing Lab - Nellis, allowing for the refinement of the technique. By using geologic units to define radiation background units of exposed bedrock and ASTER visualizations to subdivide and define radiation background units within alluvium, successful models have been produced for Government Wash, north of Lake Mead, and for the western shore of Lake Mohave, east of Searchlight, NV.« less

  18. 11 CFR 109.21 - What is a “coordinated communication”?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 100.29. (2) A public communication, as defined in 11 CFR 100.26, that disseminates, distributes, or... public communication, as defined in 11 CFR 100.26, that expressly advocates, as defined in 11 CFR 100.22... section: (i) References to House and Senate candidates. The public communication refers to a clearly...

  19. 11 CFR 109.21 - What is a “coordinated communication”?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 100.29. (2) A public communication, as defined in 11 CFR 100.26, that disseminates, distributes, or... public communication, as defined in 11 CFR 100.26, that expressly advocates, as defined in 11 CFR 100.22... section: (i) References to House and Senate candidates. The public communication refers to a clearly...

  20. 11 CFR 109.21 - What is a “coordinated communication”?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 100.29. (2) A public communication, as defined in 11 CFR 100.26, that disseminates, distributes, or... public communication, as defined in 11 CFR 100.26, that expressly advocates, as defined in 11 CFR 100.22... section: (i) References to House and Senate candidates. The public communication refers to a clearly...

  1. 11 CFR 109.21 - What is a “coordinated communication”?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 100.29. (2) A public communication, as defined in 11 CFR 100.26, that disseminates, distributes, or... public communication, as defined in 11 CFR 100.26, that expressly advocates, as defined in 11 CFR 100.22... section: (i) References to House and Senate candidates. The public communication refers to a clearly...

  2. Comparing probabilistic microbial risk assessments for drinking water against daily rather than annualised infection probability targets.

    PubMed

    Signor, R S; Ashbolt, N J

    2009-12-01

    Some national drinking water guidelines provide guidance on how to define 'safe' drinking water. Regarding microbial water quality, a common position is that the chance of an individual becoming infected by some reference waterborne pathogen (e.g. Cryptsporidium) present in the drinking water should < 10(-4) in any year. However the instantaneous levels of risk to a water consumer vary over the course of a year, and waterborne disease outbreaks have been associated with shorter-duration periods of heightened risk. Performing probabilistic microbial risk assessments is becoming commonplace to capture the impacts of temporal variability on overall infection risk levels. A case is presented here for adoption of a shorter-duration reference period (i.e. daily) infection probability target over which to assess, report and benchmark such risks. A daily infection probability benchmark may provide added incentive and guidance for exercising control over short-term adverse risk fluctuation events and their causes. Management planning could involve outlining measures so that the daily target is met under a variety of pre-identified event scenarios. Other benefits of a daily target could include providing a platform for managers to design and assess management initiatives, as well as simplifying the technical components of the risk assessment process.

  3. Lunar base surface mission operations. Lunar Base Systems Study (LBSS) task 4.1

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The purpose was to perform an analysis of the surface operations associated with a human-tended lunar base. Specifically, the study defined surface elements and developed mission manifests for a selected base scenario, determined the nature of surface operations associated with this scenario, generated a preliminary crew extravehicular and intravehicular activity (EVA/IVA) time resource schedule for conducting the missions, and proposed concepts for utilizing remotely operated equipment to perform repetitious or hazardous surface tasks. The operations analysis was performed on a 6 year period of human-tended lunar base operation prior to permanent occupancy. The baseline scenario was derived from a modified version of the civil needs database (CNDB) scenario. This scenario emphasizes achievement of a limited set of science and exploration objectives while emplacing the minimum habitability elements required for a permanent base.

  4. KB3D Reference Manual. Version 1.a

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Siminiceanu, Radu; Carreno, Victor A.; Dowek, Gilles

    2005-01-01

    This paper is a reference manual describing the implementation of the KB3D conflict detection and resolution algorithm. The algorithm has been implemented in the Java and C++ programming languages. The reference manual gives a short overview of the detection and resolution functions, the structural implementation of the program, inputs and outputs to the program, and describes how the program is used. Inputs to the program can be rectangular coordinates or geodesic coordinates. The reference manual also gives examples of conflict scenarios and the resolution outputs the program produces.

  5. 76 FR 16712 - Participation by Religious Organizations in USAID Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-25

    ... are defined without reference to religion, (iii) has the effect of furthering a development objective... available to a wide range of organizations and beneficiaries which are defined without reference to religion...

  6. An Exploration of Kernel Equating Using SAT® Data: Equating to a Similar Population and to a Distant Population. Research Report. ETS RR-07-17

    ERIC Educational Resources Information Center

    Liu, Jinghua; Low, Albert C.

    2007-01-01

    This study applied kernel equating (KE) in two scenarios: equating to a very similar population and equating to a very different population, referred to as a distant population, using SAT® data. The KE results were compared to the results obtained from analogous classical equating methods in both scenarios. The results indicate that KE results are…

  7. Commercial Mobile Alert Service (CMAS) Scenarios

    DTIC Science & Technology

    2012-05-01

    Commercial Mobile Alert Service (CMAS) Scenarios The WEA Project Team May 2012 SPECIAL REPORT CMU/SEI-2012-SR-020 CERT® Division, Software ...Homeland Security under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally...DISTRIBUTES IT “AS IS.” References herein to any specific commercial product, process, or service by trade name, trade mark, manufacturer, or otherwise

  8. Assessing cost-effectiveness of bioretention on stormwater in response to climate change and urbanization for future scenarios

    NASA Astrophysics Data System (ADS)

    Wang, Mo; Zhang, Dongqing; Adhityan, Appan; Ng, Wun Jern; Dong, Jianwen; Tan, Soon Keat

    2016-12-01

    Bioretention, as a popular low impact development practice, has become more important to mitigate adverse impacts on urban stormwater. However, there is very limited information regarding ensuring the effectiveness of bioretention response to uncertain future challenges, especially when taking into consideration climate change and urbanization. The main objective of this paper is to identify the cost-effectiveness of bioretention by assessing the hydrology performance under future scenarios modeling. First, the hydrology model was used to obtain peak runoff and TSS loads of bioretention with variable scales under different scenarios, i.e., different Representative Concentration Pathways (RCPs) and Shared Socio-economic reference Pathways (SSPs) for 2-year and 10-year design storms in Singapore. Then, life cycle costing (LCC) and life cycle assessment (LCA) were estimated for bioretention, and the cost-effectiveness was identified under different scenarios. Our finding showed that there were different degree of responses to 2-year and 10-year design storms but the general patterns and insights deduced were similar. The performance of bioretenion was more sensitive to urbanization than that for climate change in the urban catchment. In addition, it was noted that the methodology used in this study was generic and the findings could be useful as reference for other LID practices in response to climate change and urbanization.

  9. Wealth distribution across communities of adaptive financial agents

    NASA Astrophysics Data System (ADS)

    DeLellis, Pietro; Garofalo, Franco; Lo Iudice, Francesco; Napoletano, Elena

    2015-08-01

    This paper studies the trading volumes and wealth distribution of a novel agent-based model of an artificial financial market. In this model, heterogeneous agents, behaving according to the Von Neumann and Morgenstern utility theory, may mutually interact. A Tobin-like tax (TT) on successful investments and a flat tax are compared to assess the effects on the agents’ wealth distribution. We carry out extensive numerical simulations in two alternative scenarios: (i) a reference scenario, where the agents keep their utility function fixed, and (ii) a focal scenario, where the agents are adaptive and self-organize in communities, emulating their neighbours by updating their own utility function. Specifically, the interactions among the agents are modelled through a directed scale-free network to account for the presence of community leaders, and the herding-like effect is tested against the reference scenario. We observe that our model is capable of replicating the benefits and drawbacks of the two taxation systems and that the interactions among the agents strongly affect the wealth distribution across the communities. Remarkably, the communities benefit from the presence of leaders with successful trading strategies, and are more likely to increase their average wealth. Moreover, this emulation mechanism mitigates the decrease in trading volumes, which is a typical drawback of TTs.

  10. Solar hybrid power plants: Solar energy contribution in reaching full dispatchability and firmness

    NASA Astrophysics Data System (ADS)

    Servert, Jorge F.; López, Diego; Cerrajero, Eduardo; Rocha, Alberto R.; Pereira, Daniel; Gonzalez, Lucía

    2016-05-01

    Renewable energies for electricity generation have always been considered as a risk for the electricity system due to its lack of dispatchability and firmness. Renewable energies penetration is constrained to strong grids or else its production must be limited to ensure grid stability, which is kept by the usage of hydropower energy or fossil-fueled power plants. CSP technology has an opportunity to arise not only as a dispatchable and firm technology, but also as an alternative that improves grid stability. To achieve that objective, solar hybrid configurations are being developed, being the most representative three different solutions: SAPG, ISCC and HYSOL. A reference scenario in Kingdom of Saudi Arabia (KSA) has been defined to compare these solutions, which have been modelled, simulated and evaluated in terms of dispatchability and firmness using ratios defined by the authors. The results show that: a) SAPG obtains the highest firmness KPI values, but no operation constraints have been considered for the coal boiler and the solar energy contribution is limited to 1.7%, b) ISCC provides dispatchable and firm electricity production but its solar energy contribution is limited to a 6.4%, and c) HYSOL presents the higher solar energy contribution of all the technologies considered: 66.0% while providing dispatchable and firm generation in similar conditions as SAPG and ISCC.

  11. Juggling with Indianness in the Gestation of Translation with Special Reference to the English Translation of a Hindi Story

    ERIC Educational Resources Information Center

    Priya, K.

    2014-01-01

    This paper is an attempt to look closely at the process of translating dramas with special reference to the Hindi story Aadmi ka Baccha ("The Child of a Man") by Yashpal in India and the role and significance of prose transcreations in today's changing global scenario.

  12. Severe anaemia associated with Plasmodium falciparum infection in children: consequences for additional blood sampling for research.

    PubMed

    Kuijpers, Laura Maria Francisca; Maltha, Jessica; Guiraud, Issa; Kaboré, Bérenger; Lompo, Palpouguini; Devlieger, Hugo; Van Geet, Chris; Tinto, Halidou; Jacobs, Jan

    2016-06-02

    Plasmodium falciparum infection may cause severe anaemia, particularly in children. When planning a diagnostic study on children suspected of severe malaria in sub-Saharan Africa, it was questioned how much blood could be safely sampled; intended blood volumes (blood cultures and EDTA blood) were 6 mL (children aged <6 years) and 10 mL (6-12 years). A previous review [Bull World Health Organ. 89: 46-53. 2011] recommended not to exceed 3.8 % of total blood volume (TBV). In a simulation exercise using data of children previously enrolled in a study about severe malaria and bacteraemia in Burkina Faso, the impact of this 3.8 % safety guideline was evaluated. For a total of 666 children aged >2 months to <12 years, data of age, weight and haemoglobin value (Hb) were available. For each child, the estimated TBV (TBVe) (mL) was calculated by multiplying the body weight (kg) by the factor 80 (ml/kg). Next, TBVe was corrected for the degree of anaemia to obtain the functional TBV (TBVf). The correction factor consisted of the rate 'Hb of the child divided by the reference Hb'; both the lowest ('best case') and highest ('worst case') reference Hb values were used. Next, the exact volume that a 3.8 % proportion of this TBVf would present was calculated and this volume was compared to the blood volumes that were intended to be sampled. When applied to the Burkina Faso cohort, the simulation exercise pointed out that in 5.3 % (best case) and 11.4 % (worst case) of children the blood volume intended to be sampled would exceed the volume as defined by the 3.8 % safety guideline. Highest proportions would be in the age groups 2-6 months (19.0 %; worst scenario) and 6 months-2 years (15.7 %; worst case scenario). A positive rapid diagnostic test for P. falciparum was associated with an increased risk of violating the safety guideline in the worst case scenario (p = 0.016). Blood sampling in children for research in P. falciparum endemic settings may easily violate the proposed safety guideline when applied to TBVf. Ethical committees and researchers should be wary of this and take appropriate precautions.

  13. How to simulate pedestrian behaviors in seismic evacuation for vulnerability reduction of existing buildings

    NASA Astrophysics Data System (ADS)

    Quagliarini, Enrico; Bernardini, Gabriele; D'Orazio, Marco

    2017-07-01

    Understanding and representing how individuals behave in earthquake emergencies would be essentially to assess the impact of vulnerability reduction strategies on existing buildings in seismic areas. In fact, interactions between individuals and the scenario (modified by the earthquake occurrence) are really important in order to understand the possible additional risks for people, especially during the evacuation phase. The current approach is based on "qualitative" aspects, in order to define best practice guidelines for Civil Protection and populations. On the contrary, a "quantitative" description of human response and evacuation motion in similar conditions is urgently needed. Hence, this work defines the rules for pedestrians' earthquake evacuation in urban scenarios, by taking advantages of previous results of real-world evacuation analyses. In particular, motion laws for pedestrians is defined by modifying the Social Force model equation. The proposed model could be used for evaluating individuals' evacuation process and so for defining operative strategies for interferences reduction in critical urban fabric parts (e.g.: interventions on particular buildings, evacuation strategies definition, city parts projects).

  14. Impacts of feeding less food-competing feedstuffs to livestock on global food system sustainability.

    PubMed

    Schader, Christian; Muller, Adrian; Scialabba, Nadia El-Hage; Hecht, Judith; Isensee, Anne; Erb, Karl-Heinz; Smith, Pete; Makkar, Harinder P S; Klocke, Peter; Leiber, Florian; Schwegler, Patrizia; Stolze, Matthias; Niggli, Urs

    2015-12-06

    Increasing efficiency in livestock production and reducing the share of animal products in human consumption are two strategies to curb the adverse environmental impacts of the livestock sector. Here, we explore the room for sustainable livestock production by modelling the impacts and constraints of a third strategy in which livestock feed components that compete with direct human food crop production are reduced. Thus, in the outmost scenario, animals are fed only from grassland and by-products from food production. We show that this strategy could provide sufficient food (equal amounts of human-digestible energy and a similar protein/calorie ratio as in the reference scenario for 2050) and reduce environmental impacts compared with the reference scenario (in the most extreme case of zero human-edible concentrate feed: greenhouse gas emissions -18%; arable land occupation -26%, N-surplus -46%; P-surplus -40%; non-renewable energy use -36%, pesticide use intensity -22%, freshwater use -21%, soil erosion potential -12%). These results occur despite the fact that environmental efficiency of livestock production is reduced compared with the reference scenario, which is the consequence of the grassland-based feed for ruminants and the less optimal feeding rations based on by-products for non-ruminants. This apparent contradiction results from considerable reductions of animal products in human diets (protein intake per capita from livestock products reduced by 71%). We show that such a strategy focusing on feed components which do not compete with direct human food consumption offers a viable complement to strategies focusing on increased efficiency in production or reduced shares of animal products in consumption. © 2015 The Authors.

  15. Prospective randomized comparison of standard didactic lecture versus high-fidelity simulation for radiology resident contrast reaction management training.

    PubMed

    Wang, Carolyn L; Schopp, Jennifer G; Petscavage, Jonelle M; Paladin, Angelisa M; Richardson, Michael L; Bush, William H

    2011-06-01

    The objective of our study was to assess whether high-fidelity simulation-based training is more effective than traditional didactic lecture to train radiology residents in the management of contrast reactions. This was a prospective study of 44 radiology residents randomized into a simulation group versus a lecture group. All residents attended a contrast reaction didactic lecture. Four months later, baseline knowledge was assessed with a written test, which we refer to as the "pretest." After the pretest, the 21 residents in the lecture group attended a repeat didactic lecture and the 23 residents in the simulation group underwent high-fidelity simulation-based training with five contrast reaction scenarios. Next, all residents took a second written test, which we refer to as the "posttest." Two months after the posttest, both groups took a third written test, which we refer to as the "delayed posttest," and underwent performance testing with a high-fidelity severe contrast reaction scenario graded on predefined critical actions. There was no statistically significant difference between the simulation and lecture group pretest, immediate posttest, or delayed posttest scores. The simulation group performed better than the lecture group on the severe contrast reaction simulation scenario (p = 0.001). The simulation group reported improved comfort in identifying and managing contrast reactions and administering medications after the simulation training (p ≤ 0.04) and was more comfortable than the control group (p = 0.03), which reported no change in comfort level after the repeat didactic lecture. When compared with didactic lecture, high-fidelity simulation-based training of contrast reaction management shows equal results on written test scores but improved performance during a high-fidelity severe contrast reaction simulation scenario.

  16. Impacts of feeding less food-competing feedstuffs to livestock on global food system sustainability

    PubMed Central

    Hecht, Judith; Isensee, Anne; Smith, Pete; Makkar, Harinder P. S.; Klocke, Peter; Leiber, Florian; Stolze, Matthias; Niggli, Urs

    2015-01-01

    Increasing efficiency in livestock production and reducing the share of animal products in human consumption are two strategies to curb the adverse environmental impacts of the livestock sector. Here, we explore the room for sustainable livestock production by modelling the impacts and constraints of a third strategy in which livestock feed components that compete with direct human food crop production are reduced. Thus, in the outmost scenario, animals are fed only from grassland and by-products from food production. We show that this strategy could provide sufficient food (equal amounts of human-digestible energy and a similar protein/calorie ratio as in the reference scenario for 2050) and reduce environmental impacts compared with the reference scenario (in the most extreme case of zero human-edible concentrate feed: greenhouse gas emissions −18%; arable land occupation −26%, N-surplus −46%; P-surplus −40%; non-renewable energy use −36%, pesticide use intensity −22%, freshwater use −21%, soil erosion potential −12%). These results occur despite the fact that environmental efficiency of livestock production is reduced compared with the reference scenario, which is the consequence of the grassland-based feed for ruminants and the less optimal feeding rations based on by-products for non-ruminants. This apparent contradiction results from considerable reductions of animal products in human diets (protein intake per capita from livestock products reduced by 71%). We show that such a strategy focusing on feed components which do not compete with direct human food consumption offers a viable complement to strategies focusing on increased efficiency in production or reduced shares of animal products in consumption. PMID:26674194

  17. System and method for calibrating inter-star-tracker misalignments in a stellar inertial attitude determination system

    NASA Technical Reports Server (NTRS)

    Li, Rongsheng (Inventor); Wu, Yeong-Wei Andy (Inventor); Hein, Douglas H. (Inventor)

    2004-01-01

    A method and apparatus for determining star tracker misalignments is disclosed. The method comprises the steps of defining a defining a reference frame for the star tracker assembly according to a boresight of the primary star tracker and a boresight of a second star tracker wherein the boresight of the primary star tracker and a plane spanned by the boresight of the primary star tracker and the boresight of the second star tracker at least partially define a datum for the reference frame for the star tracker assembly; and determining the misalignment of the at least one star tracker as a rotation of the defined reference frame.

  18. Impact of one's own mobile phone in stand-by mode on personal radiofrequency electromagnetic field exposure.

    PubMed

    Urbinello, Damiano; Röösli, Martin

    2013-01-01

    When moving around, mobile phones in stand-by mode periodically send data about their positions. The aim of this paper is to evaluate how personal radiofrequency electromagnetic field (RF-EMF) measurements are affected by such location updates. Exposure from a mobile phone handset (uplink) was measured during commuting by using a randomized cross-over study with three different scenarios: disabled mobile phone (reference), an activated dual-band phone and a quad-band phone. In the reference scenario, uplink exposure was highest during train rides (1.19 mW/m(2)) and lowest during car rides in rural areas (0.001 mW/m(2)). In public transports, the impact of one's own mobile phone on personal RF-EMF measurements was not observable because of high background uplink radiation from other people's mobile phone. In a car, uplink exposure with an activated phone was orders of magnitude higher compared with the reference scenario. This study demonstrates that personal RF-EMF exposure is affected by one's own mobile phone in stand-by mode because of its regular location update. Further dosimetric studies should quantify the contribution of location updates to the total RF-EMF exposure in order to clarify whether the duration of mobile phone use, the most common exposure surrogate in the epidemiological RF-EMF research, is actually an adequate exposure proxy.

  19. Definition of run-off-road crash clusters-For safety benefit estimation and driver assistance development.

    PubMed

    Nilsson, Daniel; Lindman, Magdalena; Victor, Trent; Dozza, Marco

    2018-04-01

    Single-vehicle run-off-road crashes are a major traffic safety concern, as they are associated with a high proportion of fatal outcomes. In addressing run-off-road crashes, the development and evaluation of advanced driver assistance systems requires test scenarios that are representative of the variability found in real-world crashes. We apply hierarchical agglomerative cluster analysis to define similarities in a set of crash data variables, these clusters can then be used as the basis in test scenario development. Out of 13 clusters, nine test scenarios are derived, corresponding to crashes characterised by: drivers drifting off the road in daytime and night-time, high speed departures, high-angle departures on narrow roads, highways, snowy roads, loss-of-control on wet roadways, sharp curves, and high speeds on roads with severe road surface conditions. In addition, each cluster was analysed with respect to crash variables related to the crash cause and reason for the unintended lane departure. The study shows that cluster analysis of representative data provides a statistically based method to identify relevant properties for run-off-road test scenarios. This was done to support development of vehicle-based run-off-road countermeasures and driver behaviour models used in virtual testing. Future studies should use driver behaviour from naturalistic driving data to further define how test-scenarios and behavioural causation mechanisms should be included. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Differences in case-mix can influence the comparison of standardised mortality ratios even with optimal risk adjustment: an analysis of data from paediatric intensive care.

    PubMed

    Manktelow, Bradley N; Evans, T Alun; Draper, Elizabeth S

    2014-09-01

    The publication of clinical outcomes for consultant surgeons in 10 specialties within the NHS has, along with national clinical audits, highlighted the importance of measuring and reporting outcomes with the aim of monitoring quality of care. Such information is vital to be able to identify good and poor practice and to inform patient choice. The need to adequately adjust outcomes for differences in case-mix has long been recognised as being necessary to provide 'like-for-like' comparisons between providers. However, directly comparing values of the standardised mortality ratio (SMR) between different healthcare providers can be misleading even when the risk-adjustment perfectly quantifies the risk of a poor outcome in the reference population. An example is shown from paediatric intensive care. Using observed case-mix differences for 33 paediatric intensive care units (PICUs) in the UK and Ireland for 2009-2011, SMRs were calculated under four different scenarios where, in each scenario, all of the PICUs were performing identically for each patient type. Each scenario represented a clinically plausible difference in outcome from the reference population. Despite the fact that the outcome for any patient was the same no matter which PICU they were to be admitted to, differences between the units were seen when compared using the SMR: scenario 1, 1.07-1.21; scenario 2, 1.00-1.14; scenario 3, 1.04-1.13; scenario 4, 1.00-1.09. Even if two healthcare providers are performing equally for each type of patient, if their patient populations differ in case-mix their SMRs will not necessarily take the same value. Clinical teams and commissioners must always keep in mind this weakness of the SMR when making decisions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  1. Exploring Persona-Scenarios - Using Storytelling to Create Design Ideas

    NASA Astrophysics Data System (ADS)

    Madsen, Sabine; Nielsen, Lene

    This paper explores the persona-scenario method by investigating how the method can support project participants in generating shared understandings and design ideas. As persona-scenarios are stories we draw on narrative theory to define what a persona-scenario is and which narrative elements it should consist of. Based on an empirical study a key finding is that despite our inherent human ability to construct, tell, and interpret stories it is not easy to write and present a good, coherent, and design-oriented story without methodical support. The paper therefore contributes with guidelines that delineate a) what a design-oriented persona-scenario should consist of (product) and b) how to write it (procedure) in order to generate and validate as many, new, and shared understandings and design ideas as possible (purpose). The purpose of the guidelines is to facilitate the construction of persona-scenarios as good, coherent stories, which make sense to the storytellers and to the audience - and which therefore generate many, new, and shared understandings and design ideas.

  2. Alternative Geothermal Power Production Scenarios

    DOE Data Explorer

    Sullivan, John

    2014-03-14

    The information given in this file pertains to Argonne LCAs of the plant cycle stage for a set of ten new geothermal scenario pairs, each comprised of a reference and improved case. These analyses were conducted to compare environmental performances among the scenarios and cases. The types of plants evaluated are hydrothermal binary and flash and Enhanced Geothermal Systems (EGS) binary and flash plants. Each scenario pair was developed by the LCOE group using GETEM as a way to identify plant operational and resource combinations that could reduce geothermal power plant LCOE values. Based on the specified plant and well field characteristics (plant type, capacity, capacity factor and lifetime, and well numbers and depths) for each case of each pair, Argonne generated a corresponding set of material to power ratios (MPRs) and greenhouse gas and fossil energy ratios.

  3. Comparison: Mediation Solutions of WSMOLX and WebML/WebRatio

    NASA Astrophysics Data System (ADS)

    Zaremba, Maciej; Zaharia, Raluca; Turati, Andrea; Brambilla, Marco; Vitvar, Tomas; Ceri, Stefano

    In this chapter we compare the WSMO/WSML/WSMX andWebML/WebRatio approaches to the SWS-Challenge workshop mediation scenario in terms of the utilized underlying technologies and delivered solutions. In the mediation scenario one partner uses Roset-taNet to define its B2B protocol while the other one operates on a proprietary solution. Both teams shown how these partners could be semantically integrated.

  4. Experimental Optimization of Exposure Index and Quality of Service in Wlan Networks.

    PubMed

    Plets, David; Vermeeren, Günter; Poorter, Eli De; Moerman, Ingrid; Goudos, Sotirios K; Luc, Martens; Wout, Joseph

    2017-07-01

    This paper presents the first real-life optimization of the Exposure Index (EI). A genetic optimization algorithm is developed and applied to three real-life Wireless Local Area Network scenarios in an experimental testbed. The optimization accounts for downlink, uplink and uplink of other users, for realistic duty cycles, and ensures a sufficient Quality of Service to all users. EI reductions up to 97.5% compared to a reference configuration can be achieved in a downlink-only scenario, in combination with an improved Quality of Service. Due to the dominance of uplink exposure and the lack of WiFi power control, no optimizations are possible in scenarios that also consider uplink traffic. However, future deployments that do implement WiFi power control can be successfully optimized, with EI reductions up to 86% compared to a reference configuration and an EI that is 278 times lower than optimized configurations under the absence of power control. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.

    PubMed

    Lo, Y C; Armbruster, David A

    2012-04-01

    Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.

  6. Alaska OCS socioeconomic studies program: St. George basin petroleum development scenarios, Anchorage impact analysis. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ender, R.L.; Gorski, S.

    1981-10-01

    The report consists of an update to the Anchorage socioeconomic and physical baseline and infrastructure standards used to forecast impacts with and without OCS oil and gas development in Alaska. This material is found in Technical Report 43, Volumes 1 and 2 entitled 'Gulf of Alaska and Lower Cook Inlet Petroleum Development Scenarios, Anchorage Socioeconomic and Physical Baseline and Anchorage Impact Analysis.' These updates should be read in conjunction with the above report. In addition, the Anchorage base case and petroleum development scenarios for the St. George Basin are given. These sections are written to stand alone without reference.

  7. Selection and Training of Field Artillery Forward Observers: Methodologies for Improving Target Acquisition Skills

    DTIC Science & Technology

    1979-07-01

    African scenario.) The training analysis revealed some discrepancies between the list of tasks taught in FAOBC and the list of tasks emerging from the...I tD ’. 0C-) Q) 4- ) 0 N 4- _ L ~~1 CC 0 -- .0 I 4 J0C cog 1 . wi. I -4 1- Co4- ~a) U’ cu ) 0o 0 0 CDm 0 -% o c u- CO 0) -* -- cN- LO) C’I) NO 0 - CV...population density. (Refer to Figure 3-2). The African combat scenario, closely followed by the Middle Eastern scenario, was rated as being the most

  8. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    NASA Astrophysics Data System (ADS)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  9. The ARIEL mission reference sample

    NASA Astrophysics Data System (ADS)

    Zingales, Tiziano; Tinetti, Giovanna; Pillitteri, Ignazio; Leconte, Jérémy; Micela, Giuseppina; Sarkar, Subhajit

    2018-02-01

    The ARIEL (Atmospheric Remote-sensing Exoplanet Large-survey) mission concept is one of the three M4 mission candidates selected by the European Space Agency (ESA) for a Phase A study, competing for a launch in 2026. ARIEL has been designed to study the physical and chemical properties of a large and diverse sample of exoplanets and, through those, understand how planets form and evolve in our galaxy. Here we describe the assumptions made to estimate an optimal sample of exoplanets - including already known exoplanets and expected ones yet to be discovered - observable by ARIEL and define a realistic mission scenario. To achieve the mission objectives, the sample should include gaseous and rocky planets with a range of temperatures around stars of different spectral type and metallicity. The current ARIEL design enables the observation of ˜1000 planets, covering a broad range of planetary and stellar parameters, during its four year mission lifetime. This nominal list of planets is expected to evolve over the years depending on the new exoplanet discoveries.

  10. Dreaming of Atmospheres

    NASA Astrophysics Data System (ADS)

    Waldmann, I. P.

    2016-04-01

    Here, we introduce the RobERt (Robotic Exoplanet Recognition) algorithm for the classification of exoplanetary emission spectra. Spectral retrieval of exoplanetary atmospheres frequently requires the preselection of molecular/atomic opacities to be defined by the user. In the era of open-source, automated, and self-sufficient retrieval algorithms, manual input should be avoided. User dependent input could, in worst-case scenarios, lead to incomplete models and biases in the retrieval. The RobERt algorithm is based on deep-belief neural (DBN) networks trained to accurately recognize molecular signatures for a wide range of planets, atmospheric thermal profiles, and compositions. Reconstructions of the learned features, also referred to as the “dreams” of the network, indicate good convergence and an accurate representation of molecular features in the DBN. Using these deep neural networks, we work toward retrieval algorithms that themselves understand the nature of the observed spectra, are able to learn from current and past data, and make sensible qualitative preselections of atmospheric opacities to be used for the quantitative stage of the retrieval process.

  11. Orientation and metacognition in virtual space.

    PubMed

    Tenbrink, Thora; Salwiczek, Lucie H

    2016-05-01

    Cognitive scientists increasingly use virtual reality scenarios to address spatial perception, orientation, and navigation. If based on desktops rather than mobile immersive environments, this involves a discrepancy between the physically experienced static position and the visually perceived dynamic scene, leading to cognitive challenges that users of virtual worlds may or may not be aware of. The frequently reported loss of orientation and worse performance in point-to-origin tasks relate to the difficulty of establishing a consistent reference system on an allocentric or egocentric basis. We address the verbalizability of spatial concepts relevant in this regard, along with the conscious strategies reported by participants. Behavioral and verbal data were collected using a perceptually sparse virtual tunnel scenario that has frequently been used to differentiate between humans' preferred reference systems. Surprisingly, the linguistic data we collected relate to reference system verbalizations known from the earlier literature only to a limited extent, but instead reveal complex cognitive mechanisms and strategies. Orientation in desktop virtual reality appears to pose considerable challenges, which participants react to by conceptualizing the task in individual ways that do not systematically relate to the generic concepts of egocentric and allocentric reference frames. (c) 2016 APA, all rights reserved).

  12. Environmental life cycle assessment of different domestic wastewater streams: policy effectiveness in a tropical urban environment.

    PubMed

    Ng, Bernard J H; Zhou, Jin; Giannis, Apostolos; Chang, Victor W-C; Wang, Jing-Yuan

    2014-07-01

    To enhance local water security, the Singapore government promotes two water conservation policies: the use of eco-friendly toilets to reduce yellow water (YW) disposal and the installation of water efficient devices to minimize gray water (GW) discharge. The proposed water conservation policies have different impacts on the environmental performance of local wastewater management. The main purpose of this study is to examine and compare the impacts of different domestic wastewater streams and the effectiveness of two water conservation policies by means of life cycle assessment (LCA). LCA is used to compare three scenarios, including a baseline scenario (BL), YW-reduced scenario (YWR) and GW-reduced scenario (GWR). The BL is designed based on the current wastewater management system, whereas the latter two scenarios are constructed according to the two water conservation policies that are proposed by the Singapore government. The software SIMPARO 7.3 with local data and an eco-invent database is used to build up the model, and the functional unit is defined as the daily wastewater disposal of a Singapore resident. Due to local water supply characteristics, the system boundary is extended to include the sewage sludge management and tap water production processes. The characterization results indicate that the GWR has a significant impact reduction (22-25%) while the YWR has only a 2-4% impact reduction compared with the BL. The contribution analysis reveals that the GW dominates many impact categories except eutrophication potential. The tap water production is identified as the most influential process due to its high embodied energy demand in a local context. Life cycle costing analysis shows that both YWR and GWR are financially favorable. It is also revealed that the current water conservation policies could only achieve Singapore's short-term targets. Therefore, two additional strategies are recommended for achieving long-term goals. This study provides a comprehensive and reliable environmental profile of Singapore's wastewater management with the help of extended system boundary and local data. This work also fills the research gap of previous studies by identifying the contribution of different wastewater streams, which would serve as a good reference for source-separating sanitation system design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Hazard Evaluation in Valparaíso: the MAR VASTO Project

    NASA Astrophysics Data System (ADS)

    Indirli, Maurizio; Razafindrakoto, Hoby; Romanelli, Fabio; Puglisi, Claudio; Lanzoni, Luca; Milani, Enrico; Munari, Marco; Apablaza, Sotero

    2011-03-01

    The Project "MAR VASTO" (Risk Management in Valparaíso/Manejo de Riesgos en Valparaíso), funded by BID/IADB (Banco InterAmericano de Desarrollo/InterAmerican Development Bank), has been managed by ENEA, with an Italian/Chilean joined partnership and the support of local institutions. Valparaíso tells the never-ending story of a tight interaction between society and environment and the city has been declared a Patrimony of Humanity by UNESCO since 2003. The main goals of the project have been to evaluate in the Valparaíso urban area the impact of main hazards (earthquake, tsunami, fire, and landslide), defining scenarios and maps on a geo-referenced GIS database. In particular, for earthquake hazard assessment the realistic modelling of ground motion is a very important base of knowledge for the preparation of groundshaking scenarios which serve as a valid and economic tool to be fruitfully used by civil engineers, supplying a particularly powerful tool for the prevention aspects of Civil Defense. When numerical modelling is successfully compared with records (as in the case of the Valparaíso, 1985 earthquake), the resulting synthetic seismograms permit the generation of groundshaking maps, based upon a set of possible scenario earthquakes. Where no recordings are available for the scenario event, synthetic signals can be used to estimate ground motion without having to wait for a strong earthquake to occur (pre-disaster microzonation). For the tsunami hazard, the available reports, [e.g., SHOA (1999) Carta de Inundacion por Tsunami para la bahia de Valparaíso, Chile, http://www.shoa.cl/servicios/citsu/citsu.php], have been used as the reference documents for the hazard assessment for the Valparaíso site. The deep and detailed studies already carried out by SHOA have been complemented with (a) sets of parametric studies of the tsunamigenic potential of the 1985 and 1906 scenario earthquakes; and (b) analytical modelling of tsunami waveforms for different scenarios, in order to provide a complementary dataset to be used for the tsunami hazard assessment at Valparaíso. In addition, other targeted activities have been carried out, such as architectonic/urban planning studies/vulnerability evaluation for a pilot building stock in a historic area and a vulnerability analysis for three monumental churches. In this paper, a general description of the work is given, taking into account the in situ work that drove the suggestion of guidelines for mitigation actions.

  14. Analyzing Uncertainty and Risk in the Management of Water Resources in the State Of Texas

    NASA Astrophysics Data System (ADS)

    Singh, A.; Hauffpauir, R.; Mishra, S.; Lavenue, M.

    2010-12-01

    The State of Texas updates its state water plan every five years to determine the water demand required to meet its growing population. The plan compiles forecasts of water deficits from state-wide regional water planning groups as well as the water supply strategies to address these deficits. To date, the plan has adopted a deterministic framework, where reference values (e.g., best estimates, worst-case scenario) are used for key factors such as population growth, demand for water, severity of drought, water availability, etc. These key factors can, however, be affected by multiple sources of uncertainties such as - the impact of climate on surface water and groundwater availability, uncertainty in population projections, changes in sectoral composition of the economy, variability in water usage, feasibility of the permitting process, cost of implementation, etc. The objective of this study was to develop a generalized and scalable methodology for addressing uncertainty and risk in water resources management both at the regional and the local water planning level. The study proposes a framework defining the elements of an end-to-end system model that captures the key components of demand, supply and planning modules along with their associated uncertainties. The framework preserves the fundamental elements of the well-established planning process in the State of Texas, promoting an incremental and stakeholder-driven approach to adding different levels of uncertainty (and risk) into the decision-making environment. The uncertainty in the water planning process is broken down into two primary categories: demand uncertainty and supply uncertainty. Uncertainty in Demand is related to the uncertainty in population projections and the per-capita usage rates. Uncertainty in Supply, in turn, is dominated by the uncertainty in future climate conditions. Climate is represented in terms of time series of precipitation, temperature and/or surface evaporation flux for some future time period of interest, which can be obtained as outputs of global climate models (GCMs). These are then linked with hydrologic and water-availability models (WAMs) to estimate water availability for the worst drought conditions under each future climate scenario. Combining the demand scenarios with the water availability scenarios yields multiple scenarios for water shortage (or surplus). Given multiple shortage/surplus scenarios, various water management strategies can be assessed to evaluate the reliability of meeting projected deficits. These reliabilities are then used within a multi-criteria decision-framework to assess trade-offs between various water management objectives, thus helping to make more robust decisions while planning for the water needs of the future.

  15. Variation in the Gender Gap in Inactive and Active Life Expectancy by the Definition of Inactivity Among Older Adults.

    PubMed

    Malhotra, Rahul; Chan, Angelique; Ajay, Shweta; Ma, Stefan; Saito, Yasuhiko

    2016-10-01

    To assess variation in gender gap (female-male) in inactive life expectancy (IALE) and active life expectancy (ALE) by definition of inactivity. Inactivity, among older Singaporeans, was defined as follows: Scenario 1-health-related difficulty in activities of daily living (ADLs); Scenario 2-health-related difficulty in ADLs/instrumental ADLs (IADLs); Scenario 3-health-related difficulty in ADLs/IADLs or non-health-related non-performance of IADLs. Multistate life tables computed IALE and ALE at age 60, testing three hypotheses: In all scenarios, life expectancy, absolute and relative IALE, and absolute ALE are higher for females (Hypothesis 1 [H1]); gender gap in absolute and relative IALE expands, and in absolute ALE, it contracts in Scenario 2 versus 1 (Hypothesis 2 [H2]); gender gap in absolute and relative IALE decreases, and in absolute ALE, it increases in Scenario 3 versus 2 (Hypothesis 3 [H3]). H1 was supported in Scenarios 1 and 3 but not Scenario 2. Both H2 and H3 were supported. Definition of inactivity influences gender gap in IALE and ALE. © The Author(s) 2016.

  16. 75 FR 37864 - Self-Regulatory Organizations; The Options Clearing Corporation; Notice of Filing of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-30

    ... and Rules To Establish a Clearing Fund Amount Intended To Support Losses Under a Defined Set of... intended to support losses under a defined set of default scenarios. II. Self-Regulatory Organization's... discussed any comments it received on the proposed rule change. The text of these statements may be examined...

  17. Scenario Development for the Southwestern United States

    NASA Astrophysics Data System (ADS)

    Mahmoud, M.; Gupta, H.; Stewart, S.; Liu, Y.; Hartmann, H.; Wagener, T.

    2006-12-01

    The primary goal of employing a scenario development approach for the U.S. southwest is to inform regional policy by examining future possibilities related to regional vegetation change, water-leasing, and riparian restoration. This approach is necessary due to a lack of existing explicit water resources application of scenarios to the entire southwest region. A formal approach for scenario development is adopted and applied towards water resources issues within the arid and semi-arid regions of the U.S. southwest following five progressive and reiterative phases: scenario definition, scenario construction, scenario analysis, scenario assessment, and risk management. In the scenario definition phase, the inputs of scientists, modelers, and stakeholders were collected in order to define and construct relevant scenarios to the southwest and its water sustainability needs. From stakeholder-driven scenario workshops and breakout sessions, the three main axes of principal change were identified to be climate change, population development patterns, and quality of information monitoring technology. Based on the extreme and varying conditions of these three main axes, eight scenario narratives were drafted to describe the state of each scenario's respective future and the events which led to it. Events and situations are described within each scenario narrative with respect to key variables; variables that are both important to regional water resources (as distinguished by scientists and modelers), and are good tracking and monitoring indicators of change. The current phase consists of scenario construction, where the drafted scenarios are re-presented to regional scientists and modelers to verify that proper key variables are included (or excluded) from the eight narratives. The next step is to construct the data sets necessary to implement the eight scenarios on the respective computational models of modelers investigating vegetation change, water-leasing, and riparian restoration in the southwest

  18. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  19. Hazard map for volcanic ballistic impacts at Popocatépetl volcano (Mexico)

    NASA Astrophysics Data System (ADS)

    Alatorre-Ibargüengoitia, Miguel A.; Delgado-Granados, Hugo; Dingwell, Donald B.

    2012-11-01

    During volcanic explosions, volcanic ballistic projectiles (VBP) are frequently ejected. These projectiles represent a threat to people, infrastructure, vegetation, and aircraft due to their high temperatures and impact velocities. In order to protect people adequately, it is necessary to delimit the projectiles' maximum range within well-defined explosion scenarios likely to occur in a particular volcano. In this study, a general methodology to delimit the hazard zones for VBP during volcanic eruptions is applied to Popocatépetl volcano. Three explosion scenarios with different intensities have been defined based on the past activity of the volcano and parameterized by considering the maximum kinetic energy associated with VBP ejected during previous eruptions. A ballistic model is used to reconstruct the "launching" kinetic energy of VBP observed in the field. In the case of Vulcanian eruptions, the most common type of activity at Popocatépetl, the ballistic model was used in concert with an eruptive model to correlate ballistic range with initial pressure and gas content, parameters that can be estimated by monitoring techniques. The results are validated with field data and video observations of different Vulcanian eruptions at Popocatépetl. For each scenario, the ballistic model is used to calculate the maximum range of VBP under optimum "launching" conditions: ballistic diameter, ejection angle, topography, and wind velocity. Our results are presented in the form of a VBP hazard map with topographic profiles that depict the likely maximum ranges of VBP under explosion scenarios defined specifically for Popocatépetl volcano. The hazard zones shown on the map allow the responsible authorities to plan the definition and mitigation of restricted areas during volcanic crises.

  20. Evaluation of new alternatives in wastewater treatment plants based on dynamic modelling and life cycle assessment (DM-LCA).

    PubMed

    Bisinella de Faria, A B; Spérandio, M; Ahmadi, A; Tiruta-Barna, L

    2015-11-01

    With a view to quantifying the energy and environmental advantages of Urine Source-Separation (USS) combined with different treatment processes, five wastewater treatment plant (WWTP) scenarios were compared to a reference scenario using Dynamic Modelling (DM) and Life Cycle Assessment (LCA), and an integrated DM-LCA framework was thus developed. Dynamic simulations were carried out in BioWin(®) in order to obtain a realistic evaluation of the dynamic behaviour and performance of plants under perturbation. LCA calculations were performed within Umberto(®) using the Ecoinvent database. A Python™ interface was used to integrate and convert simulation data and to introduce them into Umberto(®) to achieve a complete LCA evaluation comprising foreground and background processes. Comparisons between steady-state and dynamic simulations revealed the importance of considering dynamic aspects such as nutrient and flow peaks. The results of the evaluation highlighted the potential of the USS scenario for nutrient recovery whereas the Enhanced Primary Clarification (EPC) scenario gave increased biogas production and also notably decreased aeration consumption, leading to a positive energy balance. Both USS and EPC scenarios also showed increased stability of plant operation, with smaller daily averages of total nitrogen and phosphorus. In this context, USS and EPC results demonstrated that the coupled USS + EPC scenario and its combinations with agricultural spreading of N-rich effluent and nitritation/anaerobic deammonification could present an energy-positive balance with respectively 27% and 33% lower energy requirements and an increase in biogas production of 23%, compared to the reference scenario. The coupled scenarios also presented lesser environmental impacts (reduction of 31% and 39% in total endpoint impacts) along with effluent quality well within the specified limits. The marked environmental performance (reduction of global warming) when nitrogen is used in agriculture shows the importance of future research on sustainable solutions for nitrogen recovery. The contribution analysis of midpoint impacts also showed hotspots that it will be important to optimize further, such as plant infrastructure and direct N2O emissions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Photobioreactor: Biotechnology for the Technology Education Classroom.

    ERIC Educational Resources Information Center

    Dunham, Trey; Wells, John; White, Karissa

    2002-01-01

    Describes a problem scenario involving photobioreactors and presents materials and resources, student project activities, and teaching and evaluation methods for use in the technology education classroom. (Contains 14 references.) (SK)

  2. Impact of Image Noise on Gamma Index Calculation

    NASA Astrophysics Data System (ADS)

    Chen, M.; Mo, X.; Parnell, D.; Olivera, G.; Galmarini, D.; Lu, W.

    2014-03-01

    Purpose: The Gamma Index defines an asymmetric metric between the evaluated image and the reference image. It provides a quantitative comparison that can be used to indicate sample-wised pass/fail on the agreement of the two images. The Gamma passing/failing rate has become an important clinical evaluation tool. However, the presence of noise in the evaluated and/or reference images may change the Gamma Index, hence the passing/failing rate, and further, clinical decisions. In this work, we systematically studied the impact of the image noise on the Gamma Index calculation. Methods: We used both analytic formulation and numerical calculations in our study. The numerical calculations included simulations and clinical images. Three different noise scenarios were studied in simulations: noise in reference images only, in evaluated images only, and in both. Both white and spatially correlated noises of various magnitudes were simulated. For clinical images of various noise levels, the Gamma Index of measurement against calculation, calculation against measurement, and measurement against measurement, were evaluated. Results: Numerical calculations for both the simulation and clinical data agreed with the analytic formulations, and the clinical data agreed with the simulations. For the Gamma Index of measurement against calculation, its distribution has an increased mean and an increased standard deviation as the noise increases. On the contrary, for the Gamma index of calculation against measurement, its distribution has a decreased mean and stabilized standard deviation as the noise increases. White noise has greater impact on the Gamma Index than spatially correlated noise. Conclusions: The noise has significant impact on the Gamma Index calculation and the impact is asymmetric. The Gamma Index should be reported along with the noise levels in both reference and evaluated images. Reporting of the Gamma Index with switched roles of the images as reference and evaluated images or some composite metrics would be a good practice.

  3. Novel pervasive scenarios for home management: the Butlers architecture.

    PubMed

    Denti, Enrico

    2014-01-01

    Many efforts today aim to energy saving, promoting the user's awareness and virtuous behavior in a sustainability perspective. Our houses, appliances, energy meters and devices are becoming smarter and connected, domotics is increasing possibilities in house automation and control, and ambient intelligence and assisted living are bringing attention onto people's needs from different viewpoints. Our assumption is that considering these aspects together allows for novel intriguing possibilities. To this end, in this paper we combine home energy management with domotics, coordination technologies, intelligent agents, ambient intelligence, ubiquitous technologies and gamification to devise novel scenarios, where energy monitoring and management is just the basic brick of a much wider and comprehensive home management system. The aim is to control home appliances well beyond energy consumption, combining home comfort, appliance scheduling, safety constraints, etc. with dynamically-changeable users' preferences, goals and priorities. At the same time, usability and attractiveness are seen as key success factors: so, the intriguing technologies available in most houses and smart devices are exploited to make the system configuration and use simpler, entertaining and attractive for users. These aspects are also integrated with ubiquitous and pervasive technologies, geo-localization, social networks and communities to provide enhanced functionalities and support smarter application scenarios, hereby further strengthening technology acceptation and diffusion. Accordingly, we first analyse the system requirements and define a reference multi-layer architectural model - the Butlers architecture - that specifies seven layers of functionalities, correlating the requirements, the corresponding technologies and the consequent value-added for users in each layer. Then, we outline a set of notable scenarios of increasing functionalities and complexity, discuss the structure of the corresponding system patterns in terms of the proposed architecture, and make this concrete by presenting some comprehensive interaction examples as comic strip stories. Next, we discuss the implementation requirements and how they can be met with the available technologies, discuss a possible architecture, refine it in the concrete case of the TuCSoN coordination technology, present a subsystem prototype and discuss its properties in the Butlers perspective.

  4. An Information-Centric Approach to Autonomous Trajectory Planning Utilizing Optimal Control Techniques

    DTIC Science & Technology

    2009-09-01

    to promote one way as the best, but to show there are several ways to define the problem. 107 Figure 71. Final Orientation/Obstacle Scenario...a comparison of the running cost vs. distance from an obstacle for varying values of p. Simulations have shown that for 4p  , the running cost...sliding door example. This scenario shows a major weakness when conducting trajectory planning using snapshots in a dynamic environment

  5. Climate balance of biogas upgrading systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pertl, A., E-mail: andreas.pertl@boku.ac.a; Mostbauer, P.; Obersteiner, G.

    2010-01-15

    One of the numerous applications of renewable energy is represented by the use of upgraded biogas where needed by feeding into the gas grid. The aim of the present study was to identify an upgrading scenario featuring minimum overall GHG emissions. The study was based on a life-cycle approach taking into account also GHG emissions resulting from plant cultivation to the process of energy conversion. For anaerobic digestion two substrates have been taken into account: (1) agricultural resources and (2) municipal organic waste. The study provides results for four different upgrading technologies including the BABIU (Bottom Ash for Biogas Upgrading)more » method. As the transport of bottom ash is a critical factor implicated in the BABIU-method, different transport distances and means of conveyance (lorry, train) have been considered. Furthermore, aspects including biogas compression and energy conversion in a combined heat and power plant were assessed. GHG emissions from a conventional energy supply system (natural gas) have been estimated as reference scenario. The main findings obtained underlined how the overall reduction of GHG emissions may be rather limited, for example for an agricultural context in which PSA-scenarios emit only 10% less greenhouse gases than the reference scenario. The BABIU-method constitutes an efficient upgrading method capable of attaining a high reduction of GHG emission by sequestration of CO{sub 2}.« less

  6. The International Experimental Thermal Hydraulic Systems database – TIETHYS: A new NEA validation tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, Upendra S.

    Nuclear reactor codes require validation with appropriate data representing the plant for specific scenarios. The thermal-hydraulic data is scattered in different locations and in different formats. Some of the data is in danger of being lost. A relational database is being developed to organize the international thermal hydraulic test data for various reactor concepts and different scenarios. At the reactor system level, that data is organized to include separate effect tests and integral effect tests for specific scenarios and corresponding phenomena. The database relies on the phenomena identification sections of expert developed PIRTs. The database will provide a summary ofmore » appropriate data, review of facility information, test description, instrumentation, references for the experimental data and some examples of application of the data for validation. The current database platform includes scenarios for PWR, BWR, VVER, and specific benchmarks for CFD modelling data and is to be expanded to include references for molten salt reactors. There are place holders for high temperature gas cooled reactors, CANDU and liquid metal reactors. This relational database is called The International Experimental Thermal Hydraulic Systems (TIETHYS) database and currently resides at Nuclear Energy Agency (NEA) of the OECD and is freely open to public access. Going forward the database will be extended to include additional links and data as they become available. https://www.oecd-nea.org/tiethysweb/« less

  7. Impacts of potential CO2-reduction policies on air quality in the United States.

    PubMed

    Trail, Marcus A; Tsimpidi, Alexandra P; Liu, Peng; Tsigaridis, Kostas; Hu, Yongtao; Rudokas, Jason R; Miller, Paul J; Nenes, Athanasios; Russell, Armistead G

    2015-04-21

    Impacts of emissions changes from four potential U.S. CO2 emission reduction policies on 2050 air quality are analyzed using the community multiscale air quality model (CMAQ). Future meteorology was downscaled from the Goddard Institute for Space Studies (GISS) ModelE General Circulation Model (GCM) to the regional scale using the Weather Research Forecasting (WRF) model. We use emissions growth factors from the EPAUS9r MARKAL model to project emissions inventories for two climate tax scenarios, a combined transportation and energy scenario, a biomass energy scenario and a reference case. Implementation of a relatively aggressive carbon tax leads to improved PM2.5 air quality compared to the reference case as incentives increase for facilities to install flue-gas desulfurization (FGD) and carbon capture and sequestration (CCS) technologies. However, less capital is available to install NOX reduction technologies, resulting in an O3 increase. A policy aimed at reducing CO2 from the transportation sector and electricity production sectors leads to reduced emissions of mobile source NOX, thus reducing O3. Over most of the U.S., this scenario leads to reduced PM2.5 concentrations. However, increased primary PM2.5 emissions associated with fuel switching in the residential and industrial sectors leads to increased organic matter (OM) and PM2.5 in some cities.

  8. Big-bounce cosmology from quantum gravity: The case of a cyclical Bianchi I universe

    NASA Astrophysics Data System (ADS)

    Moriconi, Riccardo; Montani, Giovanni; Capozziello, Salvatore

    2016-07-01

    We analyze the classical and quantum dynamics of a Bianchi I model in the presence of a small negative cosmological constant characterizing its evolution in term of the dust-time dualism. We demonstrate that in a canonical metric approach, the cosmological singularity is removed in correspondence to a positive defined value of the dust energy density. Furthermore, the quantum big bounce is connected to the Universe's turning point via a well-defined semiclassical limit. Then we can reliably infer that the proposed scenario is compatible with a cyclical universe picture. We also show how, when the contribution of the dust energy density is sufficiently high, the proposed scenario can be extended to the Bianchi IX cosmology and therefore how it can be regarded as a paradigm for the generic cosmological model. Finally, we investigate the origin of the observed cutoff on the cosmological dynamics, demonstrating how the big-bounce evolution can be mimicked by the same semiclassical scenario, where the negative cosmological constant is replaced via a polymer discretization of the Universe's volume. A direct proportionality law between these two parameters is then established.

  9. Judging the 'passability' of dynamic gaps in a virtual rugby environment.

    PubMed

    Watson, Gareth; Brault, Sebastien; Kulpa, Richard; Bideau, Benoit; Butterfield, Joe; Craig, Cathy

    2011-10-01

    Affordances have recently been proposed as a guiding principle in perception-action research in sport (Fajen, Riley, & Turvey, 2009). In the present study, perception of the 'passability' affordance of a gap between two approaching defenders in rugby is explored. A simplified rugby gap closure scenario was created using immersive, interactive virtual reality technology where 14 novice participants (attacker) judged the passability of the gap between two virtual defenders via a perceptual judgment (button press) task. The scenario was modeled according to tau theory (Lee, 1976) and a psychophysical function was fitted to the response data. Results revealed that a tau-based informational quantity could account for 82% of the variance in the data. Findings suggest that the passability affordance in this case, is defined by this variable and participants were able to use it in order to inform prospective judgments as to passability. These findings contribute to our understanding of affordances and how they may be defined in this particular sporting scenario; however, some limitations regarding methodology, such as decoupling perception and action are also acknowledged. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Connecting the Links: Narratives, Simulations and Serious Games in Prehospital Training.

    PubMed

    Heldal, Ilona; Backlund, Per; Johannesson, Mikael; Lebram, Mikael; Lundberg, Lars

    2017-01-01

    Due to rapid and substantial changes in the health sector, collaboration and supporting technologies get more into focus. Changes in education and training are also required. Simulations and serious games (SSG) are often advocated as promising technologies supporting training of many and in the same manner, or increasing the skills necessary to deal with new, dangerous, complex or unexpected situations. The aim of this paper is to illustrate and discuss resources needed for planning and performing collaborative contextual training scenarios. Based on a practical study involving prehospital nurses and different simulator technologies the often-recurring activity chains in prehospital training were trained. This paper exemplifies the benefit of using narratives and SSGs for contextual training contributing to higher user experiences. The benefits of using simulation technologies aligned by processes can be easier defined by narratives from practitioners. While processes help to define more efficient and effective training, narratives and SSGs are beneficial to design scenarios with clues for higher user experiences. By discussing illustrative examples, the paper contributes to better understanding of how to plan simulation-technology rich training scenarios.

  11. Exposure of the general public due to wireless LAN applications in public places.

    PubMed

    Schmid, G; Preiner, P; Lager, D; Uberbacher, R; Georg, R

    2007-01-01

    The typical exposure caused by wireless LAN applications in public areas has been investigated in a variety of scenarios. Small-sized (internet café) and large-scale (airport) indoor scenarios as well as outdoor scenarios in the environment of access points (AP) supplying for residential areas and public places were considered. The exposure assessment was carried out by numerical GTD/UTD computations based on optical wave propagation, as well as by verifying frequency selective measurements in the considered scenarios under real life conditions. In the small-sized indoor scenario the maximum temporal peak values of power density, spatially averaged over body dimensions, were found to be lower than 20 mW/m(2), corresponding to 0.2% of the reference level according to the European Council Recommendation 1999/519/EC. Local peak values of power density might be 1-2 orders of magnitude higher, spatial and time-averaged values for usual data traffic conditions might be 2-3 orders of magnitude lower, depending on the actual data traffic. In the considered outdoor scenarios, exposure was several orders of magnitude lower than in indoor scenarios due to the usually larger distances to the AP antennas.

  12. Analysis of JT-60SA operational scenarios

    NASA Astrophysics Data System (ADS)

    Garzotti, L.; Barbato, E.; Garcia, J.; Hayashi, N.; Voitsekhovitch, I.; Giruzzi, G.; Maget, P.; Romanelli, M.; Saarelma, S.; Stankiewitz, R.; Yoshida, M.; Zagórski, R.

    2018-02-01

    Reference scenarios for the JT-60SA tokamak have been simulated with one-dimensional transport codes to assess the stationary state of the flat-top phase and provide a profile database for further physics studies (e.g. MHD stability, gyrokinetic analysis) and diagnostics design. The types of scenario considered vary from pulsed standard H-mode to advanced non-inductive steady-state plasmas. In this paper we present the results obtained with the ASTRA, CRONOS, JINTRAC and TOPICS codes equipped with the Bohm/gyro-Bohm, CDBM and GLF23 transport models. The scenarios analysed here are: a standard ELMy H-mode, a hybrid scenario and a non-inductive steady state plasma, with operational parameters from the JT-60SA research plan. Several simulations of the scenarios under consideration have been performed with the above mentioned codes and transport models. The results from the different codes are in broad agreement and the main plasma parameters generally agree well with the zero dimensional estimates reported previously. The sensitivity of the results to different transport models and, in some cases, to the ELM/pedestal model has been investigated.

  13. Aerial surveillance based on hierarchical object classification for ground target detection

    NASA Astrophysics Data System (ADS)

    Vázquez-Cervantes, Alberto; García-Huerta, Juan-Manuel; Hernández-Díaz, Teresa; Soto-Cajiga, J. A.; Jiménez-Hernández, Hugo

    2015-03-01

    Unmanned aerial vehicles have turned important in surveillance application due to the flexibility and ability to inspect and displace in different regions of interest. The instrumentation and autonomy of these vehicles have been increased; i.e. the camera sensor is now integrated. Mounted cameras allow flexibility to monitor several regions of interest, displacing and changing the camera view. A well common task performed by this kind of vehicles correspond to object localization and tracking. This work presents a hierarchical novel algorithm to detect and locate objects. The algorithm is based on a detection-by-example approach; this is, the target evidence is provided at the beginning of the vehicle's route. Afterwards, the vehicle inspects the scenario, detecting all similar objects through UTM-GPS coordinate references. Detection process consists on a sampling information process of the target object. Sampling process encode in a hierarchical tree with different sampling's densities. Coding space correspond to a huge binary space dimension. Properties such as independence and associative operators are defined in this space to construct a relation between the target object and a set of selected features. Different densities of sampling are used to discriminate from general to particular features that correspond to the target. The hierarchy is used as a way to adapt the complexity of the algorithm due to optimized battery duty cycle of the aerial device. Finally, this approach is tested in several outdoors scenarios, proving that the hierarchical algorithm works efficiently under several conditions.

  14. Evaluating Daily Load Stimulus Formulas in Relating Bone Response to Exercise

    NASA Technical Reports Server (NTRS)

    Pennline, James A.; Mulugeta, Lealem

    2014-01-01

    Six formulas representing what is commonly referred to as "daily load stimulus" are identified, compared and tested in their ability to relate skeletal mechanical loading to bone maintenance and osteogenic response. Particular emphasis is placed on exercise- induced skeletal loading and whether or not the formulas can adequately capture the known experimental observations of saturation of continuous cyclic loading, rest insertion between repetitions (cycles), recovery of osteogenic potential following saturation, and multiple shorter bouts versus a single long bout of exercise. To evaluate the ability of the formulas to capture these characteristics, a set of exercise scenarios with type of exercise bout, specific duration, number of repetitions, and rest insertion between repetitions is defined. The daily load values obtained from the formulas for the loading conditions of the set of scenarios is illustrated. Not all of the formulas form estimates of daily load in units of stress or in terms of strain at a skeletal site due to the loading force from a specific exercise prescription. The comparative results show that none of the formulas are able to capture all of the experimentally observed characteristics of cyclic loading. However, the enhanced formula presented by Genc et al. does capture several characteristics of cyclic loading that the others do not, namely recovery of osteogenic potential and saturation. This could be a basis for further development of mathematical formulas that more adequately approximates the amount of daily stress at a skeletal site that contributes to bone adaptation.

  15. Approaches to defining reference regimes for river restoration planning

    NASA Astrophysics Data System (ADS)

    Beechie, T. J.

    2014-12-01

    Reference conditions or reference regimes can be defined using three general approaches, historical analysis, contemporary reference sites, and theoretical or empirical models. For large features (e.g., floodplain channels and ponds) historical data and maps are generally reliable. For smaller features (e.g., pools and riffles in small tributaries), field data from contemporary reference sites are a reasonable surrogate for historical data. Models are generally used for features that have no historical information or present day reference sites (e.g., beaver pond habitat). Each of these approaches contributes to a watershed-wide understanding of current biophysical conditions relative to potential conditions, which helps create not only a guiding vision for restoration, but also helps quantify and locate the largest or most important restoration opportunities. Common uses of geomorphic and biological reference conditions include identifying key areas for habitat protection or restoration, and informing the choice of restoration targets. Examples of use of each of these three approaches to define reference regimes in western USA illustrate how historical information and current research highlight key restoration opportunities, focus restoration effort in areas that can produce the largest ecological benefit, and contribute to estimating restoration potential and assessing likelihood of achieving restoration goals.

  16. Survey of the clinical assessment and utility of near-infrared cerebral oximetry in cardiac surgery.

    PubMed

    Zacharias, David G; Lilly, Kevin; Shaw, Cynthia L; Pirundini, Paul; Rizzo, Robert J; Body, Simon C; Longford, Nicholas T

    2014-04-01

    Near-infrared cerebral oximetry increasingly is used for monitoring during cardiac surgery. Nonetheless, the scientific basis for incorporating this technology into clinical practice, the indications for when to do so, and standard diagnostic and treatment algorithms for defining abnormal values are yet to be rigorously defined. The authors hypothesized that there would be (1) variation in clinical use and practices for near-infrared spectroscopy (NIRS), and (2) variation in management of patients when clinicians are provided with NIRS information. In order to test this hypothesis, they sought to assess the nature and strength of response heterogeneity among anesthesiologists and cardiac perfusionists when provided with cardiac surgery patient scenarios and cerebral oximetry data. A prospectively collected survey. A hospital-based, multi-institutional, multinational study. By e-mail, the authors surveyed the membership of the Society of Cardiovascular Anesthesiologists and the online Cardiovascular Perfusion Forum. This survey was focused on ascertaining what actions clinicians would take in each scenario, given case information and cerebral oximetry tracings. Questions were based on 11 patient scenarios selected to represent small, large, symmetric, or asymmetric decreases in measured regional cerebral oxygen saturation (rScO2) encountered during cardiac surgery. Information on the respondents' (n = 796; 73% anesthesiologists) clinical practice, demography, and cerebral oximetry utilization was collected. An index of dispersion was used to assess response heterogeneity overall and within demographic subgroups. The majority of respondents indicated that cerebral oximetry monitoring was either useful or an essential monitor, especially perfusionists and clinicians who used cerebral oximetry most frequently. There were marked differences in responses between perfusionists and anesthesiologists for 4 of the 6 scenarios (p<0.005 for each of these 4 scenarios) occurring during cardiopulmonary bypass. Scenarios having greatest rScO2 reduction or asymmetry in rScO2 were associated with the highest dispersion, indicating least agreement in management. Scenarios with mild or moderate rScO2 reduction were associated with the lowest dispersion, indicating greater agreement in management. Although experimental data gradually are accumulating to support the role for cerebral oximetry monitoring during cardiac surgery, the results of the present survey support the view that its role remains poorly defined, and consensus for its appropriate use is lacking. Importantly, the authors observed marked variation in the use, perceived utility, and management of patients for 4 of the 6 CPB scenarios between perfusionists and anesthesiologists who share the management of CPB. These findings support the need for well-designed, adequately-powered clinical trials examining the value of this technology. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Software Defined Networking for Next Generation Converged Metro-Access Networks

    NASA Astrophysics Data System (ADS)

    Ruffini, M.; Slyne, F.; Bluemm, C.; Kitsuwan, N.; McGettrick, S.

    2015-12-01

    While the concept of Software Defined Networking (SDN) has seen a rapid deployment within the data center community, its adoption in telecommunications network has progressed slowly, although the concept has been swiftly adopted by all major telecoms vendors. This paper presents a control plane architecture for SDN-driven converged metro-access networks, developed through the DISCUS European FP7 project. The SDN-based controller architecture was developed in a testbed implementation targeting two main scenarios: fast feeder fiber protection over dual-homed Passive Optical Networks (PONs) and dynamic service provisioning over a multi-wavelength PON. Implementation details and results of the experiment carried out over the second scenario are reported in the paper, showing the potential of SDN in providing assured on-demand services to end-users.

  18. A generic multibody simulation

    NASA Technical Reports Server (NTRS)

    Hopping, K. A.; Kohn, W.

    1986-01-01

    Described is a dynamic simulation package which can be configured for orbital test scenarios involving multiple bodies. The rotational and translational state integration methods are selectable for each individual body and may be changed during a run if necessary. Characteristics of the bodies are determined by assigning components consisting of mass properties, forces, and moments, which are the outputs of user-defined environmental models. Generic model implementation is facilitated by a transformation processor which performs coordinate frame inversions. Transformations are defined in the initialization file as part of the simulation configuration. The simulation package includes an initialization processor, which consists of a command line preprocessor, a general purpose grammar, and a syntax scanner. These permit specifications of the bodies, their interrelationships, and their initial states in a format that is not dependent on a particular test scenario.

  19. Definition of information technology architectures for continuous data management and medical device integration in diabetes.

    PubMed

    Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J

    2008-09-01

    The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.

  20. Integrated assessment of future potential global change scenarios and their hydrological impacts in coastal aquifers - a new tool to analyse management alternatives in the Plana Oropesa-Torreblanca aquifer

    NASA Astrophysics Data System (ADS)

    Pulido-Velazquez, David; Renau-Pruñonosa, Arianna; Llopis-Albert, Carlos; Morell, Ignacio; Collados-Lara, Antonio-Juan; Senent-Aparicio, Javier; Baena-Ruiz, Leticia

    2018-05-01

    Any change in the components of the water balance in a coastal aquifer, whether natural or anthropogenic, can alter the freshwater-salt water equilibrium. In this sense climate change (CC) and land use and land cover (LULC) change might significantly influence the availability of groundwater resources in the future. These coastal systems demand an integrated analysis of quantity and quality issues to obtain an appropriate assessment of hydrological impacts using density-dependent flow solutions. The aim of this work is to perform an integrated analysis of future potential global change (GC) scenarios and their hydrological impacts in a coastal aquifer, the Plana Oropesa-Torreblanca aquifer. It is a Mediterranean aquifer that extends over 75 km2 in which important historical LULC changes have been produced and are planned for the future. Future CC scenarios will be defined by using an equi-feasible and non-feasible ensemble of projections based on the results of a multi-criteria analysis of the series generated from several regional climatic models with different downscaling approaches. The hydrological impacts of these CC scenarios combined with future LULC scenarios will be assessed with a chain of models defined by a sequential coupling of rainfall-recharge models, crop irrigation requirements and irrigation return models (for the aquifer and its neighbours that feed it), and a density-dependent aquifer approach. This chain of models, calibrated using the available historical data, allow testing of the conceptual approximation of the aquifer behaviour. They are also fed with series representatives of potential global change scenarios in order to perform a sensitivity analysis regarding future scenarios of rainfall recharge, lateral flows coming from the hydraulically connected neighbouring aquifer, agricultural recharge (taking into account expected future LULC changes) and sea level rise (SLR). The proposed analysis is valuable for improving our knowledge about the aquifer, and so comprises a tool to design sustainable adaptation management strategies taking into account the uncertainty in future GC conditions and their impacts. The results show that GC scenarios produce significant increases in the variability of flow budget components and in the salinity.

  1. Space station needs, attributes and architectural options. Volume 1: Executive summary NASA

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The uses alignment plan was implemented. The existing data bank was used to define a large number of station requirements. Ten to 20 valid mission scenarios were developed. Architectural options as they are influenced by communications operations, subsystem evolvability, and required technology growth are defined. Costing of evolutionary concepts, alternative approaches, and options, was based on minimum design details.

  2. The NASA Hydrogen Energy Systems Technology study - A summary

    NASA Technical Reports Server (NTRS)

    Laumann, E. A.

    1976-01-01

    This study is concerned with: hydrogen use, alternatives and comparisons, hydrogen production, factors affecting application, and technology requirements. Two scenarios for future use are explained. One is called the reference hydrogen use scenario and assumes continued historic uses of hydrogen along with additional use for coal gasification and liquefaction, consistent with the Ford technical fix baseline (1974) projection. The expanded scenario relies on the nuclear electric economy (1973) energy projection and assumes the addition of limited new uses such as experimental hydrogen-fueled aircraft, some mixing with natural gas, and energy storage by utilities. Current uses and supply of hydrogen are described, and the technological requirements for developing new methods of hydrogen production are discussed.

  3. Inverse modeling of surface-water discharge to achieve restoration salinity performance measures in Florida Bay, Florida

    USGS Publications Warehouse

    Swain, E.D.; James, D.E.

    2008-01-01

    The use of numerical modeling to evaluate regional water-management practices involves the simulation of various alternative water-delivery scenarios, which typically are designed intuitively rather than analytically. These scenario simulations are used to analyze how specific water-management practices affect factors such as water levels, flows, and salinities. In lieu of testing a variety of scenario simulations in a trial-and-error manner, an optimization technique may be used to more precisely and directly define good water-management alternatives. A numerical model application in the coastal regions of Florida Bay and Everglades National Park (ENP), representing the surface- and ground-water hydrology for the region, is a good example of a tool used to evaluate restoration scenarios. The Southern Inland and Coastal System (SICS) model simulates this area with a two-dimensional hydrodynamic surface-water model and a three-dimensional ground-water model, linked to represent the interaction of the two systems with salinity transport. This coastal wetland environment is of great interest in restoration efforts, and the SICS model is used to analyze the effects of alternative water-management scenarios. The SICS model is run within an inverse modeling program called UCODE. In this application, UCODE adjusts the regulated inflows to ENP while SICS is run iteratively. UCODE creates parameters that define inflow within an allowable range for the SICS model based on SICS model output statistics, with the objective of matching user-defined target salinities that meet ecosystem restoration criteria. Preliminary results obtained using two different parameterization methods illustrate the ability of the model to achieve the goals of adjusting the range and reducing the variance of salinity values in the target area. The salinity variance in the primary zone of interest was reduced from an original value of 0.509 psu2 to values 0.418 psu2 and 0.342 psu2 using different methods. Simulations with one, two, and three target areas indicate that optimization is limited near model boundaries and the target location nearest the tidal boundary may not be improved. These experiments indicate that this method can be useful for designing water-delivery schemes to achieve certain water-quality objectives. Additionally, this approach avoids much of the intuitive type of experimentation with different flow schemes that has often been used to develop restoration scenarios. ?? 2007 Elsevier B.V. All rights reserved.

  4. Fracture risk among older men: osteopenia and osteoporosis defined using cut-points derived from female versus male reference data.

    PubMed

    Pasco, J A; Lane, S E; Brennan, S L; Timney, E N; Bucki-Smith, G; Dobbins, A G; Nicholson, G C; Kotowicz, M A

    2014-03-01

    We explored the effect of using male and female reference data in a male sample to categorise areal bone mineral density (BMD). Using male reference data, a large proportion of fractures arose from osteopenia, whereas using female reference data shifted the fracture burden into normal BMD. The purpose of this study was to describe fracture risk associated with osteopenia and osteoporosis in older men, defined by areal BMD and using cut-points derived from male and female reference data. As part of the Geelong Osteoporosis Study, we followed 619 men aged 60-93 years after BMD assessments (performed 2001-2006) until 2010, fracture, death or emigration. Post-baseline fractures were radiologically confirmed, and proportions of fractures in each BMD category were age-standardised to national profiles. Based on World Health Organization criteria, and using male reference data, 207 men had normal BMD at the femoral neck, 357 were osteopenic and 55 were osteoporotic. Using female reference data, corresponding numbers were 361, 227 and 31. During the study, 130 men died, 15 emigrated and 63 sustained at least one fracture. Using male reference data, most (86.5 %) of the fractures occurred in men without osteoporosis on BMD criteria (18.4 % normal BMD, 68.1 % osteopenia). The pattern differed when female reference data were used; while most fractures arose from men without osteoporosis (88.2 %), the burden shifted from those with osteopenia (34.8 %) to those with normal BMD (53.4 %). Decreasing BMD categories defined increasing risk of fracture. Although men with osteoporotic BMD were at greatest risk, they made a relatively small contribution to the total burden of fractures. Using male reference data, two-thirds of the fractures arose from men with osteopenia. However, using female reference data, approximately half of the fractures arose from those with normal BMD. Using female reference data to define osteoporosis in men does not appear to be the optimal approach.

  5. Ensemble catchment hydrological modelling for climate change impact analysis

    NASA Astrophysics Data System (ADS)

    Vansteenkiste, Thomas; Ntegeka, Victor; Willems, Patrick

    2014-05-01

    It is vital to investigate how the hydrological model structure affects the climate change impact given that future changes not in the range for which the models were calibrated or validated are likely. Thus an ensemble modelling approach which involves a diversity of models with different structures such as spatial resolutions and process descriptions is crucial. The ensemble modelling approach was applied to a set of models: from the lumped conceptual models NAM, PDM and VHM, an intermediate detailed and distributed model WetSpa, to the highly detailed and fully distributed model MIKE-SHE. Explicit focus was given to the high and low flow extremes. All models were calibrated for sub flows and quick flows derived from rainfall and potential evapotranspiration (ETo) time series. In general, all models were able to produce reliable estimates of the flow regimes under the current climate for extreme peak and low flows. An intercomparison of the low and high flow changes under changed climatic conditions was made using climate scenarios tailored for extremes. Tailoring was important for two reasons. First, since the use of many scenarios was not feasible it was necessary to construct few scenarios that would reasonably represent the range of extreme impacts. Second, scenarios would be more informative as changes in high and low flows would be easily traced to changes of ETo and rainfall; the tailored scenarios are constructed using seasonal changes that are defined using different levels of magnitude (high, mean and low) for rainfall and ETo. After simulation of these climate scenarios in the five hydrological models, close agreement was found among the models. The different models predicted similar range of peak flow changes. For the low flows, however, the differences in the projected impact range by different hydrological models was larger, particularly for the drier scenarios. This suggests that the hydrological model structure is critical in low flow predictions, more than in high flow conditions. Hence, the mechanism of the slow flow component simulation requires further attention. It is concluded that a multi-model ensemble approach where different plausible model structures are applied, is extremely useful. It improves the reliability of climate change impact results and allows decision making to be based on uncertainty assessment that includes model structure related uncertainties. References: Ntegeka, V., Baguis, P., Roulin, E., Willems, P., 2014. Developing tailored climate change scenarios for hydrological impact assessments. Journal of Hydrology, 508C, 307-321 Vansteenkiste, Th., Tavakoli, M., Ntegeka, V., Willems, P., De Smedt, F., Batelaan, O., 2013. Climate change impact on river flows and catchment hydrology: a comparison of two spatially distributed models. Hydrological Processes, 27(25), 3649-3662. Vansteenkiste, Th., Tavakoli, M., Ntegeka, V., Van Steenbergen, N., De Smedt, F., Batelaan, O., Pereira, F., Willems, P., 2014. Intercomparison of five lumped and distributed models for catchment runoff and extreme flow simulation. Journal of Hydrology, in press. Vansteenkiste, Th., Tavakoli, M., Ntegeka, V., De Smedt, F., Batelaan, O., Pereira, F., Willems, P., 2014. Intercomparison of climate scenario impact predictions by a lumped and distributed model ensemble. Journal of Hydrology, in revision.

  6. TCL2 Ocean Scenario Replay

    NASA Technical Reports Server (NTRS)

    Mohlenbrink, Christoph P.; Omar, Faisal Gamal; Homola, Jeffrey R.

    2017-01-01

    This is a video replay of system data that was generated from the UAS Traffic Management (UTM) Technical Capability Level (TCL) 2 flight demonstration in Nevada and rendered in Google Earth. What is depicted in the replay is a particular set of flights conducted as part of what was referred to as the Ocean scenario. The test range and surrounding area are presented followed by an overview of operational volumes. System messaging is also displayed as well as a replay of all of the five test flights as they occurred.

  7. Astrophysics of Reference Frame Tie Objects

    NASA Technical Reports Server (NTRS)

    Johnston, Kenneth J.; Boboltz, David; Fey, Alan Lee; Gaume, Ralph A.; Zacharias, Norbert

    2004-01-01

    The Astrophysics of Reference Frame Tie Objects Key Science program will investigate the underlying physics of SIM grid objects. Extragalactic objects in the SIM grid will be used to tie the SIM reference frame to the quasi-inertial reference frame defined by extragalactic objects and to remove any residual frame rotation with respect to the extragalactic frame. The current realization of the extragalactic frame is the International Celestial Reference Frame (ICRF). The ICRF is defined by the radio positions of 212 extragalactic objects and is the IAU sanctioned fundamental astronomical reference frame. This key project will advance our knowledge of the physics of the objects which will make up the SIM grid, such as quasars and chromospherically active stars, and relates directly to the stability of the SIM reference frame. The following questions concerning the physics of reference frame tie objects will be investigated.

  8. CanOpen on RASTA: The Integration of the CanOpen IP Core in the Avionics Testbed

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele; Ortega, Carlos Urbina; Valverde, Alberto

    2013-08-01

    This paper presents the work done within the ESA Estec Data Systems Division, targeting the integration of the CanOpen IP Core with the existing Reference Architecture Test-bed for Avionics (RASTA). RASTA is the reference testbed system of the ESA Avionics Lab, designed to integrate the main elements of a typical Data Handling system. It aims at simulating a scenario where a Mission Control Center communicates with on-board computers and systems through a TM/TC link, thus providing the data management through qualified processors and interfaces such as Leon2 core processors, CAN bus controllers, MIL-STD-1553 and SpaceWire. This activity aims at the extension of the RASTA with two boards equipped with HurriCANe controller, acting as CANOpen slaves. CANOpen software modules have been ported on the RASTA system I/O boards equipped with Gaisler GR-CAN controller and acts as master communicating with the CCIPC boards. CanOpen serves as upper application layer for based on CAN defined within the CAN-in-Automation standard and can be regarded as the definitive standard for the implementation of CAN-based systems solutions. The development and integration of CCIPC performed by SITAEL S.p.A., is the first application that aims to bring the CANOpen standard for space applications. The definition of CANOpen within the European Cooperation for Space Standardization (ECSS) is under development.

  9. Deep mantle structure as a reference frame for movements in and on the Earth

    PubMed Central

    Torsvik, Trond H.; van der Voo, Rob; Doubrovine, Pavel V.; Burke, Kevin; Steinberger, Bernhard; Ashwal, Lewis D.; Trønnes, Reidar G.; Webb, Susan J.; Bull, Abigail L.

    2014-01-01

    Earth’s residual geoid is dominated by a degree-2 mode, with elevated regions above large low shear-wave velocity provinces on the core–mantle boundary beneath Africa and the Pacific. The edges of these deep mantle bodies, when projected radially to the Earth’s surface, correlate with the reconstructed positions of large igneous provinces and kimberlites since Pangea formed about 320 million years ago. Using this surface-to-core–mantle boundary correlation to locate continents in longitude and a novel iterative approach for defining a paleomagnetic reference frame corrected for true polar wander, we have developed a model for absolute plate motion back to earliest Paleozoic time (540 Ma). For the Paleozoic, we have identified six phases of slow, oscillatory true polar wander during which the Earth’s axis of minimum moment of inertia was similar to that of Mesozoic times. The rates of Paleozoic true polar wander (<1°/My) are compatible with those in the Mesozoic, but absolute plate velocities are, on average, twice as high. Our reconstructions generate geologically plausible scenarios, with large igneous provinces and kimberlites sourced from the margins of the large low shear-wave velocity provinces, as in Mesozoic and Cenozoic times. This absolute kinematic model suggests that a degree-2 convection mode within the Earth’s mantle may have operated throughout the entire Phanerozoic. PMID:24889632

  10. Deep mantle structure as a reference frame for movements in and on the Earth.

    PubMed

    Torsvik, Trond H; van der Voo, Rob; Doubrovine, Pavel V; Burke, Kevin; Steinberger, Bernhard; Ashwal, Lewis D; Trønnes, Reidar G; Webb, Susan J; Bull, Abigail L

    2014-06-17

    Earth's residual geoid is dominated by a degree-2 mode, with elevated regions above large low shear-wave velocity provinces on the core-mantle boundary beneath Africa and the Pacific. The edges of these deep mantle bodies, when projected radially to the Earth's surface, correlate with the reconstructed positions of large igneous provinces and kimberlites since Pangea formed about 320 million years ago. Using this surface-to-core-mantle boundary correlation to locate continents in longitude and a novel iterative approach for defining a paleomagnetic reference frame corrected for true polar wander, we have developed a model for absolute plate motion back to earliest Paleozoic time (540 Ma). For the Paleozoic, we have identified six phases of slow, oscillatory true polar wander during which the Earth's axis of minimum moment of inertia was similar to that of Mesozoic times. The rates of Paleozoic true polar wander (<1°/My) are compatible with those in the Mesozoic, but absolute plate velocities are, on average, twice as high. Our reconstructions generate geologically plausible scenarios, with large igneous provinces and kimberlites sourced from the margins of the large low shear-wave velocity provinces, as in Mesozoic and Cenozoic times. This absolute kinematic model suggests that a degree-2 convection mode within the Earth's mantle may have operated throughout the entire Phanerozoic.

  11. Inferring Admixture Histories of Human Populations Using Linkage Disequilibrium

    PubMed Central

    Loh, Po-Ru; Lipson, Mark; Patterson, Nick; Moorjani, Priya; Pickrell, Joseph K.; Reich, David; Berger, Bonnie

    2013-01-01

    Long-range migrations and the resulting admixtures between populations have been important forces shaping human genetic diversity. Most existing methods for detecting and reconstructing historical admixture events are based on allele frequency divergences or patterns of ancestry segments in chromosomes of admixed individuals. An emerging new approach harnesses the exponential decay of admixture-induced linkage disequilibrium (LD) as a function of genetic distance. Here, we comprehensively develop LD-based inference into a versatile tool for investigating admixture. We present a new weighted LD statistic that can be used to infer mixture proportions as well as dates with fewer constraints on reference populations than previous methods. We define an LD-based three-population test for admixture and identify scenarios in which it can detect admixture events that previous formal tests cannot. We further show that we can uncover phylogenetic relationships among populations by comparing weighted LD curves obtained using a suite of references. Finally, we describe several improvements to the computation and fitting of weighted LD curves that greatly increase the robustness and speed of the calculations. We implement all of these advances in a software package, ALDER, which we validate in simulations and apply to test for admixture among all populations from the Human Genome Diversity Project (HGDP), highlighting insights into the admixture history of Central African Pygmies, Sardinians, and Japanese. PMID:23410830

  12. Current and Future Urban Stormwater Flooding Scenarios in the Southeast Florida Coasts

    NASA Astrophysics Data System (ADS)

    Huq, E.; Abdul-Aziz, O. I.

    2016-12-01

    This study computed rainfall-fed stormwater flooding under the historical and future reference scenarios for the Southeast Coasts Basin of Florida. A large-scale, mechanistic rainfall-runoff model was developed using the U.S. E.P.A. Storm Water Management Model (SWMM 5.1). The model parameterized important processes of urban hydrology, groundwater, and sea level, while including hydroclimatological variables and land use features. The model was calibrated and validated with historical streamflow data. It was then used to estimate the sensitivity of stormwater runoff to the reference changes in hydroclimatological variables (rainfall and evapotranspiration) and different land use/land cover features (imperviousness, roughness). Furthermore, historical (1970-2000) and potential 2050s stormwater budgets were also estimated for the Florida Southeast Coasts Basin by incorporating climatic projections from different GCMs and RCMs, as well as by using relevant projections of sea level and land use/cover. Comparative synthesis of the historical and future scenarios along with the results of sensitivity analysis can aid in efficient management of stormwater flooding for the southeast Florida coasts and similar urban centers under a changing regime of climate, sea level, land use/cover and hydrology.

  13. Methodology for Generating Conflict Scenarios by Time Shifting Recorded Traffic Data

    NASA Technical Reports Server (NTRS)

    Paglione, Mike; Oaks, Robert; Bilimoria, Karl D.

    2003-01-01

    A methodology is presented for generating conflict scenarios that can be used as test cases to estimate the operational performance of a conflict probe. Recorded air traffic data is time shifted to create traffic scenarios featuring conflicts with characteristic properties similar to those encountered in typical air traffic operations. First, a reference set of conflicts is obtained from trajectories that are computed using birth points and nominal flight plans extracted from recorded traffic data. Distributions are obtained for several primary properties (e.g., encounter angle) that are most likely to affect the performance of a conflict probe. A genetic algorithm is then utilized to determine the values of time shifts for the recorded track data so that the primary properties of conflicts generated by the time shifted data match those of the reference set. This methodology is successfully demonstrated using recorded traffic data for the Memphis Air Route Traffic Control Center; a key result is that the required time shifts are less than 5 min for 99% of the tracks. It is also observed that close matching of the primary properties used in this study additionally provides a good match for some other secondary properties.

  14. Defining scenarios of future vectors of change in marine life and associated economic sectors

    NASA Astrophysics Data System (ADS)

    Groeneveld, Rolf A.; Bosello, Francesco; Butenschön, Momme; Elliott, Mike; Peck, Myron A.; Pinnegar, John K.

    2018-02-01

    Addressing the multitude of challenges in marine policy requires an integrated approach that considers the multitude of drivers, pressures, and interests, from several disciplinary angles. Scenarios are needed to harmonise the analyses of different components of the marine system, and to deal with the uncertainty and complexity of the societal and biogeophysical dynamics in the system. This study considers a set of socio-economic scenarios to (1) explore possible futures in relation to marine invasive species, outbreak forming species, and gradual changes in species distribution and productivity; and (2) harmonise the projection modelling performed within associated studies. The exercise demonstrates that developing interdisciplinary scenarios as developed in this study is particularly complicated due to (1) the wide variety in endogeneity or exogeneity of variables in the different analyses involved; (2) the dual role of policy decisions as variables in a scenario or decisions to be evaluated and compared to other decisions; and (3) the substantial difference in time scale between societal and physical drivers.

  15. Geophysical investigation of the pressure field produced by water guns at a pond site in La Crosse, Wisconsin

    USGS Publications Warehouse

    Adams, Ryan F.; Morrow, William S.

    2015-09-03

    The July 2013 study consisted of three scenarios: fish behavior, single gun assessment, and experimental barrier evaluation. The fish behavior scenario simulated the pond conditions from previous studies. Two 80-in3 water guns were fired in the south end of the testing pond. Pressures essentially doubled from the testing of the single 80-in3 water gun. The single gun assessment scenario sought to replicate the setup of the 80-in3 scenario in September 2012, but with additional sensors to better define the pressure field. The 5-lb/in2 target pressure field continued to show a radius ranging from 40 to 45 feet, dependent on the pressure of the input air. The final scenario, the experimental barrier evaluation, showed that a two-dimensional continuous plane of 5 lb/in2 can be created between two 80-in3 water guns to a separation of 99 feet and a depth of 6.5 feet with 1,500 lb/in2 of input air.

  16. Compliance monitoring in business processes: Functionalities, application, and tool-support.

    PubMed

    Ly, Linh Thao; Maggi, Fabrizio Maria; Montali, Marco; Rinderle-Ma, Stefanie; van der Aalst, Wil M P

    2015-12-01

    In recent years, monitoring the compliance of business processes with relevant regulations, constraints, and rules during runtime has evolved as major concern in literature and practice. Monitoring not only refers to continuously observing possible compliance violations, but also includes the ability to provide fine-grained feedback and to predict possible compliance violations in the future. The body of literature on business process compliance is large and approaches specifically addressing process monitoring are hard to identify. Moreover, proper means for the systematic comparison of these approaches are missing. Hence, it is unclear which approaches are suitable for particular scenarios. The goal of this paper is to define a framework for Compliance Monitoring Functionalities (CMF) that enables the systematic comparison of existing and new approaches for monitoring compliance rules over business processes during runtime. To define the scope of the framework, at first, related areas are identified and discussed. The CMFs are harvested based on a systematic literature review and five selected case studies. The appropriateness of the selection of CMFs is demonstrated in two ways: (a) a systematic comparison with pattern-based compliance approaches and (b) a classification of existing compliance monitoring approaches using the CMFs. Moreover, the application of the CMFs is showcased using three existing tools that are applied to two realistic data sets. Overall, the CMF framework provides powerful means to position existing and future compliance monitoring approaches.

  17. Compliance monitoring in business processes: Functionalities, application, and tool-support

    PubMed Central

    Ly, Linh Thao; Maggi, Fabrizio Maria; Montali, Marco; Rinderle-Ma, Stefanie; van der Aalst, Wil M.P.

    2015-01-01

    In recent years, monitoring the compliance of business processes with relevant regulations, constraints, and rules during runtime has evolved as major concern in literature and practice. Monitoring not only refers to continuously observing possible compliance violations, but also includes the ability to provide fine-grained feedback and to predict possible compliance violations in the future. The body of literature on business process compliance is large and approaches specifically addressing process monitoring are hard to identify. Moreover, proper means for the systematic comparison of these approaches are missing. Hence, it is unclear which approaches are suitable for particular scenarios. The goal of this paper is to define a framework for Compliance Monitoring Functionalities (CMF) that enables the systematic comparison of existing and new approaches for monitoring compliance rules over business processes during runtime. To define the scope of the framework, at first, related areas are identified and discussed. The CMFs are harvested based on a systematic literature review and five selected case studies. The appropriateness of the selection of CMFs is demonstrated in two ways: (a) a systematic comparison with pattern-based compliance approaches and (b) a classification of existing compliance monitoring approaches using the CMFs. Moreover, the application of the CMFs is showcased using three existing tools that are applied to two realistic data sets. Overall, the CMF framework provides powerful means to position existing and future compliance monitoring approaches. PMID:26635430

  18. HSI top-down requirements analysis for ship manpower reduction

    NASA Astrophysics Data System (ADS)

    Malone, Thomas B.; Bost, J. R.

    2000-11-01

    U.S. Navy ship acquisition programs such as DD 21 and CVNX are increasingly relying on top down requirements analysis (TDRA) to define and assess design approaches for workload and manpower reduction, and for ensuring required levels of human performance, reliability, safety, and quality of life at sea. The human systems integration (HSI) approach to TDRA begins with a function analysis which identifies the functions derived from the requirements in the Operational Requirements Document (ORD). The function analysis serves as the function baseline for the ship, and also supports the definition of RDT&E and Total Ownership Cost requirements. A mission analysis is then conducted to identify mission scenarios, again based on requirements in the ORD, and the Design Reference Mission (DRM). This is followed by a mission/function analysis which establishes the function requirements to successfully perform the ship's missions. Function requirements of major importance for HSI are information, performance, decision, and support requirements associated with each function. An allocation of functions defines the roles of humans and automation in performing the functions associated with a mission. Alternate design concepts, based on function allocation strategies, are then described, and task networks associated with the concepts are developed. Task network simulations are conducted to assess workloads and human performance capabilities associated with alternate concepts. An assessment of the affordability and risk associated with alternate concepts is performed, and manning estimates are developed for feasible design concepts.

  19. 49 CFR 385.321 - What failures of safety management practices disclosed by the safety audit will result in a...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... occurrence. This violation refers to a driver operating a CMV as defined under § 383.5. 9. § 387.7(a... unqualified driver Single occurrence. This violation refers to a driver operating a CMV as defined under § 390...

  20. Modeling the Water - Quality Effects of Changes to the Klamath River Upstream of Keno Dam, Oregon

    USGS Publications Warehouse

    Sullivan, Annett B.; Sogutlugil, I. Ertugrul; Rounds, Stewart A.; Deas, Michael L.

    2013-01-01

    The Link River to Keno Dam (Link-Keno) reach of the Klamath River, Oregon, generally has periods of water-quality impairment during summer, including low dissolved oxygen, elevated concentrations of ammonia and algae, and high pH. Efforts are underway to improve water quality in this reach through a Total Maximum Daily Load (TMDL) program and other management and operational actions. To assist in planning, a hydrodynamic and water-quality model was used in this study to provide insight about how various actions could affect water quality in the reach. These model scenarios used a previously developed and calibrated CE-QUAL-W2 model of the Link-Keno reach developed by the U.S. Geological Survey (USGS), Watercourse Engineering Inc., and the Bureau of Reclamation for calendar years 2006-09 (referred to as the "USGS model" in this report). Another model of the same river reach was previously developed by Tetra Tech, Inc. and the Oregon Department of Environmental Quality for years 2000 and 2002 and was used in the TMDL process; that model is referred to as the "TMDL model" in this report. This report includes scenarios that (1) assess the effect of TMDL allocations on water quality, (2) provide insight on certain aspects of the TMDL model, (3) assess various methods to improve water quality in this reach, and (4) examine possible water-quality effects of a future warmer climate. Results presented in this report for the first 5 scenarios supersede or augment those that were previously published (scenarios 1 and 2 in Sullivan and others [2011], 3 through 5 in Sullivan and others [2012]); those previous results are still valid, but the results for those scenarios in this report are more current.

  1. Readings from Visibility Meters: Do They Really Mean the Maximum Distance of Observing A Black Object?

    NASA Astrophysics Data System (ADS)

    Li, M.; Zhang, S.; Garcia-Menendez, F.; Monier, E.; Selin, N. E.

    2016-12-01

    Climate change, favoring more heat waves and episodes of stagnant air, may deteriorate air quality by increasing ozone and fine particulate matter (PM2.5) concentrations and high pollution episodes. This effect, termed as "climate penalty", has been quantified and explained by many earlier studies in the U.S. and Europe, but research efforts in Asian countries are limited. We evaluate the impact of climate change on air quality and human health in China and India using a modeling framework that links the Massachusetts Institute of Technology Integrated Global System Model to the Community Atmosphere Model (MIT IGSM-CAM). Future climate fields are projected under three climate scenarios including a no-policy reference scenario and two climate stabilization scenarios with 2100 total radiative forcing targets of 9.7, 4.5 and 3.7 W m-2, respectively. Each climate scenario is run for five representations of climate variability to account for the role of natural variability. Thirty-year chemical transport simulations are conducted in 1981-2010 and 2086-2115 under the three climate scenarios with fixed anthropogenic emissions at year 2000 levels. We find that 2000—2100 climate change under the no-policy reference scenario would increase ozone concentrations in eastern China and northern India by up to 5 ppb through enhancing biogenic emissions and ozone production efficiency. Ozone extreme episodes also become more frequent in these regions, while climate policies can offset most of the increase in ozone episodes. Climate change between 2000 and 2100 would slightly increase anthropogenic PM2.5 concentrations in northern China and Sichuan province, but significantly reduce anthropogenic PM2.5 concentrations in southern China and northern India, primarily due to different chemical responses of sulfate-nitrate-ammonium aerosols to climate change in these regions. Our study also suggests that the mitigation costs of climate policies can be partially offset by health benefits from reduced climate-induced air pollution in China.

  2. Climate Penalty on Air Quality and Human Health in China and India

    NASA Astrophysics Data System (ADS)

    Li, M.; Zhang, S.; Garcia-Menendez, F.; Monier, E.; Selin, N. E.

    2017-12-01

    Climate change, favoring more heat waves and episodes of stagnant air, may deteriorate air quality by increasing ozone and fine particulate matter (PM2.5) concentrations and high pollution episodes. This effect, termed as "climate penalty", has been quantified and explained by many earlier studies in the U.S. and Europe, but research efforts in Asian countries are limited. We evaluate the impact of climate change on air quality and human health in China and India using a modeling framework that links the Massachusetts Institute of Technology Integrated Global System Model to the Community Atmosphere Model (MIT IGSM-CAM). Future climate fields are projected under three climate scenarios including a no-policy reference scenario and two climate stabilization scenarios with 2100 total radiative forcing targets of 9.7, 4.5 and 3.7 W m-2, respectively. Each climate scenario is run for five representations of climate variability to account for the role of natural variability. Thirty-year chemical transport simulations are conducted in 1981-2010 and 2086-2115 under the three climate scenarios with fixed anthropogenic emissions at year 2000 levels. We find that 2000—2100 climate change under the no-policy reference scenario would increase ozone concentrations in eastern China and northern India by up to 5 ppb through enhancing biogenic emissions and ozone production efficiency. Ozone extreme episodes also become more frequent in these regions, while climate policies can offset most of the increase in ozone episodes. Climate change between 2000 and 2100 would slightly increase anthropogenic PM2.5 concentrations in northern China and Sichuan province, but significantly reduce anthropogenic PM2.5 concentrations in southern China and northern India, primarily due to different chemical responses of sulfate-nitrate-ammonium aerosols to climate change in these regions. Our study also suggests that the mitigation costs of climate policies can be partially offset by health benefits from reduced climate-induced air pollution in China.

  3. Environment, Health and Climate: Impact of African aerosols

    NASA Astrophysics Data System (ADS)

    Liousse, C.; Doumbia, T.; Assamoi, E.; Galy-Lacaux, C.; Baeza, A.; Penner, J. E.; Val, S.; Cachier, H.; Xu, L.; Criqui, P.

    2012-12-01

    Fossil fuel and biofuel emissions of particles in Africa are expected to significantly increase in the near future, particularly due to rapid growth of African cities. In addition to biomass burning emissions prevailing in these areas, air quality degradation is then expected with important consequences on population health and climatic/radiative impact. In our group, we are constructing a new integrated methodology to study the relations between emissions, air quality and their impacts. This approach includes: (1) African combustion emission characterizations; (2) joint experimental determination of aerosol chemistry from ultrafine to coarse fractions and health issues (toxicology and epidemiology). (3) integrated environmental, health and radiative modeling. In this work, we show some results illustrating our first estimates of African anthropogenic emission impacts: - a new African anthropogenic emission inventory adapted to regional specificities on traffic, biofuel and industrial emissions has been constructed for the years 2005 and 2030. Biomass burning inventories were also improved in the frame of AMMA (African Monsoon) program. - carbonaceous aerosol radiative impact in Africa has been modeled with TM5 model and Penner et al. (2011) radiative code for these inventories for 2005 and 2030 and for two scenarios of emissions : a reference scenario, with no further emission controls beyond those achieved in 2003 and a ccc* scenario including planned policies in Kyoto protocol and regulations as applied to African emission specificities. In this study we will show that enhanced heating is expected with the ccc* scenarios emissions in which the OC fraction is relatively lower than in the reference scenario. - results of short term POLCA intensive campaigns in Bamako and Dakar in terms of aerosol chemical characterization linked to specific emissions sources and their inflammatory impacts on the respiratory tract through in vitro studies. In this study, organic carbon particles have appeared quite biologically active. Quite importantly, air quality improvement obtained through regulations in the ccc* scenario are accompanied by stronger heating impact.

  4. Assessing climate change over the Marche Region (central Italy) from 1951 to 2050: toward an integrated strategy for climate impacts reduction

    NASA Astrophysics Data System (ADS)

    Sangelantoni, Lorenzo; Russo, Aniello; Marincioni, Fausto; Appiotti, Federica

    2013-04-01

    This study investigates consequences and future impacts of climate change on the social and natural systems of the Marche Region (one of the 20 administrative divisions of Italy). This Region, is located in central part of the peninsula and borders the Adriatic Sea on the East and the Apennine mountains on the West. The Region extends for about 60 km E-W, and has a NW-SE coastline of about 170 km, covering a total area of 9366 km2. Multimodel projections over the Marche Regions, on daily, monthly and seasonal temperature and precipitation parameters, have been extracted from the outputs of a set of Regional Climate Models (RCMs) over Europe run by several research institutes participating to the EU ENSEMBLE project. These climate simulations refer to the boundary conditions of the IPCC A1B emission scenario, and have a horizontal resolution of 25km × 25km covering a time period from 1951 to 2050. Results detail a significant increase of daily, monthly and seasonal mean temperatures, especially in summer, with anomaly values reaching +3°C after the year 2025, referring to the model CliNo 1981-2010. Mountain areas show higher values of temperature anomalies than coastal ones of approximately 0.5 °C. Concurrently, a widespread decrease of seasonal precipitation appears to affect all seasons, except for autumn. Rainfall decrease and temperature increase could reduce the Region's aquifer recharge and overall availability of hydro resources. These alterations could affect human health, agricultural productivity, forest fires, coastal erosion, algal blooms and water quality. Ongoing analysis of extreme climatological indices (e.g. frequency of maximum daily temperature exceeding comfort thresholds) are expected to quantify such impacts. A first analysis, linking climate change to the hydrologic cycle, studied through the computation of the hydro-climatic intensity index (as defined by Giorgi et al., 2012), suggests for the Marche Region an increase of the intensity of both wet and dry extremes. Such changes could alter the Region's hydro-geologic processes leading to increased intensity and frequency of landslide and flood hazards. These trends, considering the geomorphologic, social and economic characteristics of the Marche Region, suggest severe physical impacts scenario over the mountains band with subsequent socio-economic effects on hilly and coastal areas. Greater dry conditions are expected all over the Region, causing soil degradation and reducing river solid transport. In turn, this will impact agriculture productivity and natural beach nourishment likely causing a decline in beach tourism. On the other hand increased flood frequency would impact the several urban and economic settlements located on floodplains. Once these scenarios will be better defined, the next step could be mapping the vulnerability conditions within the Marche Region, thus highlighting exposure and resilience of infrastructures and population. Better knowledge of climate hazards and risks would support decision makers and legislators to implement, in the short terms, policies for the long term reduction of climate impacts in the Marche Region.

  5. Response and adaptation of grapevine cultivars to hydrological conditions forced by a changing climate in a complex landscape

    NASA Astrophysics Data System (ADS)

    De Lorenzi, Francesca; Bonfante, Antonello; Alfieri, Silvia Maria; Monaco, Eugenia; De Mascellis, Roberto; Manna, Piero; Menenti, Massimo

    2014-05-01

    Soil water availability is one of the main components of the terroir concept, influencing crop yield and fruit composition in grapes. The aim of this work is to analyze some elements of the "natural environment" of terroir (climate and soil) in combination with the intra-specific biodiversity of yield responses of grapevine to water availability. From a reference (1961-90) to a future (2021-50) climate case, the effects of climate evolution on soil water availability are assessed and, regarding soil water regime as a predictor variable, the potential spatial distribution of wine-producing cultivars is determined. In a region of Southern Italy (Valle Telesina, 20,000 ha), where a terroir classification has been produced (Bonfante et al., 2011), we applied an agro-hydrological model to determine water availability indicators. Simulations were performed in 60 soil typological units, over the entire study area, and water availability (= hydrological) indicators were determined. Two climate cases were considered: reference (1961-90) and future (2021-2050), the former from climatic statistics on observed variables, and the latter from statistical downscaling of predictions by general circulation models (AOGCM) under A1B SRES scenario. Climatic data consist of daily time series of maximum and minimum temperature, and daily rainfall on a grid with a spatial resolution of 35 km. Spatial and temporal variability of hydrological indicators was addressed. With respect to temporal variability, both inter-annual and intra-annual (i.e. at different stages of crop cycle) variability were analyzed. Some cultivar-specific relations between hydrological indicators and characteristics of must quality were established. Moreover, for several wine-producing cultivars, hydrological requirements were determined by means of yield response functions to soil water availability, through the re-analysis of experimental data derived from scientific literature. The standard errors of estimated requirements were determined. To assess cultivars adaptability, hydrological requirements were evaluated against hydrological indicators. A probabilistic assessment of adaptability was performed, and the inaccuracy of estimated hydrological requirements was accounted for by the error of estimate and its distribution. Maps of cultivars potential distribution, i.e. locations where each cultivar is expected to be compatible with climate, were derived and possible options for adaptation to climate change were defined. The 2021 - 2050 climate scenario was characterized by higher temperatures throughout the year and by a significant decrease in precipitation during spring and autumn. The results have shown the relevant variability of soils water regime and its effects on cultivars adaptability. In the future climate scenario, a hydrological indicator (i.e. relative evapotranspiration deficit - RETD), averaged over the growing season, showed an average increase of 5-8 %, and more pronounced increases occurred in the phenological phases of berry formation and ripening. At the locations where soil hydrological conditions were favourable (like the ancient terraces), hydrological indicators were quite similar in both climate scenarios and the adaptability of the cultivars was high both in the reference and future climate case. The work was carried out within the Italian national project AGROSCENARI funded by the Ministry for Agricultural, Food and Forest Policies (MIPAAF, D.M. 8608/7303/2008) Keywords: climate change, Vitis vinifera L., simulation model, yield response functions, potential cultivation area.

  6. Climate Change Effects of Forest Management and Substitution of Carbon-Intensive Materials and Fossil Fuels

    NASA Astrophysics Data System (ADS)

    Sathre, R.; Gustavsson, L.; Haus, S.; Lundblad, M.; Lundström, A.; Ortiz, C.; Truong, N.; Wikberg, P. E.

    2016-12-01

    Forests can play several roles in climate change mitigation strategies, for example as a reservoir for storing carbon and as a source of renewable materials and energy. To better understand the linkages and possible trade-offs between different forest management strategies, we conduct an integrated analysis where both sequestration of carbon in growing forests and the effects of substituting carbon intensive products within society are considered. We estimate the climate effects of directing forest management in Sweden towards increased carbon storage in forests, with more land set-aside for protection, or towards increased forest production for the substitution of carbon-intensive materials and fossil fuels, relative to a reference case of current forest management. We develop various scenarios of forest management and biomass use to estimate the carbon balances of the forest systems, including ecological and technological components, and their impacts on the climate in terms of cumulative radiative forcing over a 100-year period. For the reference case of current forest management, increasing the harvest of forest residues is found to give increased climate benefits. A scenario with increased set-aside area and the current level of forest residue harvest begins with climate benefits compared to the reference scenario, but the benefits cannot be sustained for 100 years because the rate of carbon storage in set-aside forests diminishes over time as the forests mature, but the demand for products and fuels remains. The most climatically beneficial scenario, expressed as reduced cumulative radiative forcing, in both the short and long terms is a strategy aimed at high forest production, high residue recovery rate, and high efficiency utilization of harvested biomass. Active forest management with high harvest level and efficient forest product utilization will provide more climate benefit, compared to reducing harvest and storing more carbon in the forest. Figure. Schematic diagram of complete modelled forest system including ecological and technological components, showing major flows of carbon.

  7. Using Geo-Data Corporately on the Response Phase of Emergency Management

    NASA Astrophysics Data System (ADS)

    Demir Ozbek, E.; Ates, S.; Aydinoglu, A. C.

    2015-08-01

    Response phase of emergency management is the most complex phase in the entire cycle because it requires cooperation between various actors relating to emergency sectors. A variety of geo-data is needed at the emergency response such as; existing data provided by different institutions and dynamic data collected by different sectors at the time of the disaster. Disaster event is managed according to elaborately defined activity-actor-task-geodata cycle. In this concept, every activity of emergency response is determined with Standard Operation Procedure that enables users to understand their tasks and required data in any activity. In this study, a general conceptual approach for disaster and emergency management system is developed based on the regulations to serve applications in Istanbul Governorship Provincial Disaster and Emergency Directorate. The approach is implemented to industrial facility explosion example. In preparation phase, optimum ambulance locations are determined according to general response time of the ambulance to all injury cases in addition to areas that have industrial fire risk. Management of the industrial fire case is organized according to defined actors, activities, and working cycle that describe required geo-data. A response scenario was prepared and performed for an industrial facility explosion event to exercise effective working cycle of actors. This scenario provides using geo-data corporately between different actors while required data for each task is defined to manage the industrial facility explosion event. Following developing web technologies, this scenario based approach can be effective to use geo-data on the web corporately.

  8. Identifying chemicals that are planetary boundary threats.

    PubMed

    MacLeod, Matthew; Breitholtz, Magnus; Cousins, Ian T; de Wit, Cynthia A; Persson, Linn M; Rudén, Christina; McLachlan, Michael S

    2014-10-07

    Rockström et al. proposed a set of planetary boundaries that delimit a "safe operating space for humanity". Many of the planetary boundaries that have so far been identified are determined by chemical agents. Other chemical pollution-related planetary boundaries likely exist, but are currently unknown. A chemical poses an unknown planetary boundary threat if it simultaneously fulfills three conditions: (1) it has an unknown disruptive effect on a vital Earth system process; (2) the disruptive effect is not discovered until it is a problem at the global scale, and (3) the effect is not readily reversible. In this paper, we outline scenarios in which chemicals could fulfill each of the three conditions, then use the scenarios as the basis to define chemical profiles that fit each scenario. The chemical profiles are defined in terms of the nature of the effect of the chemical and the nature of exposure of the environment to the chemical. Prioritization of chemicals in commerce against some of the profiles appears feasible, but there are considerable uncertainties and scientific challenges that must be addressed. Most challenging is prioritizing chemicals for their potential to have a currently unknown effect on a vital Earth system process. We conclude that the most effective strategy currently available to identify chemicals that are planetary boundary threats is prioritization against profiles defined in terms of environmental exposure combined with monitoring and study of the biogeochemical processes that underlie vital Earth system processes to identify currently unknown disruptive effects.

  9. Nonimaging optical illumination system

    DOEpatents

    Winston, R.; Ries, H.

    1996-12-17

    A nonimaging illumination optical device for producing a selected far field illuminance over an angular range. The optical device includes a light source, a light reflecting surface, and a family of light edge rays defined along a reference line with the reflecting surface defined in terms of the reference line as a parametric function R(t) where t is a scalar parameter position and R(t)=k(t)+Du(t) where k(t) is a parameterization of the reference line, and D is a distance from a point on the reference line to the reflection surface along the desired edge ray through the point. 35 figs.

  10. Nonimaging optical illumination system

    DOEpatents

    Winston, R.; Ries, H.

    1998-10-06

    A nonimaging illumination optical device for producing a selected far field illuminance over an angular range. The optical device includes a light source a light reflecting surface, and a family of light edge rays defined along a reference line with the reflecting surface defined in terms of the reference lines a parametric function R(t) where t is a scalar parameter position and R(t)=k(t)+Du(t) where k(t) is a parameterization of the reference line, and D is a distance from a point on the reference line to the reflection surface along the desired edge ray through the point. 35 figs.

  11. Defining Top-of-Atmosphere Flux Reference Level for Earth Radiation Budget Studies

    NASA Technical Reports Server (NTRS)

    Loeb, N. G.; Kato, S.; Wielicki, B. A.

    2002-01-01

    To estimate the earth's radiation budget at the top of the atmosphere (TOA) from satellite-measured radiances, it is necessary to account for the finite geometry of the earth and recognize that the earth is a solid body surrounded by a translucent atmosphere of finite thickness that attenuates solar radiation differently at different heights. As a result, in order to account for all of the reflected solar and emitted thermal radiation from the planet by direct integration of satellite-measured radiances, the measurement viewing geometry must be defined at a reference level well above the earth s surface (e.g., 100 km). This ensures that all radiation contributions, including radiation escaping the planet along slant paths above the earth s tangent point, are accounted for. By using a field-of- view (FOV) reference level that is too low (such as the surface reference level), TOA fluxes for most scene types are systematically underestimated by 1-2 W/sq m. In addition, since TOA flux represents a flow of radiant energy per unit area, and varies with distance from the earth according to the inverse-square law, a reference level is also needed to define satellite-based TOA fluxes. From theoretical radiative transfer calculations using a model that accounts for spherical geometry, the optimal reference level for defining TOA fluxes in radiation budget studies for the earth is estimated to be approximately 20 km. At this reference level, there is no need to explicitly account for horizontal transmission of solar radiation through the atmosphere in the earth radiation budget calculation. In this context, therefore, the 20-km reference level corresponds to the effective radiative top of atmosphere for the planet. Although the optimal flux reference level depends slightly on scene type due to differences in effective transmission of solar radiation with cloud height, the difference in flux caused by neglecting the scene-type dependence is less than 0.1%. If an inappropriate TOA flux reference level is used to define satellite TOA fluxes, and horizontal transmission of solar radiation through the planet is not accounted for in the radiation budget equation, systematic errors in net flux of up to 8 W/sq m can result. Since climate models generally use a plane-parallel model approximation to estimate TOA fluxes and the earth radiation budget, they implicitly assume zero horizontal transmission of solar radiation in the radiation budget equation, and do not need to specify a flux reference level. By defining satellite-based TOA flux estimates at a 20-km flux reference level, comparisons with plane-parallel climate model calculations are simplified since there is no need to explicitly correct plane-parallel climate model fluxes for horizontal transmission of solar radiation through a finite earth.

  12. Base-Case 1% Yield Increase (BC1), All Energy Crops scenario of the 2016 Billion Ton Report

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron (ORCID:0000000320373827)

    2016-07-13

    Scientific reason for data generation: to serve as the base-case scenario for the BT16 volume 1 agricultural scenarios to compare these projections of potential biomass supplies against a reference case (agricultural baseline 10.11578/1337885). The simulation runs from 2015 through 2040; a starting year of 2014 is used but not reported. How each parameter was produced (methods), format, and relationship to other data in the data set: This exogenous price simulations (also referred to as “specified-price” simulations) introduces a farmgate price, and POLYSYS solves for biomass supplies that may be brought to market in response to these prices. In specified-price scenarios, a specified farmgate price is offered constantly in all counties over all years of the simulation. This simulation begins in 2015 with an offered farmgate price for primary crop residues only between 2015 and 2018 and long-term contracts for dedicated crops beginning in 2019. Expected mature energy crop yield grows at a compounding rate of 1% beginning in 2016. The yield growth assumptions are fixed after crops are planted such that yield gains do not apply to crops already planted, but new plantings do take advantage of the gains in expected yield growth. Instruments used: Policy Analysis System –POLYSYS (version POLYS2015_V10_alt_JAN22B), an agricultural policy modeling system of U.S. agriculture (crops and livestock), supplied by the University of Tennessee Institute of Agriculture, Agricultural Policy Analysis Center.

  13. 4% Yield Increase (HH4), All Energy Crops scenario of the 2016 Billion Ton Report

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000320373827)

    2016-07-13

    Scientific reason for data generation: to serve as an alternate high-yield scenario for the BT16 volume 1 agricultural scenarios to compare these projections of potential biomass supplies against a reference case (agricultural baseline 10.11578/1337885). The simulation runs from 2015 through 2040; a starting year of 2014 is used but not reported. Date the data set was last modified: 02/02/2016. How each parameter was produced (methods), format, and relationship to other data in the data set: This exogenous price simulations (also referred to as “specified-price” simulations) introduces a farmgate price, and POLYSYS solves for biomass supplies that may be brought to market in response to these prices. In specified-price scenarios, a specified farmgate price is offered constantly in all counties over all years of the simulation. This simulation begins in 2015 with an offered farmgate price for primary crop residues only between 2015 and 2018 and long-term contracts for dedicated crops beginning in 2019. Expected mature energy crop yield grows at a compounding rate of 4% beginning in 2016. The yield growth assumptions are fixed after crops are planted such that yield gains do not apply. Instruments used: Policy Analysis System –POLYSYS (version POLYS2015_V10_alt_JAN22B), an agricultural policy modeling system of U.S. agriculture (crops and livestock), supplied by the University of Tennessee Institute of Agriculture, Agricultural Policy Analysis Center.

  14. Developing spatially explicit footprints of plausible land-use scenarios in the Santa Cruz Watershed, Arizona and Sonora

    USGS Publications Warehouse

    Norman, Laura M.; Feller, Mark; Villarreal, Miguel L.

    2012-01-01

    The SLEUTH urban growth model is applied to a binational dryland watershed to envision and evaluate plausible future scenarios of land use change into the year 2050. Our objective was to create a suite of geospatial footprints portraying potential land use change that can be used to aid binational decision-makers in assessing the impacts relative to sustainability of natural resources and potential socio-ecological consequences of proposed land-use management. Three alternatives are designed to simulate different conditions: (i) a Current Trends Scenario of unmanaged exponential growth, (ii) a Conservation Scenario with managed growth to protect the environment, and (iii) a Megalopolis Scenario in which growth is accentuated around a defined international trade corridor. The model was calibrated with historical data extracted from a time series of satellite images. Model materials, methodology, and results are presented. Our Current Trends Scenario predicts the footprint of urban growth to approximately triple from 2009 to 2050, which is corroborated by local population estimates. The Conservation Scenario results in protecting 46% more of the Evergreen class (more than 150,000 acres) than the Current Trends Scenario and approximately 95,000 acres of Barren Land, Crops, Deciduous Forest (Mesquite Bosque), Grassland/Herbaceous, Urban/Recreational Grasses, and Wetlands classes combined. The Megalopolis Scenario results also depict the preservation of some of these land-use classes compared to the Current Trends Scenario, most notably in the environmentally important headwaters region. Connectivity and areal extent of land cover types that provide wildlife habitat were preserved under the alternative scenarios when compared to Current Trends.

  15. Conservation planning under uncertainty in urban development and vegetation dynamics

    PubMed Central

    Carmel, Yohay

    2018-01-01

    Systematic conservation planning is a framework for optimally locating and prioritizing areas for conservation. An often-noted shortcoming of most conservation planning studies is that they do not address future uncertainty. The selection of protected areas that are intended to ensure the long-term persistence of biodiversity is often based on a snapshot of the current situation, ignoring processes such as climate change. Scenarios, in the sense of being accounts of plausible futures, can be utilized to identify conservation area portfolios that are robust to future uncertainty. We compared three approaches for utilizing scenarios in conservation area selection: considering a full set of scenarios (all-scenarios portfolio), assuming the realization of specific scenarios, and a reference strategy based on the current situation (current distributions portfolio). Our objective was to compare the robustness of these approaches in terms of their relative performance across future scenarios. We focused on breeding bird species in Israel’s Mediterranean region. We simulated urban development and vegetation dynamics scenarios 60 years into the future using DINAMICA-EGO, a cellular-automata simulation model. For each scenario, we mapped the target species’ available habitat distribution, identified conservation priority areas using the site-selection software MARXAN, and constructed conservation area portfolios using the three aforementioned strategies. We then assessed portfolio performance based on the number of species for which representation targets were met in each scenario. The all-scenarios portfolio consistently outperformed the other portfolios, and was more robust to ‘errors’ (e.g., when an assumed specific scenario did not occur). On average, the all-scenarios portfolio achieved representation targets for five additional species compared with the current distributions portfolio (approximately 33 versus 28 species). Our findings highlight the importance of considering a broad and meaningful set of scenarios, rather than relying on the current situation, the expected occurrence of specific scenarios, or the worst-case scenario. PMID:29621330

  16. Conservation planning under uncertainty in urban development and vegetation dynamics.

    PubMed

    Troupin, David; Carmel, Yohay

    2018-01-01

    Systematic conservation planning is a framework for optimally locating and prioritizing areas for conservation. An often-noted shortcoming of most conservation planning studies is that they do not address future uncertainty. The selection of protected areas that are intended to ensure the long-term persistence of biodiversity is often based on a snapshot of the current situation, ignoring processes such as climate change. Scenarios, in the sense of being accounts of plausible futures, can be utilized to identify conservation area portfolios that are robust to future uncertainty. We compared three approaches for utilizing scenarios in conservation area selection: considering a full set of scenarios (all-scenarios portfolio), assuming the realization of specific scenarios, and a reference strategy based on the current situation (current distributions portfolio). Our objective was to compare the robustness of these approaches in terms of their relative performance across future scenarios. We focused on breeding bird species in Israel's Mediterranean region. We simulated urban development and vegetation dynamics scenarios 60 years into the future using DINAMICA-EGO, a cellular-automata simulation model. For each scenario, we mapped the target species' available habitat distribution, identified conservation priority areas using the site-selection software MARXAN, and constructed conservation area portfolios using the three aforementioned strategies. We then assessed portfolio performance based on the number of species for which representation targets were met in each scenario. The all-scenarios portfolio consistently outperformed the other portfolios, and was more robust to 'errors' (e.g., when an assumed specific scenario did not occur). On average, the all-scenarios portfolio achieved representation targets for five additional species compared with the current distributions portfolio (approximately 33 versus 28 species). Our findings highlight the importance of considering a broad and meaningful set of scenarios, rather than relying on the current situation, the expected occurrence of specific scenarios, or the worst-case scenario.

  17. San Pedro River Basin Data Browser Report

    EPA Science Inventory

    Acquisition of primary spatial data and database development are initial features of any type of landscape assessment project. They provide contemporary land cover and the ancillary datasets necessary to establish reference condition and develop alternative future scenarios that ...

  18. New mechanisms of disease and parasite-host interactions.

    PubMed

    de Souza, Tiago Alves Jorge; de Carli, Gabriel Jose; Pereira, Tiago Campos

    2016-09-01

    An unconventional interaction between a patient and parasites was recently reported, in which parasitic cells invaded host's tissues, establishing several tumors. This finding raises various intriguing hypotheses on unpredicted forms of interplay between a patient and infecting parasites. Here we present four unusual hypothetical host-parasite scenarios with intriguing medical consequences. Relatively simple experimental designs are described in order to evaluate such hypotheses. The first one refers to the possibility of metabolic disorders in parasites intoxicating the host. The second one is on possibility of patients with inborn errors of metabolism (IEM) being more resistant to parasites (due to accumulation of toxic compounds in the bloodstream). The third one refers to a mirrored scenario: development of tumors in parasites due to ingestion of host's circulating cancer cells. The last one describes a complex relationship between parasites accumulating a metabolite and supplying it to a patient with an IEM. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Rationale, Scenarios, and Profiles for the Application of the Internet Protocol Suite (IPS) in Space Operations

    NASA Technical Reports Server (NTRS)

    Benbenek, Daniel B.; Walsh, William

    2010-01-01

    This greenbook captures some of the current, planned and possible future uses of the Internet Protocol (IP) as part of Space Operations. It attempts to describe how the Internet Protocol is used in specific scenarios. Of primary focus is low-earth-orbit space operations, which is referred to here as the design reference mission (DRM). This is because most of the program experience drawn upon derives from this type of mission. Application profiles are provided. This includes parameter settings programs have proposed for sending IP datagrams over CCSDS links, the minimal subsets and features of the IP protocol suite and applications expected for interoperability between projects, and the configuration, operations and maintenance of these IP functions. Of special interest is capturing the lessons learned from the Constellation Program in this area, since that program included a fairly ambitious use of the Internet Protocol.

  20. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.

  1. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.

  2. Low-mass neutralino dark matter in supergravity scenarios: phenomenology and naturalness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peiró, M.; Robles, S., E-mail: mpeirogarcia@gmail.com, E-mail: sandra.robles@uam.es

    2017-05-01

    The latest experimental results from the LHC and dark matter (DM) searches suggest that the parameter space allowed in supersymmetric theories is subject to strong reductions. These bounds are especially constraining for scenarios entailing light DM particles. Previous studies have shown that light neutralino DM in the Minimal Supersymmetric Standard Model (MSSM), with parameters defined at the electroweak scale, is still viable when the low energy spectrum of the model features light sleptons, in which case, the relic density constraint can be fulfilled. In view of this, we have investigated the viability of light neutralinos as DM candidates in themore » MSSM, with parameters defined at the grand unification scale. We have analysed the optimal choices of non-universalities in the soft supersymmetry-breaking parameters for both, gauginos and scalars, in order to avoid the stringent experimental constraints. We show that light neutralinos, with a mass as low as 25 GeV, are viable in supergravity scenarios if the gaugino mass parameters at high energy are very non universal, while the scalar masses can remain of the same order. These scenarios typically predict a very small cross section of neutralinos off protons and neutrons, thereby being very challenging for direct detection experiments. However, a potential detection of smuons and selectrons at the LHC, together with a hypothetical discovery of a gamma-ray signal from neutralino annihilations in dwarf spheroidal galaxies could shed light on this kind of solutions. Finally, we have investigated the naturalness of these scenarios, taking into account all the potential sources of tuning. Besides the electroweak fine-tuning, we have found that the tuning to reproduce the correct DM relic abundance and that to match the measured Higgs mass can also be important when estimating the total degree of naturalness.« less

  3. An example of population-level risk assessments for small mammals using individual-based population models.

    PubMed

    Schmitt, Walter; Auteri, Domenica; Bastiansen, Finn; Ebeling, Markus; Liu, Chun; Luttik, Robert; Mastitsky, Sergey; Nacci, Diane; Topping, Chris; Wang, Magnus

    2016-01-01

    This article presents a case study demonstrating the application of 3 individual-based, spatially explicit population models (IBMs, also known as agent-based models) in ecological risk assessments to predict long-term effects of a pesticide to populations of small mammals. The 3 IBMs each used a hypothetical fungicide (FungicideX) in different scenarios: spraying in cereals (common vole, Microtus arvalis), spraying in orchards (field vole, Microtus agrestis), and cereal seed treatment (wood mouse, Apodemus sylvaticus). Each scenario used existing model landscapes, which differed greatly in size and structural complexity. The toxicological profile of FungicideX was defined so that the deterministic long-term first tier risk assessment would result in high risk to small mammals, thus providing the opportunity to use the IBMs for risk assessment refinement (i.e., higher tier risk assessment). Despite differing internal model design and scenarios, results indicated in all 3 cases low population sensitivity unless FungicideX was applied at very high (×10) rates. Recovery from local population impacts was generally fast. Only when patch extinctions occured in simulations of intentionally high acute toxic effects, recovery periods, then determined by recolonization, were of any concern. Conclusions include recommendations for the most important input considerations, including the selection of exposure levels, duration of simulations, statistically robust number of replicates, and endpoints to report. However, further investigation and agreement are needed to develop recommendations for landscape attributes such as size, structure, and crop rotation to define appropriate regulatory risk assessment scenarios. Overall, the application of IBMs provides multiple advantages to higher tier ecological risk assessments for small mammals, including consistent and transparent direct links to specific protection goals, and the consideration of more realistic scenarios. © 2015 SETAC.

  4. Low-mass neutralino dark matter in supergravity scenarios: phenomenology and naturalness

    NASA Astrophysics Data System (ADS)

    Peiró, M.; Robles, S.

    2017-05-01

    The latest experimental results from the LHC and dark matter (DM) searches suggest that the parameter space allowed in supersymmetric theories is subject to strong reductions. These bounds are especially constraining for scenarios entailing light DM particles. Previous studies have shown that light neutralino DM in the Minimal Supersymmetric Standard Model (MSSM), with parameters defined at the electroweak scale, is still viable when the low energy spectrum of the model features light sleptons, in which case, the relic density constraint can be fulfilled. In view of this, we have investigated the viability of light neutralinos as DM candidates in the MSSM, with parameters defined at the grand unification scale. We have analysed the optimal choices of non-universalities in the soft supersymmetry-breaking parameters for both, gauginos and scalars, in order to avoid the stringent experimental constraints. We show that light neutralinos, with a mass as low as 25 GeV, are viable in supergravity scenarios if the gaugino mass parameters at high energy are very non universal, while the scalar masses can remain of the same order. These scenarios typically predict a very small cross section of neutralinos off protons and neutrons, thereby being very challenging for direct detection experiments. However, a potential detection of smuons and selectrons at the LHC, together with a hypothetical discovery of a gamma-ray signal from neutralino annihilations in dwarf spheroidal galaxies could shed light on this kind of solutions. Finally, we have investigated the naturalness of these scenarios, taking into account all the potential sources of tuning. Besides the electroweak fine-tuning, we have found that the tuning to reproduce the correct DM relic abundance and that to match the measured Higgs mass can also be important when estimating the total degree of naturalness.

  5. Techno-economic and environmental assessment of biogas production from banana peel (Musa paradisiaca) in a biorefinery concept.

    PubMed

    Martínez-Ruano, Jimmy Anderson; Caballero-Galván, Ashley Sthefanía; Restrepo-Serna, Daissy Lorena; Cardona, Carlos Ariel

    2018-04-07

    Two scenarios for the biogas production using Banana Peel as raw material were evaluated. The first scenario involves the stand-alone production of biogas and the second scenario includes the biogas production together with other products under biorefinery concept. In both scenarios, the influence of the production scale on the process economy was assessed and feasibility limits were defined. For this purpose, the mass and energy balances were established using the software Aspen Plus along with kinetic models reported in the literature. The economic and environmental analysis of the process was performed considering Colombian economic conditions. As a result, it was found that different process scales showed great potential for biogas production. Thus, plants with greater capacity have a greater economic benefit than those with lower capacity. However, this benefit leads to high-energy consumption and greater environmental impact.

  6. Department of Defense Strategy to Support Multi-Agency Bat Conservation Initiative within the State of Utah

    DTIC Science & Technology

    2008-02-28

    Range, and Section are entered. Datum: Geometric reference surface. Original Site Location datum is defined by user’s map datum; e.g. NAD27...Section are entered. Datum: Geometric reference surface. Original Site Location datum is defined by user’s map datum; e.g. NAD27 Conus or NAD83...Calculated and recorded automatically if the fields UTM_N and UTM_E or Township, Range, and Section are entered. 41 Datum: Geometric reference surface

  7. Assessment of future impacts of potential climate change scenarios on aquifer recharge in continental Spain

    NASA Astrophysics Data System (ADS)

    Pulido-Velazquez, David; Collados-Lara, Antonio-Juan; Alcalá, Francisco J.

    2017-04-01

    This research proposes and applies a method to assess potential impacts of future climatic scenarios on aquifer rainfall recharge in wide and varied regions. The continental Spain territory was selected to show the application. The method requires to generate future series of climatic variables (precipitation, temperature) in the system to simulate them within a previously calibrated hydrological model for the historical data. In a previous work, Alcalá and Custodio (2014) used the atmospheric chloride mass balance (CMB) method for the spatial evaluation of average aquifer recharge by rainfall over the whole of continental Spain, by assuming long-term steady conditions of the balance variables. The distributed average CMB variables necessary to calculate recharge were estimated from available variable-length data series of variable quality and spatial coverage. The CMB variables were regionalized by ordinary kriging at the same 4976 nodes of a 10 km x 10 km grid. Two main sources of uncertainty affecting recharge estimates (given by the coefficient of variation, CV), induced by the inherent natural variability of the variables and from mapping were segregated. Based on these stationary results we define a simple empirical rainfall-recharge model. We consider that spatiotemporal variability of rainfall and temperature are the most important climatic feature and variables influencing potential aquifer recharge in natural regime. Changes in these variables can be important in the assessment of future potential impacts of climatic scenarios over spatiotemporal renewable groundwater resource. For instance, if temperature increases, actual evapotranspitration (EA) will increases reducing the available water for others groundwater balance components, including the recharge. For this reason, instead of defining an infiltration rate coefficient that relates precipitation (P) and recharge we propose to define a transformation function that allows estimating the spatial distribution of recharge (both average value and its uncertainty) from the difference in P and EA in each area. A complete analysis of potential short-term (2016-2045) future climate scenarios in continental Spain has been performed by considering different sources of uncertainty. It is based on the historical climatic data for the period 1976-2005 and the climatic models simulations (for the control [1976-2005] and future scenarios [2016-2045]) performed in the frame of the CORDEX EU project. The most pessimistic emission scenario (RCP8.5) has been considered. For the RCP8.5 scenario we have analyzed the time series generated by simulating with 5 Regional Climatic models (CCLM4-8-17, RCA4, HIRHAM5, RACMO22E, and WRF331F) nested to 4 different General Circulation Models (GCMs). Two different conceptual approaches (bias correction and delta change techniques) have been applied to generate potential future climate scenarios from these data. Different ensembles of obtained time series have been proposed to obtain more representative scenarios by considering all the simulations or only those providing better approximations to the historical statistics based on a multicriteria analysis. This was a step to analyze future potential impacts on the aquifer recharge by simulating them within a rainfall-recharge model. This research has been supported by the CGL2013-48424-C2-2-R (MINECO) and the PMAFI/06/14 (UCAM) projects.

  8. Storyboard for the Medical System Concept of Operations for Mars Exploration Missions

    NASA Technical Reports Server (NTRS)

    Antonsen, Eric; Hailey, Melinda; Reyes, David; Rubin, David; Urbina, Michelle

    2017-01-01

    This storyboard conceptualizes one scenario of an integrated medical system during a Mars exploration mission. All content is for illustrative purposes only and neither defines nor implies system design requirement.

  9. Virtual Exchange Services and Shared CROMERR Services

    EPA Pesticide Factsheets

    Define the objectives, leadership, and membership of an IPT that will guide the requirements definition for a cloud-based Node installation and describe the anticipated architecture and example scenario implementations available to the EN community

  10. Defining Support Requirements During Conceptual Design of Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Morris, W. D.; White, N. H.; Davis, W. T.; Ebeling, C. E.

    1995-01-01

    Current methods for defining the operational support requirements of new systems are data intensive and require significant design information. Methods are being developed to aid in the analysis process of defining support requirements for new launch vehicles during their conceptual design phase that work with the level of information available during this phase. These methods will provide support assessments based on the vehicle design and the operating scenarios. The results can be used both to define expected support requirements for new launch vehicle designs and to help evaluate the benefits of using new technologies. This paper describes the models, their current status, and provides examples of their use.

  11. Defining Human Failure Events for Petroleum Risk Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  12. Application of Energy Integration Techniques to the Design of Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Levri, Julie; Finn, Cory

    2000-01-01

    Exchanging heat between hot and cold streams within an advanced life support system can save energy. This savings will reduce the equivalent system mass (ESM) of the system. Different system configurations are examined under steady-state conditions for various percentages of food growth and waste treatment. The scenarios investigated represent possible design options for a Mars reference mission. Reference mission definitions are drawn from the ALSS Modeling and Analysis Reference Missions Document, which includes definitions for space station evolution, Mars landers, and a Mars base. For each scenario, streams requiring heating or cooling are identified and characterized by mass flow, supply and target temperatures and heat capacities. The Pinch Technique is applied to identify good matches for energy exchange between the hot and cold streams and to calculate the minimum external heating and cooling requirements for the system. For each pair of hot and cold streams that are matched, there will be a reduction in the amount of external heating and cooling required, and the original heating and cooling equipment will be replaced with a heat exchanger. The net cost savings can be either positive or negative for each stream pairing, and the priority for implementing each pairing can be ranked according to its potential cost savings. Using the Pinch technique, a complete system heat exchange network is developed and heat exchangers are sized to allow for calculation of ESM. The energy-integrated design typically has a lower total ESM than the original design with no energy integration. A comparison of ESM savings in each of the scenarios is made to direct future Pinch Analysis efforts.

  13. Modeling Future Land Use Scenarios in South Korea: Applying the IPCC Special Report on Emissions Scenarios and the SLEUTH Model on a Local Scale

    NASA Astrophysics Data System (ADS)

    Han, Haejin; Hwang, YunSeop; Ha, Sung Ryong; Kim, Byung Sik

    2015-05-01

    This study developed three scenarios of future land use/land cover on a local level for the Kyung-An River Basin and its vicinity in South Korea at a 30-m resolution based on the two scenario families of the Intergovernmental Panel on Climate Change (IPCC) Special Report Emissions Scenarios (SRES): A2 and B1, as well as a business-as-usual scenario. The IPCC SRES A2 and B1 were used to define future local development patterns and associated land use change. We quantified the population-driven demand for urban land use for each qualitative storyline and allocated the urban demand in geographic space using the SLEUTH model. The model results demonstrate the possible land use/land cover change scenarios for the years from 2000 to 2070 by examining the broad narrative of each SRES within the context of a local setting, such as the Kyoungan River Basin, constructing narratives of local development shifts and modeling a set of `best guess' approximations of the future land use conditions in the study area. This study found substantial differences in demands and patterns of land use changes among the scenarios, indicating compact development patterns under the SRES B1 compared to the rapid and dispersed development under the SRES A2.

  14. Modeling future land use scenarios in South Korea: applying the IPCC special report on emissions scenarios and the SLEUTH model on a local scale.

    PubMed

    Han, Haejin; Hwang, YunSeop; Ha, Sung Ryong; Kim, Byung Sik

    2015-05-01

    This study developed three scenarios of future land use/land cover on a local level for the Kyung-An River Basin and its vicinity in South Korea at a 30-m resolution based on the two scenario families of the Intergovernmental Panel on Climate Change (IPCC) Special Report Emissions Scenarios (SRES): A2 and B1, as well as a business-as-usual scenario. The IPCC SRES A2 and B1 were used to define future local development patterns and associated land use change. We quantified the population-driven demand for urban land use for each qualitative storyline and allocated the urban demand in geographic space using the SLEUTH model. The model results demonstrate the possible land use/land cover change scenarios for the years from 2000 to 2070 by examining the broad narrative of each SRES within the context of a local setting, such as the Kyoungan River Basin, constructing narratives of local development shifts and modeling a set of 'best guess' approximations of the future land use conditions in the study area. This study found substantial differences in demands and patterns of land use changes among the scenarios, indicating compact development patterns under the SRES B1 compared to the rapid and dispersed development under the SRES A2.

  15. First steps of processing VLBI data of space probes with VieVS

    NASA Astrophysics Data System (ADS)

    Plank, L.; Böhm, J.; Schuh, H.

    2011-07-01

    Since 2008 the VLBI group at the Institute of Geodesy and Geophysics (IGG) of the Vienna University of Technology has developed the Vienna VLBI Software VieVS which is capable to process geodetic VLBI data in NGS format. Constantly we are working on upgrading the new software, e.g. by developing a scheduling tool or extending the software from single session solution to a so-called global solution, allowing the joint analysis of many sessions covering several years. In this presentation we report on first steps to enable the processing of space VLBI data with the software. Driven by the recently increasing number of space VLBI applications, our goal is the geodetic usage of such data, primarily concerning frame ties between various reference frames, e. g. by connecting the dynamic reference frame of a space probe with the kinematically defined International Celestial Reference Frame (ICRF). Main parts of the software extension w.r.t. the existing VieVS are the treatment of fast moving targets, the implementation of a delay model for radio emitters at finite distances, and the adequate mathematical model and adjustment of the particular unknowns. Actual work has been done for two mission scenarios so far: On the one hand differential VLBI (D-VLBI) data from the two sub-satellites of the Japanese lunar mission Selene were processed, on the other hand VLBI observations of GNSS satellites were modelled in VieVS. Besides some general aspects, we give details on the calculation of the theoretical delay (delay model for moving sources at finite distances) and its realization in VieVS. First results with real data and comparisons with best fit mission orbit data are also presented.'

  16. Colorful Twisted Top Partners and Partnerium at the LHC

    NASA Astrophysics Data System (ADS)

    Kats, Yevgeny; McCullough, Matthew; Perez, Gilad; Soreq, Yotam; Thaler, Jesse

    2017-06-01

    In scenarios that stabilize the electroweak scale, the top quark is typically accompanied by partner particles. In this work, we demonstrate how extended stabilizing symmetries can yield scalar or fermionic top partners that transform as ordinary color triplets but carry exotic electric charges. We refer to these scenarios as "hypertwisted" since they involve modifications to hypercharge in the top sector. As proofs of principle, we construct two hypertwisted scenarios: a supersymmetric construction with spin-0 top partners, and a composite Higgs construction with spin-1/2 top partners. In both cases, the top partners are still phenomenologically compatible with the mass range motivated by weak-scale naturalness. The phenomenology of hypertwisted scenarios is diverse, since the lifetimes and decay modes of the top partners are model dependent. The novel coupling structure opens up search channels that do not typically arise in top-partner scenarios, such as pair production of top-plus-jet resonances. Furthermore, hypertwisted top partners are typically sufficiently long lived to form "top-partnerium" bound states that decay predominantly via annihilation, motivating searches for rare narrow resonances with diboson decay modes.

  17. A new framework for evaluating the impacts of drought on net primary productivity of grassland.

    PubMed

    Lei, Tianjie; Wu, Jianjun; Li, Xiaohan; Geng, Guangpo; Shao, Changliang; Zhou, Hongkui; Wang, Qianfeng; Liu, Leizhen

    2015-12-01

    This paper presented a valuable framework for evaluating the impacts of droughts (single factor) on grassland ecosystems. This framework was defined as the quantitative magnitude of drought impact that unacceptable short-term and long-term effects on ecosystems may experience relative to the reference standard. Long-term effects on ecosystems may occur relative to the reference standard. Net primary productivity (NPP) was selected as the response indicator of drought to assess the quantitative impact of drought on Inner Mongolia grassland based on the Standardized Precipitation Index (SPI) and BIOME-BGC model. The framework consists of six main steps: 1) clearly defining drought scenarios, such as moderate, severe and extreme drought; 2) selecting an appropriate indicator of drought impact; 3) selecting an appropriate ecosystem model and verifying its capabilities, calibrating the bias and assessing the uncertainty; 4) assigning a level of unacceptable impact of drought on the indicator; 5) determining the response of the indicator to drought and normal weather state under global-change; and 6) investigating the unacceptable impact of drought at different spatial scales. We found NPP losses assessed using the new framework were more sensitive to drought and had higher precision than the long-term average method. Moreover, the total and average losses of NPP are different in different grassland types during the drought years from 1961-2009. NPP loss was significantly increased along a gradient of increasing drought levels. Meanwhile, NPP loss variation under the same drought level was different in different grassland types. The operational framework was particularly suited for integrative assessing the effects of different drought events and long-term droughts at multiple spatial scales, which provided essential insights for sciences and societies that must develop coping strategies for ecosystems for such events. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Coordinate references for the indoor/outdoor seamless positioning

    NASA Astrophysics Data System (ADS)

    Ruan, Ling; Zhang, Ling; Long, Yi; Cheng, Fei

    2018-05-01

    Indoor positioning technologies are being developed rapidly, and seamless positioning which connected indoor and outdoor space is a new trend. The indoor and outdoor positioning are not applying the same coordinate system and different indoor positioning scenes uses different indoor local coordinate reference systems. A specific and unified coordinate reference frame is needed as the space basis and premise in seamless positioning application. Trajectory analysis of indoor and outdoor integration also requires a uniform coordinate reference. However, the coordinate reference frame in seamless positioning which can applied to various complex scenarios is lacking of research for a long time. In this paper, we proposed a universal coordinate reference frame in indoor/outdoor seamless positioning. The research focus on analysis and classify the indoor positioning scenes and put forward the coordinate reference system establishment and coordinate transformation methods in each scene. And, through some experiments, the calibration method feasibility was verified.

  19. Tailoring Green Infrastructure Implementation Scenarios based on Stormwater Management Objectives

    EPA Science Inventory

    Green infrastructure (GI) refers to stormwater management practices that mimic nature by soaking up, storing, and controlling onsite. GI practices can contribute reckonable benefits towards meeting stormwater management objectives, such as runoff peak shaving, volume reduction, f...

  20. International Multidisciplinary Artificial Gravity (IMAG) Project

    NASA Technical Reports Server (NTRS)

    Laurini, Kathy

    2007-01-01

    This viewgraph presentation reviews the efforts of the International Multidisciplinary Artificial Gravity Project. Specifically it reviews the NASA Exploration Planning Status, NASA Exploration Roadmap, Status of Planning for the Moon, Mars Planning, Reference health maintenance scenario, and The Human Research Program.

  1. Games for All Seasons.

    ERIC Educational Resources Information Center

    Jaques, David

    1981-01-01

    Argues that games with a simple communication structure and/or an abstract content have more virtues than games which introduce too many details into the roles and scenario. Four such "simple" games are described, one in detail, and four references are listed. (LLS)

  2. Joint Concept Development and Experimentation: A Force Development Perspective

    DTIC Science & Technology

    2012-02-01

    REFINED SCENARIO SET KLE S E C U R IT Y E N V IR O N M E N T E X IS T IN G Existing Scenarios KLE Figure 7: Hierarchy of force...CORA TM 2012-036 References ..... [1] Palla, G ., Barabasi, A.L., and Vicsek, T . Quantifying Social Group Evolution. Nature, Vol 446, pp...activités interarmées de développement des forces ( planification axée sur les capacités, élaboration et expérimentation de concepts) ne sont pas bien

  3. Nonimaging optical illumination system

    DOEpatents

    Winston, Roland; Ries, Harald

    2000-01-01

    A nonimaging illumination optical device for producing a selected far field illuminance over an angular range. The optical device includes a light source 102, a light reflecting surface 108, and a family of light edge rays defined along a reference line 104 with the reflecting surface 108 defined in terms of the reference line 104 as a parametric function R(t) where t is a scalar parameter position and R(t)=k(t)+Du(t) where k(t) is a parameterization of the reference line 104, and D is a distance from a point on the reference line 104 to the reflection surface 108 along the desired edge ray through the point.

  4. Nonimaging optical illumination system

    DOEpatents

    Winston, Roland; Ries, Harald

    1998-01-01

    A nonimaging illumination optical device for producing a selected far field illuminance over an angular range. The optical device includes a light source 102, a light reflecting surface 108, and a family of light edge rays defined along a reference line 104 with the reflecting surface 108 defined in terms of the reference line 104 as a parametric function R(t) where t is a scalar parameter position and R(t)=k(t)+Du(t) where k(t) is a parameterization of the reference line 104, and D is a distance from a point on the reference line 104 to the reflection surface 108 along the desired edge ray through the point.

  5. Nonimaging optical illumination system

    DOEpatents

    Winston, Roland; Ries, Harald

    1996-01-01

    A nonimaging illumination optical device for producing a selected far field illuminance over an angular range. The optical device includes a light source 102, a light reflecting surface 108, and a family of light edge rays defined along a reference line 104 with the reflecting surface 108 defined in terms of the reference line 104 as a parametric function R(t) where t is a scalar parameter position and R(t)=k(t)+Du(t) where k(t) is a parameterization of the reference line 104, and D is a distance from a point on the reference line 104 to the reflection surface 108 along the desired edge ray through the point.

  6. Constellation Architecture Team-Lunar Scenario 12.0 Habitation Overview

    NASA Technical Reports Server (NTRS)

    Kennedy, Kriss J.; Toups, Larry D.; Rudisill, Marianne

    2010-01-01

    This paper will describe an overview of the Constellation Architecture Team Lunar Scenario 12.0 (LS-12) surface habitation approach and concept performed during the study definition. The Lunar Scenario 12 architecture study focused on two primary habitation approaches: a horizontally-oriented habitation module (LS-12.0) and a vertically-oriented habitation module (LS-12.1). This paper will provide an overview of the 12.0 lunar surface campaign, the associated outpost architecture, habitation functionality, concept description, system integration strategy, mass and power resource estimates. The Scenario 12 architecture resulted from combining three previous scenario attributes from Scenario 4 "Optimized Exploration", Scenario 5 "Fission Surface Power System" and Scenario 8 "Initial Extensive Mobility" into Scenario 12 along with an added emphasis on defining the excursion ConOps while the crew is away from the outpost location. This paper will describe an overview of the CxAT-Lunar Scenario 12.0 habitation concepts and their functionality. The Crew Operations area includes basic crew accommodations such as sleeping, eating, hygiene and stowage. The EVA Operations area includes additional EVA capability beyond the suitlock function such as suit maintenance, spares stowage, and suit stowage. The Logistics Operations area includes the enhanced accommodations for 180 days such as enhanced life support systems hardware, consumable stowage, spares stowage, interconnection to the other habitation elements, a common interface mechanism for future growth, and mating to a pressurized rover or Pressurized Logistics Module (PLM). The Mission & Science Operations area includes enhanced outpost autonomy such as an IVA glove box, life support, medical operations, and exercise equipment.

  7. Recording multiple spatially-heterodyned direct to digital holograms in one digital image

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-03-25

    Systems and methods are described for recording multiple spatially-heterodyned direct to digital holograms in one digital image. A method includes digitally recording, at a first reference beam-object beam angle, a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram to sit on top of a first spatial-heterodyne carrier frequency defined by the first reference beam-object beam angle; digitally recording, at a second reference beam-object beam angle, a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram to sit on top of a second spatial-heterodyne carrier frequency defined by the second reference beam-object beam angle; applying a first digital filter to cut off signals around the first original origin and define a first result; performing a first inverse Fourier transform on the first result; applying a second digital filter to cut off signals around the second original origin and define a second result; and performing a second inverse Fourier transform on the second result, wherein the first reference beam-object beam angle is not equal to the second reference beam-object beam angle and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  8. Molecular disorder and translation/rotation coupling in the plastic crystal phase of hybrid perovskites.

    PubMed

    Even, J; Carignano, M; Katan, C

    2016-03-28

    The complexity of hybrid organic perovskites calls for an innovative theoretical view that combines usually disconnected concepts in order to achieve a comprehensive picture: (i) the intended applications of this class of materials are currently in the realm of conventional semiconductors, which reveal the key desired properties for the design of efficient devices. (ii) The reorientational dynamics of the organic component resembles that observed in plastic crystals, therefore requiring a stochastic treatment that can be done in terms of pseudospins and rotator functions. (iii) The overall structural similarity with all inorganic perovskites suggests the use of the high temperature pseudo cubic phase as the reference platform on which further refinements can be built. In this paper we combine the existing knowledge on these three fields to define a general scenario based on which we can continue the quest towards a fundamental understanding of hybrid organic perovskites. With the introduction of group theory as the main tool to rationalize the different ideas and with the help of molecular dynamics simulations, several experimentally observed properties are naturally explained with possible suggestions for future work.

  9. Molecular disorder and translation/rotation coupling in the plastic crystal phase of hybrid perovskites

    NASA Astrophysics Data System (ADS)

    Even, J.; Carignano, M.; Katan, C.

    2016-03-01

    The complexity of hybrid organic perovskites calls for an innovative theoretical view that combines usually disconnected concepts in order to achieve a comprehensive picture: (i) the intended applications of this class of materials are currently in the realm of conventional semiconductors, which reveal the key desired properties for the design of efficient devices. (ii) The reorientational dynamics of the organic component resembles that observed in plastic crystals, therefore requiring a stochastic treatment that can be done in terms of pseudospins and rotator functions. (iii) The overall structural similarity with all inorganic perovskites suggests the use of the high temperature pseudo cubic phase as the reference platform on which further refinements can be built. In this paper we combine the existing knowledge on these three fields to define a general scenario based on which we can continue the quest towards a fundamental understanding of hybrid organic perovskites. With the introduction of group theory as the main tool to rationalize the different ideas and with the help of molecular dynamics simulations, several experimentally observed properties are naturally explained with possible suggestions for future work.

  10. Visual search for conjunctions of physical and numerical size shows that they are processed independently.

    PubMed

    Sobel, Kenith V; Puri, Amrita M; Faulkenberry, Thomas J; Dague, Taylor D

    2017-03-01

    The size congruity effect refers to the interaction between numerical magnitude and physical digit size in a symbolic comparison task. Though this effect is well established in the typical 2-item scenario, the mechanisms at the root of the interference remain unclear. Two competing explanations have emerged in the literature: an early interaction model and a late interaction model. In the present study, we used visual conjunction search to test competing predictions from these 2 models. Participants searched for targets that were defined by a conjunction of physical and numerical size. Some distractors shared the target's physical size, and the remaining distractors shared the target's numerical size. We held the total number of search items fixed and manipulated the ratio of the 2 distractor set sizes. The results from 3 experiments converge on the conclusion that numerical magnitude is not a guiding feature for visual search, and that physical and numerical magnitude are processed independently, which supports a late interaction model of the size congruity effect. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Efroymson, Rebecca Ann; Dale, Virginia H; Kline, Keith L

    Indicators of the environmental sustainability of biofuel production, distribution, and use should be selected, measured, and interpreted with respect to the context in which they are used. These indicators include measures of soil quality, water quality and quantity, greenhouse-gas emissions, biodiversity, air quality, and vegetation productivity. Contextual considerations include the purpose for the sustainability analysis, the particular biofuel production and distribution system (including supply chain, management aspects, and system viability), policy conditions, stakeholder values, location, temporal influences, spatial scale, baselines, and reference scenarios. Recommendations presented in this paper include formulating the problem for particular analyses, selecting appropriate context-specific indicators ofmore » environmental sustainability, and developing indicators that can reflect multiple environmental properties at low cost within a defined context. In addition, contextual considerations such as technical objectives, varying values and perspectives of stakeholder groups, and availability and reliability of data need to be understood and considered. Sustainability indicators for biofuels are most useful if adequate historical data are available, information can be collected at appropriate spatial and temporal scales, organizations are committed to use indicator information in the decision-making process, and indicators can effectively guide behavior toward more sustainable practices.« less

  12. DREAMING OF ATMOSPHERES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waldmann, I. P., E-mail: ingo@star.ucl.ac.uk

    Here, we introduce the RobERt (Robotic Exoplanet Recognition) algorithm for the classification of exoplanetary emission spectra. Spectral retrieval of exoplanetary atmospheres frequently requires the preselection of molecular/atomic opacities to be defined by the user. In the era of open-source, automated, and self-sufficient retrieval algorithms, manual input should be avoided. User dependent input could, in worst-case scenarios, lead to incomplete models and biases in the retrieval. The RobERt algorithm is based on deep-belief neural (DBN) networks trained to accurately recognize molecular signatures for a wide range of planets, atmospheric thermal profiles, and compositions. Reconstructions of the learned features, also referred to as themore » “dreams” of the network, indicate good convergence and an accurate representation of molecular features in the DBN. Using these deep neural networks, we work toward retrieval algorithms that themselves understand the nature of the observed spectra, are able to learn from current and past data, and make sensible qualitative preselections of atmospheric opacities to be used for the quantitative stage of the retrieval process.« less

  13. The Potential of Different Concepts of Fast Breeder Reactor for the French Fleet Renewal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Massara, Simone; Tetart, Philippe; Lecarpentier, David

    2006-07-01

    The performances of different concepts of Fast Breeder Reactor (Na-cooled, He-cooled and Pb-cooled FBR) for the current French fleet renewal are analyzed in the framework of a transition scenario to a 100% FBR fleet at the end of the 21. century. Firstly, the modeling of these three FBR types by means of a semi-analytical approach in TIRELIRE - STRATEGIE, the EDF fuel cycle simulation code, is presented, together with some validation elements against ERANOS, the French reference code system for neutronic FBR analysis (CEA). Afterwards, performances comparisons are made in terms of maximum deployable power, natural uranium consumption and wastemore » production. The results show that the FBR maximum deployable capacity, independently from the FBR technology, is highly sensitive to the fuel cycle options, like the spent nuclear fuel cooling time or the Minor Actinides management strategy. Thus, some of the key parameters defining the dynamic of FBR deployment are highlighted, to inform the orientation of R and D in the development and optimization of these systems. (authors)« less

  14. Preliminary study of the space adaptation of the MELiSSA life support system

    NASA Astrophysics Data System (ADS)

    Mas-Albaigès, Joan L.; Duatis, Jordi; Podhajsky, Sandra; Guirado, Víctor; Poughon, Laurent

    MELiSSA (Micro-Ecological Life Support System Alternative) is an European Space Agency (ESA) project focused on the development of a closed regenerative life support system to aid the development of technologies for future life support systems for long term manned planetary missions, e.g. a lunar base or missions to Mars. In order to understand the potential evolution of the MELiSSA concept towards its future use in the referred manned planetary mission context the MELiSSA Space Adaptation (MSA) activity has been undertaken. MSA's main objective is to model the different MELiSSA compartments using EcosimPro R , a specialized simulation tool for life support applications, in order to define a preliminary MELiSSA implementation for service in a man-tended lunar base scenario, with a four-member crew rotating in six-month increments, and performing the basic LSS functions of air revitalization, food production, and waste and water recycling. The MELiSSA EcosimPro R Model features a dedicated library for the different MELiSSA elements (bioreactors, greenhouse, crew, interconnecting elements, etc.). It is used to dimension the MELiSSA system in terms of major parameters like mass, volume and energy needs, evaluate the accuracy of the results and define the strategy for a progressive loop closure from the initial required performance (approx.100 The MELiSSA configuration(s) obtained through the EcosimPro R simulation are further analysed using the Advanced Life Support System Evaluation (ALISSE) metric, relying on mass, energy, efficiency, human risk, system reliability and crew time, for trade-off and optimization of results. The outcome of the MSA activity is, thus, a potential Life Support System architecture description, based on combined MELiSSA and other physico-chemical technologies, defining its expected performance, associated operational conditions and logistic needs.

  15. Increased Tidal Dissipation Using Advanced Rheological Models: Implications for Io and Tidally Active Exoplanets

    NASA Astrophysics Data System (ADS)

    Renaud, Joe P.; Henning, Wade G.

    2018-04-01

    The advanced rheological models of Andrade and Sundberg & Cooper are compared to the traditional Maxwell model to understand how each affects the tidal dissipation of heat within rocky bodies. We find both Andrade and Sundberg–Cooper rheologies can produce at least 10× the tidal heating compared to a traditional Maxwell model for a warm (1400–1600 K) Io-like satellite. Sundberg–Cooper can cause even larger dissipation around a critical temperature and frequency. These models allow cooler planets to stay tidally active in the face of orbital perturbations—a condition we term “tidal resilience.” This has implications for the time evolution of tidally active worlds and the long-term equilibria they fall into. For instance, if Io’s interior is better modeled by the Andrade or Sundberg–Cooper rheologies, the number of possible resonance-forming scenarios that still produce a hot, modern Io is expanded, and these scenarios do not require an early formation of the Laplace resonance. The two primary empirical parameters that define the Andrade anelasticity are examined in several phase spaces to provide guidance on how their uncertainties impact tidal outcomes, as laboratory studies continue to constrain their real values. We provide detailed reference tables on the fully general equations required for others to insert the models of Andrade and Sundberg–Cooper into standard tidal formulae. Lastly, we show that advanced rheologies can greatly impact the heating of short-period exoplanets and exomoons, while the properties of tidal resilience could mean a greater number of tidally active worlds among all extrasolar systems.

  16. System-level Analysis of Food Moisture Content Requirements for the Mars Dual Lander Transit Mission

    NASA Technical Reports Server (NTRS)

    Levri, Julie A.; Perchonok, Michele H.

    2004-01-01

    In order to ensure that adequate water resources are available during a mission, any net water loss from the habitat must be balanced with an equivalent amount of required makeup water. Makeup water may come from a variety of sources, including water in shipped tanks, water stored in prepackaged food, product water from fuel cells, and in-situ water resources. This paper specifically addresses the issue of storing required makeup water in prepackaged food versus storing the water in shipped tanks for the Mars Dual Lander Transit Mission, one of the Advanced Life Support Reference Missions. In this paper, water mass balances have been performed for the Dual Lander Transit Mission, to determine the necessary requirement of makeup water under nominal operation (i.e. no consideration of contingency needs), on a daily basis. Contingency issues are briefly discussed with respect to impacts on makeup water storage (shipped tanks versus storage in prepackaged food). The Dual Lander Transit Mission was selected for study because it has been considered by the Johnson Space Center Exploration Office in enough detail to define a reasonable set of scenario options for nominal system operation and contingencies. This study also illustrates the concept that there are multiple, reasonable life support system scenarios for any one particular mission. Thus, the need for a particular commodity can depend upon many variables in the system. In this study, we examine the need for makeup water as it depends upon the configuration of the rest of the life support system.

  17. Computational Aerothermodynamic Assessment of Space Shuttle Orbiter Tile Damage: Open Cavities

    NASA Technical Reports Server (NTRS)

    Pulsonetti, Maria; Wood, William

    2005-01-01

    Computational aerothermodynamic simulations of Orbiter windside tile damage in flight were performed in support of the Space Shuttle Return-to-Flight effort. The simulations were performed for both hypervelocity flight and low-enthalpy wind tunnel conditions and contributed to the Return-to-Flight program by providing information to support a variety of damage scenario analyses. Computations at flight conditions were performed at or very near the peak heating trajectory point for multiple damage scenarios involving damage windside acreage reaction cured glass (RCG) coated silica tile(s). The cavities formed by the missing tile examined in this study were relatively short leading to flow features which indicated open cavity behavior. Results of the computations indicated elevated heating bump factor levels predicted for flight over the predictions for wind tunnel conditions. The peak heating bump factors, defined as the local heating to a reference value upstream of the cavity, on the cavity floor for flight simulation were 67% larger than the peak wind tunnel simulation value. On the downstream face of the cavity the flight simulation values were 60% larger than the wind tunnel simulation values. On the outer mold line (OML) downstream of the cavity, the flight values are about 20% larger than the wind tunnel simulation values. The higher heating bump factors observed in the flight simulations were due to the larger driving potential in terms of energy entering the cavity for the flight simulations. This is evidenced by the larger rate of increase in the total enthalpy through the boundary layer prior to the cavity for the flight simulation.

  18. An Adjoint Force-restore Model for Glacier Terminus Fluctuations

    NASA Astrophysics Data System (ADS)

    Ren, D.; Leslie, L.; Karoly, D.

    2006-12-01

    A linear inverse formula comprises the basis for an individual treatment of 7 central Asian (25-55°N; 70-95°E) glaciers. The linear forward model is based on first order glacier dynamics, and requires the knowledge of reference states of forcing and glacier perturbation magnitude. In this study, the adjoint based 4D-var method was applied to optimally determine the reference states and make it possible to start the integration at an arbitrarily chosen time, and thus suitable to use the availability of the coupled general circulation model (CGCM) predictions of future temperature scenarios. Two sensitive yet uncertain glacier parameters and reference states at year 1900 are inferred from observed glacier length records distributed irregularly over the 20th century and the regional mean annual temperature anomaly (against 1961-1990 reference) time series. We rotated the temperature forcing for the Hadley Centre- Climatic Research Unit of the University of East Anglia (HadCRUT2), the Global Historical Climatology Network (GHCN) observations, and the ensemble mean of multiple CGCM runs and compared the retrieval results. Because of the high resemblance between the three data sources after 1960, it was decided practicable to use the observed temperature as forcing in retrieving the model parameters and initial states and then run an extended period with forcing from ensemble mean CGCM temperature of the next century. The length fluctuation is estimated for the transient climate period with 9 CGCM simulations under SRES A2 (a strong emission scenario from the Special report on Emissions Scenarios). For the 60-year period 2000- 2060, all glaciers experienced salient shrinkage, especially those with gentle slopes. Although nearly one-third the year 2000 length will be reduced for some small glaciers, the very existence of the glaciers studied here is not threatened by year 2060. The differences in individual glacier responses are very large. No straightforward relationship is found between glacier size and fractional change of its length.

  19. Communications platform payload definition study, executive summary

    NASA Technical Reports Server (NTRS)

    Clopp, H. W.; Hawkes, T. A.; Bertles, C. R.; Pontano, B. A.; Kao, T.

    1986-01-01

    Large geostationary communications platforms have been investigated in a number of studies since 1974 as a possible means to more effectively utilize the geostationary orbital arc and electromagnetic spectrum and to reduce overall satellite communications system costs. This NASA Lewis sponsored study addresses the commercial feasibility of various communications platform payload concepts circa 1998. It defines promising payload concepts, estimates recurring costs and identifies critical technologies needed to permit eventual commercialization. Ten communications service aggregation scenarios describing potential groupings of services were developed for a range of conditions. Payload concepts were defined for four of these scenarios: (1) Land Mobile Satellite Service (LMSS), meet 100% of CONUS plus Canada demand with a single platform; (2) Fixed Satellite Service (FSS) (Trunking + Customer Premises Service (CPS), meet 20% of CONUS demands; (3) FSS (Trunking + video distribution), 10 to 13% of CONUS demand; and (4) FSS (20% of demand) + Inter Satellite Links (ISL) + TDRSS/TDAS Data Distribution.

  20. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII. The usable and realistic ground motion maps for urban areas are generated: - either from the assumption of a "reference earthquake" - or directly, showing values of macroseimic intensity generated by a damaging, real earthquake. In the study, applying deterministic approach, earthquake scenario in macroseismic intensity ("model" earthquake scenario) for the city of Sofia is generated. The deterministic "model" intensity scenario based on assumption of a "reference earthquake" is compared with a scenario based on observed macroseimic effects caused by the damaging 2012 earthquake (MW5.6). The difference between observed (Io) and predicted (Ip) intensities values is analyzed.

  1. Current Practices of Measuring and Reference Range Reporting of Free and Total Testosterone in the United States.

    PubMed

    Le, Margaret; Flores, David; May, Danica; Gourley, Eric; Nangia, Ajay K

    2016-05-01

    The evaluation and management of male hypogonadism should be based on symptoms and on serum testosterone levels. Diagnostically this relies on accurate testing and reference values. Our objective was to define the distribution of reference values and assays for free and total testosterone by clinical laboratories in the United States. Upper and lower reference values, assay methodology and source of published reference ranges were obtained from laboratories across the country. A standardized survey was reviewed with laboratory staff via telephone. Descriptive statistics were used to tabulate results. We surveyed a total of 120 laboratories in 47 states. Total testosterone was measured in house at 73% of laboratories. At the remaining laboratories studies were sent to larger centralized reference facilities. The mean ± SD lower reference value of total testosterone was 231 ± 46 ng/dl (range 160 to 300) and the mean upper limit was 850 ± 141 ng/dl (range 726 to 1,130). Only 9% of laboratories where in-house total testosterone testing was performed created a reference range unique to their region. Others validated the instrument recommended reference values in a small number of internal test samples. For free testosterone 82% of laboratories sent testing to larger centralized reference laboratories where equilibrium dialysis and/or liquid chromatography with mass spectrometry was done. The remaining laboratories used published algorithms to calculate serum free testosterone. Reference ranges for testosterone assays vary significantly among laboratories. The ranges are predominantly defined by limited population studies of men with unknown medical and reproductive histories. These poorly defined and variable reference values, especially the lower limit, affect how clinicians determine treatment. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  2. Developing ecological scenarios for the prospective aquatic risk assessment of pesticides.

    PubMed

    Rico, Andreu; Van den Brink, Paul J; Gylstra, Ronald; Focks, Andreas; Brock, Theo Cm

    2016-07-01

    The prospective aquatic environmental risk assessment (ERA) of pesticides is generally based on the comparison of predicted environmental concentrations in edge-of-field surface waters with regulatory acceptable concentrations derived from laboratory and/or model ecosystem experiments with aquatic organisms. New improvements in mechanistic effect modeling have allowed a better characterization of the ecological risks of pesticides through the incorporation of biological trait information and landscape parameters to assess individual, population and/or community-level effects and recovery. Similarly to exposure models, ecological models require scenarios that describe the environmental context in which they are applied. In this article, we propose a conceptual framework for the development of ecological scenarios that, when merged with exposure scenarios, will constitute environmental scenarios for prospective aquatic ERA. These "unified" environmental scenarios are defined as the combination of the biotic and abiotic parameters that are required to characterize exposure, (direct and indirect) effects, and recovery of aquatic nontarget species under realistic worst-case conditions. Ideally, environmental scenarios aim to avoid a potential mismatch between the parameter values and the spatial-temporal scales currently used in aquatic exposure and effect modeling. This requires a deeper understanding of the ecological entities we intend to protect, which can be preliminarily addressed by the formulation of ecological scenarios. In this article we present a methodological approach for the development of ecological scenarios and illustrate this approach by a case-study for Dutch agricultural ditches and the example focal species Sialis lutaria. Finally, we discuss the applicability of ecological scenarios in ERA and propose research needs and recommendations for their development and integration with exposure scenarios. Integr Environ Assess Manag 2016;12:510-521. © 2015 SETAC. © 2015 SETAC.

  3. Application of automation and robotics to lunar surface human exploration operations

    NASA Technical Reports Server (NTRS)

    Woodcock, Gordon R.; Sherwood, Brent; Buddington, Patricia A.; Bares, Leona C.; Folsom, Rolfe; Mah, Robert; Lousma, Jack

    1990-01-01

    Major results of a study applying automation and robotics to lunar surface base buildup and operations concepts are reported. The study developed a reference base scenario with specific goals, equipment concepts, robot concepts, activity schedules and buildup manifests. It examined crew roles, contingency cases and system reliability, and proposed a set of technologies appropriate and necessary for effective lunar operations. This paper refers readers to four companion papers for quantitative details where appropriate.

  4. Scenario-Based Case Study Analysis of Asteroid Mitigation in the Short Response Time Regime

    NASA Astrophysics Data System (ADS)

    Seery, B.; Greenaugh, K. C.

    2017-12-01

    Asteroid impact on Earth is a rare but inevitable occurrence, with potentially cataclysmic consequences. If a pending impact is discovered, mitigation options include civil-defense preparations as well as missions to deflect the asteroid and/or robustly disrupt and disperse it to an extent that only a negligible fraction remains on a threatening path (National Research Council's "Defending the Planet," 2010). If discovered with sufficient warning time, a kinetic impactor can deflect smaller objects, but response delays can rule out the option. If a body is too large to deflect by kinetic impactor, or the time for response is insufficient, deflection or disruption can be achieved with a nuclear device. The use of nuclear ablation is considered within the context of current capabilities, requiring no need for nuclear testing. Existing, well-understood devices are sufficient for the largest known Potentially Hazardous Objects (PHOs). The National Aeronautics and Space Administration/Goddard Space Flight Center and the Department of Energy/National Nuclear Security Administration are collaborating to determine the critical characterization issues that define the boundaries for the asteroid-deflection options. Drawing from such work, we examine the timeline for a deflection mission, and how to provide the best opportunity for an impactor to suffice by minimizing the response time. This integrated problem considers the physical process of the deflection method (impact or ablation), along with the spacecraft, launch capability, risk analysis, and the available intercept flight trajectories. Our joint DOE/NASA team has conducted case study analysis of three distinctly different PHOs, on a hypothetical earth impacting trajectory. The size of the design reference bodies ranges from 100 - 500 meters in diameter, with varying physical parameters such as composition, spin state, and metallicity, to name a few. We assemble the design reference of the small body in question using known values for key parameters and expert elicitation to make educated guesses on the unknown parameters, including an estimate of the overall uncertainties in those values. Our scenario-based systems approach includes 2-D and 3-D physics-based modeling and simulations.

  5. Cost-effectiveness analysis of timely dialysis referral after renal transplant failure in Spain.

    PubMed

    Villa, Guillermo; Sánchez-Álvarez, Emilio; Cuervo, Jesús; Fernández-Ortiz, Lucía; Rebollo, Pablo; Ortega, Francisco

    2012-08-16

    A cost-effectiveness analysis of timely dialysis referral after renal transplant failure was undertaken from the perspective of the Public Administration. The current Spanish situation, where all the patients undergoing graft function loss are referred back to dialysis in a late manner, was compared to an ideal scenario where all the patients are timely referred. A Markov model was developed in which six health states were defined: hemodialysis, peritoneal dialysis, kidney transplantation, late referral hemodialysis, late referral peritoneal dialysis and death. The model carried out a simulation of the progression of renal disease for a hypothetical cohort of 1,000 patients aged 40, who were observed in a lifetime temporal horizon of 45 years. In depth sensitivity analyses were performed in order to ensure the robustness of the results obtained. Considering a discount rate of 3 %, timely referral showed an incremental cost of 211 €, compared to late referral. This cost increase was however a consequence of the incremental survival observed. The incremental effectiveness was 0.0087 quality-adjusted life years (QALY). When comparing both scenarios, an incremental cost-effectiveness ratio of 24,390 €/QALY was obtained, meaning that timely dialysis referral might be an efficient alternative if a willingness-to-pay threshold of 45,000 €/QALY is considered. This result proved to be independent of the proportion of late referral patients observed. The acceptance probability of timely referral was 61.90 %, while late referral was acceptable in 38.10 % of the simulations. If we however restrict the analysis to those situations not involving any loss of effectiveness, the acceptance probability of timely referral was 70.10 %, increasing twofold that of late referral (29.90 %). Timely dialysis referral after graft function loss might be an efficient alternative in Spain, improving both patients' survival rates and health-related quality of life at an affordable cost. Spanish Public Health authorities might therefore promote the inclusion of specific recommendations for this group of patients within the existing clinical guidelines.

  6. Socioeconomic Drought in a Changing Climate: Modeling and Management

    NASA Astrophysics Data System (ADS)

    AghaKouchak, Amir; Mehran, Ali; Mazdiyasni, Omid

    2016-04-01

    Drought is typically defined based on meteorological, hydrological and land surface conditions. However, in many parts of the world, anthropogenic changes and water management practices have significantly altered local water availability. Socioeconomic drought refers to conditions whereby the available water supply cannot satisfy the human and environmental water needs. Surface water reservoirs provide resilience against local climate variability (e.g., droughts), and play a major role in regional water management. This presentation focuses on a framework for describing socioeconomic drought based on both water supply and demand information. We present a multivariate approach as a measure of socioeconomic drought, termed Multivariate Standardized Reliability and Resilience Index (MSRRI; Mehran et al., 2015). This model links the information on inflow and surface reservoir storage to water demand. MSRRI integrates a "top-down" and a "bottom-up" approach for describing socioeconomic drought. The "top-down" component describes processes that cannot be simply controlled or altered by local decision-makers and managers (e.g., precipitation, climate variability, climate change), whereas the "bottom-up" component focuses on the local resilience, and societal capacity to respond to droughts. The two components (termed, Inflow-Demand Reliability (IDR) indicator and Water Storage Resilience (WSR) indicator) are integrated using a nonparametric multivariate approach. We use this framework to assess the socioeconomic drought during the Australian Millennium Drought (1998-2010) and the 2011-2014 California Droughts. MSRRI provides additional information on socioeconomic drought onset, development and termination based on local resilience and human demand that cannot be obtained from the commonly used drought indicators. We show that MSRRI can be used for water management scenario analysis (e.g., local water availability based on different human water demands scenarios). Finally, we provide examples of using the proposed modeling framework for analyzing water availability in a changing climate considering local conditions. Reference: Mehran A., Mazdiyasni O., AghaKouchak A., 2015, A Hybrid Framework for Assessing Socioeconomic Drought: Linking Climate Variability, Local Resilience, and Demand, Journal of Geophysical Research, 120 (15), 7520-7533, doi: 10.1002/2015JD023147

  7. Validation of a Detailed Scoring Checklist for Use During Advanced Cardiac Life Support Certification

    PubMed Central

    McEvoy, Matthew D.; Smalley, Jeremy C.; Nietert, Paul J.; Field, Larry C.; Furse, Cory M.; Blenko, John W.; Cobb, Benjamin G.; Walters, Jenna L.; Pendarvis, Allen; Dalal, Nishita S.; Schaefer, John J.

    2012-01-01

    Introduction Defining valid, reliable, defensible, and generalizable standards for the evaluation of learner performance is a key issue in assessing both baseline competence and mastery in medical education. However, prior to setting these standards of performance, the reliability of the scores yielding from a grading tool must be assessed. Accordingly, the purpose of this study was to assess the reliability of scores generated from a set of grading checklists used by non-expert raters during simulations of American Heart Association (AHA) MegaCodes. Methods The reliability of scores generated from a detailed set of checklists, when used by four non-expert raters, was tested by grading team leader performance in eight MegaCode scenarios. Videos of the scenarios were reviewed and rated by trained faculty facilitators and by a group of non-expert raters. The videos were reviewed “continuously” and “with pauses.” Two content experts served as the reference standard for grading, and four non-expert raters were used to test the reliability of the checklists. Results Our results demonstrate that non-expert raters are able to produce reliable grades when using the checklists under consideration, demonstrating excellent intra-rater reliability and agreement with a reference standard. The results also demonstrate that non-expert raters can be trained in the proper use of the checklist in a short amount of time, with no discernible learning curve thereafter. Finally, our results show that a single trained rater can achieve reliable scores of team leader performance during AHA MegaCodes when using our checklist in continuous mode, as measures of agreement in total scoring were very strong (Lin’s Concordance Correlation Coefficient = 0.96; Intraclass Correlation Coefficient = 0.97). Discussion We have shown that our checklists can yield reliable scores, are appropriate for use by non-expert raters, and are able to be employed during continuous assessment of team leader performance during the review of a simulated MegaCode. This checklist may be more appropriate for use by Advanced Cardiac Life Support (ACLS) instructors during MegaCode assessments than current tools provided by the AHA. PMID:22863996

  8. [The body as the scenario for social vulnerability regarding health].

    PubMed

    Ortega-Bolaños, Jesús A; Bula-Escobar, Jorge I

    2012-10-01

    Theoretical reflection concerning the human body within a scenario of social healthcare practices recognizes its nature through ontological dimensions for defining our finiteness and facticity in a world where nobody experiences their own birth or death but rather experiences such events through others. Considering health as an overall process called existence, being conscious of one's own body in terms of dynamic states of health and disease, is related to scenarios where the body has played a central role in the existential experience of the culture which it has built and in which it has been living, as well as in the relationship of knowledge with how power is exercised. Relationships between knowledge, power, resistance and the action of bodies are scenarios concerning social vulnerability regarding health. We wish to raise awareness concerning the relationship between bodies and social vulnerability regarding health as an emergent scenario for promoting discussion dealing with claiming the right to health and the chances of it being positively affected by public health policy from a trans-sector response and a community-based response proposed by Colombian society and culture.

  9. Dynamic Optical Networks for Future Internet Environments

    NASA Astrophysics Data System (ADS)

    Matera, Francesco

    2014-05-01

    This article reports an overview on the evolution of the optical network scenario taking into account the exponential growth of connected devices, big data, and cloud computing that is driving a concrete transformation impacting the information and communication technology world. This hyper-connected scenario is deeply affecting relationships between individuals, enterprises, citizens, and public administrations, fostering innovative use cases in practically any environment and market, and introducing new opportunities and new challenges. The successful realization of this hyper-connected scenario depends on different elements of the ecosystem. In particular, it builds on connectivity and functionalities allowed by converged next-generation networks and their capacity to support and integrate with the Internet of Things, machine-to-machine, and cloud computing. This article aims at providing some hints of this scenario to contribute to analyze impacts on optical system and network issues and requirements. In particular, the role of the software-defined network is investigated by taking into account all scenarios regarding data centers, cloud computing, and machine-to-machine and trying to illustrate all the advantages that could be introduced by advanced optical communications.

  10. Reconciliation of Gene and Species Trees

    PubMed Central

    Rusin, L. Y.; Lyubetskaya, E. V.; Gorbunov, K. Y.; Lyubetsky, V. A.

    2014-01-01

    The first part of the paper briefly overviews the problem of gene and species trees reconciliation with the focus on defining and algorithmic construction of the evolutionary scenario. Basic ideas are discussed for the aspects of mapping definitions, costs of the mapping and evolutionary scenario, imposing time scales on a scenario, incorporating horizontal gene transfers, binarization and reconciliation of polytomous trees, and construction of species trees and scenarios. The review does not intend to cover the vast diversity of literature published on these subjects. Instead, the authors strived to overview the problem of the evolutionary scenario as a central concept in many areas of evolutionary research. The second part provides detailed mathematical proofs for the solutions of two problems: (i) inferring a gene evolution along a species tree accounting for various types of evolutionary events and (ii) trees reconciliation into a single species tree when only gene duplications and losses are allowed. All proposed algorithms have a cubic time complexity and are mathematically proved to find exact solutions. Solving algorithms for problem (ii) can be naturally extended to incorporate horizontal transfers, other evolutionary events, and time scales on the species tree. PMID:24800245

  11. The Electronic Library Workstation--Today.

    ERIC Educational Resources Information Center

    Nolte, James

    1990-01-01

    Describes the components--hardware, software and applications, CD-ROM and online reference resources, and telecommunications links--of an electronic library workstation in use at Clarkson University (Potsdam, New York). Data manipulation, a hypothetical research scenario, and recommended workstation capabilities are also discussed. (MES)

  12. The Directed Case Method.

    ERIC Educational Resources Information Center

    Cliff, William H.; Curtin, Leslie Nesbitt

    2000-01-01

    Provides an example of a directed case on human anatomy and physiology. Uses brief real life newspaper articles and clinical descriptions of medical reference texts to describe an actual, fictitious, or composite event. Includes interrelated human anatomy and physiology topics in the scenario. (YDS)

  13. Knowledge Management and Reference Services

    ERIC Educational Resources Information Center

    Gandhi, Smiti

    2004-01-01

    Many corporations are embracing knowledge management (KM) to capture the intellectual capital of their employees. This article focuses on KM applications for reference work in libraries. It defines key concepts of KM, establishes a need for KM for reference services, and reviews various KM initiatives for reference services.

  14. Use of simulated patients to assess the clinical and communication skills of community pharmacists.

    PubMed

    Weiss, Marjorie C; Booth, Anneka; Jones, Bethan; Ramjeet, Sarah; Wong, Eva

    2010-06-01

    To investigate the quality and appropriateness of Emergency Hormonal Contraception (EHC) supply from community pharmacies. Community pharmacies in the southwest of England during 2007. Two simulated patient ('mystery shopper') scenarios to each participating pharmacy, one where the supply of EHC would be appropriate (scenario 1) and one where there was a drug interaction between EHC and St John's Wort, and the supply inappropriate (scenario 2). Pharmacy consultations were rated using criteria developed from two focus groups: one with pharmacist academics and one with female university students. Feedback to pharmacists to inform their continuing professional development was provided. Scores on rating scales encompassing the clinical and communication skills of the participating community pharmacists completed immediately after each mystery shopper visit. 40 pharmacist visits were completed: 21 for scenario 1 and 19 for scenario 2. Eighteen pharmacists were visited twice. Five pharmacists visited for scenario 2 supplied EHC against professional guidance, although other reference sources conflicted with this advice. Pharmacies which were part of the local PGD scheme scored higher overall in scenario 1 (P = 0.005) than those not part of the scheme. Overall the communication skills of pharmacists were rated highly although some pharmacists used jargon when explaining the interaction for scenario 2. Formatively assessing communication skills in an integrative manner alongside clinical skills has been identified as an important part of the medical consultation skills training and can be incorporated into the routine assessment and feedback of pharmacy over-the-counter medicines advice.

  15. Defining an Approach for Future Close Air Support Capability

    DTIC Science & Technology

    2017-01-01

    may take on the form of a force-mix study that considers multiple joint scenarios and missions. viii Acknowledgments The authors would like to thank...the Army and other services on an approach for defining future CAS capability. 9 Colin Clark, “Air...unit; one British soldier was killed, and five others were wounded.15 Only one A-10 was shot down during all of OIF and OEF. However, it should be

  16. Fossil-fueled development (SSP5): An energy and resource intensive scenario for the 21st century

    DOE PAGES

    Kriegler, Elmar; Bauer, Nico; Popp, Alexander; ...

    2016-08-18

    Here, this paper presents a set of energy and resource intensive scenarios based on the concept of Shared Socio-Economic Pathways (SSPs). The scenario family is characterized by rapid and fossil-fueled development with high socio-economic challenges to mitigation and low socio-economic challenges to adaptation (SSP5). A special focus is placed on the SSP5 marker scenario developed by the REMIND-MAgPIE integrated assessment modeling framework. The SSP5 scenarios exhibit very high levels of fossil fuel use, up to a doubling of global food demand, and up to a tripling of energy demand and greenhouse gas emissions over the course of the century, markingmore » the upper end of the scenario literature in several dimensions. The SSP5 marker scenario results in a radiative forcing pathway close to the highest Representative Concentration Pathway (RCP8.5), and represents currently the only socio-economic scenario family that can be combined with climate model projections based on RCP8.5. This paper further investigates the direct impact of mitigation policies on the energy, land and emissions dynamics confirming high socio-economic challenges to mitigation in SSP5. Nonetheless, mitigation policies reaching climate forcing levels as low as in the lowest Representative Concentration Pathway (RCP2.6) are accessible in SSP5. Finally, the SSP5 scenarios presented in this paper aim to provide useful reference points for future climate change, climate impact, adaption and mitigation analysis, and broader questions of sustainable development.« less

  17. GSM base stations: short-term effects on well-being.

    PubMed

    Augner, Christoph; Florian, Matthias; Pauser, Gernot; Oberfeld, Gerd; Hacker, Gerhard W

    2009-01-01

    The purpose of this study was to examine the effects of short-term GSM (Global System for Mobile Communications) cellular phone base station RF-EMF (radiofrequency electromagnetic fields) exposure on psychological symptoms (good mood, alertness, calmness) as measured by a standardized well-being questionnaire. Fifty-seven participants were selected and randomly assigned to one of three different exposure scenarios. Each of those scenarios subjected participants to five 50-min exposure sessions, with only the first four relevant for the study of psychological symptoms. Three exposure levels were created by shielding devices in a field laboratory, which could be installed or removed during the breaks between sessions such that double-blinded conditions prevailed. The overall median power flux densities were 5.2 microW/m(2) during "low," 153.6 microW/m(2) during "medium," and 2126.8 microW/m(2) during "high" exposure sessions. For scenario HM and MH, the first and third sessions were "low" exposure. The second session was "high" and the fourth was "medium" in scenario HM; and vice versa for scenario MH. Scenario LL had four successive "low" exposure sessions constituting the reference condition. Participants in scenarios HM and MH (high and medium exposure) were significantly calmer during those sessions than participants in scenario LL (low exposure throughout) (P = 0.042). However, no significant differences between exposure scenarios in the "good mood" or "alertness" factors were obtained. We conclude that short-term exposure to GSM base station signals may have an impact on well-being by reducing psychological arousal. (c) 2008 Wiley-Liss, Inc.

  18. Fossil-fueled development (SSP5): An energy and resource intensive scenario for the 21st century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kriegler, Elmar; Bauer, Nico; Popp, Alexander

    Here, this paper presents a set of energy and resource intensive scenarios based on the concept of Shared Socio-Economic Pathways (SSPs). The scenario family is characterized by rapid and fossil-fueled development with high socio-economic challenges to mitigation and low socio-economic challenges to adaptation (SSP5). A special focus is placed on the SSP5 marker scenario developed by the REMIND-MAgPIE integrated assessment modeling framework. The SSP5 scenarios exhibit very high levels of fossil fuel use, up to a doubling of global food demand, and up to a tripling of energy demand and greenhouse gas emissions over the course of the century, markingmore » the upper end of the scenario literature in several dimensions. The SSP5 marker scenario results in a radiative forcing pathway close to the highest Representative Concentration Pathway (RCP8.5), and represents currently the only socio-economic scenario family that can be combined with climate model projections based on RCP8.5. This paper further investigates the direct impact of mitigation policies on the energy, land and emissions dynamics confirming high socio-economic challenges to mitigation in SSP5. Nonetheless, mitigation policies reaching climate forcing levels as low as in the lowest Representative Concentration Pathway (RCP2.6) are accessible in SSP5. Finally, the SSP5 scenarios presented in this paper aim to provide useful reference points for future climate change, climate impact, adaption and mitigation analysis, and broader questions of sustainable development.« less

  19. Future electricity: The challenge of reducing both carbon and water footprint.

    PubMed

    Mekonnen, Mesfin M; Gerbens-Leenes, P W; Hoekstra, Arjen Y

    2016-11-01

    We estimate the consumptive water footprint (WF) of electricity and heat in 2035 for the four energy scenarios of the International Energy Agency (IEA) and a fifth scenario with a larger percentage of solar energy. Counter-intuitively, the 'greenest' IEA scenario (with the smallest carbon footprint) shows the largest WF increase over time: an increase by a factor four over the period 2010-2035. In 2010, electricity from solar, wind, and geothermal contributed 1.8% to the total. The increase of this contribution to 19.6% in IEA's '450 scenario' contributes significantly to the decrease of the WF of the global electricity and heat sector, but is offset by the simultaneous increase of the use of firewood and hydropower. Only substantial growth in the fractions of energy sources with small WFs - solar, wind, and geothermal energy - can contribute to a lowering of the WF of the electricity and heat sector in the coming decades. The fifth energy scenario - adapted from the IEA 450 scenario but based on a quick transition to solar, wind and geothermal energy and a minimum in bio-energy - is the only scenario that shows a strong decline in both carbon footprint (-66%) and consumptive WF (-12%) in 2035 compared to the reference year 2010. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  20. A Statistical Bias Correction Tool for Generating Climate Change Scenarios in Indonesia based on CMIP5 Datasets

    NASA Astrophysics Data System (ADS)

    Faqih, A.

    2017-03-01

    Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.

  1. Scenarios for Ultrafast Gamma-Ray Variability in AGN

    NASA Astrophysics Data System (ADS)

    Aharonian, F. A.; Barkov, M. V.; Khangulyan, D.

    2017-05-01

    We analyze three scenarios to address the challenge of ultrafast gamma-ray variability reported from active galactic nuclei. We focus on the energy requirements imposed by these scenarios: (I) external cloud in the jet, (II) relativistic blob propagating through the jet material, and (III) production of high-energy gamma-rays in the magnetosphere gaps. We show that while the first two scenarios are not constrained by the flare luminosity, there is a robust upper limit on the luminosity of flares generated in the black hole magnetosphere. This limit depends weakly on the mass of the central black hole and is determined by the accretion disk magnetization, viewing angle, and the pair multiplicity. For the most favorable values of these parameters, the luminosity for 5-minute flares is limited by 2× {10}43 {erg} {{{s}}}-1, which excludes a black hole magnetosphere origin of the flare detected from IC 310. In the scopes of scenarios (I) and (II), the jet power, which is required to explain the IC 310 flare, exceeds the jet power estimated based on the radio data. To resolve this discrepancy in the framework of scenario (II), it is sufficient to assume that the relativistic blobs are not distributed isotropically in the jet reference frame. A realization of scenario (I) demands that the jet power during the flare exceeds by a factor 102 the power of the radio jet relevant to a timescale of 108 years.

  2. Visual Communication and Cognition in Everyday Decision-Making.

    PubMed

    Jaenichen, Claudine

    2017-01-01

    Understanding cognition and the context of decision-making should be prioritized in the design process in order to accurately anticipate the outcome for intended audiences. A thorough understanding of cognition has been excluded from being a part of foundational design principals in visual communication. By defining leisure, direct, urgent, and emergency scenarios and providing examples of work that deeply considers the viewer's relationship to the design solution in context of these scenarios allows us to affirm the relevancy of cognition as a design variable and the importance of projects that advocate public utility.

  3. Reference Structures: Stagnation, Progress, and Future Challenges.

    ERIC Educational Resources Information Center

    Greenberg, Jane

    1997-01-01

    Assesses the current state of reference structures in online public access catalogs (OPACs) in a framework defined by stagnation, progress, and future challenges. Outlines six areas for reference structure development. Twenty figures provide illustrations. (AEF)

  4. 3% Yield Increase (HH3), All Energy Crops scenario of the 2016 Billion Ton Report

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron (ORCID:0000000320373827)

    2016-07-13

    Scientific reason for data generation: to serve as an alternate high-yield scenario for the BT16 volume 1 agricultural scenarios to compare these projections of potential biomass supplies against a reference case (agricultural baseline 10.11578/1337885). The simulation runs from 2015 through 2040; a starting year of 2014 is used but not reported. Date the data set was last modified: 02/02/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: This exogenous price simulations (also referred to as “specified-price” simulations) introduces a farmgate price, and POLYSYS solves for biomass supplies that may be brought to market in response to these prices. In specified-price scenarios, a specified farmgate price is offered constantly in all counties over all years of the simulation. This simulation begins in 2015 with an offered farmgate price for primary crop residues only between 2015 and 2018 and long-term contracts for dedicated crops beginning in 2019. Expected mature energy crop yield grows at a compounding rate of 3% beginning in 2016. The yield growth assumptions are fixed after crops are planted such that yield gains do not apply to crops already planted, but new plantings do take advantage of the gains in expected yield growth. Instruments used: Policy Analysis System –POLYSYS (version POLYS2015_V10_alt_JAN22B), an agricultural policy modeling system of U.S. agriculture (crops and livestock), supplied by the University of Tennessee Institute of Agriculture, Agricultural Policy Analysis Center.

  5. 2% Yield Increase (HH2), All Energy Crops scenario of the 2016 Billion Ton Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Maggie R.; Hellwinkel, Chad; Eaton, Laurence

    Scientific reason for data generation: to serve as an alternate high-yield scenario for the BT16 volume 1 agricultural scenarios to compare these projections of potential biomass supplies against a reference case (agricultural baseline 10.11578/1337885). The simulation runs from 2015 through 2040; a starting year of 2014 is used but not reported. Date the data set was last modified: 02/02/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: This exogenous price simulations (also referred to as “specified-price” simulations) introduces a farmgate price, and POLYSYS solves for biomass supplies that may be brought tomore » market in response to these prices. In specified-price scenarios, a specified farmgate price is offered constantly in all counties over all years of the simulation. This simulation begins in 2015 with an offered farmgate price for primary crop residues only between 2015 and 2018 and long-term contracts for dedicated crops beginning in 2019. Expected mature energy crop yield grows at a compounding rate of 2% beginning in 2016. The yield growth assumptions are fixed after crops are planted such that yield gains do not apply to crops already planted, but new plantings do take advantage of the gains in expected yield growth. Instruments used: Policy Analysis System –POLYSYS (version POLYS2015_V10_alt_JAN22B), an agricultural policy modeling system of U.S. agriculture (crops and livestock), supplied by the University of Tennessee Institute of Agriculture, Agricultural Policy Analysis Center.« less

  6. Assessing Mechanisms of Climate Change Impact on the Upland Forest Water Balance of the Willamette River Basin

    NASA Astrophysics Data System (ADS)

    Turner, D. P.; Conklin, D. R.; Vache, K. B.; Schwartz, C.; Nolin, A. W.; Chang, H.; Watson, E.; John, B.

    2016-12-01

    Projected changes in air temperature, precipitation, and vapor pressure for the Willamette River Basin (Oregon, USA) over the next century will have significant impacts on the river basin water balance, notably on the amount of evapotranspiration (ET). Mechanisms of impact on ET will be both direct and indirect, but there is limited understanding of their absolute and relative magnitudes. Here we developed a spatially-explicit, daily time-step, modeling infrastructure to simulate the basin-wide water balance that accounts for meteorological influences, as well as effects mediated by changing vegetation cover type, leaf area, and ecophysiology. Three CMIP5 climate scenarios (LowClim, Reference, HighClim) were run for the 2010 to 2100 period. Besides warmer temperatures, the climate scenarios were characterized by wetter winters and increasing vapor pressure deficits. In the mid-range Reference scenario, our landscape simulation model (Envision) projected a continuation of forest cover on the uplands but a 3-fold increase in area burned per year. A decline (12-30%) in basin-wide mean leaf area index (LAI) in forests was projected in all scenarios. The lower LAIs drove a corresponding decline in ET. In a sensitivity test, the effect of increasing CO2 on stomatal conductance induced a further substantial decrease (11-18%) in basin-wide mean ET. The net effect of decreases in ET and increases in winter precipitation was an increase in annual streamflow. These results support the inclusion of changes in land cover, land use, LAI, and ecophysiology in efforts to anticipate impacts of climate change on basin-scale water balances.

  7. 78 FR 41132 - Self-Regulatory Organizations; EDGX Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Exchange's principal office, and at the Public Reference Room of the Commission. \\3\\ ``Member'' is defined... the Exchange as that term is defined in Section 3(a)(3) of the Act.'' EDGX Rule 1.5(n). \\4\\ References..., then the shares are posted on the EDGX book unless the Member instructs otherwise. See Exchange Rule 11...

  8. Cosmological singularity resolution from quantum gravity: The emergent-bouncing universe

    NASA Astrophysics Data System (ADS)

    Alesci, Emanuele; Botta, Gioele; Cianfrani, Francesco; Liberati, Stefano

    2017-08-01

    Alternative scenarios to the big bang singularity have been subject of intense research for several decades by now. Most popular in this sense have been frameworks were such singularity is replaced by a bounce around some minimal cosmological volume or by some early quantum phase. This latter scenario was devised a long time ago and referred as an "emergent universe" (in the sense that our universe emerged from a constant volume quantum phase). We show here that within an improved framework of canonical quantum gravity (the so-called quantum reduced loop gravity) the Friedmann equations for cosmology are modified in such a way to replace the big bang singularity with a short bounce preceded by a metastable quantum phase in which the volume of the universe oscillates between a series of local maxima and minima. We call this hybrid scenario an "emergent-bouncing universe" since after a pure oscillating quantum phase the classical Friedmann spacetime emerges. Perspective developments and possible tests of this scenario are discussed in the end.

  9. A reference model for space data system interconnection services

    NASA Astrophysics Data System (ADS)

    Pietras, John; Theis, Gerhard

    1993-03-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  10. A reference model for space data system interconnection services

    NASA Technical Reports Server (NTRS)

    Pietras, John; Theis, Gerhard

    1993-01-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  11. Framing 100-year overflowing and overtopping marine submersion hazard resulting from the propagation of 100-year joint hydrodynamic conditions

    NASA Astrophysics Data System (ADS)

    Nicolae Lerma, A.; Bulteau, T.; Elineau, S.; Paris, F.; Pedreros, R.

    2016-12-01

    Marine submersion is an increasing concern for coastal cities as urban development reinforces their vulnerabilities while climate change is likely to foster the frequency and magnitude of submersions. Characterising the coastal flooding hazard is therefore of paramount importance to ensure the security of people living in such places and for coastal planning. A hazard is commonly defined as an adverse phenomenon, often represented by a magnitude of a variable of interest (e.g. flooded area), hereafter called response variable, associated with a probability of exceedance or, alternatively, a return period. Characterising the coastal flooding hazard consists in finding the correspondence between the magnitude and the return period. The difficulty lies in the fact that the assessment is usually performed using physical numerical models taking as inputs scenarios composed by multiple forcing conditions that are most of the time interdependent. Indeed, a time series of the response variable is usually not available so we have to deal instead with time series of forcing variables (e.g. water level, waves). Thus, the problem is twofold: on the one hand, the definition of scenarios is a multivariate matter; on the other hand, it is tricky and approximate to associate the resulting response, being the output of the physical numerical model, to the return period defined for the scenarios. In this study, we illustrate the problem on the district of Leucate, located in the French Mediterranean coast. A multivariate extreme value analysis of waves and water levels is performed offshore using a conditional extreme model, then two different methods are used to define and select 100-year scenarios of forcing variables: one based on joint exceedance probability contours, a method classically used in coastal risks studies, the other based on environmental contours, which are commonly used in the field of structure design engineering. We show that these two methods enable one to frame the true 100-year response variable. The selected scenarios are propagated to the shore through a high resolution flood modelling coupling overflowing and overtopping processes. Results in terms of inundated areas and inland water volumes are finally compared for the two methods, giving upper and lower bounds for the true response variables.

  12. Choosing relatives for DNA identification of missing persons.

    PubMed

    Ge, Jianye; Budowle, Bruce; Chakraborty, Ranajit

    2011-01-01

    DNA-based analysis is integral to missing person identification cases. When direct references are not available, indirect relative references can be used to identify missing persons by kinship analysis. Generally, more reference relatives render greater accuracy of identification. However, it is costly to type multiple references. Thus, at times, decisions may need to be made on which relatives to type. In this study, pedigrees for 37 common reference scenarios with 13 CODIS STRs were simulated to rank the information content of different combinations of relatives. The results confirm that first-order relatives (parents and fullsibs) are the most preferred relatives to identify missing persons; fullsibs are also informative. Less genetic dependence between references provides a higher on average likelihood ratio. Distant relatives may not be helpful solely by autosomal markers. But lineage-based Y chromosome and mitochondrial DNA markers can increase the likelihood ratio or serve as filters to exclude putative relationships. © 2010 American Academy of Forensic Sciences.

  13. Four dimensional studies in earth space

    NASA Technical Reports Server (NTRS)

    Mather, R. S.

    1972-01-01

    A system of reference which is directly related to observations, is proposed for four-dimensional studies in earth space. Global control network and polar wandering are defined. The determination of variations in the earth's gravitational field with time also forms part of such a system. Techniques are outlined for the unique definition of the motion of the geocenter, and the changes in the location of the axis of rotation of an instantaneous earth model, in relation to values at some epoch of reference. The instantaneous system referred to is directly related to a fundamental equation in geodynamics. The reference system defined would provide an unambiguous frame for long period studies in earth space, provided the scale of the space were specified.

  14. Scoping Future Policy Dynamics in Raw Materials Through Scenarios Testing

    NASA Astrophysics Data System (ADS)

    Correia, Vitor; Keane, Christopher; Sturm, Flavius; Schimpf, Sven; Bodo, Balazs

    2017-04-01

    The International Raw Materials Observatory (INTRAW) project is working towards a sustainable future for the European Union in access to raw materials, from an availability, economical, and environmental framework. One of the major exercises for the INTRAW project is the evaluation of potential future scenarios for 2050 to frame economic, research, and environmental policy towards a sustainable raw materials supply. The INTRAW consortium developed three possible future scenarios that encompass defined regimes of political, economic, and technological norms. The first scenario, "Unlimited Trade," reflects a world in which free trade continues to dominate the global political and economic environment, with expectations of a growing demand for raw materials from widely distributed global growth. The "National Walls" scenario reflects a world where nationalism and economic protectionism begins to dominate, leading to stagnating economic growth and uneven dynamics in raw materials supply and demand. The final scenario, "Sustainability Alliance," examines the dynamics of a global political and economic climate that is focused on environmental and economic sustainability, leading towards increasingly towards a circular raw materials economy. These scenarios were reviewed, tested, and provided simulations of impacts with members of the Consortium and a panel of global experts on international raw materials issues which led to expected end conditions for 2050. Given the current uncertainty in global politics, these scenarios are informative to identifying likely opportunities and crises. The details of these simulations and expected responses to the research demand, technology investments, and economic components of raw materials system will be discussed.

  15. Green roof adoption in atlanta, georgia: the effects of building characteristics and subsidies on net private, public, and social benefits.

    PubMed

    Mullen, Jeffrey D; Lamsal, Madhur; Colson, Greg

    2013-10-01

    This research draws on and expands previous studies that have quantified the costs and benefits associated with conventional roofs versus green roofs. Using parameters from those studies to define alternative scenarios, we estimate from a private, public, and social perspective the costs and benefits of installing and maintaining an extensive green roof in Atlanta, GA. Results indicate net private benefits are a decreasing function of roof size and vary considerably across scenarios. In contrast, net public benefits are highly stable across scenarios, ranging from $32.49 to $32.90 m(-2). In addition, we evaluate two alternative subsidy regimes: (i) a general subsidy provided to every building that adopts a green roof and (ii) a targeted subsidy provided only to buildings for which net private benefits are negative but net public benefits are positive. In 6 of the 12 general subsidy scenarios the optimal public policy is not to offer a subsidy; in 5 scenarios the optimal subsidy rate is between $20 and $27 m(-2); and in 1 scenario the optimal rate is $5 m(-2). The optimal rate with a targeted subsidy is between $20 and $27 m(-2) in 11 scenarios and no subsidy is optimal in the twelfth. In most scenarios, a significant portion of net public benefits are generated by buildings for which net private benefits are positive. This suggests a policy focused on information dissemination and technical assistance may be more cost-effective than direct subsidy payments.

  16. Futures of elderly care in Iran: A protocol with scenario approach.

    PubMed

    Goharinezhad, Salime; Maleki, Mohammadreza; Baradaran, Hamid Reza; Ravaghi, Hamid

    2016-01-01

    Background: The number of people aged 60 and older is increasing faster than other age groups worldwide. Iran will experience a sharp aging population increase in the next decades, and this will pose new challenges to the healthcare system. Since providing high quality aged-care services would be the major concern of the policymakers, this question arises that what types of aged care services should be organized in the coming 10 years? This protocol has been designed to develop a set of scenarios for the future of elderly care in Iran. Methods: In this study, intuitive logics approach and Global Business Network (GBN) model were used to develop scenarios for elderly care in Iran. In terms of perspective, the scenarios in this approach are normative, qualitative with respect to methodology and deductive in constructing the process of scenarios. The three phases of GBN model are as follows: 1) Orientation: Identifying strategic levels, stakeholders, participants and time horizon; 2) Exploration: Identifying the driving forces and key uncertainties; 3) Synthesis: Defining the scenario logics and constructing scenario storyline. Results: Presently, two phases are completed and the results will be published in mid-2016. Conclusion: This study delivers a comprehensive framework for taking appropriate actions in providing care for the elderly in the future. Moreover, policy makers should specify and provide the full range of services for the elderly, and in doing so, the scenarios and key findings of this study could be of valuable help.

  17. Using the scenario method in the context of health and health care--a scoping review.

    PubMed

    Vollmar, Horst Christian; Ostermann, Thomas; Redaèlli, Marcus

    2015-10-16

    The scenario technique is a method for future research and for strategic planning. Today, it includes both qualitative and quantitative elements. The aims of this scoping review are to give an overview of the application of the scenario method in the fields of health care and to make suggestions for better reporting in future scenario projects. Between January 2013 and October 2013 we conducted a systematic search in the databases Medline, Embase, PsycInfo, Eric, The Cochrane Library, Scopus, Web of Science, and Cinahl since inception for the term 'scenario(s)' in combination with other terms, e.g. method, model, and technique. Our search was not restricted by date or language. In addition, we screened the reference lists of the included articles. A total of 576 bibliographical records were screened. After removing duplicates and three rounds of screening, 41 articles covering 38 different scenario projects were included for the final analysis. Nine of the included articles addressed disease related issues, led by mental health and dementia (n = 4), and followed by cancer (n = 3). Five scenario projects focused on public health issues at an organizational level and five focused on the labor market for different health care professionals. In addition, four projects dealt with health care 'in general', four with the field of biotechnology and personalized medicine, and additional four with other technology developments. Some of the scenario projects suffered from poor reporting of methodological aspects. Despite its potential, use of the scenario method seems to be published rarely in comparison to other methods such as the Delphi-technique, at least in the field of health care. This might be due to the complexity of the methodological approach. Individual project methods and activities vary widely and are poorly reported. Improved criteria are required for reporting of scenario project methods. With improved standards and greater transparency, the scenario method will be a good tool for scientific health care planning and strategic decision-making in public health.

  18. Downscaled climate change impacts on agricultural water resources in Puerto Rico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harmsen, E.W.; Miller, N.L.; Schlegel, N.J.

    2009-04-01

    The purpose of this study is to estimate reference evapotranspiration (ET{sub o}), rainfall deficit (rainfall - ET{sub o}) and relative crop yield reduction for a generic crop under climate change conditions for three locations in Puerto Rico: Adjuntas, Mayaguez, and Lajas. Reference evapotranspiration is estimated by the Penman-Monteith method. Rainfall and temperature data were statistically downscaled and evaluated using the DOE/NCAR PCM global circulation model projections for the B1 (low), A2 (mid-high) and A1fi (high) emission scenarios of the Intergovernmental Panel on Climate Change Special Report on Emission Scenarios. Relative crop yield reductions were estimated from a function dependent watermore » stress factor, which is a function of soil moisture content. Average soil moisture content for the three locations was determined by means of a simple water balance approach. Results from the analysis indicate that the rainy season will become wetter and the dry season will become drier. The 20-year mean 1990-2010 September rainfall excess (i.e., rainfall - ET{sub o} > 0) increased for all scenarios and locations from 149.8 to 356.4 mm for 2080-2100. Similarly, the 20-year average February rainfall deficit (i.e., rainfall - ET{sub o} < 0) decreased from a -26.1 mm for 1990-2010 to -72.1 mm for the year 2080-2100. The results suggest that additional water could be saved during the wet months to offset increased irrigation requirements during the dry months. Relative crop yield reduction did not change significantly under the B1 projected emissions scenario, but increased by approximately 20% during the summer months under the A1fi emissions scenario. Components of the annual water balance for the three climate change scenarios are rainfall, evapotranspiration (adjusted for soil moisture), surface runoff, aquifer recharge and change in soil moisture storage. Under the A1fi scenario, for all locations, annual evapotranspiration decreased owing to lower soil moisture, surface runoff decreased, and aquifer recharge increased. Aquifer recharge increased at all three locations because the majority of recharge occurs during the wet season and the wet season became wetter. This is good news from a groundwater production standpoint. Increasing aquifer recharge also suggests that groundwater levels may increase and this may help to minimize saltwater intrusion near the coasts as sea levels increase, provided that groundwater use is not over-subscribed.« less

  19. Scenarios to prioritize observing activities on the North Slope, Alaska in the context of resource development, climate change and socio-economic uncertainties

    NASA Astrophysics Data System (ADS)

    Lee, O. A.; Eicken, H.; Payne, J. F.; Lassuy, D.

    2014-12-01

    The North Slope of Alaska is experiencing rapid changes in response to interacting climate and socioeconomic drivers. The North Slope Science Initiative (NSSI) is using scenarios as a tool to identify plausible, spatially explicit future states of resource extraction activities on the North Slope and adjacent seas through the year 2040. The objective of the scenarios process is to strategically assess research and monitoring needs on the North Slope. The participatory scenarios process involved stakeholder input (including Federal, State, local, academic, industry and non-profit representatives) to identify key drivers of change related to resource extraction activities on the North Slope. While climate change was identified as a key driver in the biophysical system, economic drivers related to oil and gas development were also important. Expert-reviewed informational materials were developed to help stakeholders obtain baseline knowledge and stimulate discussions about interactions between drivers, knowledge gaps and uncertainties. Map-based scenario products will allow mission-oriented agencies to jointly explore where to prioritize research investments and address risk in a complex, changing environment. Scenarios consider multidecadal timescales. However, tracking of indicator variables derived from scenarios can lead to important insights about the trajectory of the North Slope social-environmental system and inform management decisions to reduce risk on much shorter timescales. The inclusion of stakeholders helps provide a broad spectrum of expert viewpoints necessary for considering the range of plausible scenarios. A well-defined focal question, transparency in the participation process and continued outreach about the utility and limitations of scenarios are also important components of the scenarios process.

  20. Understanding Land System Change Through Scenario-Based Simulations: A Case Study from the Drylands in Northern China

    NASA Astrophysics Data System (ADS)

    Liu, Zhifeng; Verburg, Peter H.; Wu, Jianguo; He, Chunyang

    2017-03-01

    The drylands in northern China are expected to face dramatic land system change in the context of socioeconomic development and environmental conservation. Recent studies have addressed changes of land cover with socioeconomic development in the drylands in northern China. However, the changes in land use intensity and the potential role of environmental conservation measures have yet to be adequately examined. Given the importance of land management intensity to the ecological conditions and regional sustainability, our study projected land system change in Hohhot city in the drylands in northern China from 2013 to 2030. Here, land systems are defined as combinations of land cover and land use intensity. Using the CLUMondo model, we simulated land system change in Hohhot under three scenarios: a scenario following historical trends, a scenario with strong socioeconomic and land use planning, and a scenario focused on achieving environmental conservation targets. Our results showed that Hohhot is likely to experience agricultural intensification and urban growth under all three scenarios. The agricultural intensity and the urban growth rate were much higher under the historical trend scenario compared to those with more planning interventions. The dynamics of grasslands depend strongly on projections of livestock and other claims on land resources. In the historical trend scenario, intensively grazed grasslands increase whereas a large amount of the current area of grasslands with livestock converts to forest under the scenario with strong planning. Strong conversion from grasslands with livestock and extensive cropland to semi-natural grasslands was estimated under the conservation scenario. The findings provide an input into discussions about environmental management, planning and sustainable land system design for Hohhot.

  1. Cost-effectiveness of Lung Cancer Screening in Canada.

    PubMed

    Goffin, John R; Flanagan, William M; Miller, Anthony B; Fitzgerald, Natalie R; Memon, Saima; Wolfson, Michael C; Evans, William K

    2015-09-01

    The US National Lung Screening Trial supports screening for lung cancer among smokers using low-dose computed tomographic (LDCT) scans. The cost-effectiveness of screening in a publically funded health care system remains a concern. To assess the cost-effectiveness of LDCT scan screening for lung cancer within the Canadian health care system. The Cancer Risk Management Model (CRMM) simulated individual lives within the Canadian population from 2014 to 2034, incorporating cancer risk, disease management, outcome, and cost data. Smokers and former smokers eligible for lung cancer screening (30 pack-year smoking history, ages 55-74 years, for the reference scenario) were modeled, and performance parameters were calibrated to the National Lung Screening Trial (NLST). The reference screening scenario assumes annual scans to age 75 years, 60% participation by 10 years, 70% adherence to screening, and unchanged smoking rates. The CRMM outputs are aggregated, and costs (2008 Canadian dollars) and life-years are discounted 3% annually. The incremental cost-effectiveness ratio. Compared with no screening, the reference scenario saved 51,000 quality-adjusted life-years (QALY) and had an incremental cost-effectiveness ratio of CaD $52,000/QALY. If smoking history is modeled for 20 or 40 pack-years, incremental cost-effectiveness ratios of CaD $62,000 and CaD $43,000/QALY, respectively, were generated. Changes in participation rates altered life years saved but not the incremental cost-effectiveness ratio, while the incremental cost-effectiveness ratio is sensitive to changes in adherence. An adjunct smoking cessation program improving the quit rate by 22.5% improves the incremental cost-effectiveness ratio to CaD $24,000/QALY. Lung cancer screening with LDCT appears cost-effective in the publicly funded Canadian health care system. An adjunct smoking cessation program has the potential to improve outcomes.

  2. Experimental investigations of weak definite and weak indefinite noun phrases

    PubMed Central

    Klein, Natalie M.; Gegg-Harrison, Whitney M.; Carlson, Greg N.; Tanenhaus, Michael K.

    2013-01-01

    Definite noun phrases typically refer to entities that are uniquely identifiable in the speaker and addressee’s common ground. Some definite noun phrases (e.g. the hospital in Mary had to go the hospital and John did too) seem to violate this uniqueness constraint. We report six experiments that were motivated by the hypothesis that these “weak definite” interpretations arise in “incorporated” constructions. Experiments 1-3 compared nouns that seem to allow for a weak definite interpretation (e.g. hospital, bank, bus, radio) with those that do not (e.g. farm, concert, car, book). Experiments 1 and 2 used an instruction-following task and picture-judgment task, respectively, to demonstrate that a weak definite need not uniquely refer. In Experiment 3 participants imagined scenarios described by sentences such as The Federal Express driver had to go to the hospital/farm. The imagined scenarios following weak definite noun phrases were more likely to include conventional activities associated with the object, whereas following regular nouns, participants were more likely to imagine scenarios that included typical activities associated with the subject; similar effects were observed with weak indefinites. Experiment 4 found that object-related activities were reduced when the same subject and object were used with a verb that does not license weak definite interpretations. In Experiment 5, a science fiction story introduced an artificial lexicon for novel concepts. Novel nouns that shared conceptual properties with English weak definite nouns were more likely to allow weak reference in a judgment task. Experiment 6 demonstrated that familiarity for definite articles and anti- familiarity for indefinite articles applies to the activity associated with the noun, consistent with predictions made by the incorporation analysis. PMID:23685208

  3. Basic Simulation Environment for Highly Customized Connected and Autonomous Vehicle Kinematic Scenarios.

    PubMed

    Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen

    2017-08-23

    To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification.

  4. Importance of the Pre-Industrial Baseline in Determining the Likelihood of Exceeding the Paris Limits

    NASA Astrophysics Data System (ADS)

    Mann, M. E.; Schurer, A. P.; Hawkins, E.; Tett, S. F.; Hegerl, G. C.

    2017-12-01

    During the Paris Conference in 2015, nations of the world strengthened the United NationsFramework Convention on Climate Change by agreeing to holding "the increase in the globalaverage temperature to well below 2°C above pre-industrial levels and pursuing efforts to limitthe temperature increase to 1.5°C. However, "pre-industrial" was not defined. Here weinvestigate the implications of different choices of the pre-industrial baseline on the likelihood ofexceeding these two temperature thresholds. We find that for the strongest mitigation scenarioRCP2.6 and a medium scenario RCP4.5 the probability of exceeding the thresholds and timingof exceedance is highly dependent on the pre-industrial baseline, for example the probability ofcrossing 1.5°C by the end of the century under RCP2.6, varies from 61% to 88% depending onhow the baseline is defined. In contrast, in the scenario with no mitigation, RCP8.5, boththresholds will almost certainly be exceeded by the middle of the century with the definition ofthe pre-industrial baseline of less importance. Allowable carbon emissions for thresholdstabilisation are similarly highly dependent on the pre-industrial baseline. For stabilisation at2°C, allowable emissions decrease by as much as 40% when earlier than 19th century climatesare considered as a baseline.

  5. Basic Simulation Environment for Highly Customized Connected and Autonomous Vehicle Kinematic Scenarios

    PubMed Central

    Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen

    2017-01-01

    To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification. PMID:28832518

  6. Performance evaluation of time-aware enhanced software defined networking (TeSDN) for elastic data center optical interconnection.

    PubMed

    Yang, Hui; Zhang, Jie; Zhao, Yongli; Ji, Yuefeng; Li, Hui; Lin, Yi; Li, Gang; Han, Jianrui; Lee, Young; Ma, Teng

    2014-07-28

    Data center interconnection with elastic optical networks is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. We previously implemented enhanced software defined networking over elastic optical network for data center application [Opt. Express 21, 26990 (2013)]. On the basis of it, this study extends to consider the time-aware data center service scheduling with elastic service time and service bandwidth according to the various time sensitivity requirements. A novel time-aware enhanced software defined networking (TeSDN) architecture for elastic data center optical interconnection has been proposed in this paper, by introducing a time-aware resources scheduling (TaRS) scheme. The TeSDN can accommodate the data center services with required QoS considering the time dimensionality, and enhance cross stratum optimization of application and elastic optical network stratums resources based on spectrum elasticity, application elasticity and time elasticity. The overall feasibility and efficiency of the proposed architecture is experimentally verified on our OpenFlow-based testbed. The performance of TaRS scheme under heavy traffic load scenario is also quantitatively evaluated based on TeSDN architecture in terms of blocking probability and resource occupation rate.

  7. A Preliminary Performance Assessment for Salt Disposal of High-Level Nuclear Waste - 12173

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Clayton, Daniel; Jove-Colon, Carlos

    2012-07-01

    A salt repository is one of the four geologic media currently under study by the U.S. DOE Office of Nuclear Energy to support the development of a long-term strategy for geologic disposal of commercial used nuclear fuel (UNF) and high-level radioactive waste (HLW). The immediate goal of the generic salt repository study is to develop the necessary modeling tools to evaluate and improve the understanding of the repository system response and processes relevant to long-term disposal of UNF and HLW in a salt formation. The current phase of this study considers representative geologic settings and features adopted from previous studiesmore » for salt repository sites. For the reference scenario, the brine flow rates in the repository and underlying interbeds are very low, and transport of radionuclides in the transport pathways is dominated by diffusion and greatly retarded by sorption on the interbed filling materials. I-129 is the dominant annual dose contributor at the hypothetical accessible environment, but the calculated mean annual dose is negligibly small. For the human intrusion (or disturbed) scenario, the mean mass release rate and mean annual dose histories are very different from those for the reference scenario. Actinides including Pu-239, Pu-242 and Np-237 are major annual dose contributors, and the calculated peak mean annual dose is acceptably low. A performance assessment model for a generic salt repository has been developed incorporating, where applicable, representative geologic settings and features adopted from literature data for salt repository sites. The conceptual model and scenario for radionuclide release and transport from a salt repository were developed utilizing literature data. The salt GDS model was developed in a probabilistic analysis framework. The preliminary performance analysis for demonstration of model capability is for an isothermal condition at the ambient temperature for the near field. The capability demonstration emphasizes key attributes of a salt repository that are potentially important to the long-term safe disposal of UNF and HLW. The analysis presents and discusses the results showing repository responses to different radionuclide release scenarios (undisturbed and human intrusion). For the reference (or nominal or undisturbed) scenario, the brine flow rates in the repository and underlying interbeds are very low, and transport of radionuclides in the transport pathways is dominated by diffusion and greatly retarded by sorption on the interbed filling materials. I-129 (non-sorbing and unlimited solubility with a very long half-life) is the dominant annual dose contributor at the hypothetical accessible environment, but the calculated mean annual dose is negligibly small that there is no meaningful consequence for the repository performance. For the human intrusion (or disturbed) scenario analysis, the mean mass release rate and mean annual dose histories are very different from those for the reference scenario analysis. Compared to the reference scenario, the relative annual dose contributions by soluble, non-sorbing fission products, particularly I-129, are much lower than by actinides including Pu-239, Pu-242 and Np-237. The lower relative mean annual dose contributions by the fission product radionuclides are due to their lower total inventory available for release (i.e., up to five affected waste packages), and the higher mean annual doses by the actinides are the outcome of the direct release of the radionuclides into the overlying aquifer having high water flow rates, thereby resulting in an early arrival of higher concentrations of the radionuclides at the biosphere drinking water well prior to their significant decay. The salt GDS model analysis has also identified the following future recommendations and/or knowledge gaps to improve and enhance the confidence of the future repository performance analysis. - Repository thermal loading by UNF and HLW, and the effect on the engineered barrier and near-field performance. - Closure and consolidation of salt rocks by creep deformation under the influence of thermal perturbation, and the effect on the engineered barrier and near-field performance. - Brine migration and radionuclide transport under the influence of thermal perturbation in generic salt repository environment, and the effect on the engineered barrier and near-field performance and far-field performance. - Near-field geochemistry and radionuclide mobility in generic salt repository environment (high ionic strength brines, elevated temperatures and chemically reducing condition). - Degradation of engineer barrier components (waste package, waste canister, waste forms, etc.) in a generic salt repository environment (high ionic strength brines, elevated temperatures and chemically reducing condition). - Waste stream types and inventory estimates, particularly for reprocessing high-level waste. (authors)« less

  8. Plant distributions in the southwestern United States; a scenario assessment of the modern-day and future distribution ranges of 166 Species

    USGS Publications Warehouse

    Thomas, Kathryn A.; Guertin, Patricia P.; Gass, Leila

    2012-01-01

    The authors developed spatial models of the predicted modern-day suitable habitat (SH) of 166 dominant and indicator plant species of the southwestern United States (herein referred to as the Southwest) and then conducted a coarse assessment of potential future changes in the distribution of their suitable habitat under three climate-change scenarios for two time periods. We used Maxent-based spatial modeling to predict the modern-day and future scenarios of SH for each species in an over 342-million-acre area encompassing all or parts of six states in the Southwest--Arizona, California, Colorado, Nevada, New Mexico, and Utah. Modern-day SH models were predicted by our using 26 annual and monthly average temperature and precipitation variables, averaged for the years 1971-2000. Future SH models were predicted for each species by our using six climate models based on application of the average of 16 General Circulation Models to Intergovernmental Panel on Climate Change emission scenarios B1, A1B, and A2 for two time periods, 2040 to 2069 and 2070 and 2100, referred to respectively as the 2050 and 2100 time periods. The assessment examined each species' vulnerability to loss of modern-day SH under future climate scenarios, potential to gain SH under future climate scenarios, and each species' estimated risk as a function of both vulnerability and potential gains. All 166 species were predicted to lose modern-day SH in the future climate change scenarios. In the 2050 time period, nearly 30 percent of the species lost 75 percent or more of their modern-day suitable habitat, 21 species gained more new SH than their modern-day SH, and 30 species gained less new SH than 25 percent of their modern-day SH. In the 2100 time period, nearly half of the species lost 75 percent or more of their modern-day SH, 28 species gained more new SH than their modern-day SH, and 34 gained less new SH than 25 percent of their modern-day SH. Using nine risk categories we found only two species were in the least risk category, while 20 species were in the highest risk category. The assessment showed that species respond independently to predicted climate change, suggesting that current plant assemblages may disassemble under predicted climate change scenarios. This report presents the results for each species in tables (Appendix A) and maps (14 for each species) in Appendix B.

  9. Analytic Studies of the Relationship between Track Geometry Variations and Derailment Potential at Low Speeds

    DOT National Transportation Integrated Search

    1983-09-01

    This report describes analytical studies carried out to define the relationship between track parameters and safety from derailment. Problematic track scenarios are identified reflecting known accident data. Vehicle response is investigated in the 10...

  10. Mapping the unknown: Modeling future scenarios of riverine fish communities

    EPA Science Inventory

    Riverscapes can be defined by spatial and temporal variation in a suite of environmental conditions that influence the distribution and persistence of riverine fish populations. Fish in riverscapes can exhibit extensive movements, require seasonally-distinct habitats for spawnin...

  11. Analytical Studies of the Relationship Between Track Geometry Variations and Derailment Potential at Low Speeds

    DOT National Transportation Integrated Search

    1983-09-01

    This report describes analytical studies carried out to define the relationship between track parameters and safety from derailment. Problematic track scenarios are identified reflecting known accident data. Vehicle response is investigated in the 10...

  12. [Study on strategies of pollution prevention in coastal city of Zhejiang Province based on scenario analysis].

    PubMed

    Tian, Jin-Ping; Chen, Lü-Jun; Du, Peng-Fei; Qian, Yi

    2013-01-01

    Scenario analysis was used to study the environmental burden in a coastal city of Zhejiang province under different patterns of economic development. The aim of this research is to propose advices on decision making by illustrating how to make emissions reduced by transforming the pattern of economic development in a developed coastal area, which had acquired the level of 70 000 yuan GDP per cap. At first, 18 heavy pollution industries were screened out, by referencing total emissions of chemical oxygen demand, ammonia-nitrogen, sulfur dioxide, and nitrogen oxide. Then, a model of scenario analysis and the back-up calculation program were designed to study the sustainable development of the heavy pollution industries. With 2008 and 2015 as the reference year and the target year respectively, emissions of four pollutants mentioned above in the 18 heavy pollution industries in the city were analyzed under six scenarios. The total emissions of 4 pollutants should be reduced to an expectant degree, which is set as the constraint prerequisite of the scenario analysis. At last, some suggestions for decision-making are put forward, which include maintaining a moderate increase rate of GDP around 7%, strengthening the adjustment of economic structure, controlling the increasing rate of industrial added value of the industries with heavy pollution, optimizing the structure of industries with heavy pollution, decreasing the intensity of waste emission by implementing cleaner production to reduce emission produce at the source, and strengthening regulations on the operation of waste treatment plants to further promote the efficiency of waste treatment. Only by implementing such measures mentioned above, can the total emissions of chemical oxygen demand, ammonia-nitrogen, sulfur dioxide, and nitrogen oxide of the 18 industries with heavy pollution in the city be reduced by a 10%, 10%, 5%, and 15% respectively based on the reference year.

  13. From theoretical fixed return period events to real flooding impacts: a new approach to set flooding scenarios, thresholds and alerts

    NASA Astrophysics Data System (ADS)

    Parravicini, Paola; Cislaghi, Matteo; Condemi, Leonardo

    2017-04-01

    ARPA Lombardia is the Environmental Protection Agency of Lombardy, a wide region in the North of Italy. ARPA is in charge of river monitoring either for Civil Protection or water balance purposes. It cooperates with the Civil Protection Agency of Lombardy (RL-PC) in flood forecasting and early warning. The early warning system is based on rainfall and discharge thresholds: when a threshold exceeding is expected, RL-PC disseminates an alert from yellow to red. The conventional threshold evaluation is based on events at a fixed return period. Anyway, the impacts of events with the same return period may be different along the river course due to the specific characteristics of the affected areas. A new approach is introduced. It defines different scenarios, corresponding to different flood impacts. A discharge threshold is then associated to each scenario and the return period of the scenario is computed backwards. Flood scenarios are defined in accordance with National Civil Protection guidelines, which describe the expected flood impact and associate a colour to the scenario from green (no relevant effects) to red (major floods). A range of discharges is associated with each scenario since they cause the same flood impact; the threshold is set as the discharge corresponding to the transition between two scenarios. A wide range of event-based information is used to estimate the thresholds. As first guess, the thresholds are estimated starting from hydraulic model outputs and the people or infrastructures flooded according to the simulations. Eventually the model estimates are validated with real event knowledge: local Civil Protection Emergency Plans usually contain very detailed local impact description at known river levels or discharges, RL-PC collects flooding information notified by the population, newspapers often report flood events on web, data from the river monitoring network provide evaluation of actually happened levels and discharges. The methodology allows to give a return period for each scenario. The return period may vary along the river course according to the discharges associated with the scenario. The values of return period may show the areas characterized by higher risk and can be an important basis for civil protection emergency planning and river monitoring. For example, considering the Lambro River, the red scenario (major flood) shows a return period of 50 years in the northern rural part of the catchment. When the river crosses the city of Milan, the return period drops to 4 years. Afterwards it goes up to more than 100 years when the river flows in the agricultural areas in the southern part of the catchment. In addition, the knowledge gained with event-based analysis allows evaluating the compliance of the monitoring network with early warning requirements and represents the starting point for further development of the network itself.

  14. Budgetary Impact of Telotristat Ethyl, a Novel Treatment for Patients with Carcinoid Syndrome Diarrhea: A US Health Plan Perspective.

    PubMed

    Joish, Vijay N; Frech, Feride; Lapuerta, Pablo

    2017-12-01

    Telotristat ethyl (TE) was recently approved for carcinoid syndrome diarrhea (CSD) in patients not adequately controlled with somatostatin analog long-acting release (SSA LAR) therapy alone. A budget impact model was developed to determine the short-term affordability of reimbursing TE in a US health plan. A budget impact model compared health care costs when CSD is managed per current treatment patterns (SSA LAR, reference drug scenario) versus when TE is incorporated in the treatment algorithm (SSA LAR + TE, new drug scenario). Prevalence of CSD, proportion of patients not adequately controlled on SSA LAR, monthly treatment costs (pharmacy and medical), and treatment efficacy were derived from the literature. In the reference drug scenario, an escalated monthly dose of SSA LAR therapy of 40 mg was assumed to treat patients with CSD not adequately controlled on the labeled dose of SSA LAR. In the new drug scenario, TE was added to the maximum labeled monthly dose of SSA LAR therapy of 30 mg. The incremental budget impact was calculated based on an assumed TE market uptake of 28%, 42%, and 55% during Years 1, 2, and 3, respectively. One-way sensitivity analyses were conducted to test model assumptions. A hypothetical health plan of 1 million members was estimated to have 42 prevalent CSD patients of whom 17 would be inadequately controlled on SSA LAR therapy. The monthly medical cost per patient not adequately controlled on SSA LAR in addition to pharmacotherapy was estimated to be $3946 based on the literature. Based on the observed treatment response in a clinical trial of 20% and 44% for the base case reference and new drug scenarios, total per patient per month costs were estimated to be $7563 and $11,205, respectively. Total annual costs in the new drug scenario were estimated to be $2.3 to $2.5 million during the first 3 years. The overall incremental annual costs were estimated to be $154,000 in Year 1, $231,000 in Year 2, and $302,000 in Year 3. This translated to an incremental per patient per month cost of $0.013, $0.019, and $0.025 for Years 1, 2, and 3. These results remained robust in 1-way sensitivity analyses. The availability of TE for patients not adequately controlled on SSA LAR therapy provides a novel treatment option for CSD. This model showed that providing access to this first-in-class oral agent would have a minimal budget impact to a US health plan. Copyright © 2017 Elsevier HS Journals, Inc. All rights reserved.

  15. Initial Development of C.A.T.E.S.: A Simulation-Based Competency Assessment Instrument for Neonatal Nurse Practitioners.

    PubMed

    Cates, Leigh Ann; Bishop, Sheryl; Armentrout, Debra; Verklan, Terese; Arnold, Jennifer; Doughty, Cara

    2015-01-01

    Determine content validity of global statements and operational definitions and choose scenarios for Competency, Assessment, Technology, Education, and Simulation (C.A.T.E.S.), instrument in development to evaluate multidimensional competency of neonatal nurse practitioners (NNPs). Real-time Delphi (RTD) method to pursue four specific aims (SAs): (1) identify which cognitive, technical, or behavioral dimension of NNP competency accurately reflects each global statement; (2) map the global statements to the National Association of Neonatal Nurse Practitioners (NANNP) core competency domains; (3) define operational definitions for the novice to expert performance subscales; and (4) determine the essential scenarios to assess NNPs. Twenty-five NNPs and nurses with competency and simulation experience Main outcome variable: One hundred percent of global statements correct for competency dimension and all but two correct for NANNP domain. One hundred percent novice to expert operational definitions and eight scenarios chosen. Content validity determined for global statements and novice to expert definitions and essential scenarios chosen.

  16. Examination of a Method to Determine the Reference Region for Calculating the Specific Binding Ratio in Dopamine Transporter Imaging.

    PubMed

    Watanabe, Ayumi; Inoue, Yusuke; Asano, Yuji; Kikuchi, Kei; Miyatake, Hiroki; Tokushige, Takanobu

    2017-01-01

    The specific binding ratio (SBR) was first reported by Tossici-Bolt et al. for quantitative indicators for dopamine transporter (DAT) imaging. It is defined as the ratio of the specific binding concentration of the striatum to the non-specific binding concentration of the whole brain other than the striatum. The non-specific binding concentration is calculated based on the region of interest (ROI), which is set 20 mm inside the outer contour, defined by a threshold technique. Tossici-Bolt et al. used a 50% threshold, but sometimes we couldn't define the ROI of non-specific binding concentration (reference region) and calculate SBR appropriately with a 50% threshold. Therefore, we sought a new method for determining the reference region when calculating SBR. We used data from 20 patients who had undergone DAT imaging in our hospital, to calculate the non-specific binding concentration by the following methods, the threshold to define a reference region was fixed at some specific values (the fixing method) and reference region was visually optimized by an examiner at every examination (the visual optimization method). First, we assessed the reference region of each method visually, and afterward, we quantitatively compared SBR calculated based on each method. In the visual assessment, the scores of the fixing method at 30% and visual optimization method were higher than the scores of the fixing method at other values, with or without scatter correction. In the quantitative assessment, the SBR obtained by visual optimization of the reference region, based on consensus of three radiological technologists, was used as a baseline (the standard method). The values of SBR showed good agreement between the standard method and both the fixing method at 30% and the visual optimization method, with or without scatter correction. Therefore, the fixing method at 30% and the visual optimization method were equally suitable for determining the reference region.

  17. Rape shield laws and sexual behavior evidence: effects of consent level and women's sexual history on rape allegations.

    PubMed

    Flowe, Heather D; Ebbesen, Ebbe B; Putcha-Bhagavatula, Anila

    2007-04-01

    Rape shield laws, which limit the introduction of sexual history evidence in rape trials, challenge the view that women with extensive sexual histories more frequently fabricate charges of rape than other women. The present study examined the relationship between women's actual sexual history and their reporting rape in hypothetical scenarios. Female participants (college students and a community sample, which included women working as prostitutes and topless dancers, and women living in a drug and alcohol rehabilitation center) imagined themselves in dating scenarios that described either a legally definable act of rape or consensual sexual intercourse. Additionally, within the rape scenarios, level of consensual intimate contact (i.e., foreplay) preceding rape was examined to determine its influence on rape reporting. Women were less likely to say that they would take legal action in response to the rape scenarios if they had extensive sexual histories, or if they had consented to an extensive amount of intimate contact before the rape. In response to the consensual sexual intercourse scenarios, women with more extensive sexual histories were not more likely to say that they would report rape, even when the scenario provided them with a motive for seeking revenge against their dating partner.

  18. An inverse approach to perturb historical rainfall data for scenario-neutral climate impact studies

    NASA Astrophysics Data System (ADS)

    Guo, Danlu; Westra, Seth; Maier, Holger R.

    2018-01-01

    Scenario-neutral approaches are being used increasingly for climate impact assessments, as they allow water resource system performance to be evaluated independently of climate change projections. An important element of these approaches is the generation of perturbed series of hydrometeorological variables that form the inputs to hydrologic and water resource assessment models, with most scenario-neutral studies to-date considering only shifts in the average and a limited number of other statistics of each climate variable. In this study, a stochastic generation approach is used to perturb not only the average of the relevant hydrometeorological variables, but also attributes such as the intermittency and extremes. An optimization-based inverse approach is developed to obtain hydrometeorological time series with uniform coverage across the possible ranges of rainfall attributes (referred to as the 'exposure space'). The approach is demonstrated on a widely used rainfall generator, WGEN, for a case study at Adelaide, Australia, and is shown to be capable of producing evenly-distributed samples over the exposure space. The inverse approach expands the applicability of the scenario-neutral approach in evaluating a water resource system's sensitivity to a wider range of plausible climate change scenarios.

  19. Potential for reducing air-pollutants while achieving 2 °C global temperature change limit target.

    PubMed

    Hanaoka, Tatsuya; Akashi, Osamu; Fujiwara, Kazuya; Motoki, Yuko; Hibino, Go

    2014-12-01

    This study analyzes the potential to reduce air pollutants while achieving the 2 °C global temperature change limit target above pre-industrial levels, by using the bottom-up optimization model, AIM/Enduse[Global]. This study focuses on; 1) estimating mitigation potentials and costs for achieving 2 °C, 2.5 °C, and 3 °C target scenarios, 2) assessing co-benefits of reducing air pollutants such as NOx, SO2, BC, PM, and 3) analyzing features of sectoral attributions in Annex I and Non-Annex I groups of countries. The carbon tax scenario at 50 US$/tCO2-eq in 2050 can reduce GHG emissions more than the 3 °C target scenario, but a higher carbon price around 400 US$/tCO2-eq in 2050 is required to achieve the 2 °C target scenario. However, there is also a co-benefit of large reduction potential of air pollutants, in the range of 60-80% reductions in 2050 from the reference scenario while achieving the 2 °C target. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. NASA CEV Reference Entry GN&C System and Analysis

    NASA Technical Reports Server (NTRS)

    Munday, S.; Madsen, C.; Broome, J.; Gay, R.; Tigges, M.; Strahan, A.

    2007-01-01

    As part of its overall objectives, the Orion spacecraft will be required to perform entry and Earth landing functions for Low Earth Orbit (LEO) and Lunar missions. Both of these entry scenarios will begin with separation of the Service Module (SM), making them unique from other Orion mission phases in that only the Command Module (CM) portion of the Crew Exploration Vehicle (CEV) will be involved, requiring a CM specific Guidance, Navigation and Control (GN&C) system. Also common to these mission scenarios will be the need for GN&C to safely return crew (or cargo) to earth within the dynamic thermal and structural constraints of entry and within acceptable accelerations on the crew, utilizing the limited aerodynamic performance of the CM capsule. The lunar return mission could additionally require an initial atmospheric entry designed to support a precision skip and second entry, all to maximize downrange performance and ensure landing in the United States. This paper describes the Entry GN&C reference design, developed by the NASA-led team, that supports these entry scenarios and that was used to validate the Orion System requirements. Description of the reference design will include an overview of the GN&C functions, avionics, and effectors and will relate these to the specific design drivers of the entry scenarios, as well as the desire for commonality in vehicle systems to support the different missions. The discussion will also include the requirement for an Emergency Entry capability beyond that of the nominal performance of the multi-string GNC system, intended to return the crew to the earth in a survivable but unguided manner. Finally, various analyses will be discussed, including those completed to support validation efforts of the current CEV requirements, along with those on-going and planned with the intention to further refine the requirements and to support design development work in conjunction with the prime contractor. Some of these ongoing analyses will include work to size effectors (jets) and fuel budgets, to refine skip entry concepts, to characterize navigation performance and uncertainties, to provide for SM disposal offshore and to identify requirements to support target site selection.

  1. Influences of removing linear and nonlinear trends from climatic variables on temporal variations of annual reference crop evapotranspiration in Xinjiang, China.

    PubMed

    Li, Yi; Yao, Ning; Chau, Henry Wai

    2017-08-15

    Reference crop evapotranspiration (ET o ) is a key parameter in field irrigation scheduling, drought assessment and climate change research. ET o uses key prescribed (or fixed or reference) land surface parameters for crops. The linear and nonlinear trends in different climatic variables (CVs) affect ET o change. This research aims to reveal how ET o responds after the related CVs were linearly and nonlinearly detrended over 1961-2013 in Xinjiang, China. The ET o -related CVs included minimum (T min ), average (T ave ), and maximum air temperatures (T max ), wind speed at 2m (U 2 ), relative humidity (RH) and sunshine hour (n). ET o was calculated using the Penman-Monteith equation. A total of 29 ET o scenarios, including the original scenario, 14 scenarios in Group I (ET o was recalculated after removing linear trends from single or more CVs) and 14 scenarios in Group II (ET o was recalculated after removing nonlinear trends from the CVs), were generated. The influence of U 2 was stronger than influences of the other CVs on ET o for both Groups I and II either in northern, southern or the entirety of Xinjiang. The weak influences of increased T min , T ave and T max on increasing ET o were masked by the strong effects of decreased U 2 &n and increased RH on decreasing ET o . The effects of the trends in CVs, especially U 2 , on changing ET o were clearly shown. Without the general decreases of U 2 , ET o would have increased in the past 53years. Due to the non-monotone variations of the CVs and ET o , the results of nonlinearly detrending CVs on changing ET o in Group II should be more plausible than the results of linearly detrending CVs in Group I. The decreasing ET o led to a general relief in drought, which was indicated by the recalculated aridity index. Therefore, there would be a slightly lower risk of water utilization in Xinjiang, China. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Neutrino Factory Plans at CERN

    NASA Astrophysics Data System (ADS)

    Riche, J. A.

    2002-10-01

    The considerable interest raised by the discovery of neutrino oscillations and recent progress in studies of muon colliders has triggered interest in considering a neutrino factory at CERN. This paper explains the reference scenario, indicates the other possible choices and mentions the R&D that are foreseen.

  3. San Pedro River Basin Data Browser (http://fws-case-12.nmsu.edu/SanPedro/)

    EPA Science Inventory

    Acquisition of primary spatial data and database development are initial features of any type of landscape assessment project. They provide contemporary land cover and the ancillary datasets necessary to establish reference condition and develop alternative future scenarios that ...

  4. A view to the future of natural gas and electricity: An integrated modeling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Wesley J.; Medlock, Kenneth B.; Jani, Aditya

    This paper demonstrates the value of integrating two highly spatially resolved models: the Rice World Gas Trade Model (RWGTM) of the natural gas sector and the Regional Energy Deployment System (ReEDS) model of the U.S. electricity sector. The RWGTM passes electricity-sector natural gas prices to the ReEDS model, while the ReEDS model returns electricity-sector natural gas demand to the RWGTM. The two models successfully converge to a solution under reference scenario conditions. We present electricity-sector and natural gas sector evolution using the integrated models for this reference scenario. This paper demonstrates that the integrated models produced similar national-level results asmore » when running in a stand-alone form, but that regional and state-level results can vary considerably. As we highlight, these regional differences have potentially significant implications for electric sector planners especially in the wake of substantive policy changes for the sector (e.g., the Clean Power Plan).« less

  5. A view to the future of natural gas and electricity: An integrated modeling approach

    DOE PAGES

    Cole, Wesley J.; Medlock, Kenneth B.; Jani, Aditya

    2016-03-17

    This paper demonstrates the value of integrating two highly spatially resolved models: the Rice World Gas Trade Model (RWGTM) of the natural gas sector and the Regional Energy Deployment System (ReEDS) model of the U.S. electricity sector. The RWGTM passes electricity-sector natural gas prices to the ReEDS model, while the ReEDS model returns electricity-sector natural gas demand to the RWGTM. The two models successfully converge to a solution under reference scenario conditions. We present electricity-sector and natural gas sector evolution using the integrated models for this reference scenario. This paper demonstrates that the integrated models produced similar national-level results asmore » when running in a stand-alone form, but that regional and state-level results can vary considerably. As we highlight, these regional differences have potentially significant implications for electric sector planners especially in the wake of substantive policy changes for the sector (e.g., the Clean Power Plan).« less

  6. The potential of clustering methods to define intersection test scenarios: Assessing real-life performance of AEB.

    PubMed

    Sander, Ulrich; Lubbe, Nils

    2018-04-01

    Intersection accidents are frequent and harmful. The accident types 'straight crossing path' (SCP), 'left turn across path - oncoming direction' (LTAP/OD), and 'left-turn across path - lateral direction' (LTAP/LD) represent around 95% of all intersection accidents and one-third of all police-reported car-to-car accidents in Germany. The European New Car Assessment Program (Euro NCAP) have announced that intersection scenarios will be included in their rating from 2020; however, how these scenarios are to be tested has not been defined. This study investigates whether clustering methods can be used to identify a small number of test scenarios sufficiently representative of the accident dataset to evaluate Intersection Automated Emergency Braking (AEB). Data from the German In-Depth Accident Study (GIDAS) and the GIDAS-based Pre-Crash Matrix (PCM) from 1999 to 2016, containing 784 SCP and 453 LTAP/OD accidents, were analyzed with principal component methods to identify variables that account for the relevant total variances of the sample. Three different methods for data clustering were applied to each of the accident types, two similarity-based approaches, namely Hierarchical Clustering (HC) and Partitioning Around Medoids (PAM), and the probability-based Latent Class Clustering (LCC). The optimum number of clusters was derived for HC and PAM with the silhouette method. The PAM algorithm was both initiated with random start medoid selection and medoids from HC. For LCC, the Bayesian Information Criterion (BIC) was used to determine the optimal number of clusters. Test scenarios were defined from optimal cluster medoids weighted by their real-life representation in GIDAS. The set of variables for clustering was further varied to investigate the influence of variable type and character. We quantified how accurately each cluster variation represents real-life AEB performance using pre-crash simulations with PCM data and a generic algorithm for AEB intervention. The usage of different sets of clustering variables resulted in substantially different numbers of clusters. The stability of the resulting clusters increased with prioritization of categorical over continuous variables. For each different set of cluster variables, a strong in-cluster variance of avoided versus non-avoided accidents for the specified Intersection AEB was present. The medoids did not predict the most common Intersection AEB behavior in each cluster. Despite thorough analysis using various cluster methods and variable sets, it was impossible to reduce the diversity of intersection accidents into a set of test scenarios without compromising the ability to predict real-life performance of Intersection AEB. Although this does not imply that other methods cannot succeed, it was observed that small changes in the definition of a scenario resulted in a different avoidance outcome. Therefore, we suggest using limited physical testing to validate more extensive virtual simulations to evaluate vehicle safety. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Rational design of gold nanoparticle toxicology assays: a question of exposure scenario, dose and experimental setup.

    PubMed

    Taylor, Ulrike; Rehbock, Christoph; Streich, Carmen; Rath, Detlef; Barcikowski, Stephan

    2014-09-01

    Many studies have evaluated the toxicity of gold nanoparticles, although reliable predictions based on these results are rare. In order to overcome this problem, this article highlights strategies to improve comparability and standardization of nanotoxicological studies. To this end, it is proposed that we should adapt the nanomaterial to the addressed exposure scenario, using ligand-free nanoparticle references in order to differentiate ligand effects from size effects. Furthermore, surface-weighted particle dosing referenced to the biologically relevant parameter (e.g., cell number or organ mass) is proposed as the gold standard. In addition, it is recommended that we should shift the focus of toxicological experiments from 'live-dead' assays to the assessment of cell function, as this strategy allows observation of bioresponses at lower doses that are more relevant for in vivo scenarios.

  8. Device for monitoring cell voltage

    DOEpatents

    Doepke, Matthias [Garbsen, DE; Eisermann, Henning [Edermissen, DE

    2012-08-21

    A device for monitoring a rechargeable battery having a number of electrically connected cells includes at least one current interruption switch for interrupting current flowing through at least one associated cell and a plurality of monitoring units for detecting cell voltage. Each monitoring unit is associated with a single cell and includes a reference voltage unit for producing a defined reference threshold voltage and a voltage comparison unit for comparing the reference threshold voltage with a partial cell voltage of the associated cell. The reference voltage unit is electrically supplied from the cell voltage of the associated cell. The voltage comparison unit is coupled to the at least one current interruption switch for interrupting the current of at least the current flowing through the associated cell, with a defined minimum difference between the reference threshold voltage and the partial cell voltage.

  9. [Pediatric reference intervals : retrospective study on thyroid hormone levels].

    PubMed

    Ladang, A; Vranken, L; Luyckx, F; Lebrethon, M-C; Cavalier, E

    2017-01-01

    Defining reference range is an essential tool for diagnostic. Age and sexe influences on thyroid hormone levels have been already discussed. In this study, we are defining a new pediatric reference range for TSH, FT3 and FT4 for Cobas C6000 analyzer. To do so, we have taken in account 0 to 18 year old outclinic patients. During the first year of life, thyroid hormone levels change dramatically before getting stabilized around 3 years old. We also compared our results to those obtained in a Canadian large-scale prospective study (the CALIPER initiative).

  10. Representative Agricultural Pathways and Scenarios for Regional Integrated Assessment of Climate Change Impacts, Vulnerability, and Adaptation. 5; Chapter

    NASA Technical Reports Server (NTRS)

    Valdivia, Roberto O.; Antle, John M.; Rosenzweig, Cynthia; Ruane, Alexander C.; Vervoort, Joost; Ashfaq, Muhammad; Hathie, Ibrahima; Tui, Sabine Homann-Kee; Mulwa, Richard; Nhemachena, Charles; hide

    2015-01-01

    The global change research community has recognized that new pathway and scenario concepts are needed to implement impact and vulnerability assessment where precise prediction is not possible, and also that these scenarios need to be logically consistent across local, regional, and global scales. For global climate models, representative concentration pathways (RCPs) have been developed that provide a range of time-series of atmospheric greenhouse-gas concentrations into the future. For impact and vulnerability assessment, new socio-economic pathway and scenario concepts have also been developed, with leadership from the Integrated Assessment Modeling Consortium (IAMC).This chapter presents concepts and methods for development of regional representative agricultural pathways (RAOs) and scenarios that can be used for agricultural model intercomparison, improvement, and impact assessment in a manner consistent with the new global pathways and scenarios. The development of agriculture-specific pathways and scenarios is motivated by the need for a protocol-based approach to climate impact, vulnerability, and adaptation assessment. Until now, the various global and regional models used for agricultural-impact assessment have been implemented with individualized scenarios using various data and model structures, often without transparent documentation, public availability, and consistency across disciplines. These practices have reduced the credibility of assessments, and also hampered the advancement of the science through model intercomparison, improvement, and synthesis of model results across studies. The recognition of the need for better coordination among the agricultural modeling community, including the development of standard reference scenarios with adequate agriculture-specific detail led to the creation of the Agricultural Model Intercomparison and Improvement Project (AgMIP) in 2010. The development of RAPs is one of the cross-cutting themes in AgMIP's work plan, and has been the subject of ongoing work by AgMIP since its creation.

  11. Cost effectiveness of ovarian reserve testing in in vitro fertilization: a Markov decision-analytic model.

    PubMed

    Moolenaar, Lobke M; Broekmans, Frank J M; van Disseldorp, Jeroen; Fauser, Bart C J M; Eijkemans, Marinus J C; Hompes, Peter G A; van der Veen, Fulco; Mol, Ben Willem J

    2011-10-01

    To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). A Markov decision model based on data from the literature and original patient data. Decision analytic framework. Computer-simulated cohort of subfertile women aged 20 to 45 years who are eligible for IVF. [1] No treatment, [2] up to three cycles of IVF limited to women under 41 years and no ovarian reserve testing, [3] up to three cycles of IVF with dose individualization of gonadotropins according to ovarian reserve, and [4] up to three cycles of IVF with ovarian reserve testing and exclusion of expected poor responders after the first cycle, with no treatment scenario as the reference scenario. Cumulative live birth over 1 year, total costs, and incremental cost-effectiveness ratios. The cumulative live birth was 9.0% in the no treatment scenario, 54.8% for scenario 2, 70.6% for scenario 3 and 51.9% for scenario 4. Absolute costs per woman for these scenarios were €0, €6,917, €6,678, and €5,892 for scenarios 1, 2, 3, and 4, respectively. Incremental cost-effectiveness ratios (ICER) for scenarios 2, 3, and 4 were €15,166, €10,837, and €13,743 per additional live birth. Sensitivity analysis showed the model to be robust over a wide range of values. Individualization of the follicle-stimulating hormone dose according to ovarian reserve is likely to be cost effective in women who are eligible for IVF, but this effectiveness needs to be confirmed in randomized clinical trials. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  12. Archaeogeophysical tests in water saturated and under water scenarios at the Hydrogeosite Laboratory

    NASA Astrophysics Data System (ADS)

    Capozzoli, Luigi; De Martino, Gregory; Giampaolo, Valeria; Perciante, Felice; Rizzo, Enzo

    2016-04-01

    The growing interest in underwater archaeology as witnessed by numerous archaeological campaigns carried out in the Mediterranean region in marine and lacustrine environments involves a challenge of great importance for archaeogeophysical discipline. Through a careful use of geophysical techniques it is possible support archaeological research to identify and analyse the undiscovered cultural heritage placed under water located near rivers and sea. Over the past decades, geophysical methods were applied successfully in the field of archaeology: an integrated approach based on the use of electric, electromagnetic and magnetic techniques have showed the ability to individuate and reconstruct the presence of archaeological remains in the subsoil allowing to define their distribution in the space limiting the excavation activities. Moreover the capability of geophysics could be limited cause the low geophysical contrasts occurring between archaeological structures and surrounding environment; in particular problems of resolution, depth of investigation and sensitivity related to each adopted technique can result in a distorted reading of the subsurface behaviour preventing the identification of archaeological remains. This problem is amplified when geophysical approach is applied in very humid environments such as in lacustrine and marine scenarios, or in soils characterized by high clay content that make more difficult the propagation of geophysical signals. In order to improve our geophysical knowledge in lacustrine and coastal scenarios a complex and innovative research project was realized at the CNR laboratory of Hydrogeosite which permitted to perform an archaeogeophysical experiment in controlled conditions. The designed archaeological context was focused on the Roman age and various elements characterized by different shapes and materials were placed at different depths in the sub-soil. The preliminary project activities with some scenarios were presented last year, now we would like to show the final results of the project where different scenarios were set up for GPR and ERT investigations. Severale phases were performed: buried objects were covered by different thickness of sediments and different soil water contents were defined. Moreover, geophysical measurements were acquired on an underwater scenario. The 2D and 3D acquisitions have allowed to identify the limits and the abilities of the GPR and resistivity measurements.

  13. The effectiveness of an intensive care quick reference checklist manual--a randomized simulation-based trial.

    PubMed

    Just, Katja S; Hubrich, Svenja; Schmidtke, Daniel; Scheifes, Andrea; Gerbershagen, Mark U; Wappler, Frank; Grensemann, Joern

    2015-04-01

    We aimed to test the effectiveness of checklists for emergency procedures on medical staff performance in intensive care crises. This is a prospective single-center randomized trial in a high-fidelity simulation center modeling an intensive care unit (ICU) in a tertiary care hospital in Germany. Teams consisted of 1 ICU resident and 2 ICU nurses (in total, n = 48). All completed 4 crisis scenarios, in which they were randomized to use checklists or to perform without any aid. In 2 of the scenarios, checklists could be used immediately (type 1 scenarios); and for the remaining, some further steps, for example, confirming diagnosis, were required first (type 2 scenarios). Outcome measurements were number of predefined items and time to completion of more than 50% and more than 75% of steps, respectively. When using checklists, participants initiated items faster and more completely according to appropriate treatment guidelines (9 vs 7 items with and without checklists, P < .05). Benefit of checklists was better in type 2 scenarios than in type 1 scenarios (2 vs 1 additional item, P < .05). In type 2 scenarios, time to complete 50% and 75% of items was faster with the use of checklists (P < .005). Use of checklists in ICU crises has a benefit on the completion of critical treatment steps. Within the type 2 scenarios, items were fulfilled faster with checklists. The implementation of checklists for intensive care crises is a promising approach that may improve patients' care. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Human Factors Design Of Automated Highway Systems: Scenario Definition

    DOT National Transportation Integrated Search

    1995-09-01

    Attention to driver acceptance and performance issues during system design will be key to the success of the Automated Highway System (AHS). A first step in the process of defining driver roles and driver-system interface requirements of AHS is the d...

  15. Inexperience and risky decisions of young adolescents, as pedestrians and cyclists, in interactions with lorries, and the effects of competency versus awareness education.

    PubMed

    Twisk, Divera; Vlakveld, Willem; Mesken, Jolieke; Shope, Jean T; Kok, Gerjo

    2013-06-01

    Road injuries are a prime cause of death in early adolescence. Often road safety education (RSE) is used to target risky road behaviour in this age group. These RSE programmes are frequently based on the assumption that deliberate risk taking rather than lack of competency underlies risk behaviour. This study tested the competency of 10-13 year olds, by examining their decisions - as pedestrians and cyclists - in dealing with blind spot areas around lorries. Also, the effects of an awareness programme and a competency programme on these decisions were evaluated. Table-top models were used, representing seven scenarios that differed in complexity: one basic scenario to test the identification of blind spot areas, and 6 traffic scenarios to test behaviour in traffic situations of low or high task complexity. Using a quasi-experimental design (pre-test and post-test reference group design without randomization), the programme effects were assessed by requiring participants (n=62) to show, for each table-top traffic scenario, how they would act if they were in that traffic situation. On the basic scenario, at pre-test 42% of the youngsters identified all blind spots correctly, but only 27% showed safe behaviour in simple scenarios and 5% in complex scenarios. The competency programme yielded improved performance on the basic scenario but not on the traffic scenarios, whereas the awareness programme did not result in any improvements. The correlation between improvements on the basic scenarios and the traffic scenarios was not significant. Young adolescents have not yet mastered the necessary skills for safe performance in simple and complex traffic situations, thus underlining the need for effective prevention programmes. RSE may improve the understanding of blind spot areas but this does not 'automatically' transfer to performance in traffic situations. Implications for the design of RSE are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. How much can we save? Impact of different emission scenarios on future snow cover in the Alps

    NASA Astrophysics Data System (ADS)

    Marty, Christoph; Schlögl, Sebastian; Bavay, Mathias; Lehning, Michael

    2017-02-01

    This study focuses on an assessment of the future snow depth for two larger Alpine catchments. Automatic weather station data from two diverse regions in the Swiss Alps have been used as input for the Alpine3D surface process model to compute the snow cover at a 200 m horizontal resolution for the reference period (1999-2012). Future temperature and precipitation changes have been computed from 20 downscaled GCM-RCM chains for three different emission scenarios, including one intervention scenario (2 °C target) and for three future time periods (2020-2049, 2045-2074, 2070-2099). By applying simple daily change values to measured time series of temperature and precipitation, small-scale climate scenarios have been calculated for the median estimate and extreme changes. The projections reveal a decrease in snow depth for all elevations, time periods and emission scenarios. The non-intervention scenarios demonstrate a decrease of about 50 % even for elevations above 3000 m. The most affected elevation zone for climate change is located below 1200 m, where the simulations show almost no snow towards the end of the century. Depending on the emission scenario and elevation zone the winter season starts half a month to 1 month later and ends 1 to 3 months earlier in this last scenario period. The resulting snow cover changes may be roughly equivalent to an elevation shift of 500-800 or 700-1000 m for the two non-intervention emission scenarios. At the end of the century the number of snow days may be more than halved at an elevation of around 1500 m and only 0-2 snow days are predicted in the lowlands. The results for the intervention scenario reveal no differences for the first scenario period but clearly demonstrate a stabilization thereafter, comprising much lower snow cover reductions towards the end of the century (ca. 30 % instead of 70 %).

  17. Relations between water physico-chemistry and benthic algal communities in a northern Canadian watershed: defining reference conditions using multiple descriptors of community structure.

    PubMed

    Thomas, Kathryn E; Hall, Roland I; Scrimgeour, Garry J

    2015-09-01

    Defining reference conditions is central to identifying environmental effects of anthropogenic activities. Using a watershed approach, we quantified reference conditions for benthic algal communities and their relations to physico-chemical conditions in rivers in the South Nahanni River watershed, NWT, Canada, in 2008 and 2009. We also compared the ability of three descriptors that vary in terms of analytical costs to define algal community structure based on relative abundances of (i) all algal taxa, (ii) only diatom taxa, and (iii) photosynthetic pigments. Ordination analyses showed that variance in algal community structure was strongly related to gradients in environmental variables describing water physico-chemistry, stream habitats, and sub-watershed structure. Water physico-chemistry and local watershed-scale descriptors differed significantly between algal communities from sites in the Selwyn Mountain ecoregion compared to sites in the Nahanni-Hyland ecoregions. Distinct differences in algal community types between ecoregions were apparent irrespective of whether algal community structure was defined using all algal taxa, diatom taxa, or photosynthetic pigments. Two algal community types were highly predictable using environmental variables, a core consideration in the development of Reference Condition Approach (RCA) models. These results suggest that assessments of environmental impacts could be completed using RCA models for each ecoregion. We suggest that use of algal pigments, a high through-put analysis, is a promising alternative compared to more labor-intensive and costly taxonomic approaches for defining algal community structure.

  18. The Scenario Model Intercomparison Project (ScenarioMIP) for CMIP6

    DOE PAGES

    O'Neill, Brian C.; Tebaldi, Claudia; van Vuuren, Detlef P.; ...

    2016-09-28

    Projections of future climate change play a fundamental role in improving understanding of the climate system as well as characterizing societal risks and response options. The Scenario Model Intercomparison Project (ScenarioMIP) is the primary activity within Phase 6 of the Coupled Model Intercomparison Project (CMIP6) that will provide multi-model climate projections based on alternative scenarios of future emissions and land use changes produced with integrated assessment models. Here, we describe ScenarioMIP's objectives, experimental design, and its relation to other activities within CMIP6. The ScenarioMIP design is one component of a larger scenario process that aims to facilitate a wide rangemore » of integrated studies across the climate science, integrated assessment modeling, and impacts, adaptation, and vulnerability communities, and will form an important part of the evidence base in the forthcoming Intergovernmental Panel on Climate Change (IPCC) assessments. Furthermore, it will provide the basis for investigating a number of targeted science and policy questions that are especially relevant to scenario-based analysis, including the role of specific forcings such as land use and aerosols, the effect of a peak and decline in forcing, the consequences of scenarios that limit warming to below 2°C, the relative contributions to uncertainty from scenarios, climate models, and internal variability, and long-term climate system outcomes beyond the 21st century. In order to serve this wide range of scientific communities and address these questions, a design has been identified consisting of eight alternative 21st century scenarios plus one large initial condition ensemble and a set of long-term extensions, divided into two tiers defined by relative priority. Some of these scenarios will also provide a basis for variants planned to be run in other CMIP6-Endorsed MIPs to investigate questions related to specific forcings. Harmonized, spatially explicit emissions and land use scenarios generated with integrated assessment models will be provided to participating climate modeling groups by late 2016, with the climate model simulations run within the 2017–2018 time frame, and output from the climate model projections made available and analyses performed over the 2018–2020 period.« less

  19. Proposal of global flood vulnerability scenarios for evaluating future potential flood losses

    NASA Astrophysics Data System (ADS)

    Kinoshita, Y.; Tanoue, M.; Watanabe, S.; Hirabayashi, Y.

    2015-12-01

    Flooding is one of the most hazardous and damaging natural disasters causing serious economic loss and casualties across the world (Jongman et al., 2015). Previous studies showed that the global temperature increase affects regional weather pattern, and several general circulation model (GCM) simulations suggest the increase of flood events in both frequency and magnitude in many parts of the world (Hirabayashi et al., 2013). Effective adaptation to potential flood risks under the warming climate requires an in-depth understanding of both the physical and socioeconomic contributors of the flood risk. To assess the realistic future potential flood risk, future sophisticated vulnerability scenarios associated with the shared socioeconomic pathways (SSPs) are necessary. In this study we propose a new future vulnerability scenarios in mortality. Our vulnerability scenarios are constructed based on the modeled flood exposure (population potentially suffered by flooding) and a past from 1980 to 2005. All the flood fatality data were classified according to four income levels (high, mid-high, mid-low and low). Our proposed scenarios have three pathways regarding to SSPs; High efficiency (HE) scenario (SSP1, SSP4 (rich country) and SSP5), Medium efficiency (ME) scenario (SSP2), and Low efficiency (LE) scenario (SSP3 and SSP4 (poor country)). The maximum mortality protection level on each category was detected by applying exponential curve fitting with offset term. Slopes in the HE scenario are assumed to be equal to slopes estimated by regression analysis in each category. The slope in the HE scenario is defined by the mean value of all countries' slope value that is approximately -0.33 mortality decreases per year. The EM-DAT mortality data shows a decreasing trend in time in almost all of the countries. Although mortalities in some countries show an increasing trend, this is because these countries were affected by once-in-hundred-years floods after 1990's. The slope in the ME scenario are half of that in the HE scenario, and a quarter in the LE scenario. In addition, we set three categories depending on mortality level. Our proposed vulnerability scenarios would enable us to reasonably replicate self-sustained vulnerability change against flood hazard associated with the SSPs.

  20. The Scenario Model Intercomparison Project (ScenarioMIP) for CMIP6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Neill, Brian C.; Tebaldi, Claudia; van Vuuren, Detlef P.

    2016-01-01

    Projections of future climate change play a fundamental role in improving understanding of the climate system as well as characterizing societal risks and response options. The Scenario Model Intercomparison Project (ScenarioMIP) is the primary activity within Phase 6 of the Coupled Model Intercomparison Project (CMIP6) that will provide multi-model climate projections based on alternative scenarios of future emissions and land use changes produced with integrated assessment models. In this paper, we describe ScenarioMIP's objectives, experimental design, and its relation to other activities within CMIP6. The ScenarioMIP design is one component of a larger scenario process that aims to facilitate amore » wide range of integrated studies across the climate science, integrated assessment modeling, and impacts, adaptation, and vulnerability communities, and will form an important part of the evidence base in the forthcoming Intergovernmental Panel on Climate Change (IPCC) assessments. At the same time, it will provide the basis for investigating a number of targeted science and policy questions that are especially relevant to scenario-based analysis, including the role of specific forcings such as land use and aerosols, the effect of a peak and decline in forcing, the consequences of scenarios that limit warming to below 2 °C, the relative contributions to uncertainty from scenarios, climate models, and internal variability, and long-term climate system outcomes beyond the 21st century. To serve this wide range of scientific communities and address these questions, a design has been identified consisting of eight alternative 21st century scenarios plus one large initial condition ensemble and a set of long-term extensions, divided into two tiers defined by relative priority. Some of these scenarios will also provide a basis for variants planned to be run in other CMIP6-Endorsed MIPs to investigate questions related to specific forcings. Harmonized, spatially explicit emissions and land use scenarios generated with integrated assessment models will be provided to participating climate modeling groups by late 2016, with the climate model simulations run within the 2017–2018 time frame, and output from the climate model projections made available and analyses performed over the 2018–2020 period.« less

  1. The Scenario Model Intercomparison Project (ScenarioMIP) for CMIP6

    NASA Astrophysics Data System (ADS)

    O'Neill, Brian C.; Tebaldi, Claudia; van Vuuren, Detlef P.; Eyring, Veronika; Friedlingstein, Pierre; Hurtt, George; Knutti, Reto; Kriegler, Elmar; Lamarque, Jean-Francois; Lowe, Jason; Meehl, Gerald A.; Moss, Richard; Riahi, Keywan; Sanderson, Benjamin M.

    2016-09-01

    Projections of future climate change play a fundamental role in improving understanding of the climate system as well as characterizing societal risks and response options. The Scenario Model Intercomparison Project (ScenarioMIP) is the primary activity within Phase 6 of the Coupled Model Intercomparison Project (CMIP6) that will provide multi-model climate projections based on alternative scenarios of future emissions and land use changes produced with integrated assessment models. In this paper, we describe ScenarioMIP's objectives, experimental design, and its relation to other activities within CMIP6. The ScenarioMIP design is one component of a larger scenario process that aims to facilitate a wide range of integrated studies across the climate science, integrated assessment modeling, and impacts, adaptation, and vulnerability communities, and will form an important part of the evidence base in the forthcoming Intergovernmental Panel on Climate Change (IPCC) assessments. At the same time, it will provide the basis for investigating a number of targeted science and policy questions that are especially relevant to scenario-based analysis, including the role of specific forcings such as land use and aerosols, the effect of a peak and decline in forcing, the consequences of scenarios that limit warming to below 2 °C, the relative contributions to uncertainty from scenarios, climate models, and internal variability, and long-term climate system outcomes beyond the 21st century. To serve this wide range of scientific communities and address these questions, a design has been identified consisting of eight alternative 21st century scenarios plus one large initial condition ensemble and a set of long-term extensions, divided into two tiers defined by relative priority. Some of these scenarios will also provide a basis for variants planned to be run in other CMIP6-Endorsed MIPs to investigate questions related to specific forcings. Harmonized, spatially explicit emissions and land use scenarios generated with integrated assessment models will be provided to participating climate modeling groups by late 2016, with the climate model simulations run within the 2017-2018 time frame, and output from the climate model projections made available and analyses performed over the 2018-2020 period.

  2. Tightness of correlation inequalities with no quantum violation

    NASA Astrophysics Data System (ADS)

    Ramanathan, Ravishankar; Quintino, Marco Túlio; Sainz, Ana Belén; Murta, Gláucia; Augusiak, Remigiusz

    2017-01-01

    We study the faces of the set of quantum correlations, i.e., the Bell and noncontextuality inequalities without any quantum violation. First, we investigate the question of whether every proper (facet-defining) Bell inequality for two parties, other than the trivial ones from positivity, normalization, and no-signaling, can be violated by quantum correlations, i.e., whether the classical Bell polytope or the smaller correlation polytope share any facets with their respective quantum sets. To do this, we develop a recently derived bound on the quantum value of linear games based on the norms of game matrices to give a simple sufficient condition to identify linear games with no quantum advantage. Additionally we show how this bound can be extended to the general class of unique games. We then show that the paradigmatic examples of correlation Bell inequalities with no quantum violation, namely the nonlocal computation games, do not constitute facet-defining Bell inequalities, not even for the correlation polytope. We also extend this to an arbitrary prime number of outcomes for a specific class of these games. We then study the faces in the simplest Clauser-Horne-Shimony-Holt Bell scenario of binary dichotomic measurements, and identify edges in the set of quantum correlations in this scenario. Finally, we relate the noncontextual polytope of single-party correlation inequalities with the cut polytope CUT(∇ G ) , where G denotes the compatibility graph of observables in the contextuality scenario and ∇ G denotes the suspension graph of G . We observe that there exist facet-defining noncontextuality inequalities with no quantum violation, and furthermore that this set of inequalities is beyond those implied by the consistent exclusivity principle.

  3. What Happened, and Why: Toward an Understanding of Human Error Based on Automated Analyses of Incident Reports. Volume 1

    NASA Technical Reports Server (NTRS)

    Maille, Nicolas P.; Statler, Irving C.; Ferryman, Thomas A.; Rosenthal, Loren; Shafto, Michael G.; Statler, Irving C.

    2006-01-01

    The objective of the Aviation System Monitoring and Modeling (ASMM) project of NASA s Aviation Safety and Security Program was to develop technologies that will enable proactive management of safety risk, which entails identifying the precursor events and conditions that foreshadow most accidents. This presents a particular challenge in the aviation system where people are key components and human error is frequently cited as a major contributing factor or cause of incidents and accidents. In the aviation "world", information about what happened can be extracted from quantitative data sources, but the experiential account of the incident reporter is the best available source of information about why an incident happened. This report describes a conceptual model and an approach to automated analyses of textual data sources for the subjective perspective of the reporter of the incident to aid in understanding why an incident occurred. It explores a first-generation process for routinely searching large databases of textual reports of aviation incident or accidents, and reliably analyzing them for causal factors of human behavior (the why of an incident). We have defined a generic structure of information that is postulated to be a sound basis for defining similarities between aviation incidents. Based on this structure, we have introduced the simplifying structure, which we call the Scenario as a pragmatic guide for identifying similarities of what happened based on the objective parameters that define the Context and the Outcome of a Scenario. We believe that it will be possible to design an automated analysis process guided by the structure of the Scenario that will aid aviation-safety experts to understand the systemic issues that are conducive to human error.

  4. Calculation of greenhouse gas emissions of jatropha oil and jatropha biodiesel as alternative fuels for electricity production in Côte d'Ivoire

    NASA Astrophysics Data System (ADS)

    Atta, Pascal Atta; N'guessan, Yao; Morin, Celine; Voirol, Anne Jaecker; Descombes, Georges

    2017-02-01

    The electricity in Côte d'Ivoire is mainly produced from fossil energy sources. This causes damages on environment due to greenhouse gas emissions (GHG). The aim of this paper is to calculate the greenhouse gas (GHG) emissions of jatropha oil and jatropha biodiesel as alternative fuels for electricity production in Côte d'Ivoire by using Life Cycle Assessment (LCA) methodology. The functional unit in this LCA is defined as 1 kWh of electricity produced by the combustion of jatropha oil or jatropha biodiesel in the engine of a generator. Two scenarios, called short chain and long chain, were examined in this LCA. The results show that 0.132 kg CO2 equivalent is emitted for the scenario 1 with jatropha oil as an alternative fuel against 0.6376 kg CO2 equivalent for the scenario 2 with jatropha biodiesel as an alternative fuel. An 87 % reduction of kg CO2 equivalent is observed in scenario 1 and a 37 % reduction of kg CO2 equivalent is observed in the scenario 2, when compared with a Diesel fuel.

  5. SUDOQU, a new dose-assessment methodology for radiological surface contamination.

    PubMed

    van Dillen, Teun; van Dijk, Arjan

    2018-06-12

    A new methodology has been developed for the assessment of the annual effective dose resulting from removable and fixed radiological surface contamination. It is entitled SUDOQU (SUrface DOse QUantification) and it can for instance be used to derive criteria for surface contamination related to the import of non-food consumer goods, containers and conveyances, e.g., limiting values and operational screening levels. SUDOQU imposes mass (activity)-balance equations based on radioactive decay, removal and deposition processes in indoor and outdoor environments. This leads to time-dependent contamination levels that may be of particular importance in exposure scenarios dealing with one or a few contaminated items only (usually public exposure scenarios, therefore referred to as the 'consumer' model). Exposure scenarios with a continuous flow of freshly contaminated goods also fall within the scope of the methodology (typically occupational exposure scenarios, thus referred to as the 'worker model'). In this paper we describe SUDOQU, its applications, and its current limitations. First, we delineate the contamination issue, present the assumptions and explain the concepts. We describe the relevant removal, transfer, and deposition processes, and derive equations for the time evolution of the radiological surface-, air- and skin-contamination levels. These are then input for the subsequent evaluation of the annual effective dose with possible contributions from external gamma radiation, inhalation, secondary ingestion (indirect, from hand to mouth), skin contamination, direct ingestion and skin-contact exposure. The limiting effective surface dose is introduced for issues involving the conservatism of dose calculations. SUDOQU can be used by radiation-protection scientists/experts and policy makers in the field of e.g. emergency preparedness, trade and transport, exemption and clearance, waste management, and nuclear facilities. Several practical examples are worked out demonstrating the potential applications of the methodology. . Creative Commons Attribution license.

  6. Fuel Cycle Analysis Framework Base Cases for the IAEA/INPRO GAINS Collaborative Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brent Dixon

    Thirteen countries participated in the Collaborative Project GAINS “Global Architecture of Innovative Nuclear Energy Systems Based on Thermal and Fast Reactors Including a Closed Fuel Cycle”, which was the primary activity within the IAEA/INPRO Program Area B: “Global Vision on Sustainable Nuclear Energy” for the last three years. The overall objective of GAINS was to develop a standard framework for assessing future nuclear energy systems taking into account sustainable development, and to validate results through sample analyses. This paper details the eight scenarios that constitute the GAINS framework base cases for analysis of the transition to future innovative nuclear energymore » systems. The framework base cases provide a reference for users of the framework to start from in developing and assessing their own alternate systems. Each base case is described along with performance results against the GAINS sustainability evaluation metrics. The eight cases include four using a moderate growth projection and four using a high growth projection for global nuclear electricity generation through 2100. The cases are divided into two sets, addressing homogeneous and heterogeneous scenarios developed by GAINS to model global fuel cycle strategies. The heterogeneous world scenario considers three separate nuclear groups based on their fuel cycle strategies, with non-synergistic and synergistic cases. The framework base case analyses results show the impact of these different fuel cycle strategies while providing references for future users of the GAINS framework. A large number of scenario alterations are possible and can be used to assess different strategies, different technologies, and different assumptions about possible futures of nuclear power. Results can be compared to the framework base cases to assess where these alternate cases perform differently versus the sustainability indicators.« less

  7. The ALMA CONOPS project: the impact of funding decisions on observatory performance

    NASA Astrophysics Data System (ADS)

    Ibsen, Jorge; Hibbard, John; Filippi, Giorgio

    2014-08-01

    In time when every penny counts, many organizations are facing the question of how much scientific impact a budget cut can have or, putting it in more general terms, which is the science impact of alternative (less costly) operational modes. In reply to such question posted by the governing bodies, the ALMA project had to develop a methodology (ALMA Concepts for Operations, CONOPS) that attempts to measure the impact that alternative operational scenarios may have on the overall scientific production of the Observatory. Although the analysis and the results are ALMA specific, the developed approach is rather general and provides a methodology for a cost-performance analysis of alternatives before any radical alterations to the operations model are adopted. This paper describes the key aspects of the methodology: a) the definition of the Figures of Merit (FoMs) for the assessment of quantitative science performance impacts as well as qualitative impacts, and presents a methodology using these FoMs to evaluate the cost and impact of the different operational scenarios; b) the definition of a REFERENCE operational baseline; c) the identification of Alternative Scenarios each replacing one or more concepts in the REFERENCE by a different concept that has a lower cost and some level of scientific and/or operational impact; d) the use of a Cost-Performance plane to graphically combine the effects that the alternative scenarios can have in terms of cost reduction and affected performance. Although is a firstorder assessment, we believe this approach is useful for comparing different operational models and to understand the cost performance impact of these choices. This can be used to take decision to meet budget cuts as well as in evaluating possible new emergent opportunities.

  8. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    NASA Technical Reports Server (NTRS)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  9. Using a software-defined computer in teaching the basics of computer architecture and operation

    NASA Astrophysics Data System (ADS)

    Kosowska, Julia; Mazur, Grzegorz

    2017-08-01

    The paper describes the concept and implementation of SDC_One software-defined computer designed for experimental and didactic purposes. Equipped with extensive hardware monitoring mechanisms, the device enables the students to monitor the computer's operation on bus transfer cycle or instruction cycle basis, providing the practical illustration of basic aspects of computer's operation. In the paper, we describe the hardware monitoring capabilities of SDC_One and some scenarios of using it in teaching the basics of computer architecture and microprocessor operation.

  10. Catastrophe on the Horizon: A Scenario-Based Future Effect of Orbital Space Debris

    DTIC Science & Technology

    2010-04-01

    real. In fact, the preliminary results of a recent NASA risk assessment of the soon to be decommissioned Space Shuttle puts the risk of a manned...Section 1 – Introduction Orbital Space Debris Defined Orbital space debris can be defined as dead satellites, discarded rocket parts, or simply flecks...of paint or other small objects orbiting the earth. It is simply space ―junk,‖ but junk that can be extremely dangerous to space assets. Most of the

  11. Telling better stories: strengthening the story in story and simulation

    NASA Astrophysics Data System (ADS)

    Kemp-Benedict, Eric

    2012-12-01

    The scenarios of the IPCC Special Report on Emissions Scenarios (SRES) (Nakicenovic and Swart 2000) are both widely cited and widely criticized. This combination of censure and regard reflects their importance, as they provide both a point of reference and a point of departure for those wishing to understand the long-term implications of policies and human activities for the climate and adaptive capacity. The paper by Schweizer and Kriegler in this issue (Schweizer and Kriegler 2012) reports a unique and interesting critique of the SRES scenarios. The authors find several results, including that the path the world may now be on (labeled by them 'coal-powered growth') is under-represented in the SRES scenarios. While such post-hoc critiques are easy to dismiss, Schweizer and Kriegler were careful to use only the information available to the SRES authors, and they applied a technique that (if it had been available) could have been carried out at that time. In this way they demonstrate that not only was coal-powered growth a clearly discernible possible future at the time of the SRES, but variants on the theme dominate the handful of highly consistent and robust scenarios as identified by their method. Their paper is well-timed because a new round of climate scenarios is now under development (Kriegler et al 2012, van Vuuren et al 2012), and it could learn from evaluations of the SRES process and scenarios. Schweizer and Kriegler (2012) construct a consistent scenario logic using a relatively new foresight technique, cross-impact balances (CIB) (Weimer-Jehle 2006). As explained above, to sharpen their critique and properly evaluate the method, they apply CIB to the information that the authors of the SRES had at their disposal at the time they constructed their scenarios. Their study is therefore anachronistic, in that the CIB method was not published when the SRES was released, but historically faithful in that Schweizer and Kriegler limit themselves to the information available at that time, based on statements that appear in the SRES itself. The CIB method is a technique for constructing internally consistent qualitative scenarios. Global-scale scenario exercises, in particular climate scenarios, typically include both qualitative (narrative) and quantitative (model) elements. As noted by Schweizer and Kriegler, the dominant method for such studies, which Alcamo (2001, 2008) formalized and named the 'story and simulation' (SAS) approach, relies at least in part on quantitative modeling to ensure consistency. Schweizer and Kriegler rightly criticize the idea that models alone can ensure consistency of a scenario narrative. By itself, this critique is not new. Indeed, if asked, both Alcamo and Raskin et al (Raskin et al 2005), whom Schweizer and Kriegler (2012) cite, would probably agree with them; both sources emphasize the need for qualitative storylines that go beyond what models can provide. However, Schweizer and Kriegler correctly point out that these sources provide little or no guidance to those responsible for the narratives beyond a dialog with the model outputs. The CIB method addresses this problem, and Schweizer and Kriegler's application of the method shows that even the best narrative-writing teams can benefit from this guidance. While the paper of Schweizer and Kriegler makes a compelling argument for using CIB in global scenarios, it should be used in combination with other methods. A scenario exercise has several aims, of which consistency is one. Another important goal is diversity: given a set of internally consistent scenarios, a diverse set covers the space of possibilities, and thereby helps users of the scenarios avoid underestimating or overestimating the potential for change in one or another key factor (e.g., see (Carlsen 2009)). From this point of view, the SRES authors could legitimately respond to Schweizer and Kriegler's finding that the SRES scenarios excluded interesting variants on coal-fueled growth by arguing that they did include some variants, and to include more would have conflicted with a legitimate goal of breadth. In this imagined dialog, Schweizer and Kriegler could concede the point, but then point out that several of the SRES scenarios were revealed to be either marginally or very inconsistent by their exercise. Thus, CIB and a technique that helps ensure breadth can usefully complement one another. The CIB method is also liable to a form of specification error, in that the worldviews of the people filling in the cross-impact table influence the results. This is a problem with many foresight techniques, but it is masked by the formalism of CIB, and there is a danger it will go unnoticed. For example, Schweizer and Kriegler's paper suggests that the A1T2 scenario is (marginally) internally consistent. It has relatively low carbon emissions, low rates of population growth, very high GDP per capita growth rates, low primary energy intensity, very low carbon intensity, high fossil-fuel availability, global economic policy focus, and mixed global and regional energy policy focus. It has been argued by Jackson (2009) and Victor (2008), among others, that the evidence is slim that we ever will decouple carbon emissions from GDP to any meaningful extent. Thus, they would presumably argue that this is an inconsistent scenario, and might very well have done so at the time the SRES was written. That is not by itself a reason to reject the scenario, but it suggests that a CIB exercise could be run assuming the qualitative models implied by different worldviews, and the results contrasted. Such an exercise would go beyond the sensitivity analysis that Schweizer and Kriegler report in their paper. The cross-impact balance method should be a useful tool for constructing the next round of climate scenarios. It will be even more useful if combined with techniques that ensure a diversity of scenarios. This could include formal techniques such as 'scenario diversity analysis', which maximizes a quantitative measure of the spread of a set of qualitative scenarios defined by states of driving forces (Carlsen 2009). It could also include a survey of different worldviews, and the qualitative models that they imply, such as that carried out by Sunderlin (Sunderlin 2003). Futures studies has moved forward from the time the SRES was published, and new techniques are now available that can help us to tell better stories of the future. References Alcamo J 2001 Scenarios as Tools for International Environmental Assessments (Cophenhagen: European Environment Agency) Alcamo J 2008 The SAS approach: combining qualitative and quantitative knowledge in environmental scenarios Environmental Futures—The Practice of Environmental Scenario Analysis vol 2, ed J Alcamo (Amsterdam: Elsevier) pp 123-50 Carlsen H 2009 Climate change and the construction of scenario sets that span the range of societal uncertainties Paper for International Studies Association Annual Convention 2009 (New York City, February) Jackson T 2009 Prosperity Without Growth: Economics for a Finite Planet (London: Earthscan) Kriegler E, O'Neill B C, Hallegatte S, Kram T, Lempert R J, Moss R H and Wilbanks T 2012 The need for and use of socio-economic scenarios for climate change analysis: a new approach based on shared socio-economic pathways Glob. Environ. Change 22 807-22 Nakicenovic N and Swart R (eds) 2000 Special Report on Emissions Scenarios (Cambridge: Cambridge University Press) Raskin P, Monks F, Ribeiro T, van Vuuren D and Zurek M 2005 Global scenarios in historical perspective Ecosystems and Human Well-Being: Scenarios: Findings of the Scenarios Working Group vol 2, ed S R Carpenter et al (Washington, DC: Island) pp 35-44 Schweizer V J and Kriegler E 2012 Improving environmental change research with systematic techniques for qualitative scenarios Environ. Res. Lett. 7 044011 Sunderlin W D 2003 Ideology, Social Theory, and the Environment (Lanham, MD: Rowman & Littlefield) van Vuuren D P et al 2012 A proposal for a new scenario framework to support research and assessment in different climate research communities Glob. Environ. Change 22 21-35 Victor P A 2008 Managing Without Growth: Slower by Design, Not Disaster (Advances in Ecological Economics Series) (Cheltenham: Edward Elgar) Weimer-Jehle W 2006 Cross-impact balances: a system—theoretical approach to cross-impact analysis Technol. Forecast. Social Change 73 334-61

  12. General dental practitioner's views on dental general anaesthesia services.

    PubMed

    Threlfall, A G; King, D; Milsom, K M; Blinkhom, A S; Tickle, M

    2007-06-01

    Policy has recently changed on provision of dental general anaesthetic services in England. The aim of this study was to investigate general dental practitioners' views about dental general anaesthetics, the reduction in its availability and the impact on care of children with toothache. Qualitative study using semi-structured interviews and clinical case scenarios. General dental practitioners providing NHS services in the North West of England. 93 general dental practitioners were interviewed and 91 answered a clinical case scenario about the care they would provide for a 7-year-old child with multiple decayed teeth presenting with toothache. Scenario responses showed variation; 8% would immediately refer for general anaesthesia, 25% would initially prescribe antibiotics, but the majority would attempt to either restore or extract the tooth causing pain. Interview responses also demonstrated variation in care, however most dentists agree general anaesthesia has a role for nervous children but only refer as a last resort. The responses indicated an increase in inequalities, and that access to services did not match population needs, leaving some children waiting in pain. Most general dental practitioners support moving dental general anaesthesia into hospitals but some believe that it has widened health inequalities and there is also a problem associated with variation in treatment provision. Additional general anaesthetic services in some areas with high levels of tooth decay are needed and evidence based guidelines about caring for children with toothache are required.

  13. Technical Feasibility Assessment of Lunar Base Mission Scenarios

    NASA Astrophysics Data System (ADS)

    Magelssen, Trygve ``Spike''; Sadeh, Eligar

    2005-02-01

    Investigation of the literature pertaining to lunar base (LB) missions and the technologies required for LB development has revealed an information gap that hinders technical feasibility assessment. This information gap is the absence of technical readiness levels (TRL) (Mankins, 1995) and information pertaining to the criticality of the critical enabling technologies (CETs) that enable mission success. TRL is a means of identifying technical readiness stages of a technology. Criticality is defined as the level of influence the CET has on the mission scenario. The hypothesis of this research study is that technical feasibility is a function of technical readiness and technical readiness is a function of criticality. A newly developed research analysis method is used to identify the technical feasibility of LB mission scenarios. A Delphi is used to ascertain technical readiness levels and CET criticality-to-mission. The research analysis method is applied to the Delphi results to determine the technical feasibility of the LB mission scenarios that include: observatory, science research, lunar settlement, space exploration gateway, space resource utilization, and space tourism. The CETs identified encompasses four major system level technologies of: transportation, life support, structures, and power systems. Results of the technical feasibility assessment show the observatory and science research LB mission scenarios to be more technical ready out of all the scenarios, but all mission scenarios are in very close proximity to each other in regard to criticality and TRL and no one mission scenario stands out as being absolutely more technically ready than any of the other scenarios. What is significant and of value are the Delphi results concerning CET criticality-to-mission and the TRL values evidenced in the Tables that can be used by anyone assessing the technical feasibility of LB missions.

  14. Defining And Employing Reference Conditions For Ecological Restoration Of The Lower Missouri River, USA

    NASA Astrophysics Data System (ADS)

    Jacobson, R. B.; Elliott, C. M.; Reuter, J. M.

    2008-12-01

    Ecological reference conditions are especially challenging for large, intensively managed rivers like the Lower Missouri. Historical information provides broad understanding of how the river has changed, but translating historical information into quantitative reference conditions remains a challenge. Historical information is less available for biological and chemical conditions than for physical conditions. For physical conditions, much of the early historical condition is documented in date-specific measurements or maps, and it is difficult to determine how representative these conditions are for a river system that was characterized historically by large floods and high channel migration rates. As an alternative to a historically defined least- disturbed condition, spatial variation within the Missouri River basin provides potential for defining a best- attainable reference condition. A possibility for the best-attainable condition for channel morphology is an unchannelized segment downstream of the lowermost dam (rkm 1298 - 1203). This segment retains multiple channels and abundant sandbars although it has a highly altered flow regime and a greatly diminished sediment supply. Conversely, downstream river segments have more natural flow regimes, but have been narrowed and simplified for navigation and bank stability. We use two computational tools to compensate for the lack of ideal reference conditions. The first is a hydrologic model that synthesizes natural and altered flow regimes based on 100 years of daily inputs to the river (daily routing model, DRM, US Army Corps of Engineers, 1998); the second tool is hydrodynamic modeling of habitat availability. The flow-regime and hydrodynamic outputs are integrated to define habitat-duration curves as the basis for reference conditions (least-disturbed flow regime and least-disturbed channel morphology). Lacking robust biological response models, we use mean residence time of water and a habitat diversity index as generic ecosystem indicators.

  15. Pesticide exposure assessment for surface waters in the EU. Part 2: Determination of statistically based run-off and drainage scenarios for Germany.

    PubMed

    Bach, Martin; Diesner, Mirjam; Großmann, Dietlinde; Guerniche, Djamal; Hommen, Udo; Klein, Michael; Kubiak, Roland; Müller, Alexandra; Preuss, Thomas G; Priegnitz, Jan; Reichenberger, Stefan; Thomas, Kai; Trapp, Matthias

    2017-05-01

    In order to assess surface water exposure to active substances of plant protection products (PPPs) in the European Union (EU), the FOCUS (FOrum for the Co-ordination of pesticide fate models and their USe) surface water workgroup introduced four run-off and six drainage scenarios for Step 3 of the tiered FOCUSsw approach. These scenarios may not necessarily represent realistic worst-case situations for the different Member States of the EU. Hence, the suitability of the scenarios for risk assessment in the national authorisation procedures is not known. Using Germany as an example, the paper illustrates how national soil-climate scenarios can be developed to model entries of active substances into surface waters from run-off and erosion (using the model PRZM) and from drainage (using the model MACRO). In the authorisation procedure for PPPs on Member State level, such soil-climate scenarios can be used to determine exposure endpoints with a defined overall percentile. The approach allows the development of national specific soil-climate scenarios and to calculate percentile-based exposure endpoints. The scenarios have been integrated into a software tool analogous to FOCUS-SWASH which can be used in the future to assess surface water exposure in authorisation procedures of PPPs in Germany. © 2017 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2017 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  16. Future Forest Cover Change Scenarios with Implications for Landslide Risk: An Example from Buzau Subcarpathians, Romania

    NASA Astrophysics Data System (ADS)

    Malek, Žiga; Boerboom, Luc; Glade, Thomas

    2015-11-01

    This study focuses on future forest cover change in Buzau Subcarpathians, a landslide prone region in Romania. Past and current trends suggest that the area might expect a future increase in deforestation. We developed spatially explicit scenarios until 2040 to analyze the spatial pattern of future forest cover change and potential changes to landslide risk. First, we generated transition probability maps using the weights of evidence method, followed by a cellular automata allocation model. We performed expert interviews, to develop two future forest management scenarios. The Alternative scenario (ALT) was defined by 67 % more deforestation than the Business as Usual scenario (BAU). We integrated the simulated scenarios with a landslide susceptibility map. In both scenarios, most of deforestation was projected in areas where landslides are less likely to occur. Still, 483 (ALT) and 276 (BAU) ha of deforestation were projected on areas with a high-landslide occurrence likelihood. Thus, deforestation could lead to a local-scale increase in landslide risk, in particular near or adjacent to forestry roads. The parallel process of near 10 % forest expansion until 2040 was projected to occur mostly on areas with high-landslide susceptibility. On a regional scale, forest expansion could so result in improved slope stability. We modeled two additional scenarios with an implemented landslide risk policy, excluding high-risk zones. The reduction of deforestation on high-risk areas was achieved without a drastic decrease in the accessibility of the areas. Together with forest expansion, it could therefore be used as a risk reduction strategy.

  17. Future Forest Cover Change Scenarios with Implications for Landslide Risk: An Example from Buzau Subcarpathians, Romania.

    PubMed

    Malek, Žiga; Boerboom, Luc; Glade, Thomas

    2015-11-01

    This study focuses on future forest cover change in Buzau Subcarpathians, a landslide prone region in Romania. Past and current trends suggest that the area might expect a future increase in deforestation. We developed spatially explicit scenarios until 2040 to analyze the spatial pattern of future forest cover change and potential changes to landslide risk. First, we generated transition probability maps using the weights of evidence method, followed by a cellular automata allocation model. We performed expert interviews, to develop two future forest management scenarios. The Alternative scenario (ALT) was defined by 67% more deforestation than the Business as Usual scenario (BAU). We integrated the simulated scenarios with a landslide susceptibility map. In both scenarios, most of deforestation was projected in areas where landslides are less likely to occur. Still, 483 (ALT) and 276 (BAU) ha of deforestation were projected on areas with a high-landslide occurrence likelihood. Thus, deforestation could lead to a local-scale increase in landslide risk, in particular near or adjacent to forestry roads. The parallel process of near 10% forest expansion until 2040 was projected to occur mostly on areas with high-landslide susceptibility. On a regional scale, forest expansion could so result in improved slope stability. We modeled two additional scenarios with an implemented landslide risk policy, excluding high-risk zones. The reduction of deforestation on high-risk areas was achieved without a drastic decrease in the accessibility of the areas. Together with forest expansion, it could therefore be used as a risk reduction strategy.

  18. Using Rapid-Response Scenario-Building Methodology for Climate Change Adaptation Planning

    NASA Astrophysics Data System (ADS)

    Ludwig, K. A.; Stoepler, T. M.; Schuster, R.

    2015-12-01

    Rapid-response scenario-building methodology can be modified to develop scenarios for slow-onset disasters associated with climate change such as drought. Results of a collaboration between the Department of the Interior (DOI) Strategic Sciences Group (SSG) and the Southwest Colorado Social-Ecological Climate Resilience Project are presented in which SSG scenario-building methods were revised and applied to climate change adaptation planning in Colorado's Gunnison Basin, United States. The SSG provides the DOI with the capacity to rapidly assemble multidisciplinary teams of experts to develop scenarios of the potential environmental, social, and economic cascading consequences of environmental crises, and to analyze these chains to determine actionable intervention points. By design, the SSG responds to acute events of a relatively defined duration. As a capacity-building exercise, the SSG explored how its scenario-building methodology could be applied to outlining the cascading consequences of slow-onset events related to climate change. SSG staff facilitated two workshops to analyze the impacts of drought, wildfire, and insect outbreak in the sagebrush and spruce-fir ecosystems. Participants included local land managers, natural and social scientists, ranchers, and other stakeholders. Key findings were: 1) scenario framing must be adjusted to accommodate the multiple, synergistic components and longer time frames of slow-onset events; 2) the development of slow-onset event scenarios is likely influenced by participants having had more time to consider potential consequences, relative to acute events; 3) participants who are from the affected area may have a more vested interest in the outcome and/or may be able to directly implement interventions.

  19. The economic impact of more sustainable water use in agriculture: A computable general equilibrium analysis

    NASA Astrophysics Data System (ADS)

    Calzadilla, Alvaro; Rehdanz, Katrin; Tol, Richard S. J.

    2010-04-01

    SummaryAgriculture is the largest consumer of freshwater resources - around 70 percent of all freshwater withdrawals are used for food production. These agricultural products are traded internationally. A full understanding of water use is, therefore, impossible without understanding the international market for food and related products, such as textiles. Based on the global general equilibrium model GTAP-W, we offer a method for investigating the role of green (rain) and blue (irrigation) water resources in agriculture and within the context of international trade. We use future projections of allowable water withdrawals for surface water and groundwater to define two alternative water management scenarios. The first scenario explores a deterioration of current trends and policies in the water sector (water crisis scenario). The second scenario assumes an improvement in policies and trends in the water sector and eliminates groundwater overdraft world-wide, increasing water allocation for the environment (sustainable water use scenario). In both scenarios, welfare gains or losses are not only associated with changes in agricultural water consumption. Under the water crisis scenario, welfare not only rises for regions where water consumption increases (China, South East Asia and the USA). Welfare gains are considerable for Japan and South Korea, Southeast Asia and Western Europe as well. These regions benefit from higher levels of irrigated production and lower food prices. Alternatively, under the sustainable water use scenario, welfare losses not only affect regions where overdrafting is occurring. Welfare decreases in other regions as well. These results indicate that, for water use, there is a clear trade-off between economic welfare and environmental sustainability.

  20. Instantaneous progression reference frame for calculating pelvis rotations: Reliable and anatomically-meaningful results independent of the direction of movement.

    PubMed

    Kainz, Hans; Lloyd, David G; Walsh, Henry P J; Carty, Christopher P

    2016-05-01

    In motion analysis, pelvis angles are conventionally calculated as the rotations between the pelvis and laboratory reference frame. This approach assumes that the participant's motion is along the anterior-posterior laboratory reference frame axis. When this assumption is violated interpretation of pelvis angels become problematic. In this paper a new approach for calculating pelvis angles based on the rotations between the pelvis and an instantaneous progression reference frame was introduced. At every time-point, the tangent to the trajectory of the midpoint of the pelvis projected into the horizontal plane of the laboratory reference frame was used to define the anterior-posterior axis of the instantaneous progression reference frame. This new approach combined with the rotation-obliquity-tilt rotation sequence was compared to the conventional approach using the rotation-obliquity-tilt and tilt-obliquity-rotation sequences. Four different movement tasks performed by eight healthy adults were analysed. The instantaneous progression reference frame approach was the only approach that showed reliable and anatomically meaningful results for all analysed movement tasks (mean root-mean-square-differences below 5°, differences in pelvis angles at pre-defined gait events below 10°). Both rotation sequences combined with the conventional approach led to unreliable results as soon as the participant's motion was not along the anterior-posterior laboratory axis (mean root-mean-square-differences up to 30°, differences in pelvis angles at pre-defined gait events up to 45°). The instantaneous progression reference frame approach enables the gait analysis community to analysis pelvis angles for movements that do not follow the anterior-posterior axis of the laboratory reference frame. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Satellite Survivability Module

    NASA Astrophysics Data System (ADS)

    Buehler, P.; Smith, J.

    The Satellite Survivability Module (SSM) is an end-to-end, physics-based, performance prediction model for directed energy engagement of orbiting spacecraft. SSM was created as an add-on module for the Satellite Tool Kit (STK). Two engagement types are currently supported: laser engagement of the focal plane array of an imaging spacecraft; and Radio Frequency (RF) engagement of spacecraft components. This paper will focus on the laser engagement scenario, the process by which it is defined, and how we use this tool to support a future laser threat detection system experiment. For a laser engagement, the user creates a spacecraft, defines its optical system, adds any protection techniques used by the optical system, introduces a laser threat, and then defines the atmosphere through which the laser will pass. SSM models the laser engagement and its impact on the spacecraft's optical system using four impact levels: degradation, saturation, damage, and destruction. Protection techniques, if employed, will mitigate engagement effects. SSM currently supports two laser protection techniques. SSM allows the user to create and implement a variety of "what if" scenarios. Satellites can be placed in a variety of orbits. Threats can be placed anywhere on the Earth or, for version 2.0, on other satellites. Satellites and threats can be mixed and matched to examine possibilities. Protection techniques for a particular spacecraft can be turned on or off individually; and can be arranged in any order to simulate more complicated protection schemes. Results can be displayed as 2-D or 3-D visualizations, or as textual reports. A new report feature available in version 2.0 will allow laser effects data to be displayed dynamically during scenario execution. In order to test SSM capabilities, the Ball team used SSM to model several engagement scenarios for our future laser threat detection system experiment. Actual test sites, along with actual laser, optics, and detector characteristics were entered into SSM to determine what effects we can expect to see, and to what extent. We concluded that SSM results are accurate when compared to actual field test results. The work is currently funded by the Air Force Research Laboratory, Space Vehicles directorate at Kirtland AFB, New Mexico, under contract number FA9453-06-C-0371.

  2. Ash fallout scenarios at Vesuvius: Numerical simulations and implications for hazard assessment

    NASA Astrophysics Data System (ADS)

    Macedonio, G.; Costa, A.; Folch, A.

    2008-12-01

    Volcanic ash fallout subsequent to a possible renewal of the Vesuvius activity represents a serious threat to the highly urbanized area around the volcano. In order to assess the relative hazard we consider three different possible scenarios such as those following Plinian, Sub-Plinian, and violent Strombolian eruptions. Reference eruptions for each scenario are similar to the 79 AD (Pompeii), the 1631 AD (or 472 AD) and the 1944 AD Vesuvius events, respectively. Fallout deposits for the first two scenarios are modeled using HAZMAP, a model based on a semi-analytical solution of the 2D advection-diffusion-sedimentation equation. In contrast, fallout following a violent Strombolian event is modeled by means of FALL3D, a numerical model based on the solution of the full 3D advection-diffusion-sedimentation equation which is valid also within the atmospheric boundary layer. Inputs for models are total erupted mass, eruption column height, bulk grain-size, bulk component distribution, and a statistical set of wind profiles obtained by the NCEP/NCAR re-analysis. We computed ground load probability maps for different ash loadings. In the case of a Sub-Plinian scenario, the most representative tephra loading maps in 16 cardinal directions were also calculated. The probability maps obtained for the different scenarios are aimed to give support to the risk mitigation strategies.

  3. Dynamic Effects of Self-Relevance and Task on the Neural Processing of Emotional Words in Context

    PubMed Central

    Fields, Eric C.; Kuperberg, Gina R.

    2016-01-01

    We used event-related potentials (ERPs) to examine the interactions between task, emotion, and contextual self-relevance on processing words in social vignettes. Participants read scenarios that were in either third person (other-relevant) or second person (self-relevant) and we recorded ERPs to a neutral, pleasant, or unpleasant critical word. In a previously reported study (Fields and Kuperberg, 2012) with these stimuli, participants were tasked with producing a third sentence continuing the scenario. We observed a larger LPC to emotional words than neutral words in both the self-relevant and other-relevant scenarios, but this effect was smaller in the self-relevant scenarios because the LPC was larger on the neutral words (i.e., a larger LPC to self-relevant than other-relevant neutral words). In the present work, participants simply answered comprehension questions that did not refer to the emotional aspects of the scenario. Here we observed quite a different pattern of interaction between self-relevance and emotion: the LPC was larger to emotional vs. neutral words in the self-relevant scenarios only, and there was no effect of self-relevance on neutral words. Taken together, these findings suggest that the LPC reflects a dynamic interaction between specific task demands, the emotional properties of a stimulus, and contextual self-relevance. We conclude by discussing implications and future directions for a functional theory of the emotional LPC. PMID:26793138

  4. Enhancements to an Agriculture-land Modeling System - FEST-C and Its Applications

    EPA Science Inventory

    The Fertilizer Emission Scenario Tool for CMAQ (FEST-C) system was originally developed to simulate daily fertilizer application information using the Environmental Policy Integrated Climate (EPIC) model across any defined CMAQ conterminous United States (U.S.) CMAQ domain and gr...

  5. Star formation history: Modeling of visual binaries

    NASA Astrophysics Data System (ADS)

    Gebrehiwot, Y. M.; Tessema, S. B.; Malkov, O. Yu.; Kovaleva, D. A.; Sytov, A. Yu.; Tutukov, A. V.

    2018-05-01

    Most stars form in binary or multiple systems. Their evolution is defined by masses of components, orbital separation and eccentricity. In order to understand star formation and evolutionary processes, it is vital to find distributions of physical parameters of binaries. We have carried out Monte Carlo simulations in which we simulate different pairing scenarios: random pairing, primary-constrained pairing, split-core pairing, and total and primary pairing in order to get distributions of binaries over physical parameters at birth. Next, for comparison with observations, we account for stellar evolution and selection effects. Brightness, radius, temperature, and other parameters of components are assigned or calculated according to approximate relations for stars in different evolutionary stages (main-sequence stars, red giants, white dwarfs, relativistic objects). Evolutionary stage is defined as a function of system age and component masses. We compare our results with the observed IMF, binarity rate, and binary mass-ratio distributions for field visual binaries to find initial distributions and pairing scenarios that produce observed distributions.

  6. Communications platform payload definition study

    NASA Technical Reports Server (NTRS)

    Clopp, H. W.; Hawkes, T. A.; Bertles, C. R.; Pontano, B. A.; Kao, T.

    1986-01-01

    Large geostationary communications platforms were investigated in a number of studies since 1974 as a possible means to more effectively utilize the geostationary arc and electromagnetic spectrum and to reduce overall satellite communications system costs. The commercial feasibility of various communications platform payload concepts circa 1998 was addressed. Promising payload concepts were defined, recurring costs were estimated, and critical technologies needed to enable eventual commercialization were identified. Ten communications service aggregation scenarios describing potential groupings of service were developed for a range of conditions. Payload concepts were defined for four of these scenarios: (1) Land Mobile Satellite Service (LMSS) meets 100% of Contiguous United States (CONUS) plus Canada demand with a single platform; (2) Fixed Satellite Service (FSS) (trunking + Customer Premises Service (CPS)), meet 20% of CONUS demand;(3) FSS (trunking + CPS + video distribution), 10 to 13% of CONUS demand; and (4) FSS (20% of demand) + Inter Satellite Links (ISL) + Tracking and Data Relay Satellite System (TDRSS)/Tracking and Data Acquisition System (TDAS) Data Distribution.

  7. Thermal-hydraulic analysis of N Reactor graphite and shield cooling system performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Low, J.O.; Schmitt, B.E.

    1988-02-01

    A series of bounding (worst-case) calculations were performed using a detailed hydrodynamic RELAP5 model of the N Reactor graphite and shield cooling system (GSCS). These calculations were specifically aimed to answer issues raised by the Westinghouse Independent Safety Review (WISR) committee. These questions address the operability of the GSCS during a worst-case degraded-core accident that requires the GDCS to mitigate the consequences of the accident. An accident scenario previously developed was designed as the hydrogen-mitigation design-basis accident (HMDBA). Previous HMDBA heat transfer analysis,, using the TRUMP-BD code, was used to define the thermal boundary conditions that the GSDS may bemore » exposed to. These TRUMP/HMDBA analysis results were used to define the bounding operating conditions of the GSCS during the course of an HMDBA transient. Nominal and degraded GSCS scenarios were investigated using RELAP5 within or at the bounds of the HMDBA transient. 10 refs., 42 figs., 10 tabs.« less

  8. Cretaceous subduction in the Pyrenees: Iberian plate-kinematics in a mantle reference frame

    NASA Astrophysics Data System (ADS)

    Vissers, Reinoud; van Hinsbergen, Douwe; van der Meer, Douwe; Spakman, Wim

    2016-04-01

    During the Cretaceous, Iberia was a microplate separated from Laurasia and Gondwana by ridges and transforms, and by a convergent margin to its northeast along which the Pyrenean fold-thrust belt developed. As a microplate, Iberia underwent a well-defined but ill-understood Albian-Aptian ~ 35° counterclockwise rotation relative to Eurasia. Three competing kinematic scenarios for Iberian motion in the late Mesozoic are all compatible with the Pyrenean geological record and comprise (1) transtensional eastward motion of Iberia versus Eurasia, (2) strike-slip motion followed by orthogonal extension and (3) scissor-style opening of the Bay of Biscay coupled with subduction in the Pyrenean realm. The last scenario is the only one consistent with paleomagnetic and ocean floor anomaly constraints showing Iberia's rotation, but is criticized because the upper mantle below the Pyrenees contains no evidence for a subducted slab. Here we show that when taking absolute plate motions into account, Aptian oceanic subduction in the Pyrenees followed by Albian slab break-off should leave a slab remnant in the present-day mid-mantle below NW Africa instead of below the Pyrenees. Mantle tomography shows a positive seismic velocity anomaly that matches the predicted position and dimension of such a slab remnant between 1900 and 1500 km depth below Reggane in Southern Algeria. Seismic tomographic imaging of the mantle structure therefore does not falsify the Pyrenean subduction hypothesis, and provides no basis to discard marine magnetic and paleomagnetic constraints on Iberia's kinematic history. Slab break-off explains the well-dated Albian-Cenomanian high-temperature metamorphism in the Pyrenees that hitherto has been interpreted as an expression of continental break-up and hyperextension. We suspect that subduction in the Pyrenees may have played a key role in driving the rapid Aptian rotation of the Iberian microplate.

  9. An algorithm for optimal fusion of atlases with different labeling protocols

    PubMed Central

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Aganj, Iman; Bhatt, Priyanka; Casillas, Christen; Salat, David; Boxer, Adam; Fischl, Bruce; Van Leemput, Koen

    2014-01-01

    In this paper we present a novel label fusion algorithm suited for scenarios in which different manual delineation protocols with potentially disparate structures have been used to annotate the training scans (hereafter referred to as “atlases”). Such scenarios arise when atlases have missing structures, when they have been labeled with different levels of detail, or when they have been taken from different heterogeneous databases. The proposed algorithm can be used to automatically label a novel scan with any of the protocols from the training data. Further, it enables us to generate new labels that are not present in any delineation protocol by defining intersections on the underling labels. We first use probabilistic models of label fusion to generalize three popular label fusion techniques to the multi-protocol setting: majority voting, semi-locally weighted voting and STAPLE. Then, we identify some shortcomings of the generalized methods, namely the inability to produce meaningful posterior probabilities for the different labels (majority voting, semi-locally weighted voting) and to exploit the similarities between the atlases (all three methods). Finally, we propose a novel generative label fusion model that can overcome these drawbacks. We use the proposed method to combine four brain MRI datasets labeled with different protocols (with a total of 102 unique labeled structures) to produce segmentations of 148 brain regions. Using cross-validation, we show that the proposed algorithm outperforms the generalizations of majority voting, semi-locally weighted voting and STAPLE (mean Dice score 83%, vs. 77%, 80% and 79%, respectively). We also evaluated the proposed algorithm in an aging study, successfully reproducing some well-known results in cortical and subcortical structures. PMID:25463466

  10. A Process-Based Assessment for Watershed Restoration Planning, Chehalis River Basin, USA

    NASA Astrophysics Data System (ADS)

    Beechie, T. J.; Thompson, J.; Seixas, G.; Fogel, C.; Hall, J.; Chamberlin, J.; Kiffney, P.; Pollock, M. M.; Pess, G. R.

    2016-12-01

    Three key questions in identifying and prioritizing river restoration are: (1) How have habitats changed?, (2) What are the causes of those habitat changes?, and (3) How of those changes affected the species of interest? To answer these questions and assist aquatic habitat restoration planning in the Chehalis River basin, USA, we quantified habitat changes across the river network from headwaters to the estuary. We estimated historical habitat capacity to support salmonids using a combination of historical assessments, reference sites, and models. We also estimated current capacity from recent or newly created data sets. We found that losses of floodplain habitats and beaver ponds were substantial, while the estuary was less modified. Both tributary and main channel habitats—while modified—did not show particularly large habitat changes. Assessments of key processes that form and sustain habitats indicate that riparian functions (shading and wood recruitment) have been significantly altered, although peak and low flows have also been altered in some locations. The next step is to link our habitat assessments to salmon life-cycle models to evaluate which life stages and habitat types currently constrain population sizes of spring and fall Chinook salmon, coho salmon, and steelhead. By comparing model runs that represent different components of habitat losses identified in the analysis above, life-cycle models help identify which habitat losses have most impacted each species and population. This assessment will indicate which habitat types provide the greatest restoration potential, and help define a guiding vision for restoration efforts. Future analyses may include development and evaluation of alternative restoration scenarios, including different climate change scenarios, to refine our understanding of which restoration actions provide the greatest benefit to a salmon population.

  11. Health System Resource Gaps and Associated Mortality from Pandemic Influenza across Six Asian Territories

    PubMed Central

    Rudge, James W.; Hanvoravongchai, Piya; Krumkamp, Ralf; Chavez, Irwin; Adisasmito, Wiku; Ngoc Chau, Pham; Phommasak, Bounlay; Putthasri, Weerasak; Shih, Chin-Shui; Stein, Mart; Timen, Aura; Touch, Sok; Reintjes, Ralf; Coker, Richard

    2012-01-01

    Background Southeast Asia has been the focus of considerable investment in pandemic influenza preparedness. Given the wide variation in socio-economic conditions, health system capacity across the region is likely to impact to varying degrees on pandemic mitigation operations. We aimed to estimate and compare the resource gaps, and potential mortalities associated with those gaps, for responding to pandemic influenza within and between six territories in Asia. Methods and Findings We collected health system resource data from Cambodia, Indonesia (Jakarta and Bali), Lao PDR, Taiwan, Thailand and Vietnam. We applied a mathematical transmission model to simulate a “mild-to-moderate” pandemic influenza scenario to estimate resource needs, gaps, and attributable mortalities at province level within each territory. The results show that wide variations exist in resource capacities between and within the six territories, with substantial mortalities predicted as a result of resource gaps (referred to here as “avoidable” mortalities), particularly in poorer areas. Severe nationwide shortages of mechanical ventilators were estimated to be a major cause of avoidable mortalities in all territories except Taiwan. Other resources (oseltamivir, hospital beds and human resources) are inequitably distributed within countries. Estimates of resource gaps and avoidable mortalities were highly sensitive to model parameters defining the transmissibility and clinical severity of the pandemic scenario. However, geographic patterns observed within and across territories remained similar for the range of parameter values explored. Conclusions The findings have important implications for where (both geographically and in terms of which resource types) investment is most needed, and the potential impact of resource mobilization for mitigating the disease burden of an influenza pandemic. Effective mobilization of resources across administrative boundaries could go some way towards minimizing avoidable deaths. PMID:22363739

  12. Health system resource gaps and associated mortality from pandemic influenza across six Asian territories.

    PubMed

    Rudge, James W; Hanvoravongchai, Piya; Krumkamp, Ralf; Chavez, Irwin; Adisasmito, Wiku; Chau, Pham Ngoc; Phommasak, Bounlay; Putthasri, Weerasak; Shih, Chin-Shui; Stein, Mart; Timen, Aura; Touch, Sok; Reintjes, Ralf; Coker, Richard

    2012-01-01

    Southeast Asia has been the focus of considerable investment in pandemic influenza preparedness. Given the wide variation in socio-economic conditions, health system capacity across the region is likely to impact to varying degrees on pandemic mitigation operations. We aimed to estimate and compare the resource gaps, and potential mortalities associated with those gaps, for responding to pandemic influenza within and between six territories in Asia. We collected health system resource data from Cambodia, Indonesia (Jakarta and Bali), Lao PDR, Taiwan, Thailand and Vietnam. We applied a mathematical transmission model to simulate a "mild-to-moderate" pandemic influenza scenario to estimate resource needs, gaps, and attributable mortalities at province level within each territory. The results show that wide variations exist in resource capacities between and within the six territories, with substantial mortalities predicted as a result of resource gaps (referred to here as "avoidable" mortalities), particularly in poorer areas. Severe nationwide shortages of mechanical ventilators were estimated to be a major cause of avoidable mortalities in all territories except Taiwan. Other resources (oseltamivir, hospital beds and human resources) are inequitably distributed within countries. Estimates of resource gaps and avoidable mortalities were highly sensitive to model parameters defining the transmissibility and clinical severity of the pandemic scenario. However, geographic patterns observed within and across territories remained similar for the range of parameter values explored. The findings have important implications for where (both geographically and in terms of which resource types) investment is most needed, and the potential impact of resource mobilization for mitigating the disease burden of an influenza pandemic. Effective mobilization of resources across administrative boundaries could go some way towards minimizing avoidable deaths.

  13. Time evolution of strategic and non-strategic 2-party competitions

    NASA Astrophysics Data System (ADS)

    Shanahan, Linda Lee

    The study of the nature of conflict and competition and its many manifestations---military, social, environmental, biological---has enjoyed a long history and garnered the attention of researchers in many disciplines. It will no doubt continue to do so. That the topic is of interest to some in the physics community has to do with the critical role physicists have shouldered in furthering knowledge in every sphere with reference to behavior observed in nature. The techniques, in the case of this research, have been rooted in statistical physics and the science of probability. Our tools include the use of cellular automata and random number generators in an agent-based modeling approach. In this work, we first examine a type of "conflict" model where two parties vye for the same resources with no apparent strategy or intelligence, their interactions devolving to random encounters. Analytical results for the time evolution of the model are presented with multiple examples. What at first encounter seems a trivial formulation is found to be a model with rich possibilities for adaptation to far more interesting and potentially relevant scenarios. An example of one such possibility---random events punctuated by correlated non-random ones---is included. We then turn our attention to a different conflict scenario, one in which one party acts with no strategy and in a random manner while the other receives intelligence, makes decisions, and acts with a specific purpose. We develop a set of parameters and examine several examples for insight into the model behavior in different regions of the parameter space, finding both intuitive and non-intuitive results. Of particular interest is the role of the so-called "intelligence" in determining the outcome of a conflict. We consider two applications for which specific conditions are imposed on the parameters. First, can an invader beginning in a single cell or site and utilizing a search and deploy strategy gain territory in an environment defined by constant exposure to random attacks? What magnitude of defense is sufficient to eliminate or contain such growth, and what role does the quantity and quality of available information play? Second, we build on the idea of a single intruder to include a look at a scenario where a single intruder or a small group of intruders invades or attacks a space which may have significant restrictions (such as walls or other inaccessible spaces). The importance of information and strategy emerges in keeping with intuitive expectations. Additional derivations are provided in the appendix, along with the MATLAB codes for the models. References are relegated to the end of the thesis.

  14. Factors influencing general practitioner referral of patients developing end-stage renal failure: a standardised case-analysis study.

    PubMed

    Montgomery, Anthony J; McGee, Hannah M; Shannon, William; Donohoe, John

    2006-09-13

    To understand why treatment referral rates for ESRF are lower in Ireland than in other European countries, an investigation of factors influencing general practitioner referral of patients developing ESRF was conducted. Randomly selected general practitioners (N = 51) were interviewed using 32 standardised written patient scenarios to elicit referral strategies. General practitioner referral levels and thresholds for patients developing end-stage renal disease; referral routes (nephrologist vs other physicians); influence of patient age, marital status and co-morbidity on referral. Referral levels varied widely with the full range of cases (0-32; median = 15) referred by different doctors after consideration of first laboratory results. Less than half (44%) of cases were referred to a nephrologist. Patient age (40 vs 70 years), marital status, co-morbidity (none vs rheumatoid arthritis) and general practitioner prior specialist renal training (yes or no) did not influence referral rates. Many patients were not referred to a specialist at creatinine levels of 129 micromol/l (47% not referred) or 250 micromol/l (45%). While all patients were referred at higher levels (350 and 480 micromol/l), referral to a nephrologist decreased in likelihood as scenarios became more complex; 28% at 129 micromol/l creatinine; 28% at 250 micromol/l; 18% at 350 micromol/l and 14% at 480 micromol/l. Referral levels and routes were not influenced by general practitioner age, sex or practice location. Most general practitioners had little current contact with chronic renal patients (mean number in practice = 0.7, s.d. = 1.3). The very divergent management patterns identified highlight the need for guidance to general practitioners on appropriate management of this serious condition.

  15. Technostress and the Reference Librarian.

    ERIC Educational Resources Information Center

    Kupersmith, John

    1992-01-01

    Defines "technostress" as the stress experienced by reference librarians who must constantly deal with the demands of new information technology and the changes they produce in the work place. Discussion includes suggested ways in which both organizations and individuals can work to reduce stress. (27 references) (LAE)

  16. Changing Roles for References Librarians.

    ERIC Educational Resources Information Center

    Kelly, Julia; Robbins, Kathryn

    1996-01-01

    Discusses the future outlook for reference librarians, with topics including: "Technology as the Source of Change"; "Impact of the Internet"; "Defining the Virtual Library"; "Rethinking Reference"; "Out of the Library and into the Streets"; "Asking Users About Their Needs"; "Standardization and Artificial Intelligence"; "The Financial Future"; and…

  17. Using Optimal Land-Use Scenarios to Assess Trade-Offs between Conservation, Development, and Social Values.

    PubMed

    Adams, Vanessa M; Pressey, Robert L; Álvarez-Romero, Jorge G

    2016-01-01

    Development of land resources can contribute to increased economic productivity but can also negatively affect the extent and condition of native vegetation, jeopardize the persistence of native species, reduce water quality, and erode ecosystem services. Spatial planning must therefore balance outcomes for conservation, development, and social goals. One approach to evaluating these trade-offs is scenario planning. In this paper we demonstrate methods for incorporating stakeholder preferences into scenario planning through both defining scenario objectives and evaluating the scenarios that emerge. In this way, we aim to develop spatial plans capable of informing actual land-use decisions. We used a novel approach to scenario planning that couples optimal land-use design and social evaluation of environmental outcomes. Four land-use scenarios combined differences in total clearing levels (10% and 20%) in our study region, the Daly Catchment Australia, with the presence or absence of spatial precincts to concentrate irrigated agriculture. We used the systematic conservation planning tool Marxan with Zones to optimally plan for multiple land-uses that met objectives for both conservation and development. We assessed the performance of the scenarios in terms of the number of objectives met and the degree to which existing land-use policies were compromised (e.g., whether clearing limits in existing guidelines were exceeded or not). We also assessed the land-use scenarios using expected stakeholder satisfaction with changes in the catchment to explore how the scenarios performed against social preferences. There were a small fraction of conservation objectives with high conservation targets (100%) that could not be met due to current land uses; all other conservation and development objectives were met in all scenarios. Most scenarios adhered to the existing clearing guidelines with only marginal exceedances of limits, indicating that the scenario objectives were compatible with existing policy. We found that two key stakeholder groups, agricultural and Indigenous residents, had divergent satisfaction levels with the amount of clearing and agricultural development. Based on the range of benefits and potential adverse impacts of each scenario, we suggest that the 10% clearing scenarios are most aligned with stakeholder preferences and best balance preferences across stakeholder groups. Our approach to scenario planning is applicable generally to exploring the potential conflicts between goals for conservation and development. Our case study is particularly relevant to current discussion about increased agricultural and pastoral development in northern Australia.

  18. Using Optimal Land-Use Scenarios to Assess Trade-Offs between Conservation, Development, and Social Values

    PubMed Central

    Pressey, Robert L.; Álvarez-Romero, Jorge G.

    2016-01-01

    Development of land resources can contribute to increased economic productivity but can also negatively affect the extent and condition of native vegetation, jeopardize the persistence of native species, reduce water quality, and erode ecosystem services. Spatial planning must therefore balance outcomes for conservation, development, and social goals. One approach to evaluating these trade-offs is scenario planning. In this paper we demonstrate methods for incorporating stakeholder preferences into scenario planning through both defining scenario objectives and evaluating the scenarios that emerge. In this way, we aim to develop spatial plans capable of informing actual land-use decisions. We used a novel approach to scenario planning that couples optimal land-use design and social evaluation of environmental outcomes. Four land-use scenarios combined differences in total clearing levels (10% and 20%) in our study region, the Daly Catchment Australia, with the presence or absence of spatial precincts to concentrate irrigated agriculture. We used the systematic conservation planning tool Marxan with Zones to optimally plan for multiple land-uses that met objectives for both conservation and development. We assessed the performance of the scenarios in terms of the number of objectives met and the degree to which existing land-use policies were compromised (e.g., whether clearing limits in existing guidelines were exceeded or not). We also assessed the land-use scenarios using expected stakeholder satisfaction with changes in the catchment to explore how the scenarios performed against social preferences. There were a small fraction of conservation objectives with high conservation targets (100%) that could not be met due to current land uses; all other conservation and development objectives were met in all scenarios. Most scenarios adhered to the existing clearing guidelines with only marginal exceedances of limits, indicating that the scenario objectives were compatible with existing policy. We found that two key stakeholder groups, agricultural and Indigenous residents, had divergent satisfaction levels with the amount of clearing and agricultural development. Based on the range of benefits and potential adverse impacts of each scenario, we suggest that the 10% clearing scenarios are most aligned with stakeholder preferences and best balance preferences across stakeholder groups. Our approach to scenario planning is applicable generally to exploring the potential conflicts between goals for conservation and development. Our case study is particularly relevant to current discussion about increased agricultural and pastoral development in northern Australia. PMID:27362347

  19. Low energy stage study. Volume 2: Requirements and candidate propulsion modes. [orbital launching of shuttle payloads

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A payload mission model covering 129 launches, was examined and compared against the space transportation system shuttle standard orbit inclinations and a shuttle launch site implementation schedule. Based on this examination and comparison, a set of six reference missions were defined in terms of spacecraft weight and velocity requirements to deliver the payload from a 296 km circular Shuttle standard orbit to the spacecraft's planned orbit. Payload characteristics and requirements representative of the model payloads included in the regime bounded by each of the six reference missions were determined. A set of launch cost envelopes were developed and defined based on the characteristics of existing/planned Shuttle upper stages and expendable launch systems in terms of launch cost and velocity delivered. These six reference missions were used to define the requirements for the candidate propulsion modes which were developed and screened to determine the propulsion approaches for conceptual design.

  20. Long-Term Variations of the EOP and ICRF2

    NASA Technical Reports Server (NTRS)

    Zharov, Vladimir; Sazhin, Mikhail; Sementsov, Valerian; Sazhina, Olga

    2010-01-01

    We analyzed the time series of the coordinates of the ICRF radio sources. We show that part of the radio sources, including the defining sources, shows a significant apparent motion. The stability of the celestial reference frame is provided by a no-net-rotation condition applied to the defining sources. In our case this condition leads to a rotation of the frame axes with time. We calculated the effect of this rotation on the Earth orientation parameters (EOP). In order to improve the stability of the celestial reference frame we suggest a new method for the selection of the defining sources. The method consists of two criteria: the first one we call cosmological and the second one kinematical. It is shown that a subset of the ICRF sources selected according to cosmological criteria provides the most stable reference frame for the next decade.

  1. Mars rover/sample return mission requirements affecting space station

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The possible interfaces between the Space Station and the Mars Rover/Sample Return (MRSR) mission are defined. In order to constrain the scope of the report a series of seven design reference missions divided into three major types were assumed. These missions were defined to span the probable range of Space Station-MRSR interactions. The options were reduced, the MRSR sample handling requirements and baseline assumptions about the MRSR hardware and the key design features and requirements of the Space Station are summarized. Only the aspects of the design reference missions necessary to define the interfaces, hooks and scars, and other provisions on the Space Station are considered. An analysis of each of the three major design reference missions, is reported, presenting conceptual designs of key hardware to be mounted on the Space Station, a definition of weights, interfaces, and required hooks and scars.

  2. Thermocouple, multiple junction reference oven

    NASA Technical Reports Server (NTRS)

    Leblanc, L. P. (Inventor)

    1981-01-01

    An improved oven for maintaining the junctions of a plurality of reference thermocouples at a common and constant temperature is described. The oven is characterized by a cylindrical body defining a heat sink with axially extended-cylindrical cavity a singularized heating element which comprises a unitary cylindrical heating element consisting of a resistance heating coil wound about the surface of metallic spool with an axial bore defined and seated in the cavity. Other features of the oven include an annular array of radially extended bores defined in the cylindrical body and a plurality of reference thermocouple junctions seated in the bores in uniformly spaced relation with the heating element, and a temperature sensing device seated in the axial bore for detecting temperature changes as they occur in the spool and circuit to apply a voltage across the coil in response to detected drops in temperatures of the spool.

  3. Review of Interactive Video--Romanian Project Proposal

    ERIC Educational Resources Information Center

    Onita, Mihai; Petan, Sorin; Vasiu, Radu

    2016-01-01

    In the recent years, the globalization and massification of video education offer involved more and more eLearning scenarios within universities. This article refers to interactive video and proposes an overview of it. We analyze the background information, regarding the eLearning campus used in virtual universities around the world, the MOOC…

  4. Data-Driven Decision-Making: It's a Catch-Up Game

    ERIC Educational Resources Information Center

    Briggs, Linda L.

    2006-01-01

    Having an abundance of data residing in individual silos across campus, but little decision-ready information, is a typical scenario at many institutions. One problem is that the terms "data warehousing" and "business intelligence" refer to very different things, although the two often go hand-in-hand. "Data…

  5. In-Situ Burning of Spilled Oil.

    ERIC Educational Resources Information Center

    Allen, Alan A.

    1991-01-01

    Reviews in-situ burning with particular emphasis on how it can be applied in water-related oil spill situations. Presents and discusses the use of nomograms and development of techniques cited for safe and effective ignition and controlled burning of spilled oil. Includes representative oil spill scenarios and possible responses. (15 references)…

  6. Scenario planning.

    PubMed

    Enzmann, Dieter R; Beauchamp, Norman J; Norbash, Alexander

    2011-03-01

    In facing future developments in health care, scenario planning offers a complementary approach to traditional strategic planning. Whereas traditional strategic planning typically consists of predicting the future at a single point on a chosen time horizon and mapping the preferred plans to address such a future, scenario planning creates stories about multiple likely potential futures on a given time horizon and maps the preferred plans to address the multiple described potential futures. Each scenario is purposefully different and specifically not a consensus worst-case, average, or best-case forecast; nor is scenario planning a process in probabilistic prediction. Scenario planning focuses on high-impact, uncertain driving forces that in the authors' example affect the field of radiology. Uncertainty is the key concept as these forces are mapped onto axes of uncertainty, the poles of which have opposed effects on radiology. One chosen axis was "market focus," with poles of centralized health care (government control) vs a decentralized private market. Another axis was "radiology's business model," with one pole being a unified, single specialty vs a splintered, disaggregated subspecialty. The third axis was "technology and science," with one pole representing technology enabling to radiology vs technology threatening to radiology. Selected poles of these axes were then combined to create 3 scenarios. One scenario, termed "entrepreneurialism," consisted of a decentralized private market, a disaggregated business model, and threatening technology and science. A second scenario, termed "socialized medicine," had a centralized market focus, a unified specialty business model, and enabling technology and science. A third scenario, termed "freefall," had a centralized market focus, a disaggregated business model, and threatening technology and science. These scenarios provide a range of futures that ultimately allow the identification of defined "signposts" that can suggest which basic features among the "possible futures" are playing out. Scenario planning provides for the implementation of appropriate constructed strategic responses. Scenarios allow for a pre-prepared game plan available for ready use as the future unfolds. They allow a deliberative response rather than a hastily constructed, urgent response. Copyright © 2011 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. The value of cows in reference populations for genomic selection of new functional traits.

    PubMed

    Buch, L H; Kargo, M; Berg, P; Lassen, J; Sørensen, A C

    2012-06-01

    Today, almost all reference populations consist of progeny tested bulls. However, older progeny tested bulls do not have reliable estimated breeding values (EBV) for new traits. Thus, to be able to select for these new traits, it is necessary to build a reference population. We used a deterministic prediction model to test the hypothesis that the value of cows in reference populations depends on the availability of phenotypic records. To test the hypothesis, we investigated different strategies of building a reference population for a new functional trait over a 10-year period. The trait was either recorded on a large scale (30 000 cows per year) or on a small scale (2000 cows per year). For large-scale recording, we compared four scenarios where the reference population consisted of 30 sires; 30 sires and 170 test bulls; 30 sires and 2000 cows; or 30 sires, 2000 cows and 170 test bulls in the first year with measurements of the new functional trait. In addition to varying the make-up of the reference population, we also varied the heritability of the trait (h2 = 0.05 v. 0.15). The results showed that a reference population of test bulls, cows and sires results in the highest accuracy of the direct genomic values (DGV) for a new functional trait, regardless of its heritability. For small-scale recording, we compared two scenarios where the reference population consisted of the 2000 cows with phenotypic records or the 30 sires of these cows in the first year with measurements of the new functional trait. The results showed that a reference population of cows results in the highest accuracy of the DGV whether the heritability is 0.05 or 0.15, because variation is lost when phenotypic data on cows are summarized in EBV of their sires. The main conclusions from this study are: (i) the fewer phenotypic records, the larger effect of including cows in the reference population; (ii) for small-scale recording, the accuracy of the DGV will continue to increase for several years, whereas the increases in the accuracy of the DGV quickly decrease with large-scale recording; (iii) it is possible to achieve accuracies of the DGV that enable selection for new functional traits recorded on a large scale within 3 years from commencement of recording; and (iv) a higher heritability benefits a reference population of cows more than a reference population of bulls.

  8. Water-sensitivity assessment of regional spatial plan based on the relation between watershed imperviousness and aquatic ecosystem health

    NASA Astrophysics Data System (ADS)

    Sutjiningsih, D.; Soeryantono, H.; Anggraheni, E.

    2018-04-01

    Upper Ciliwung watershed in the JABODETABEKPUNJUR area experiencing rapid population growth, which in turn promotes the pace of infrastructure development especially increasing impervious land cover. This will trigger various stressors to the abiotic and biotic elements in the aquatic ecosystem. This study aims to examine whether the relationship between imperviousness in the subwatersheds in Upper Ciliwung and abiotic/biotic elements of its aquatic ecosystems can be used to assess the degree of water-sensitivity of the related regional spatial plan. Two scenarios of impervious cover changes have been assessed, scenario 1 using constant growth of 7.56% per annum, while scenario 2 refers to regional spatial plan of Bogor Regency. Although there are inconsistencies in four (out of 13) subwatersheds, the tests proved that the procedure is succesful to be applied in Upper Ciliwung.

  9. Bridging the sanitation gap between disaster relief and development.

    PubMed

    Lai, Ka-Man; Ramirez, Claudia; Liu, Weilong; Kirilova, Darina; Vick, David; Mari, Joe; Smith, Rachel; Lam, Ho-Yin; Ostovari, Afshin; Shibakawa, Akifumi; Liu, Yang; Samant, Sidharth; Osaro, Lucky

    2015-10-01

    By interpreting disasters as opportunities to initiate the fulfilment of development needs, realise the vulnerability of the affected community and environment, and extend the legacy of relief funds and effort, this paper builds upon the concept linking relief, rehabilitation and development (LRRD) in the sanitation sector. It aims to use a composite of case studies to devise a framework for a semi-hypothetical scenario to identify critical components and generic processes for a LRRD action plan. The scenario is based on a latrine wetland sanitation system in a Muslim community. Several sub-frameworks are developed: (i) latrine design; (ii) assessment of human waste treatment; (iii) connective sanitation promotion strategy; and (iv) ecological systems and environmental services for sanitation and development. This scenario illustrates the complex issues involved in LRRD in sanitation work and provides technical notes and references for a legacy plan for disaster relief and development. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.

  10. Problems in Defining the Field of Distance Education.

    ERIC Educational Resources Information Center

    Keegan, Desmond

    1988-01-01

    This discussion of definitions of distance education responds to previous articles attempting to define the field. Topics discussed include distance education versus conventional education; group-based distance education; differences between open learning and distance education; and criteria to define distance education. (13 references) (LRW)

  11. Setting conservation management thresholds using a novel participatory modeling approach.

    PubMed

    Addison, P F E; de Bie, K; Rumpff, L

    2015-10-01

    We devised a participatory modeling approach for setting management thresholds that show when management intervention is required to address undesirable ecosystem changes. This approach was designed to be used when management thresholds: must be set for environmental indicators in the face of multiple competing objectives; need to incorporate scientific understanding and value judgments; and will be set by participants with limited modeling experience. We applied our approach to a case study where management thresholds were set for a mat-forming brown alga, Hormosira banksii, in a protected area management context. Participants, including management staff and scientists, were involved in a workshop to test the approach, and set management thresholds to address the threat of trampling by visitors to an intertidal rocky reef. The approach involved trading off the environmental objective, to maintain the condition of intertidal reef communities, with social and economic objectives to ensure management intervention was cost-effective. Ecological scenarios, developed using scenario planning, were a key feature that provided the foundation for where to set management thresholds. The scenarios developed represented declines in percent cover of H. banksii that may occur under increased threatening processes. Participants defined 4 discrete management alternatives to address the threat of trampling and estimated the effect of these alternatives on the objectives under each ecological scenario. A weighted additive model was used to aggregate participants' consequence estimates. Model outputs (decision scores) clearly expressed uncertainty, which can be considered by decision makers and used to inform where to set management thresholds. This approach encourages a proactive form of conservation, where management thresholds and associated actions are defined a priori for ecological indicators, rather than reacting to unexpected ecosystem changes in the future. © 2015 The Authors Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  12. Selecting surrogate endpoints for estimating pesticide effects on avian reproductive success

    EPA Science Inventory

    A Markov chain nest productivity model (MCnest) has been developed for projecting the effects of a specific pesticide-use scenario on the annual reproductive success of avian species of concern. A critical element in MCnest is the use of surrogate endpoints, defined as measured ...

  13. On the Meaning of Uniqueness

    ERIC Educational Resources Information Center

    Shipman, Barbara A.

    2013-01-01

    This article analyzes four questions on the meaning of uniqueness that have contrasting answers in common language versus mathematical language. The investigations stem from a scenario in which students interpreted uniqueness according to a definition from standard English, that is, different from the mathematical meaning, in defining an injective…

  14. The dynamic simulation model of soybean in Central Java to support food self sufficiency: A supply chain perspective

    NASA Astrophysics Data System (ADS)

    Oktyajati, Nancy; Hisjam, Muh.; Sutopo, Wahyudi

    2018-02-01

    Consider food become one of the basic human needs in order to survive so food sufficiency become very important. Food sufficiency of soybean commodity in Central Java still depends on imported soybean. Insufficiency of soybean because of there is much gap between local soybean productions and its demand. In the year 2016 the shortage of supply soybean commodity as much 68.79%. Soybean is an important and strategic commodity after rice and corn. The increasing consumption of soybean is related to increasing population, increasing incomes, changing of healthy life style. The aims of this study are to determine the soybean dynamic model based on supply chain perspective, define the proper price of local soybean to trigger increasing of local production, and to define the alternative solution to support food self sufficiency. This study will capture the real condition into dynamics model, then simulate a series of scenario into a computer program to obtain the best results. This study will be conducted the following first scenario with government intervention policy and second without government intervention policy. The best solution of the alternative can be used as government consideration for governmental policy. The results of the propose scenarios showed that self sufficiency on soybean can be achieved after the next 20 years by increasing planting area 4% and land productivity 1% per year.

  15. Importance of the pre-industrial baseline for likelihood of exceeding Paris goals

    NASA Astrophysics Data System (ADS)

    Schurer, Andrew P.; Mann, Michael E.; Hawkins, Ed; Tett, Simon F. B.; Hegerl, Gabriele C.

    2017-08-01

    During the Paris conference in 2015, nations of the world strengthened the United Nations Framework Convention on Climate Change by agreeing to holding `the increase in the global average temperature to well below 2 °C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5 °C' (ref. ). However, `pre-industrial' was not defined. Here we investigate the implications of different choices of the pre-industrial baseline on the likelihood of exceeding these two temperature thresholds. We find that for the strongest mitigation scenario RCP2.6 and a medium scenario RCP4.5, the probability of exceeding the thresholds and timing of exceedance is highly dependent on the pre-industrial baseline; for example, the probability of crossing 1.5 °C by the end of the century under RCP2.6 varies from 61% to 88% depending on how the baseline is defined. In contrast, in the scenario with no mitigation, RCP8.5, both thresholds will almost certainly be exceeded by the middle of the century with the definition of the pre-industrial baseline of less importance. Allowable carbon emissions for threshold stabilization are similarly highly dependent on the pre-industrial baseline. For stabilization at 2 °C, allowable emissions decrease by as much as 40% when earlier than nineteenth-century climates are considered as a baseline.

  16. A comparative analysis of the density of the SNOMED CT conceptual content for semantic harmonization

    PubMed Central

    He, Zhe; Geller, James; Chen, Yan

    2015-01-01

    Objectives Medical terminologies vary in the amount of concept information (the “density”) represented, even in the same sub-domains. This causes problems in terminology mapping, semantic harmonization and terminology integration. Moreover, complex clinical scenarios need to be encoded by a medical terminology with comprehensive content. SNOMED Clinical Terms (SNOMED CT), a leading clinical terminology, was reported to lack concepts and synonyms, problems that cannot be fully alleviated by using post-coordination. Therefore, a scalable solution is needed to enrich the conceptual content of SNOMED CT. We are developing a structure-based, algorithmic method to identify potential concepts for enriching the conceptual content of SNOMED CT and to support semantic harmonization of SNOMED CT with selected other Unified Medical Language System (UMLS) terminologies. Methods We first identified a subset of English terminologies in the UMLS that have ‘PAR’ relationship labeled with ‘IS_A’ and over 10% overlap with one or more of the 19 hierarchies of SNOMED CT. We call these “reference terminologies” and we note that our use of this name is different from the standard use. Next, we defined a set of topological patterns across pairs of terminologies, with SNOMED CT being one terminology in each pair and the other being one of the reference terminologies. We then explored how often these topological patterns appear between SNOMED CT and each reference terminology, and how to interpret them. Results Four viable reference terminologies were identified. Large density differences between terminologies were found. Expected interpretations of these differences were indeed observed, as follows. A random sample of 299 instances of special topological patterns (“2:3 and 3:2 trapezoids”) showed that 39.1% and 59.5% of analyzed concepts in SNOMED CT and in a reference terminology, respectively, were deemed to be alternative classifications of the same conceptual content. In 30.5% and 17.6% of the cases, it was found that intermediate concepts could be imported into SNOMED CT or into the reference terminology, respectively, to enhance their conceptual content, if approved by a human curator. Other cases included synonymy and errors in one of the terminologies. Conclusion These results show that structure-based algorithmic methods can be used to identify potential concepts to enrich SNOMED CT and the four reference terminologies. The comparative analysis has the future potential of supporting terminology authoring by suggesting new content to improve content coverage and semantic harmonization between terminologies. PMID:25890688

  17. Performance evaluation of multi-stratum resources integration based on network function virtualization in software defined elastic data center optical interconnect.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; Tian, Rui; Han, Jianrui; Lee, Young

    2015-11-30

    Data center interconnect with elastic optical network is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resilience between IP and elastic optical networks that allows to accommodate data center services. In view of this, this study extends to consider the resource integration by breaking the limit of network device, which can enhance the resource utilization. We propose a novel multi-stratum resources integration (MSRI) architecture based on network function virtualization in software defined elastic data center optical interconnect. A resource integrated mapping (RIM) scheme for MSRI is introduced in the proposed architecture. The MSRI can accommodate the data center services with resources integration when the single function or resource is relatively scarce to provision the services, and enhance globally integrated optimization of optical network and application resources. The overall feasibility and efficiency of the proposed architecture are experimentally verified on the control plane of OpenFlow-based enhanced software defined networking (eSDN) testbed. The performance of RIM scheme under heavy traffic load scenario is also quantitatively evaluated based on MSRI architecture in terms of path blocking probability, provisioning latency and resource utilization, compared with other provisioning schemes.

  18. Programming in a proposed 9X distributed Ada

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Goldsack, Stephen J.; Holzbach-Valero, A. A.

    1991-01-01

    The studies of the proposed Ada 9X constructs for distribution, now referred to as AdaPT are reported. The goals for this time period were to revise the chosen example scenario and to begin studying about how the proposed constructs might be implemented. The example scenario chosen is the Submarine Combat Information Center (CIC) developed by IBM for the Navy. The specification provided by IBM was preliminary and had several deficiencies. To address these problems, some changes to the scenario specification were made. Some of the more important changes include: (1) addition of a system database management function; (2) addition of a fourth processing unit to the standard resources; (3) addition of an operator console interface function; and (4) removal of the time synchronization function. To implement the CIC scenario in AdaPT, the decided strategy were publics, partitions, and nodes. The principle purpose for implementing the CIC scenario was to demonstrate how the AdaPT constructs interact with the program structure. While considering ways that the AdaPt constructs might be translated to Ada 83, it was observed that the partition construct could reasonably be modeled as an abstract data type. Although this gives a useful method of modeling partitions, it does not at all address the configuration aspects on the node construct.

  19. Regional air quality management aspects of climate change: impact of climate mitigation options on regional air emissions.

    PubMed

    Rudokas, Jason; Miller, Paul J; Trail, Marcus A; Russell, Armistead G

    2015-04-21

    We investigate the projected impact of six climate mitigation scenarios on U.S. emissions of carbon dioxide (CO2), sulfur dioxide (SO2), and nitrogen oxides (NOX) associated with energy use in major sectors of the U.S. economy (commercial, residential, industrial, electricity generation, and transportation). We use the EPA U.S. 9-region national database with the MARKet Allocation energy system model to project emissions changes over the 2005 to 2050 time frame. The modeled scenarios are two carbon tax, two low carbon transportation, and two biomass fuel choice scenarios. In the lower carbon tax and both biomass fuel choice scenarios, SO2 and NOX achieve reductions largely through pre-existing rules and policies, with only relatively modest additional changes occurring from the climate mitigation measures. The higher carbon tax scenario projects greater declines in CO2 and SO2 relative to the 2050 reference case, but electricity sector NOX increases. This is a result of reduced investments in power plant NOX controls in earlier years in anticipation of accelerated coal power plant retirements, energy penalties associated with carbon capture systems, and shifting of NOX emissions in later years from power plants subject to a regional NOX cap to those in regions not subject to the cap.

  20. Evaluating EDGARv4.tox2 speciated mercury emissions ex-post scenarios and their impacts on modelled global and regional wet deposition patterns

    NASA Astrophysics Data System (ADS)

    Muntean, Marilena; Janssens-Maenhout, Greet; Song, Shaojie; Giang, Amanda; Selin, Noelle E.; Zhong, Hui; Zhao, Yu; Olivier, Jos G. J.; Guizzardi, Diego; Crippa, Monica; Schaaf, Edwin; Dentener, Frank

    2018-07-01

    Speciated mercury gridded emissions inventories together with chemical transport models and concentration measurements are essential when investigating both the effectiveness of mitigation measures and the mercury cycle in the environment. Since different mercury species have contrasting behaviour in the atmosphere, their proportion in anthropogenic emissions could determine the spatial impacts. In this study, the time series from 1970 to 2012 of the EDGARv4.tox2 global mercury emissions inventory are described; the total global mercury emission in 2010 is 1772 tonnes. Global grid-maps with geospatial distribution of mercury emissions at a 0.1° × 0.1° resolution are provided for each year. Compared to the previous tox1 version, tox2 provides updates for more recent years and improved emissions in particular for agricultural waste burning, power generation and artisanal and small-scale gold mining (ASGM) sectors. We have also developed three retrospective emissions scenarios based on different hypotheses related to the proportion of mercury species in the total mercury emissions for each activity sector; improvements in emissions speciation are seen when using information primarily from field measurements. We evaluated them using the GEOS-Chem 3-D mercury model in order to explore the influence of speciation shifts, to reactive mercury forms in particular, on regional wet deposition patterns. The reference scenario S1 (EDGARv4.tox2_S1) uses speciation factors from the Arctic Monitoring and Assessment Programme (AMAP); scenario S2 ("EPA_power") uses factors from EPA's Information Collection Request (ICR); and scenario S3 ("Asia_filedM") factors from recent scientific publications. In the reference scenario, the sum of reactive mercury emissions (Hg-P and Hg2+) accounted for 25.3% of the total global emissions; the regions/countries that have shares of reactive mercury emissions higher than 6% in total global reactive mercury are China+ (30.9%), India+ (12.5%) and the United States (9.9%). In 2010, the variations of reactive mercury emissions amongst the different scenarios are in the range of -19.3 t/yr (China+) to 4.4 t/yr (OECD_Europe). However, at the sector level, the variation could be different, e.g., for the iron and steel industry in China reaches 15.4 t/yr. Model evaluation at the global level shows a variation of approximately ±10% in wet deposition for the three emissions scenarios. An evaluation of the impact of mercury speciation within nested grid sensitivity simulations is performed for the United States and modelled wet deposition fluxes are compared with measurements. These studies show that using the S2 and S3 emissions of reactive mercury, can improve wet deposition estimates near sources.

  1. Valuation of biodiversity effects from reduced pesticide use.

    PubMed

    Schou, Jesper S; Hasler, Berit; Nahrstedt, Birgit

    2006-04-01

    This study deals with the effects on biodiversity of pesticide-free buffer zones along field margins. Using choice modeling, the majority of respondents to a survey on pesticide use in the environment are willing to accept an increase in the price of bread if the survival of partridge chicks and the number of wild plants increase. The study identifies the need for further empirical work with respect to methodological validation, price estimation, and the use of survey results in policy analysis. In particular, the environmental effects of pesticide use are complex and, therefore, present difficult challenges when presenting information to lay people. Forty-one percent of respondents changed their responses regarding willingness to pay more for bread when references to pesticide use were introduced in the questionnaire. This indicates that scenarios depicting changes in pesticide use can be difficult to present to lay people in an economically rational and well-defined context. Thus, in the study of valuation related to changes in pesticide use, much attention should be devoted to the design and definition of the context. Furthermore, the effects of providing different background information, e.g., with or without the mention of pesticides, should be tested.

  2. Formation and internal structure of superdense dark matter clumps and ultracompact minihaloes

    NASA Astrophysics Data System (ADS)

    Berezinsky, V. S.; Dokuchaev, V. I.; Eroshenko, Yu. N.

    2013-11-01

    We discuss the formation mechanisms and structure of the superdense dark matter clumps (SDMC) and ultracompact minihaloes (UCMH), outlining the differences between these types of DM objects. We define as SDMC the gravitationally bounded DM objects which have come into virial equilibrium at the radiation-dominated (RD) stage of the universe evolution. Such objects can be formed from the isocurvature (entropy) density perturbations or from the peaks in the spectrum of curvature (adiabatic) perturbation. The axion miniclusters (Kolb and Tkachev 1994) are the example of the former model. The system of central compact mass (e.g. in the form of SDMC or primordial black hole (PBH)) with the outer DM envelope formed in the process of secondary accretion we refer to as UCMH. Therefore, the SDMC can serve as the seed for the UCMH in some scenarios. Recently, the SDMC and UCMH were considered in the many works, and we try to systematize them here. We consider also the effect of asphericity of the initial density perturbation in the gravitational evolution, which decreases the SDMC amount and, as the result, suppresses the gamma-ray signal from DM annihilation.

  3. Expression and Regulation of Attachment-Related Emotions in Children with Conduct Problems and Callous-Unemotional Traits.

    PubMed

    Dadds, Mark R; Gale, Nyree; Godbee, Megan; Moul, Caroline; Pasalich, Dave S; Fink, Elian; Hawes, David J

    2016-08-01

    Callous-unemotional traits (CU) are defined by low responsiveness to, and unfeeling disregard for the emotions of others. There is controversial evidence, however, that children with high CU traits can demonstrate affective responsiveness under certain conditions, namely those associated with attachment threat. We tested this using 'fear + amusing' and 'attachment rich' stimuli from the Lion King film. Of N = 76, 4-14 years old children, 56 were clinic-referred children divided into high and low CU traits groups, and 20 children were drawn from the community. Participants watched film sequences of fearful, attachment-related and neutral stimuli and their affective responses and emotion-regulation strategies were coded by independent observers. Children in the high CU traits group were able to disengage from the fear stimuli by showing more 'happiness' to a brief slapstick interlude. In the attachment scenario, high CU children expressed similar or trends toward higher emotional responses and emotion regulation strategies, compared to low-CU children and control children. The results support the idea that high CU children may have the potential for emotional responsiveness to complex emotional stimuli in attachment contexts. Implications of these results for the development of interventions are discussed.

  4. Authorization basis supporting documentation for plutonium finishing plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, J.P., Fluor Daniel Hanford

    1997-03-05

    The identification and definition of the authorization basis for the Plutonium Finishing Plant (PFP) facility and operations are essential for compliance to DOE Order 5480.21, Unreviewed Safety Questions. The authorization basis, as defined in the Order, consists of those aspects of the facility design basis, i.e., the structures, systems and components (SSCS) and the operational requirements that are considered to be important to the safety of operations and are relied upon by DOE to authorize operation of the facility. These facility design features and their function in various accident scenarios are described in WHC-SD-CP-SAR-021, Plutonium Finishing Plant Final Safety Analysismore » Report (FSAR), Chapter 9, `Accident Analysis.` Figure 1 depicts the relationship of the Authorization Basis to its components and other information contained in safety documentation supporting the Authorization Basis. The PFP SSCs that are important to safety, collectively referred to as the `Safety Envelope` are discussed in various chapters of the FSAR and in WHC-SD-CP-OSR-010, Plutonium Finishing Plant Operational Safety Requirements. Other documents such as Criticality Safety Evaluation Reports (CSERS) address and support some portions of the Authorization Basis and Safety Envelope.« less

  5. 49 CFR 385.321 - What failures of safety management practices disclosed by the safety audit will result in a...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... disqualified by a State, has lost the right to operate a CMV in a State or who is disqualified to operate a... violation refers to a driver operating a CMV as defined under § 383.5. 9. § 387.7(a)—Operating a motor... Single occurrence. This violation refers to a driver operating a CMV as defined under § 390.5. 13. § 395...

  6. 49 CFR 385.321 - What failures of safety management practices disclosed by the safety audit will result in a...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... disqualified by a State, has lost the right to operate a CMV in a State or who is disqualified to operate a... violation refers to a driver operating a CMV as defined under § 383.5. 9. § 387.7(a)—Operating a motor... Single occurrence. This violation refers to a driver operating a CMV as defined under § 390.5. 13. § 395...

  7. 49 CFR 385.321 - What failures of safety management practices disclosed by the safety audit will result in a...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... disqualified by a State, has lost the right to operate a CMV in a State or who is disqualified to operate a... violation refers to a driver operating a CMV as defined under § 383.5. 9. § 387.7(a)—Operating a motor... Single occurrence. This violation refers to a driver operating a CMV as defined under § 390.5. 13. § 395...

  8. Tailored scenarios for streamflow climate change impacts based on the perturbation of precipitation and evapotranspiration

    NASA Astrophysics Data System (ADS)

    Ntegeka, Victor; Willems, Patrick; Baguis, Pierre; Roulin, Emmanuel

    2015-04-01

    It is advisable to account for a wide range of uncertainty by including the maximum possible number of climate models and scenarios for future impacts. As this is not always feasible, impact assessments are inevitably performed with a limited set of scenarios. The development of tailored scenarios is a challenge that needs more attention as the number of available climate change simulations grows. Whether these scenarios are representative enough for climate change impacts is a question that needs addressing. This study presents a methodology of constructing tailored scenarios for assessing runoff flows including extreme conditions (peak flows) from an ensemble of future climate change signals of precipitation and potential evapotranspiration (ETo) derived from the climate model simulations. The aim of the tailoring process is to formulate scenarios that can optimally represent the uncertainty spectrum of climate scenarios. These tailored scenarios have the advantage of being few in number as well as having a clear description of the seasonal variation of the climate signals, hence allowing easy interpretation of the implications of future changes. The tailoring process requires an analysis of the hydrological impacts from the likely future change signals from all available climate model simulations in a simplified (computationally less expensive) impact model. Historical precipitation and ETo time series are perturbed with the climate change signals based on a quantile perturbation technique that accounts for the changes in extremes. For precipitation, the change in wetday frequency is taken into account using a markov-chain approach. Resulting hydrological impacts from the perturbed time series are then subdivided into high, mean and low hydrological impacts using a quantile change analysis. From this classification, the corresponding precipitation and ETo change factors are back-tracked on a seasonal basis to determine precipitation-ETo covariation. The established precipitation-ETo covariations are used to inform the scenario construction process. Additionally, the back-tracking of extreme flows from driving scenarios allows for a diagnosis of the physical responses to climate change scenarios. The method is demonstrated through the application of scenarios from 10 Regional Climate Models,21 Global Climate Models and selected catchments in central Belgium. Reference Ntegeka, V., Baguis, P., Roulin, E., & Willems, P. (2014). Developing tailored climate change scenarios for hydrological impact assessments. Journal of Hydrology, 508, 307-321.

  9. An equilibrium analysis of the land use structure in the Yunnan Province, China

    NASA Astrophysics Data System (ADS)

    van Aken, H. M.; van Veldhoven, A. K.; Veth, C.; de Ruijter, W. P. M.; van Leeuwen, P. J.; Drijfhout, S. S.; Whittle, C. P.; Rouault, M.

    2014-06-01

    Global land use structure is changing rapidly due to unceasing population growth and accelerated urbanization, which leads to fierce competition between the rigid demand for built-up area and the protection of cultivated land, forest, and grassland. It has been a great challenge to realize the sustainable development of land resources. Based on a computable general equilibrium model of land use change with a social accounting matrix dataset, this study implemented an equilibrium analysis of the land use structure in the Yunnan Province during the period of 2008-2020 under three scenarios, the baseline scenario, low TFP (total factor productivity) scenario, and high TFP scenario. The results indicated that under all three scenarios, area of cultivated land declined significantly along with a remarkable expansion of built-up area, while areas of forest, grassland, and unused land increased slightly. The growth rate of TFP had first negative and then positive effects on the expansion of built-up area and decline of cultivated land as it increased. Moreover, the simulated changes of both cultivated land and built-up area were the biggest under the low TFP scenario, and far exceeded the limit in the Overall Plan for Land Utilization in the Yunnan Province in 2020. The scenario-based simulation results are of important reference value for policy-makers in making land use decisions, balancing the fierce competition between the protection of cultivated land and the increasing demand for built-up area, and guaranteeing food security, ecological security, and the sustainable development of land resources.

  10. An equilibrium analysis of the land use structure in the Yunnan Province, China

    NASA Astrophysics Data System (ADS)

    Luo, Jiao; Zhan, Jinyan; Lin, Yingzhi; Zhao, Chunhong

    2014-09-01

    Global land use structure is changing rapidly due to unceasing population growth and accelerated urbanization, which leads to fierce competition between the rigid demand for built-up area and the protection of cultivated land, forest, and grassland. It has been a great challenge to realize the sustainable development of land resources. Based on a computable general equilibrium model of land use change with a social accounting matrix dataset, this study implemented an equilibrium analysis of the land use structure in the Yunnan Province during the period of 2008-2020 under three scenarios, the baseline scenario, low TFP (total factor productivity) scenario, and high TFP scenario. The results indicated that under all three scenarios, area of cultivated land declined significantly along with a remarkable expansion of built-up area, while areas of forest, grassland, and unused land increased slightly. The growth rate of TFP had first negative and then positive effects on the expansion of built-up area and decline of cultivated land as it increased. Moreover, the simulated changes of both cultivated land and built-up area were the biggest under the low TFP scenario, and far exceeded the limit in the Overall Plan for Land Utilization in the Yunnan Province in 2020. The scenario-based simulation results are of important reference value for policy-makers in making land use decisions, balancing the fierce competition between the protection of cultivated land and the increasing demand for built-up area, and guaranteeing food security, ecological security, and the sustainable development of land resources.

  11. Co-benefits of air quality and climate change policies on air quality of the Mediterranean

    NASA Astrophysics Data System (ADS)

    Pozzoli, Luca; Mert Gokturk, Ozan; Unal, Alper; Kindap, Tayfun; Janssens-Maenhout, Greet

    2015-04-01

    The Mediterranean basin is one of the regions of the world where significant impacts due to climate changes are predicted to occur in the future. Observations and model simulations are used to provide to the policy makers scientifically based estimates of the necessity to adjust national emission reductions needed to achieve air quality objectives in the context of a changing climate, which is not only driven by GHGs, but also by short lived climate pollutants, such as tropospheric ozone and aerosols. There is an increasing interest and need to design cost-benefit emission reduction strategies, which could improve both regional air quality and global climate change. In this study we used the WRF-CMAQ air quality modelling system to quantify the contribution of anthropogenic emissions to ozone and particulate matter concentrations in Europe and the Eastern Mediterranean and to understand how this contribution could change in different future scenarios. We have investigated four different future scenarios for year 2050 defined during the European Project CIRCE: a "business as usual" scenario (BAU) where no or just actual measures are taken into account; an "air quality" scenario (BAP) which implements the National Emission Ceiling directive 2001/81/EC member states of the European Union (EU-27); a "climate change" scenario (CC) which implements global climate policies decoupled from air pollution policies; and an "integrated air quality and climate policy" scenario (CAP) which explores the co-benefit of global climate and EU-27 air pollution policies. The BAP scenario largely decreases summer ozone concentrations over almost the entire continent, while the CC and CAP scenarios similarly determine lower decreases in summer ozone but extending all over the Mediterranean, the Middle East countries and Russia. Similar patterns are found for winter PM concentrations; BAP scenario improves pollution levels only in the Western EU countries, and the CAP scenario determines the largest PM reductions over the entire continent and the Mediterranean basin.

  12. Human health risk assessment case study: an abandoned metal smelter site in Poland.

    PubMed

    Wcisło, Eleonora; Ioven, Dawn; Kucharski, Rafal; Szdzuj, Jerzy

    2002-05-01

    United States Environmental Protection Agency methodologies for human health risk assessment (HRA) were applied in a Brownfields Demonstration Project on the Warynski smelter site (WSS), an abandoned industrial site at Piekary Slaskie town, Upper Silesia, Poland. The HRA included the baseline risk assessment (BRA) and the development of risk-based preliminary remedial goals (RBPRGs). The HRA focused on surface area covered with waste materials, which were evaluated with regard to the potential risks they may pose to humans. Cadmium, copper, iron, manganese, lead, and zinc were proposed as the contaminants of potential concern (COPCs) at WSS based on archive data on chemical composition of waste located on WSS. For the defined future land use patterns, the industrial (I) and recreational (II) exposure scenarios were assumed and evaluated. The combined hazard index for all COPCs was 3.1E+00 for Scenario I and 3.2E+00 for Scenario II. Regarding potential carcinogenic risks associated with the inhalation route, only cadmium was a contributor, with risks of 1.6E-06 and 2.6E-07 for Scenario I and Scenario II, respectively. The results of the BRA indicated that the potential health risks at WSS were mainly associated with waste material exposure to cadmium (industrial and recreational scenarios) and lead (industrial scenario). RBPRGs calculated under the industrial scenario were 1.17E+03 and 1.62E+03 mg/kg for cadmium and lead, respectively. The RBPRG for cadmium was 1.18E+03 mg/kg under the recreational scenario. The BRA results, as well as RBCs, are comparable for both scenarios, so it is impossible to prioritize land use patterns for WSS based on these results. For choosing a future land use pattern or an appropriate redevelopment option, different factors would be decisive in the decision-making process, e.g., social, market needs, technical feasibility and costs of redevelopment actions or acceptance of local community.

  13. Climate Projections over Mediterranean Basin under RCP8.5 and RCP4.5 emission scenarios

    NASA Astrophysics Data System (ADS)

    Ilhan, Asli; Ünal, Yurdanur S.

    2017-04-01

    Climate Projections over Mediterranean Basin under RCP8.5 and RCP4.5 emission scenarios A. ILHAN ve Y. S. UNAL Istanbul Technical University, Department of Meteorology In the study, 50 km resolution downscaled results of two different Earth System Models (ESM) HadGEM2-ES and MPI-ESM with regional climate model of RegCM are used to estimate present and future climate conditions over Mediterranean Basin. The purpose of this study is to compare the projections of two ESMs under Representative Concentration Pathways 4.5 (RCP4.5) and 8.5 (RCP8.5) over the region of interest seasonally and annually with 50 km resolution. Temperature and precipitation parameters for reference period (1971-2000) and future (2015-2100) are analyzed. The average temperature and total precipitation distributions of each downscaled ESM simulations were compared with observation data (Climate Research Unit-CRU data) to explore the capability of each model for the representation of the current climate. According to reference period values of CRU, HadGEM2-ES and MPI-ESM, it is seen that both models are warmer and wetter than observations and have positive temperature biases only around Caspian Sea and positive precipitation biases over Eastern and Central Europe. The future projections (from 2015 to 2100) of HadGEM2-ES and MPI-ESM-MR simulations under RCP4.5 and RCP8.5 emission scenarios are compared with reference period (from 1971 to 2000) and analyzed for temperature and precipitation parameters. The downscaled HadGEM2-ES forced by RCP8.5 scenario produces higher temperatures than the MPI-ESM-MR. The reasons of this warming can be sensitivity of HadGEM2-ES to greenhouse gases and high radiative forcing (+8.5 W/m2). On the other hand, MPI-ESM produce more precipitation than HadGEM2-ES. In order to analyze regional responses of the climate model chains, five main regions are selected which are Turkey, Central Europe, Western Europe, Eastern Europe and North Africa. The average biases of the HadGEM2-ES+RegCM and MPI-ESM-MR+RegCM model chains are also calculated for temperature and precipitation variables, and future expectations in each region are discussed under RCP4.5 and RCP8.5 scenarios. According to the regional analysis, North Africa is the warmest region for HadGEM2-ES and MPI-ESM-MR, and Central Europe warms up similar to North Africa in MPI-ESM-MR coupled simulations under both RCPs. In addition, Eastern Europe is expected to be the wettest region in both models and in both emission scenarios. On the other hand, the driest conditions are expected over Western Europe for MPI-ESM-MR and over Turkey for HadGEM2-ES under RCPs.

  14. Overview of the EUROfusion Medium Size Tokamak scientific program

    NASA Astrophysics Data System (ADS)

    Bernert, Matthias; Bolzonella, Tommaso; Coda, Stefano; Hakola, Antti; Meyer, Hendrik; Eurofusion Mst1 Team; Tcv Team; Mast-U Team; ASDEX Upgrade Team

    2017-10-01

    Under the EUROfusion MST1 program, coordinated experiments are conducted at three European medium sized tokamaks (ASDEX Upgrade, TCV and MAST-U). It complements the JET program for preparing a safe and efficient operation for ITER and DEMO. Work under MST1 benefits from cross-machine comparisons but also makes use of the unique capabilities of each device. For the 2017/2018 campaign 25 topic areas were defined targeting three main objectives: 1) Development towards an edge and wall compatible H-mode scenario with small or no ELMs. 2) Investigation of disruptions in order to achieve better predictions and improve avoidance or mitigation schemes. 3) Exploring conventional and alternative divertor configurations for future high P/R scenarios. This contribution will give an overview of the work done under MST1 exemplified by the highlight results for each top objective from the last campaigns, such as evaluation of natural small ELM scenarios, runaway mitigation and control, assessment of detachment in alternative divertor configurations and highly radiative scenarios. See author list of ``H. Meyer et al. 2017 Nucl. Fusion 57, 102014''.

  15. Structural technology challenges for evolutionary growth of Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Doiron, Harold H.

    1990-01-01

    A proposed evolutionary growth scenario for Space Station Freedom was defined recently by a NASA task force created to study requirements for a Human Exploration Initiative. The study was an initial response to President Bush's July 20, 1989 proposal to begin a long range program of human exploration of space including a permanently manned lunar base and a manned mission to Mars. This growth scenario evolves Freedom into a critical transportation node to support lunar and Mars missions. The growth scenario begins with the Assembly Complete configuration and adds structure, power, and facilities to support a Lunar Transfer Vehicle (LTV) verification flight. Evolutionary growth continues to support expendable, then reusable LTV operations, and finally, LTV and Mars Transfer Vehicle (MTV) operations. The significant structural growth and additional operations creating new loading conditions will present new technological and structural design challenges in addition to the considerable technology requirements of the baseline Space Station Freedom program. Several structural design and technology issues of the baseline program are reviewed and related technology development required by the growth scenario is identified.

  16. Comparing long-term projections of the space debris environment to real world data - Looking back to 1990

    NASA Astrophysics Data System (ADS)

    Radtke, Jonas; Stoll, Enrico

    2016-10-01

    Long-term projections of the space debris environment are commonly used to assess the trends within different scenarios for the assumed future development of spacefaring. General scenarios investigated include business-as-usual cases in which spaceflight is performed as today and mitigation scenarios, assuming the implementation of Space Debris Mitigation Guidelines at different advances or the effectiveness of more drastic measures, such as active debris removal. One problem that always goes along with the projection of a system's behaviour in the future is that affecting parameters, such as the launch rate, are unpredictable. It is common to look backwards and re-model the past in other fields of research. This is a rather difficult task for spaceflight as it is still quite young, and furthermore mostly influenced by drastic politic changes, as the break-down of the Soviet Union in the end of the 1980s. Furthermore, one major driver of the evolution of the number of on-orbit objects turn out to be collisions between objects. As of today, these collisions are, fortunately, very rare and therefore, a real-world-data modelling approach is difficult. Nevertheless, since the end of the cold war more than 20 years of a comparably stable evolution of spaceflight activities have passed. For this study, this period is used in a comparison between the real evolution of the space debris environment and that one projected using the Institute of Space System's in-house tool for long-term assessment LUCA (Long-Term Utility for Collision Analysis). Four different scenarios are investigated in this comparison; all of them have the common starting point of using an initial population for 1st May 1989. The first scenario, which serves as reference, is simply taken from MASTER-2009. All launch and mission related objects from the Two Line Elements (TLE) catalogue and other available sources are included. All events such as explosion and collision events have been re-modelled as close to the reality as possible and included in the corresponding population. They furthermore have been correlated with TLE catalogue objects. As the latest available validated population snapshot for MASTER is May 2009, this epoch is chosen as endpoint for the simulations. The second scenario uses the knowledge of the past 25 years to perform a Monte-Carlo simulation of the evolution of the space debris environment. Necessary input parameters such as explosions per year, launch rates, and the evolution of the solar cycle are taken from their real evolutions. The third scenario goes a step further by only extracting mean numbers and trends from inputs such as launch and explosion rates and applying them. The final and fourth scenario aims to disregarding all knowledge of the time frame under investigation and inputs are determined based on data available in 1989 only. Results are compared to the reference scenario of the space debris environment.

  17. Sustainable Land Management's potential for climate change adaptation in Mediterranean environments: a regional scale assessment

    NASA Astrophysics Data System (ADS)

    Eekhout, Joris P. C.; de Vente, Joris

    2017-04-01

    Climate change has strong implications for many essential ecosystem services, such as provision of drinking and irrigation water, soil erosion and flood control. Especially large impacts are expected in the Mediterranean, already characterised by frequent floods and droughts. The projected higher frequency of extreme weather events under climate change will lead to an increase of plant water stress, reservoir inflow and sediment yield. Sustainable Land Management (SLM) practices are increasingly promoted as climate change adaptation strategy and to increase resilience against extreme events. However, there is surprisingly little known about their impacts and trade-offs on ecosystem services at regional scales. The aim of this research is to provide insight in the potential of SLM for climate change adaptation, focusing on catchment-scale impacts on soil and water resources. We applied a spatially distributed hydrological model (SPHY), coupled with an erosion model (MUSLE) to the Segura River catchment (15,978 km2) in SE Spain. We run the model for three periods: one reference (1981-2000) and two future scenarios (2031-2050 and 2081-2100). Climate input data for the future scenarios were based on output from 9 Regional Climate Models and for different emission scenarios (RCP 4.5 and RCP 8.5). Realistic scenarios of SLM practices were developed based on a local stakeholder consultation process. The evaluated SLM scenarios focussed on reduced tillage and organic amendments under tree and cereal crops, covering 24% and 15% of the catchment, respectively. In the reference scenario, implementation of SLM at the field-scale led to an increase of the infiltration capacity of the soil and a reduction of surface runoff up to 29%, eventually reducing catchment-scale reservoir inflow by 6%. This led to a reduction of field-scale sediment yield of more than 50% and a reduced catchment-scale sediment flux to reservoirs of 5%. SLM was able to fully mitigate the effect of climate change at the field-scale and partly at the catchment-scale. Therefore, we conclude that large-scale adoption of SLM can effectively contribute to climate change adaptation by increasing the soil infiltration capacity, the soil water retention capacity and soil moisture content in the rootzone, leading to less crop stress. These findings of regional scale impacts of SLM are of high relevance for land-owners, -managers and policy makers to design effective climate change adaptation strategies.

  18. It takes chutzpah: oncology nurse leaders.

    PubMed

    Green, E

    1999-01-01

    Chutzpah, according to the Oxford Dictionary of Current English (1996) is a slang term from the Yiddish language which means shameless audacity. Chutzpah has been used to identify people with courage who take on situations that others avoid and somehow achieve the impossible. Tim Porter-O'Grady (1997) recently wrote that management is dead, and has been replaced by process leadership. Health care organizations have made shifts from hierarchical structures to process or program models where people have dual/multiple reporting/communication relationship. In this new orientation, management functions of controlling, directing, organizing and disciplining are replaced by process leadership functions of coordinating, facilitating, linking and sustaining (Porter O'Grady, 1997). Herein lies the challenge for oncology nurse leaders: "what lies behind us and what lies before us are tiny matters compared to what lies within us" (Ralph Waldo Emerson). Leadership is not a function of job title. The evidence for this is clear in current practice.... There are no/few positions of nurse leaders. Titles have changed to eliminate the professional discipline, and reflect a non-descript orientation. The new titles are process leaders, program leaders, professional practice leaders. Nurse leaders need new points of reference to take in the challenges of influencing, facilitating and linking. Those points of reference are: principle-centered leadership, integrity and chutzpah. This presentation will focus on examining current thinking, defining key characteristics and attributes, and using scenarios to illustrate the impact of leadership. We, as leaders in oncology nursing, must use chutzpah to make positive change and long-term gains for patient care and the profession of nursing.

  19. Thermo-mechanical Properties of Upper Jurassic (Malm) Carbonate Rock Under Drained Conditions

    NASA Astrophysics Data System (ADS)

    Pei, Liang; Blöcher, Guido; Milsch, Harald; Zimmermann, Günter; Sass, Ingo; Huenges, Ernst

    2018-01-01

    The present study aims to quantify the thermo-mechanical properties of Neuburger Bankkalk limestone, an outcrop analog of the Upper Jurassic carbonate formation (Germany), and to provide a reference for reservoir rock deformation within future enhanced geothermal systems located in the Southern German Molasse Basin. Experiments deriving the drained bulk compressibility C were performed by cycling confining pressure p c between 2 and 50 MPa at a constant pore pressure p p of 0.5 MPa after heating the samples to defined temperatures between 30 and 90 °C. Creep strain was then measured after each loading and unloading stage, and permeability k was obtained after each creep strain measurement. The drained bulk compressibility increased with increasing temperature and decreased with increasing differential pressure p d = p c - p p showing hysteresis between the loading and unloading stages above 30 °C. The apparent values of the indirectly calculated Biot coefficient α ind containing contributions from inelastic deformation displayed the same temperature and pressure dependencies. The permeability k increased immediately after heating and the creep rates were also temperature dependent. It is inferred that the alteration of the void space caused by temperature changes leads to the variation of rock properties measured under isothermal conditions while the load cycles applied under isothermal conditions yield additional changes in pore space microstructure. The experimental results were applied to a geothermal fluid production scenario to constrain drawdown and time-dependent effects on the reservoir, overall, to provide a reference for the hydromechanical behavior of geothermal systems in carbonate, and more specifically, in Upper Jurassic lithologies.

  20. Formal Consistency Verification of Deliberative Agents with Respect to Communication Protocols

    NASA Technical Reports Server (NTRS)

    Ramirez, Jaime; deAntonio, Angelica

    2004-01-01

    The aim of this paper is to show a method that is able to detect inconsistencies in the reasoning carried out by a deliberative agent. The agent is supposed to be provided with a hybrid Knowledge Base expressed in a language called CCR-2, based on production rules and hierarchies of frames, which permits the representation of non-monotonic reasoning, uncertain reasoning and arithmetic constraints in the rules. The method can give a specification of the scenarios in which the agent would deduce an inconsistency. We define a scenario to be a description of the initial agent s state (in the agent life cycle), a deductive tree of rule firings, and a partially ordered set of messages and/or stimuli that the agent must receive from other agents and/or the environment. Moreover, the method will make sure that the scenarios will be valid w.r.t. the communication protocols in which the agent is involved.

  1. SAFRR Tsunami Scenarios and USGS-NTHMP Collaboration

    NASA Astrophysics Data System (ADS)

    Ross, S.; Wood, N. J.; Cox, D. A.; Jones, L.; Cheung, K. F.; Chock, G.; Gately, K.; Jones, J. L.; Lynett, P. J.; Miller, K.; Nicolsky, D.; Richards, K.; Wein, A. M.; Wilson, R. I.

    2015-12-01

    Hazard scenarios provide emergency managers and others with information to help them prepare for future disasters. The SAFRR Tsunami Scenario, published in 2013, modeled a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. It presented the modeled inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the scenario tsunami. The intended users were those responsible for making mitigation decisions before and those who need to make rapid decisions during future tsunamis. It provided the basis for many exercises involving, among others, NOAA, the State of Washington, several counties in California, and the National Institutes of Health. The scenario led to improvements in the warning protocol for southern California and highlighted issues that led to ongoing work on harbor and marina safety. Building on the lessons learned in the SAFRR Tsunami Scenario, another tsunami scenario is being developed with impacts to Hawaii and to the source region in Alaska, focusing on the evacuation issues of remote communities with primarily shore parallel roads, and also on the effects of port closures. Community exposure studies in Hawaii (Ratliff et al., USGS-SIR, 2015) provided background for selecting these foci. One complicated and important aspect of any hazard scenario is defining the source event. The USGS is building collaborations with the National Tsunami Hazard Mitigation Program (NTHMP) to consider issues involved in developing a standardized set of tsunami sources to support hazard mitigation work. Other key USGS-NTHMP collaborations involve population vulnerability and evacuation modeling.

  2. The Attentional Demand of Automobile Driving Revisited: Occlusion Distance as a Function of Task-Relevant Event Density in Realistic Driving Scenarios.

    PubMed

    Kujala, Tuomo; Mäkelä, Jakke; Kotilainen, Ilkka; Tokkonen, Timo

    2016-02-01

    We studied the utility of occlusion distance as a function of task-relevant event density in realistic traffic scenarios with self-controlled speed. The visual occlusion technique is an established method for assessing visual demands of driving. However, occlusion time is not a highly informative measure of environmental task-relevant event density in self-paced driving scenarios because it partials out the effects of changes in driving speed. Self-determined occlusion times and distances of 97 drivers with varying backgrounds were analyzed in driving scenarios simulating real Finnish suburban and highway traffic environments with self-determined vehicle speed. Occlusion distances varied systematically with the expected environmental demands of the manipulated driving scenarios whereas the distributions of occlusion times remained more static across the scenarios. Systematic individual differences in the preferred occlusion distances were observed. More experienced drivers achieved better lane-keeping accuracy than inexperienced drivers with similar occlusion distances; however, driving experience was unexpectedly not a major factor for the preferred occlusion distances. Occlusion distance seems to be an informative measure for assessing task-relevant event density in realistic traffic scenarios with self-controlled speed. Occlusion time measures the visual demand of driving as the task-relevant event rate in time intervals, whereas occlusion distance measures the experienced task-relevant event density in distance intervals. The findings can be utilized in context-aware distraction mitigation systems, human-automated vehicle interaction, road speed prediction and design, as well as in the testing of visual in-vehicle tasks for inappropriate in-vehicle glancing behaviors in any dynamic traffic scenario for which appropriate individual occlusion distances can be defined. © 2015, Human Factors and Ergonomics Society.

  3. Analysis of Future Streamflow Regimes under Global Change Scenarios in Central Chile for Ecosystem Sustainability

    NASA Astrophysics Data System (ADS)

    Henriquez Dole, L. E.; Gironas, J. A.; Vicuna, S.

    2015-12-01

    Given the critical role of the streamflow regime for ecosystem sustainability, modeling long term effects of climate change and land use change on streamflow is important to predict possible impacts in stream ecosystems. Because flow duration curves are largely used to characterize the streamflow regime and define indices of ecosystem health, they were used to represent and analyze in this study the stream regime in the Maipo River Basin in Central Chile. Water and Environmental Assessment and Planning (WEAP) model and the Plant Growth Model (PGM) were used to simulate water distribution, consumption in rural areas and stream flows on a weekly basis. Historical data (1990-2014), future land use scenarios (2030/2050) and climate change scenarios were included in the process. Historical data show a declining trend in flows mainly by unprecedented climatic conditions, increasing interest among users on future streamflow scenarios. In the future, under an expected decline in water availability coupled with changes in crop water demand, water users will be forced to adapt by changing water allocation rules. Such adaptation actions would in turns affect the streamflow regime. Future scenarios for streamflow regime show dramatic changes in water availability and temporal distribution. Annual weekly mean flows can reduce in 19% in the worst scenario and increase in 3.3% in the best of them, and variability in streamflow increases nearly 90% in all scenarios under evaluation. The occurrence of maximum and minimum monthly flows changes, as June instead of July becomes the driest month, and December instead of January becomes the month with maximum flows. Overall, results show that under future scenarios streamflow is affected and altered by water allocation rules to satisfy water demands, and thus decisions will need to consider the streamflow regime (and habitat) in order to be sustainable.

  4. A Visual Aid to Decision-Making for People with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Bailey, Rebecca; Willner, Paul; Dymond, Simon

    2011-01-01

    Previous studies have shown that people with mild intellectual disabilities have difficulty in "weighing-up" information, defined as integrating information from two different sources for the purpose of reaching a decision. This was demonstrated in two very different procedures, temporal discounting and a scenario-based financial…

  5. Uncertainty of climate change impacts on soil erosion from cropland in central Oklahoma

    USDA-ARS?s Scientific Manuscript database

    Impacts of climate change on soil erosion and the potential need for additional conservation actions are typically estimated by applying a hydrologic and soil erosion model under present and future climate conditions defined by an emission scenario. Projecting future climate conditions harbors sever...

  6. Discussing and Defining Sexual Assault: A Classroom Activity

    ERIC Educational Resources Information Center

    Franiuk, Renae

    2007-01-01

    The author devised a classroom activity that facilitates discussion and increases awareness about sexual assault. Students read scenarios involving sexual situations that varied in ambiguity, then labeled whether the situations involved a sexual assault. Students also gave their definitions of sexual assault and completed an evaluation of the…

  7. The "Kobayashi Maru" Meeting: High-Fidelity Experiential Learning

    ERIC Educational Resources Information Center

    Bruni-Bossio, Vincent; Willness, Chelsea

    2016-01-01

    The "Kobayashi Maru" is a training simulation that has its roots in the Star Trek series notable for its defining characteristic as a no-win scenario with no "correct" resolution and where the solution actually involves redefining the problem. Drawing upon these characteristics, we designed a board meeting simulation for an…

  8. IT Professionals' Competences: High School Students' Views

    ERIC Educational Resources Information Center

    Garcia-Crespo, Angel; Colomo-Palacios, Ricardo; Gomez-Berbis, Juan Miguel; Tovar-Caro, Edmundo

    2009-01-01

    During last few years, the competential paradigm has become a standard for modern Human Resources Management. The importance and the impact of this concept have led higher education institutions to adopt this concept in the definition of educational resources. In this scenario, knowing which competencies and characteristics define professionals in…

  9. Defining the subjective experience of workload

    NASA Technical Reports Server (NTRS)

    Hart, S. G.; Childress, M. E.; Bortolussi, M.

    1981-01-01

    Flight scenarios that represent different types and levels of pilot workload are needed in order to conduct research about, and develop measures of, pilot workload. In order to be useful, however, the workload associated with such scenarios and the component tasks must be determined independently. An initial study designed to provide such information was conducted by asking a panel of general aviation pilots to evaluate flight-related tasks for the overall, perceptual, physical, and cognitive workload they impose. These ratings will provide the nucleus for a data base of flight-related primary tasks that have been independently rated for workload to use in workload assessment research.

  10. How Researchers Define Vulnerable Populations in HIV/AIDS Clinical Trials

    PubMed Central

    Lo, Bernard; Strauss, Ronald P.; Eron, Joseph; Gifford, Allen L.

    2010-01-01

    In this study, we interviewed researchers, asking them to define vulnerable populations in HIV/AIDS clinical trials, and provide feedback on the federal regulations for three vulnerable populations. Interview data informed a conceptual framework, and were content analyzed to identify acceptability or disagreement with the regulations. Beginning with several characteristics of vulnerable enrollees identified by researchers, the conceptual framework illustrates possible scenarios of how enrollees could be considered vulnerable in clinical research. Content analysis identified barriers affecting HIV/AIDS researchers’ ability to conduct clinical trials with pregnant women, prisoners, and children, for which the regulations specify additional protections. This study challenges current thinking about federal regulations’ group-based approach to defining vulnerable populations. PMID:20721614

  11. First Simulations of Designing Stratospheric Sulfate Aerosol Geoengineering to Meet Multiple Simultaneous Climate Objectives

    NASA Astrophysics Data System (ADS)

    Kravitz, Ben; MacMartin, Douglas G.; Mills, Michael J.; Richter, Jadwiga H.; Tilmes, Simone; Lamarque, Jean-Francois; Tribbia, Joseph J.; Vitt, Francis

    2017-12-01

    We describe the first simulations of stratospheric sulfate aerosol geoengineering using multiple injection locations to meet multiple simultaneous surface temperature objectives. Simulations were performed using CESM1(WACCM), a coupled atmosphere-ocean general circulation model with fully interactive stratospheric chemistry, dynamics (including an internally generated quasi-biennial oscillation), and a sophisticated treatment of sulfate aerosol formation, microphysical growth, and deposition. The objectives are defined as maintaining three temperature features at their 2020 levels against a background of the RCP8.5 scenario over the period 2020-2099. These objectives are met using a feedback mechanism in which the rate of sulfur dioxide injection at each of the four locations is adjusted independently every year of simulation. Even in the presence of uncertainties, nonlinearities, and variability, the objectives are met, predominantly by SO2 injection at 30°N and 30°S. By the last year of simulation, the feedback algorithm calls for a total injection rate of 51 Tg SO2 per year. The injections are not in the tropics, which results in a greater degree of linearity of the surface climate response with injection amount than has been found in many previous studies using injection at the equator. Because the objectives are defined in terms of annual mean temperature, the required geongineering results in "overcooling" during summer and "undercooling" during winter. The hydrological cycle is also suppressed as compared to the reference values corresponding to the year 2020. The demonstration we describe in this study is an important step toward understanding what geoengineering can do and what it cannot do.

  12. The AP1000{sup R} nuclear power plant innovative features for extended station blackout mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vereb, F.; Winters, J.; Schulz, T.

    2012-07-01

    Station Blackout (SBO) is defined as 'a condition wherein a nuclear power plant sustains a loss of all offsite electric power system concurrent with turbine trip and unavailability of all onsite emergency alternating current (AC) power system. Station blackout does not include the loss of available AC power to buses fed by station batteries through inverters or by alternate AC sources as defined in this section, nor does it assume a concurrent single failure or design basis accident...' in accordance with Reference 1. In this paper, the innovative features of the AP1000 plant design are described with their operation inmore » the scenario of an extended station blackout event. General operation of the passive safety systems are described as well as the unique features which allow the AP1000 plant to cope for at least 7 days during station blackout. Points of emphasis will include: - Passive safety system operation during SBO - 'Fail-safe' nature of key passive safety system valves; automatically places the valve in a conservatively safe alignment even in case of multiple failures in all power supply systems, including normal AC and battery backup - Passive Spent Fuel Pool cooling and makeup water supply during SBO - Robustness of AP1000 plant due to the location of key systems, structures and components required for Safe Shutdown - Diverse means of supplying makeup water to the Passive Containment Cooling System (PCS) and the Spent Fuel Pool (SFP) through use of an engineered, safety-related piping interface and portable equipment, as well as with permanently installed onsite ancillary equipment. (authors)« less

  13. Options for national parks and reserves for adapting to climate change.

    PubMed

    Baron, Jill S; Gunderson, Lance; Allen, Craig D; Fleishman, Erica; McKenzie, Donald; Meyerson, Laura A; Oropeza, Jill; Stephenson, Nate

    2009-12-01

    Past and present climate has shaped the valued ecosystems currently protected in parks and reserves, but future climate change will redefine these conditions. Continued conservation as climate changes will require thinking differently about resource management than we have in the past; we present some logical steps and tools for doing so. Three critical tenets underpin future management plans and activities: (1) climate patterns of the past will not be the climate patterns of the future; (2) climate defines the environment and influences future trajectories of the distributions of species and their habitats; (3) specific management actions may help increase the resilience of some natural resources, but fundamental changes in species and their environment may be inevitable. Science-based management will be necessary because past experience may not serve as a guide for novel future conditions. Identifying resources and processes at risk, defining thresholds and reference conditions, and establishing monitoring and assessment programs are among the types of scientific practices needed to support a broadened portfolio of management activities. In addition to the control and hedging management strategies commonly in use today, we recommend adaptive management wherever possible. Adaptive management increases our ability to address the multiple scales at which species and processes function, and increases the speed of knowledge transfer among scientists and managers. Scenario planning provides a broad forward-thinking framework from which the most appropriate management tools can be chosen. The scope of climate change effects will require a shared vision among regional partners. Preparing for and adapting to climate change is as much a cultural and intellectual challenge as an ecological challenge.

  14. Options for national parks and reserves for adapting to climate change

    USGS Publications Warehouse

    Baron, Jill S.; Gunderson, Lance; Allen, Craig D.; Fleishman, Erica; McKenzie, Donald; Meyerson, Laura A.; Oropeza, Jill; Stephenson, Nathan L.

    2009-01-01

    Past and present climate has shaped the valued ecosystems currently protected in parks and reserves, but future climate change will redefine these conditions. Continued conservation as climate changes will require thinking differently about resource management than we have in the past; we present some logical steps and tools for doing so. Three critical tenets underpin future management plans and activities: (1) climate patterns of the past will not be the climate patterns of the future; (2) climate defines the environment and influences future trajectories of the distributions of species and their habitats; (3) specific management actions may help increase the resilience of some natural resources, but fundamental changes in species and their environment may be inevitable. Science-based management will be necessary because past experience may not serve as a guide for novel future conditions. Identifying resources and processes at risk, defining thresholds and reference conditions, and establishing monitoring and assessment programs are among the types of scientific practices needed to support a broadened portfolio of management activities. In addition to the control and hedging management strategies commonly in use today, we recommend adaptive management wherever possible. Adaptive management increases our ability to address the multiple scales at which species and processes function, and increases the speed of knowledge transfer among scientists and managers. Scenario planning provides a broad forward-thinking framework from which the most appropriate management tools can be chosen. The scope of climate change effects will require a shared vision among regional partners. Preparing for and adapting to climate change is as much a cultural and intellectual challenge as an ecological challenge.

  15. The Moral Imperative of Social Justice Leadership: A Critical Component of Effective Practice

    ERIC Educational Resources Information Center

    Rivera-McCutchen, Rosa L.

    2014-01-01

    This study examined how four principals in urban middle and senior high schools with a social justice orientation responded to hypothetical scenarios involving teacher prejudice. The principals in this study did not reference their leadership preparation programs in describing the evolution of their equity-focused leadership philosophies, nor did…

  16. Prospective Conversion: Data Transfer between Fossil and New Microcomputer Technologies in Libraries.

    ERIC Educational Resources Information Center

    Vratny-Watts, Janet; Valauskas, Edward J.

    1989-01-01

    Discusses the technological changes that will necessitate the prospective conversion of library data over the next decade and addresses the problems of converting data from obsolete personal computers to newer models that feature radically different operating systems. Three case studies are used to illustrate possible scenarios. (11 references)…

  17. Pedagogic Strategies Supporting the Use of Synchronous Audiographic Conferencing: A Review of the Literature

    ERIC Educational Resources Information Center

    de Freitas, Sara; Neumann, Tim

    2009-01-01

    Synchronous audiographic conferencing (SAC) refers to a combination of technologies for real-time communication and interaction using multiple media and modes. With an increasing institutional uptake of SAC, users require an understanding of the complex interrelations of multiple media in learning scenarios in order to support pedagogic-driven…

  18. Violent and Nonviolent Children's and Parents' Reasoning about Family and Peer Violence.

    ERIC Educational Resources Information Center

    Astor, Ron Avi; Behre, William J.

    1997-01-01

    A study compared moral reasoning patterns of 17 violent children (ages 10-13) with emotional and behavioral disorders and their aggressive parents to matched controls. When presented with family and peer violence scenarios, the violent children and parents referred more to rules prohibiting provocation rather than to rules prohibiting physical…

  19. Pre-Service Teachers' Mindset Beliefs about Student Ability

    ERIC Educational Resources Information Center

    Gutshall, C. Anne

    2014-01-01

    Introduction: We all have beliefs about our ability or intelligence. The extent to which we believe ability is malleable (growth) or stable (fixed) is commonly referred to as our mindset. This research is designed to explore pre-service teachers' mindset beliefs as well as their beliefs when applied to hypothetical student scenarios. Method:…

  20. Evidence-Based Practices to Reduce Challenging Behaviors of Young Children with Autism

    ERIC Educational Resources Information Center

    Rahn, Naomi L.; Coogle, Christan Grygas; Hanna, Alexajo; Lewellen, Traysha

    2017-01-01

    Challenging behaviors refer to those behaviors that decrease the child's ability to engage and participate in classroom routines (Dunlap, Wilson, Strain, & Lee, 2013), and therefore, the dilemma in the scenario above is common to early childhood and early childhood special education teachers due to an increase in children experiencing autism…

Top