Sample records for management process estimation

  1. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    NASA Technical Reports Server (NTRS)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  2. [Definition and stabilization of processes I. Management processes and support in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Chiva, Vicente; Gamarra, Manuela

    2015-01-01

    The implantation of total quality management models in clinical departments can better adapt to the 2009 ISO 9004 model. An essential part of implantation of these models is the establishment of processes and their stabilization. There are four types of processes: key, management, support and operative (clinical). Management processes have four parts: process stabilization form, process procedures form, medical activities cost estimation form and, process flow chart. In this paper we will detail the creation of an essential process in a surgical department, such as the process of management of the surgery waiting list.

  3. Milestones of mathematical model for business process management related to cost estimate documentation in petroleum industry

    NASA Astrophysics Data System (ADS)

    Khamidullin, R. I.

    2018-05-01

    The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.

  4. Analysis and Management of Animal Populations: Modeling, Estimation and Decision Making

    USGS Publications Warehouse

    Williams, B.K.; Nichols, J.D.; Conroy, M.J.

    2002-01-01

    This book deals with the processes involved in making informed decisions about the management of animal populations. It covers the modeling of population responses to management actions, the estimation of quantities needed in the modeling effort, and the application of these estimates and models to the development of sound management decisions. The book synthesizes and integrates in a single volume the methods associated with these themes, as they apply to ecological assessment and conservation of animal populations. KEY FEATURES * Integrates population modeling, parameter estimation and * decision-theoretic approaches to management in a single, cohesive framework * Provides authoritative, state-of-the-art descriptions of quantitative * approaches to modeling, estimation and decision-making * Emphasizes the role of mathematical modeling in the conduct of science * and management * Utilizes a unifying biological context, consistent mathematical notation, * and numerous biological examples

  5. Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes

    DTIC Science & Technology

    2015-05-01

    Information Technology and Business Process Redesign | MIT Sloan Management Review . MIT Sloan Management Review . Retrieved from http://sloanreview.mit.edu...links systems management to process execution Three Phases/ Multi-Year Effort (This Phase) Literature review Model development— Formal and...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining

  6. Assessing the accuracy of wildland fire situation analysis (WFSA) fire size and suppression cost estimates.

    Treesearch

    Geoffrey H. Donovan; Peter. Noordijk

    2005-01-01

    To determine the optimal suppression strategy for escaped wildfires, federal land managers are requiredto conduct a wildland fire situation analysis (WFSA). As part of the WFSA process, fire managers estimate final fire size and suppression costs. Estimates from 58 WFSAs conducted during the 2002 fire season are compared to actual outcomes. Results indicate that...

  7. Managing Uncertainty in Runoff Estimation with the U.S. Environmental Protection Agency National Stormwater Calculator.

    EPA Science Inventory

    The U.S. Environmental Protection Agency National Stormwater Calculator (NSWC) simplifies the task of estimating runoff through a straightforward simulation process based on the EPA Stormwater Management Model. The NSWC accesses localized climate and soil hydrology data, and opti...

  8. The association between effectiveness of the management processes and quality of health services from the perspective of the managers in the university hospitals of Ahvaz, Iran

    PubMed Central

    Faraji-Khiavi, F; Ghobadian, S; Moradi-Joo, E

    2015-01-01

    Background and Objective: Knowledge management is introduced as a key element of quality improvement in organizations. There was no such research in university hospitals of Ahvaz. This study aimed to determine the association between the effectiveness of the processes of knowledge management and the health services quality from the managers’ view in the educational hospitals of Ahvaz city. Materials and Methods: in this correlational and research, the research population consisted of 120 managers from hospitals in University of Medical Sciences Ahvaz. Due to the limited population, the census was run. Three questionnaires were used for data collection: Demographic characteristics, the effectiveness of knowledge management processes and the quality of medical services. To analyze the data, the Spearman association analysis, The Kruskal-Wallis, the Mann–Whitney U test, were used in SPSS. Results: estimation of average scoring of the effectiveness of knowledge management processes and its components were relatively appropriate. Quality of medical services was estimated as relatively appropriate. Relationship of quality of health services with the effectiveness of knowledge management processes showed a medium and positive correlation (p < 0.001). Managers with different genders showed significant differences in knowledge development and transfer (P = 0.003). Conclusion: a significant and positive association was observed between the effectiveness of knowledge management processes and health care quality. To improve the health care quality in university hospitals, managers should pay more attention to develop the cultures of innovation, encourage teamwork, and improve communication and creative thinking in the knowledge management context PMID:28316735

  9. The relationship between quality management practices and organisational performance: A structural equation modelling approach

    NASA Astrophysics Data System (ADS)

    Jamaluddin, Z.; Razali, A. M.; Mustafa, Z.

    2015-02-01

    The purpose of this paper is to examine the relationship between the quality management practices (QMPs) and organisational performance for the manufacturing industry in Malaysia. In this study, a QMPs and organisational performance framework is developed according to a comprehensive literature review which cover aspects of hard and soft quality factors in manufacturing process environment. A total of 11 hypotheses have been put forward to test the relationship amongst the six constructs, which are management commitment, training, process management, quality tools, continuous improvement and organisational performance. The model is analysed using Structural Equation Modeling (SEM) with AMOS software version 18.0 using Maximum Likelihood (ML) estimation. A total of 480 questionnaires were distributed, and 210 questionnaires were valid for analysis. The results of the modeling analysis using ML estimation indicate that the fits statistics of QMPs and organisational performance model for manufacturing industry is admissible. From the results, it found that the management commitment have significant impact on the training and process management. Similarly, the training had significant effect to the quality tools, process management and continuous improvement. Furthermore, the quality tools have significant influence on the process management and continuous improvement. Likewise, the process management also has a significant impact to the continuous improvement. In addition the continuous improvement has significant influence the organisational performance. However, the results of the study also found that there is no significant relationship between management commitment and quality tools, and between the management commitment and continuous improvement. The results of the study can be used by managers to prioritize the implementation of QMPs. For instances, those practices that are found to have positive impact on organisational performance can be recommended to managers so that they can allocate resources to improve these practices to get better performance.

  10. Evaluations of alternative methods for monitoring and estimating responses of salmon productivity in the North Pacific to future climatic change and other processes: A simulation study

    EPA Science Inventory

    Estimation of the relative influence of climate change, compared to other human activities, on dynamics of Pacific salmon (Oncorhynchus spp.) populations can help management agencies take appropriate management actions. We used empirically based simulation modelling of 48 sockeye...

  11. Remote sensing for grassland management in the arid Southwest

    USGS Publications Warehouse

    Marsett, R.C.; Qi, J.; Heilman, P.; Biedenbender, S.H.; Watson, M.C.; Amer, S.; Weltz, M.; Goodrich, D.; Marsett, R.

    2006-01-01

    We surveyed a group of rangeland managers in the Southwest about vegetation monitoring needs on grassland. Based on their responses, the objective of the RANGES (Rangeland Analysis Utilizing Geospatial Information Science) project was defined to be the accurate conversion of remotely sensed data (satellite imagery) to quantitative estimates of total (green and senescent) standing cover and biomass on grasslands and semidesert grasslands. Although remote sensing has been used to estimate green vegetation cover, in arid grasslands herbaceous vegetation is senescent much of the year and is not detected by current remote sensing techniques. We developed a ground truth protocol compatible with both range management requirements and Landsat's 30 m resolution imagery. The resulting ground-truth data were then used to develop image processing algorithms that quantified total herbaceous vegetation cover, height, and biomass. Cover was calculated based on a newly developed Soil Adjusted Total Vegetation Index (SATVI), and height and biomass were estimated based on reflectance in the near infrared (NIR) band. Comparison of the remotely sensed estimates with independent ground measurements produced r2 values of 0.80, 0.85, and 0.77 and Nash Sutcliffe values of 0.78, 0.70, and 0.77 for the cover, plant height, and biomass, respectively. The approach for estimating plant height and biomass did not work for sites where forbs comprised more than 30% of total vegetative cover. The ground reconnaissance protocol and image processing techniques together offer land managers accurate and timely methods for monitoring extensive grasslands. The time-consuming requirement to collect concurrent data in the field for each image implies a need to share the high fixed costs of processing an image across multiple users to reduce the costs for individual rangeland managers.

  12. Lake Erie Yellow perch age estimation based on three structures: Precision, processing times, and management implications

    USGS Publications Warehouse

    Vandergoot, C.S.; Bur, M.T.; Powell, K.A.

    2008-01-01

    Yellow perch Perca flavescens support economically important recreational and commercial fisheries in Lake Erie and are intensively managed. Age estimation represents an integral component in the management of Lake Erie yellow perch stocks, as age-structured population models are used to set safe harvest levels on an annual basis. We compared the precision associated with yellow perch (N = 251) age estimates from scales, sagittal otoliths, and anal spine sections and evaluated the time required to process and estimate age from each structure. Three readers of varying experience estimated ages. The precision (mean coefficient of variation) of estimates among readers was 1% for sagittal otoliths, 5-6% for anal spines, and 11-13% for scales. Agreement rates among readers were 94-95% for otoliths, 71-76% for anal spines, and 45-50% for scales. Systematic age estimation differences were evident among scale and anal spine readers; less-experienced readers tended to underestimate ages of yellow perch older than age 4 relative to estimates made by an experienced reader. Mean scale age tended to underestimate ages of age-6 and older fish relative to otolith ages estimated by an experienced reader. Total annual mortality estimates based on scale ages were 20% higher than those based on otolith ages; mortality estimates based on anal spine ages were 4% higher than those based on otolith ages. Otoliths required more removal and preparation time than scales and anal spines, but age estimation time was substantially lower for otoliths than for the other two structures. We suggest the use of otoliths or anal spines for age estimation in yellow perch (regardless of length) from Lake Erie and other systems where precise age estimates are necessary, because age estimation errors resulting from the use of scales could generate incorrect management decisions. ?? Copyright by the American Fisheries Society 2008.

  13. Comparative evaluation of urban storm water quality models

    NASA Astrophysics Data System (ADS)

    Vaze, J.; Chiew, Francis H. S.

    2003-10-01

    The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.

  14. Reconstructing European forest management from 1600 to 2010

    NASA Astrophysics Data System (ADS)

    McGrath, M. J.; Luyssaert, S.; Meyfroidt, P.; Kaplan, J. O.; Buergi, M.; Chen, Y.; Erb, K.; Gimmi, U.; McInerney, D.; Naudts, K.; Otto, J.; Pasztor, F.; Ryder, J.; Schelhaas, M.-J.; Valade, A.

    2015-04-01

    European forest use for fuel, timber and food dates back to pre-Roman times. Century-scale ecological processes and their legacy effects require accounting for forest management when studying today's forest carbon sink. Forest management reconstructions that are used to drive land surface models are one way to quantify the impact of both historical and today's large scale application of forest management on today's forest-related carbon sink and surface climate. In this study we reconstruct European forest management from 1600 to 2010 making use of diverse approaches, data sources and assumptions. Between 1600 and 1828, a demand-supply approach was used in which wood supply was reconstructed based on estimates of historical annual wood increment and land cover reconstructions. For the same period demand estimates accounted for the fuelwood needed in households, wood used in food processing, charcoal used in metal smelting and salt production, timber for construction and population estimates. Comparing estimated demand and supply resulted in a spatially explicit reconstruction of the share of forests under coppice, high stand management and forest left unmanaged. For the reconstruction between 1829 and 2010 a supply-driven back-casting method was used. The method used age reconstructions from the years 1950 to 2010 as its starting point. Our reconstruction reproduces the most important changes in forest management between 1600 and 2010: (1) an increase of 593 000 km2 in conifers at the expense of deciduous forest (decreasing by 538 000 km2), (2) a 612 000 km2 decrease in unmanaged forest, (3) a 152 000 km2 decrease in coppice management, (4) a 818 000 km2 increase in high stand management, and (5) the rise and fall of litter raking which at its peak in 1853 removed 50 Tg dry litter per year.

  15. Gaussian process models for reference ET estimation from alternative meteorological data sources

    USDA-ARS?s Scientific Manuscript database

    Accurate estimates of daily crop evapotranspiration (ET) are needed for efficient irrigation management, especially in arid and semi-arid regions where crop water demand exceeds rainfall. Daily grass or alfalfa reference ET values and crop coefficients are widely used to estimate crop water demand. ...

  16. An algorithm for management of deep brain stimulation battery replacements: devising a web-based battery estimator and clinical symptom approach.

    PubMed

    Montuno, Michael A; Kohner, Andrew B; Foote, Kelly D; Okun, Michael S

    2013-01-01

    Deep brain stimulation (DBS) is an effective technique that has been utilized to treat advanced and medication-refractory movement and psychiatric disorders. In order to avoid implanted pulse generator (IPG) failure and consequent adverse symptoms, a better understanding of IPG battery longevity and management is necessary. Existing methods for battery estimation lack the specificity required for clinical incorporation. Technical challenges prevent higher accuracy longevity estimations, and a better approach to managing end of DBS battery life is needed. The literature was reviewed and DBS battery estimators were constructed by the authors and made available on the web at http://mdc.mbi.ufl.edu/surgery/dbs-battery-estimator. A clinical algorithm for management of DBS battery life was constructed. The algorithm takes into account battery estimations and clinical symptoms. Existing methods of DBS battery life estimation utilize an interpolation of averaged current drains to calculate how long a battery will last. Unfortunately, this technique can only provide general approximations. There are inherent errors in this technique, and these errors compound with each iteration of the battery estimation. Some of these errors cannot be accounted for in the estimation process, and some of the errors stem from device variation, battery voltage dependence, battery usage, battery chemistry, impedance fluctuations, interpolation error, usage patterns, and self-discharge. We present web-based battery estimators along with an algorithm for clinical management. We discuss the perils of using a battery estimator without taking into account the clinical picture. Future work will be needed to provide more reliable management of implanted device batteries; however, implementation of a clinical algorithm that accounts for both estimated battery life and for patient symptoms should improve the care of DBS patients. © 2012 International Neuromodulation Society.

  17. Isohaline position as a habitat indicator for estuarine populations

    USGS Publications Warehouse

    Jassby, Alan D.; Kimmerer, W.J.; Monismith, Stephen G.; Armor, C.; Cloern, James E.; Powell, T.M.; Vedlinski, Timothy J.

    1995-01-01

    The striped bass survival data were also used to illustrate a related important point: incorporating additionalexplanatory variables may decrease the prediction error for a population or process, but it can increase theuncertainty in parameter estimates and management strategies based on these estimates. Even in cases wherethe uncertainty is currently too large to guide management decisions, an uncertainty analysis can identify themost practical direction for future data acquisition.

  18. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  19. Gaussian processes-based predictive models to estimate reference ET from alternative meteorological data sources for irrigation scheduling

    USDA-ARS?s Scientific Manuscript database

    Accurate estimates of daily crop evapotranspiration (ET) are needed for efficient irrigation management, especially in arid and semi-arid irrigated regions where crop water demand exceeds rainfall. The impact of inaccurate ET estimates can be tremendous in both irrigation cost and the increased dema...

  20. Sire: An Automated Software Development Environment.

    DTIC Science & Technology

    1983-12-01

    understanding the fundamental nature of the software process" (Osterweil, 1981: 35). In fact, the optimal environment for most applications is found by extending... resource planning and other management concerns that cause these problems. Therefore, a complete ASDE should attempt to provide the -21...management with some type of control over the project without impeding the actual development process. Facilities that estimate current resource

  1. Fertilization effects on forest carbon storage and exchange, and net primary production: A new hybrid process model for stand management

    Treesearch

    D. A. Sampson; R. H. Waring; C. A. Maier; C. M. Gough; M. J. Ducey; K. H. Johnsen

    2006-01-01

    A critical ecological question in plantation management is whether fertilization, which generally increases yield, results in enhanced C sequestration over short rotations. We present a rotation-length hybrid process model (SECRETS-3PG) that was calibrated (using control treatments; CW) and verified (using fertilized treatments; FW) using daily estimates of H

  2. Fertilization effects on forest carbon storage and exchange, and net primary production: a new hybrid process model for stand management

    Treesearch

    D.A. Sampson; R.H. Waring; C.A. Maier; C.M. Gough; M.J. Ducey; K.H. Kohnsen

    2006-01-01

    A critical ecological question in plantation management is whether fertilization, which generally increases yield, results in enhanced C sequestration over short rotations. We present a rotation-length hybrid process model (SECRETS-3PG) that was calibrated (using control treatments; CW) and verified (using fertilized treatments; FW) using daily estimates of H

  3. Estimate benefits of crowdsourced data from social media.

    DOT National Transportation Integrated Search

    2014-12-01

    Traffic Management Centers (TMCs) acquire, process, and integrate data in a variety of ways to support real-time operations. Crowdsourcing has been identified as one of the top trends and technologies that traffic management agencies can adapt and ta...

  4. Road Network State Estimation Using Random Forest Ensemble Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Yi; Edara, Praveen; Chang, Yohan

    Network-scale travel time prediction not only enables traffic management centers (TMC) to proactively implement traffic management strategies, but also allows travelers make informed decisions about route choices between various origins and destinations. In this paper, a random forest estimator was proposed to predict travel time in a network. The estimator was trained using two years of historical travel time data for a case study network in St. Louis, Missouri. Both temporal and spatial effects were considered in the modeling process. The random forest models predicted travel times accurately during both congested and uncongested traffic conditions. The computational times for themore » models were low, thus useful for real-time traffic management and traveler information applications.« less

  5. Cost Estimation and Control for Flight Systems

    NASA Technical Reports Server (NTRS)

    Hammond, Walter E.; Vanhook, Michael E. (Technical Monitor)

    2002-01-01

    Good program management practices, cost analysis, cost estimation, and cost control for aerospace flight systems are interrelated and depend upon each other. The best cost control process cannot overcome poor design or poor systems trades that lead to the wrong approach. The project needs robust Technical, Schedule, Cost, Risk, and Cost Risk practices before it can incorporate adequate Cost Control. Cost analysis both precedes and follows cost estimation -- the two are closely coupled with each other and with Risk analysis. Parametric cost estimating relationships and computerized models are most often used. NASA has learned some valuable lessons in controlling cost problems, and recommends use of a summary Project Manager's checklist as shown here.

  6. Auditing of suppliers as the requirement of quality management systems in construction

    NASA Astrophysics Data System (ADS)

    Harasymiuk, Jolanta; Barski, Janusz

    2017-07-01

    The choice of a supplier of construction materials can be important factor of increase or reduction of building works costs. Construction materials present from 40 for 70% of investment task depending on kind of works being provided for realization. There is necessity of estimate of suppliers from the point of view of effectiveness of construction undertaking and necessity from the point of view of conformity of taken operation by executives of construction job and objects within the confines of systems of managements quality being initiated in their organizations. The estimate of suppliers of construction materials and subexecutives of special works is formal requirement in quality management systems, which meets the requirements of the ISO 9001 standard. The aim of this paper is to show possibilities of making use of anaudit for estimate of credibility and reliability of the supplier of construction materials. The article describes kinds of audits, that were carried in quality management systems, with particular taking into consideration audits called as second-site. One characterizes the estimate criterions of qualitative ability and method of choice of the supplier of construction materials. The paper shows also propositions of exemplary questions, that would be estimated in audit process, the way of conducting of this estimate and conditionality of estimate.

  7. An "Ensemble Approach" to Modernizing Extreme Precipitation Estimation for Dam Safety Decision-Making

    NASA Astrophysics Data System (ADS)

    Cifelli, R.; Mahoney, K. M.; Webb, R. S.; McCormick, B.

    2017-12-01

    To ensure structural and operational safety of dams and other water management infrastructure, water resources managers and engineers require information about the potential for heavy precipitation. The methods and data used to estimate extreme rainfall amounts for managing risk are based on 40-year-old science and in need of improvement. The need to evaluate new approaches based on the best science available has led the states of Colorado and New Mexico to engage a body of scientists and engineers in an innovative "ensemble approach" to updating extreme precipitation estimates. NOAA is at the forefront of one of three technical approaches that make up the "ensemble study"; the three approaches are conducted concurrently and in collaboration with each other. One approach is the conventional deterministic, "storm-based" method, another is a risk-based regional precipitation frequency estimation tool, and the third is an experimental approach utilizing NOAA's state-of-the-art High Resolution Rapid Refresh (HRRR) physically-based dynamical weather prediction model. The goal of the overall project is to use the individual strengths of these different methods to define an updated and broadly acceptable state of the practice for evaluation and design of dam spillways. This talk will highlight the NOAA research and NOAA's role in the overarching goal to better understand and characterizing extreme precipitation estimation uncertainty. The research led by NOAA explores a novel high-resolution dataset and post-processing techniques using a super-ensemble of hourly forecasts from the HRRR model. We also investigate how this rich dataset may be combined with statistical methods to optimally cast the data in probabilistic frameworks. NOAA expertise in the physical processes that drive extreme precipitation is also employed to develop careful testing and improved understanding of the limitations of older estimation methods and assumptions. The process of decision making in the midst of uncertainty is a major part of this study. We will speak to how the ensemble approach may be used in concert with one another to manage risk and enhance resiliency in the midst of uncertainty. Finally, the presentation will also address the implications of including climate change in future extreme precipitation estimation studies.

  8. 3-PG simulations of young ponderosa pine plantations under varied management intensity: why do they grow so differently?

    Treesearch

    Liang Wei; Marshall John; Jianwei Zhang; Hang Zhou; Robert Powers

    2014-01-01

    Models can be powerful tools for estimating forest productivity and guiding forest management, but their credibility and complexity are often an issue for forest managers. We parameterized a process-based forest growth model, 3-PG (Physiological Principles Predicting Growth), to simulate growth of ponderosa pine (Pinus ponderosa) plantations in...

  9. Estimating spread rates of non-native species: the gypsy moth as a case study

    Treesearch

    Patrick Tobin; Andrew M. Liebhold; E. Anderson Roberts; Laura M. Blackburn

    2015-01-01

    Estimating rates of spread and generating projections of future range expansion for invasive alien species is a key process in the development of management guidelines and policy. Critical needs to estimate spread rates include the availability of surveys to characterize the spatial distribution of an invading species and the application of analytical methods to...

  10. Time-driven activity-based costing.

    PubMed

    Kaplan, Robert S; Anderson, Steven R

    2004-11-01

    In the classroom, activity-based costing (ABC) looks like a great way to manage a company's limited resources. But executives who have tried to implement ABC in their organizations on any significant scale have often abandoned the attempt in the face of rising costs and employee irritation. They should try again, because a new approach sidesteps the difficulties associated with large-scale ABC implementation. In the revised model, managers estimate the resource demands imposed by each transaction, product, or customer, rather than relying on time-consuming and costly employee surveys. This method is simpler since it requires, for each group of resources, estimates of only two parameters: how much it costs per time unit to supply resources to the business's activities (the total overhead expenditure of a department divided by the total number of minutes of employee time available) and how much time it takes to carry out one unit of each kind of activity (as estimated or observed by the manager). This approach also overcomes a serious technical problem associated with employee surveys: the fact that, when asked to estimate time spent on activities, employees invariably report percentages that add up to 100. Under the new system, managers take into account time that is idle or unused. Armed with the data, managers then construct time equations, a new feature that enables the model to reflect the complexity of real-world operations by showing how specific order, customer, and activity characteristics cause processing times to vary. This Tool Kit uses concrete examples to demonstrate how managers can obtain meaningful cost and profitability information, quickly and inexpensively. Rather than endlessly updating and maintaining ABC data,they can now spend their time addressing the deficiencies the model reveals: inefficient processes, unprofitable products and customers, and excess capacity.

  11. Ixcatec ethnoecology: plant management and biocultural heritage in Oaxaca, Mexico.

    PubMed

    Rangel-Landa, Selene; Casas, Alejandro; Rivera-Lozoya, Erandi; Torres-García, Ignacio; Vallejo-Ramos, Mariana

    2016-07-20

    Studying motives of plant management allows understanding processes that originated agriculture and current forms of traditional technology innovation. Our work analyses the role of native plants in the Ixcatec subsistence, management practices, native plants biocultural importance, and motivations influencing management decisions. Cultural and ecological importance and management complexity may differ among species according with their use value and availability. We hypothesized that decreasing risk in availability of resources underlies the main motives of management, but curiosity, aesthetic, and ethical values may also be determinant. Role of plants in subsistence strategies, forms of use and management was documented through 130 semi-structured interviews and participant observation. Free listing interviews to 38 people were used to estimate the cognitive importance of species used as food, medicine, fuel, fodder, ornament and ceremonial. Species ecological importance was evaluated through sampling vegetation in 22 points. Principal Components Analysis were performed to explore the relation between management, cultural and ecological importance and estimating the biocultural importance of native species. We recorded 627 useful plant species, 589 of them native. Livelihood strategies of households rely on agriculture, livestock and multiple use of forest resources. At least 400 species are managed, some of them involving artificial selection. Management complexity is the main factor reflecting the biocultural importance of plant species, and the weight of ecological importance and cultural value varied among use types. Management strategies aim to ensure resources availability, to have them closer, to embellish human spaces or satisfying ethical principles. Decisions about plants management are influenced by perception of risk to satisfy material needs, but immaterial principles are also important. Studying such relation is crucial for understanding past and present technological innovation processes and understand the complex process of developing biocultural legacy.

  12. Tree Canopy Light Interception Estimates in Almond and a Walnut Orchards Using Ground, Low Flying Aircraft, and Satellite Based Methods to Improve Irrigation Scheduling Programs

    NASA Technical Reports Server (NTRS)

    Rosecrance, Richard C.; Johnson, Lee; Soderstrom, Dominic

    2016-01-01

    Canopy light interception is a main driver of water use and crop yield in almond and walnut production. Fractional green canopy cover (Fc) is a good indicator of light interception and can be estimated remotely from satellite using the normalized difference vegetation index (NDVI) data. Satellite-based Fc estimates could be used to inform crop evapotranspiration models, and hence support improvements in irrigation evaluation and management capabilities. Satellite estimates of Fc in almond and walnut orchards, however, need to be verified before incorporating them into irrigation scheduling or other crop water management programs. In this study, Landsat-based NDVI and Fc from NASA's Satellite Irrigation Management Support (SIMS) were compared with four estimates of canopy cover: 1. light bar measurement, 2. in-situ and image-based dimensional tree-crown analyses, 3. high-resolution NDVI data from low flying aircraft, and 4. orchard photos obtained via Google Earth and processed by an Image J thresholding routine. Correlations between the various estimates are discussed.

  13. Tree canopy light interception estimates in almond and a walnut orchards using ground, low flying aircraft, and satellite based methods to improve irrigation scheduling programs.

    NASA Astrophysics Data System (ADS)

    Rosecrance, R. C.; Johnson, L.; Soderstrom, D.

    2016-12-01

    Canopy light interception is a main driver of water use and crop yield in almond and walnut production. Fractional green canopy cover (Fc) is a good indicator of light interception and can be estimated remotely from satellite using the normalized difference vegetation index (NDVI) data. Satellite-based Fc estimates could be used to inform crop evapotranspiration models, and hence support improvements in irrigation evaluation and management capabilities. Satellite estimates of Fc in almond and walnut orchards, however, need to be verified before incorporating them into irrigation scheduling or other crop water management programs. In this study, Landsat-based NDVI and Fc from NASA's Satellite Irrigation Management Support (SIMS) were compared with four estimates of canopy cover: 1. light bar measurement, 2. in-situ and image-based dimensional tree-crown analyses, 3. high-resolution NDVI data from low flying aircraft, and 4. orchard photos obtained via Google Earth and processed by an Image J thresholding routine. Correlations between the various estimates are discussed.

  14. PREDICTIVE UNCERTAINTY IN HYDROLOGIC AND WATER QUALITY MODELING: APPROACHES, APPLICATION TO ENVIRONMENTAL MANAGEMENT, AND FUTURE CHALLENGES (PRESENTATION)

    EPA Science Inventory

    Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...

  15. PREDICTIVE UNCERTAINTY IN HYDROLOGIC AND WATER QUALITY MODELING: APPROACHES, APPLICATION TO ENVIRONMENTAL MANAGEMENT, AND FUTURE CHALLENGES

    EPA Science Inventory

    Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...

  16. Estimating Lion Abundance using N-mixture Models for Social Species

    PubMed Central

    Belant, Jerrold L.; Bled, Florent; Wilton, Clay M.; Fyumagwa, Robert; Mwampeta, Stanslaus B.; Beyer, Dean E.

    2016-01-01

    Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170–551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species. PMID:27786283

  17. Estimating Lion Abundance using N-mixture Models for Social Species.

    PubMed

    Belant, Jerrold L; Bled, Florent; Wilton, Clay M; Fyumagwa, Robert; Mwampeta, Stanslaus B; Beyer, Dean E

    2016-10-27

    Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170-551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species.

  18. Assessing quality of citizen scientists’ soil texture estimates to evaluate land potential

    USDA-ARS?s Scientific Manuscript database

    Texture influences nearly all soil processes and is often the most measured parameter in soil science. Estimating soil texture is a universal and fundamental practice applied by resource scientists to classify and understand the behavior and management of soil systems. While trained soil scientist c...

  19. Estimating Regional and National-Scale Greenhouse Gas Emissions in the Agriculture, Forestry, and Other Land Use (AFOLU) Sector using the `Agricultural and Land Use (ALU) Tool'

    NASA Astrophysics Data System (ADS)

    Spencer, S.; Ogle, S. M.; Wirth, T. C.; Sivakami, G.

    2016-12-01

    The Intergovernmental Panel on Climate Change (IPCC) provides methods and guidance for estimating anthropogenic greenhouse gas emissions for reporting to the United Nations Framework Convention on Climate Change. The methods are comprehensive and require extensive data compilation, management, aggregation, documentation and calculations of source and sink categories to achieve robust emissions estimates. IPCC Guidelines describe three estimation tiers that require increasing levels of country-specific data and method complexity. Use of higher tiers should improve overall accuracy and reduce uncertainty in estimates. The AFOLU sector represents a complex set of methods for estimating greenhouse gas emissions and carbon sinks. Major AFOLU emissions and sinks include carbon dioxide (CO2) from carbon stock change in biomass, dead organic matter and soils, urea or lime application to soils, and oxidation of carbon in drained organic soils; nitrous oxide (N2O) and methane (CH4) emissions from livestock management and biomass burning; N2O from organic amendments and fertilizer application to soils, and CH4 emissions from rice cultivation. To assist inventory compilers with calculating AFOLU-sector estimates, the Agriculture and Land Use Greenhouse Gas Inventory Tool (ALU) was designed to implement Tier 1 and 2 methods using IPCC Good Practice Guidance. It guides the compiler through activity data entry, emission factor assignment, and emissions calculations while carefully maintaining data integrity. ALU also provides IPCC defaults and can estimate uncertainty. ALU was designed to simplify the AFOLU inventory compilation process at regional or national scales, disaggregating the process into a series of steps reduces the potential for errors in the compilation process. An example application has been developed using ALU to estimate methane emissions from rice production in the United States.

  20. Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, George; Doll, William E.; Beard, Les P.

    2009-01-01

    Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less

  1. Pairing top-down and bottom-up approaches to analyze catchment scale management of water quality and quantity

    NASA Astrophysics Data System (ADS)

    Lovette, J. P.; Duncan, J. M.; Band, L. E.

    2016-12-01

    Watershed management requires information on the hydrologic impacts of local to regional land use, land cover and infrastructure conditions. Management of runoff volumes, storm flows, and water quality can benefit from large scale, "top-down" screening tools, using readily available information, as well as more detailed, "bottom-up" process-based models that explicitly track local runoff production and routing from sources to receiving water bodies. Regional scale data, available nationwide through the NHD+, and top-down models based on aggregated catchment information provide useful tools for estimating regional patterns of peak flows, volumes and nutrient loads at the catchment level. Management impacts can be estimated with these models, but have limited ability to resolve impacts beyond simple changes to land cover proportions. Alternatively, distributed process-based models provide more flexibility in modeling management impacts by resolving spatial patterns of nutrient source, runoff generation, and uptake. This bottom-up approach can incorporate explicit patterns of land cover, drainage connectivity, and vegetation extent, but are typically applied over smaller areas. Here, we first model peak flood flows and nitrogen loads across North Carolina's 70,000 NHD+ catchments using USGS regional streamflow regression equations and the SPARROW model. We also estimate management impact by altering aggregated sources in each of these models. To address the missing spatial implications of the top-down approach, we further explore the demand for riparian buffers as a management strategy, simulating the accumulation of nutrient sources along flow paths and the potential mitigation of these sources through forested buffers. We use the Regional Hydro-Ecological Simulation System (RHESSys) to model changes across several basins in North Carolina's Piedmont and Blue Ridge regions, ranging in size from 15 - 1,130 km2. The two approaches provide a complementary set of tools for large area screening, followed by smaller, more process based assessment and design tools.

  2. Anthropogenic Influences on Conservation Values of White Rhinoceros

    PubMed Central

    Ferreira, Sam M.; Botha, Judith M.; Emmett, Megan C.

    2012-01-01

    White rhinoceros (rhinos) is a keystone conservation species and also provides revenue for protection agencies. Restoring or mimicking the outcomes of impeded ecological processes allows reconciliation of biodiversity and financial objectives. We evaluate the consequences of white rhino management removal, and in recent times, poaching, on population persistence, regional conservation outcomes and opportunities for revenue generation. In Kruger National Park, white rhinos increased from 1998 to 2008. Since then the population may vary non-directionally. In 2010, we estimated 10,621 (95% CI: 8,767–12,682) white rhinos using three different population estimation methods. The desired management effect of a varying population was detectable after 2008. Age and sex structures in sink areas (focal rhino capture areas) were different from elsewhere. This comes from relatively more sub-adults being removed by managers than what the standing age distribution defined. Poachers in turn focused on more adults in 2011. Although the effect of poaching was not detectable at the population level given the confidence intervals of estimates, managers accommodated expected poaching annually and adapted management removals. The present poaching trend predicts that 432 white rhinos may be poached in Kruger during 2012. The white rhino management model mimicking outcomes of impeded ecological processes predicts 397 rhino management removals are required. At present poachers may be doing “management removals,” but conservationists have no opportunity left to contribute to regional rhino conservation strategies or generate revenue through white rhino sales. In addition, continued trends in poaching predict detectable white rhino declines in Kruger National Park by 2016. Our results suggest that conservationists need innovative approaches that reduce financial incentives to curb the threats that poaching poses to several conservation values of natural resources such as white rhinos. PMID:23029354

  3. A new device to estimate abundance of moist-soil plant seeds

    USGS Publications Warehouse

    Penny, E.J.; Kaminski, R.M.; Reinecke, K.J.

    2006-01-01

    Methods to sample the abundance of moist-soil seeds efficiently and accurately are critical for evaluating management practices and determining food availability. We adapted a portable, gasoline-powered vacuum to estimate abundance of seeds on the surface of a moist-soil wetland in east-central Mississippi and evaluated the sampler by simulating conditions that researchers and managers may experience when sampling moist-soil areas for seeds. We measured the percent recovery of known masses of seeds by the vacuum sampler in relation to 4 experimentally controlled factors (i.e., seed-size class, sample mass, soil moisture class, and vacuum time) with 2-4 levels per factor. We also measured processing time of samples in the laboratory. Across all experimental factors, seed recovery averaged 88.4% and varied little (CV = 0.68%, n = 474). Overall, mean time to process a sample was 30.3 ? 2.5 min (SE, n = 417). Our estimate of seed recovery rate (88%) may be used to adjust estimates for incomplete seed recovery, or project-specific correction factors may be developed by investigators. Our device was effective for estimating surface abundance of moist-soil plant seeds after dehiscence and before habitats were flooded.

  4. Setting numerical population objectives for priority landbird species

    Treesearch

    Kenneth V. Rosenberg; Peter J. Blancher

    2005-01-01

    Following the example of the North American Waterfowl Management Plan, deriving numerical population estimates and conservation targets for priority landbird species is considered a desirable, if not necessary, element of the Partners in Flight planning process. Methodology for deriving such estimates remains in its infancy, however, and the use of numerical population...

  5. KSC Construction Cost Index

    NASA Technical Reports Server (NTRS)

    Brown, J. A.

    1983-01-01

    Kennedy Space Center cost Index aids in conceptual design cost estimates. Report discusses development of KSC Cost Index since January 1974. Index since January 1974. Index provides management, design engineers, and estimators an up-to-data reference for local labor and material process. Also provides mount and rate of change in these costs used to predict future construction costs.

  6. Estimating and controlling workplace risk: an approach for occupational hygiene and safety professionals.

    PubMed

    Toffel, Michael W; Birkner, Lawrence R

    2002-07-01

    The protection of people and physical assets is the objective of health and safety professionals and is accomplished through the paradigm of anticipation, recognition, evaluation, and control of risks in the occupational environment. Risk assessment concepts are not only used by health and safety professionals, but also by business and financial planners. Since meeting health and safety objectives requires financial resources provided by business and governmental managers, the hypothesis addressed here is that health and safety risk decisions should be made with probabilistic processes used in financial decision-making and which are familiar and recognizable to business and government planners and managers. This article develops the processes and demonstrates the use of incident probabilities, historic outcome information, and incremental impact analysis to estimate risk of multiple alternatives in the chemical process industry. It also analyzes how the ethical aspects of decision-making can be addressed in formulating health and safety risk management plans. It is concluded that certain, easily understood, and applied probabilistic risk assessment methods used by business and government to assess financial and outcome risk have applicability to improving workplace health and safety in three ways: 1) by linking the business and health and safety risk assessment processes to securing resources, 2) by providing an additional set of tools for health and safety risk assessment, and 3) by requiring the risk assessor to consider multiple risk management alternatives.

  7. 78 FR 21912 - Proposed Information Collection; Comment Request; Processed Products Family of Forms

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-12

    ... Collection; Comment Request; Processed Products Family of Forms AGENCY: National Oceanic and Atmospheric... NOAA in the economic and social analyses developed when proposing and evaluating fishery management... collection). Affected Public: Business or other for-profit organizations. Estimated Number of Respondents...

  8. Greenhouse gas emissions of waste management processes and options: A case study.

    PubMed

    de la Barrera, Belen; Hooda, Peter S

    2016-07-01

    Increasing concern about climate change is prompting organisations to mitigate their greenhouse gas emissions. Waste management activities also contribute to greenhouse gas emissions. In the waste management sector, there has been an increasing diversion of waste sent to landfill, with much emphasis on recycling and reuse to prevent emissions. This study evaluates the carbon footprint of the different processes involved in waste management systems, considering the entire waste management stream. Waste management data from the Royal Borough of Kingston upon Thames, London (UK), was used to estimate the carbon footprint for its (Royal Borough of Kingston upon Thames) current source segregation system. Second, modelled full and partial co-mingling scenarios were used to estimate carbon emissions from these proposed waste management approaches. The greenhouse gas emissions from the entire waste management system at Royal Borough of Kingston upon Thames were 12,347 t CO2e for the source-segregated scenario, and 11,907 t CO2e for the partial co-mingled model. These emissions amount to 203.26 kg CO2e t(-1) and 196.02 kg CO2e t(-1) municipal solid waste for source-segregated and partial co-mingled, respectively. The change from a source segregation fleet to a partial co-mingling fleet reduced the emissions, at least partly owing to a change in the number and type of vehicles. © The Author(s) 2016.

  9. Estimating the costs of human space exploration

    NASA Technical Reports Server (NTRS)

    Mandell, Humboldt C., Jr.

    1994-01-01

    The plan for NASA's new exploration initiative has the following strategic themes: (1) incremental, logical evolutionary development; (2) economic viability; and (3) excellence in management. The cost estimation process is involved with all of these themes and they are completely dependent upon the engineering cost estimator for success. The purpose is to articulate the issues associated with beginning this major new government initiative, to show how NASA intends to resolve them, and finally to demonstrate the vital importance of a leadership role by the cost estimation community.

  10. Population viability as a measure of forest sustainability

    Treesearch

    Eric T. Linder; Nathan A. Klaus; David A. Buehler

    2004-01-01

    Many forest managers work to balance timber production with protection of ecological processes and other nontimber values. The preservation of biodiversity is an important nontimber value. When a suite of management options is being developed, it is difficult to estimate quantitatively the impact of the various scenarios on biodiversity. We suggest population viability...

  11. Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)

    NASA Astrophysics Data System (ADS)

    Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.

    2014-04-01

    A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.

  12. Cost benefits of advanced software: A review of methodology used at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla N.

    1993-01-01

    To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.

  13. Government conceptual estimating for contracting and management

    NASA Technical Reports Server (NTRS)

    Brown, J. A.

    1986-01-01

    The use of the Aerospace Price Book, a cost index, and conceptual cost estimating for cost-effective design and construction of space facilities is discussed. The price book consists of over 200 commonly used conceptual elements and 100 systems summaries of projects such as launch pads, processing facilities, and air locks. The cost index is composed of three divisions: (1) bid summaries of major Shuttle projects, (2) budget cost data sheets, and (3) cost management summaries; each of these divisions is described. Conceptual estimates of facilities and ground support equipment are required to provide the most probable project cost for budget, funding, and project approval purposes. Similar buildings, systems, and elements already designed are located in the cost index in order to make the best rough order of magnitude conceptual estimates for development of Space Shuttle facilities. An example displaying the applicability of the conceptual cost estimating procedure for the development of the KSC facilities is presented.

  14. Testing the Wisconsin Phosphorus Index with year-round, field-scale runoff monitoring.

    PubMed

    Good, Laura W; Vadas, Peter; Panuska, John C; Bonilla, Carlos A; Jokela, William E

    2012-01-01

    The Wisconsin Phosphorus Index (WPI) is one of several P indices in the United States that use equations to describe actual P loss processes. Although for nutrient management planning the WPI is reported as a dimensionless whole number, it is calculated as average annual dissolved P (DP) and particulate P (PP) mass delivered per unit area. The WPI calculations use soil P concentration, applied manure and fertilizer P, and estimates of average annual erosion and average annual runoff. We compared WPI estimated P losses to annual P loads measured in surface runoff from 86 field-years on crop fields and pastures. As the erosion and runoff generated by the weather in the monitoring years varied substantially from the average annual estimates used in the WPI, the WPI and measured loads were not well correlated. However, when measured runoff and erosion were used in the WPI field loss calculations, the WPI accurately estimated annual total P loads with a Nash-Sutcliffe Model Efficiency (NSE) of 0.87. The DP loss estimates were not as close to measured values (NSE = 0.40) as the PP loss estimates (NSE = 0.89). Some errors in estimating DP losses may be unavoidable due to uncertainties in estimating on-farm manure P application rates. The WPI is sensitive to field management that affects its erosion and runoff estimates. Provided that the WPI methods for estimating average annual erosion and runoff are accurately reflecting the effects of management, the WPI is an accurate field-level assessment tool for managing runoff P losses. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  15. The Large Synoptic Survey Telescope project management control system

    NASA Astrophysics Data System (ADS)

    Kantor, Jeffrey P.

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) program is jointly funded by the NSF, the DOE, and private institutions and donors. From an NSF funding standpoint, the LSST is a Major Research Equipment and Facilities (MREFC) project. The NSF funding process requires proposals and D&D reviews to include activity-based budgets and schedules; documented basis of estimates; risk-based contingency analysis; cost escalation and categorization. "Out-of-the box," the commercial tool Primavera P6 contains approximately 90% of the planning and estimating capability needed to satisfy R&D phase requirements, and it is customizable/configurable for remainder with relatively little effort. We describe the customization/configuration and use of Primavera for the LSST Project Management Control System (PMCS), assess our experience to date, and describe future directions. Examples in this paper are drawn from the LSST Data Management System (DMS), which is one of three main subsystems of the LSST and is funded by the NSF. By astronomy standards the LSST DMS is a large data management project, processing and archiving over 70 petabyes of image data, producing over 20 petabytes of catalogs annually, and generating 2 million transient alerts per night. Over the 6-year construction and commissioning phase, the DM project is estimated to require 600,000 hours of engineering effort. In total, the DMS cost is approximately 60% hardware/system software and 40% labor.

  16. Algorithmic decision rules for estimating growth, removals, and mortality within a national-scale forest inventory (USA)

    Treesearch

    William H. McWilliams; Carol L. Alerich; William A. Bechtold; Mark Hansen; Christopher M. Oswalt; Mike Thompson; Jeff Turner

    2012-01-01

    The U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) program maintains the National Information Management System (NIMS) that provides the computational framework for the annual forest inventory of the United States. Questions regarding the impact of key elements of programming logic, processing criteria, and estimation procedures...

  17. Parallel task processing of very large datasets

    NASA Astrophysics Data System (ADS)

    Romig, Phillip Richardson, III

    This research concerns the use of distributed computer technologies for the analysis and management of very large datasets. Improvements in sensor technology, an emphasis on global change research, and greater access to data warehouses all are increase the number of non-traditional users of remotely sensed data. We present a framework for distributed solutions to the challenges of datasets which exceed the online storage capacity of individual workstations. This framework, called parallel task processing (PTP), incorporates both the task- and data-level parallelism exemplified by many image processing operations. An implementation based on the principles of PTP, called Tricky, is also presented. Additionally, we describe the challenges and practical issues in modeling the performance of parallel task processing with large datasets. We present a mechanism for estimating the running time of each unit of work within a system and an algorithm that uses these estimates to simulate the execution environment and produce estimated runtimes. Finally, we describe and discuss experimental results which validate the design. Specifically, the system (a) is able to perform computation on datasets which exceed the capacity of any one disk, (b) provides reduction of overall computation time as a result of the task distribution even with the additional cost of data transfer and management, and (c) in the simulation mode accurately predicts the performance of the real execution environment.

  18. Validation and verification of lawful water use in South Africa: An overview of the process in the KwaZulu-Natal Province

    NASA Astrophysics Data System (ADS)

    Kapangaziwiri, E.; Mwenge Kahinda, J.; Dzikiti, S.; Ramoelo, A.; Cho, M.; Mathieu, R.; Naidoo, M.; Seetal, A.; Pienaar, H.

    2018-06-01

    South Africa is a water-stressed country which has, over the years, strived to adopt a rational, just and equitable way to manage this limited resource. The National Water Act (Act No.36 of 1998) (NWA) provides the legal framework to achieve this objective. Since 2003, the government embarked on a national process to: validate (confirm the quantum of), and; verify (establish the lawfulness of) water uses that exceed domestic requirements. The objective of the process is to determine how much water is allocated for: (1) existing lawful use in accordance with specific requirements of the NWA, and; (2) current water uses. The process identified users with or without registered use entitlements, whether claims for registered uses were correct, under-estimated, over-estimated or false; and confirmed the lawfulness of each water use in accordance with water legislation that pre-dated the NWA. The process included identifying land and non-land based water uses (industrial, mining and bulk potable water supplies, irrigation, crop types and impoundments) using remote sensing (RS) techniques for both a qualifying (defined as two years before the enactment of the NWA) and the current periods. Using this as a basis, volumetric crop irrigation requirements were then estimated using the South African Procedure for estimating irrigation WATer requirements (SAPWAT), while the Gush curves were used to quantify Stream Flow Reduction Activities (SFRAs) for commercially afforested areas. The boundaries of farm reservoirs were delineated from RS and the volumes calculated using a regression approach. Estimates of the irrigation water requirements, SFRAs and reservoir volumes formed the basis for interaction between the Department of Water and Sanitation (DWS) and water users to confirm their uses; and subsequently, to update the DWS Water Authorisation and Registration Management System (WARMS), a database of water users. While WARMS initially indicated a total of approximately 16 000 registered users in the KwaZulu-Natal Province, following the RS analysis up to 6000 potential additional water users have been identified, mostly currently unregistered, who are expected to be registered in the updated database. Despite certain process methodology challenges and limitations, it forms a critical basis for all other aspects of water management, informs macro- and micro-water resource planning, water allocation reform, as well as water use compliance, monitoring and enforcement.

  19. Estimating daily Landsat-scale evapotranspiration over a managed pine plantation in North Carolina, USA using a data fusion method

    USDA-ARS?s Scientific Manuscript database

    As a primary flux in the global water cycle, evapotranspiration (ET) connects hydrologic and biological processes and is directly affected by water management, land use change and climate change. The two source energy balance (TSEB) model has been widely applied to quantify field scale ET using sate...

  20. Daily Landsat-scale evapotranspiration estimation over a managed pine plantation in North Carolina, USA using multi-satellite data fusion

    USDA-ARS?s Scientific Manuscript database

    As a primary flux in the global water cycle, evapotranspiration (ET) connects hydrologic and biological processes and is directly affected by water and land management, land use change and climate variability. The Two Source Energy Balance (TSEB) model has been widely applied to quantify field- to g...

  1. 76 FR 81851 - Fisheries Off West Coast States; West Coast Salmon Fisheries; Amendment 16 to the Salmon Fishery...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... currently the best available science. The MSA requires that management decisions be based on the best available science. The FMP as amended by Amendment 16 provides a process for changing estimates of MSY if..., the Council manages Klamath Basin on an aggregate basis using the best available science. The...

  2. Print Still Matters in an E-Learning World, and Training Companies Need to Properly Manage It

    ERIC Educational Resources Information Center

    Kriesen, Gretchen L.

    2011-01-01

    This report demonstrates how the application of Behavioral Systems Analysis (BSA) methods assisted in assessing a small training company's Print Production Management (PPM) system. PPM is the process by which printed materials are conceptualized, estimated, released to a commercial printer, proofed, and delivered to the client. The current PPM…

  3. The application of risk management in sport.

    PubMed

    Fuller, Colin; Drawer, Scott

    2004-01-01

    The process of risk management can be implemented as part of a best practice management system within the sport and leisure sector. The process enables risk factors that might lead to injuries to be identified and the levels of risk associated with activities to be estimated and evaluated. This information can be utilised proactively by sports governing bodies and participants to identify preventive and therapeutic interventions in order to reduce the frequency of occurrence and/or severity of injuries within their sports. The acceptability of risk within specific sports, however, is dependent on the perceptions of the participants involved. Copyright 2004 Adis Data Information BV

  4. VA Construction: Improved Processes Needed to Monitor Contract Modifications, Develop Schedules, and Estimate Costs

    DTIC Science & Technology

    2017-03-01

    address challenges in managing projects to build medical facilities. In response to statutory requirements and additional congressional direction, VA...is outsourcing management of certain such projects to the U.S. Army Corps of Engineers (USACE). As of October 2016, VA had 23 ongoing projects...costing $100 million or more. VA and USACE have entered into interagency agreements for 12 of these 23 projects. The agreements entail USACE’s managing

  5. Uncertainty Representation and Interpretation in Model-Based Prognostics Algorithms Based on Kalman Filter Estimation

    NASA Technical Reports Server (NTRS)

    Galvan, Jose Ramon; Saxena, Abhinav; Goebel, Kai Frank

    2012-01-01

    This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process, and how it relates to uncertainty representation, management and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for two while considering prognostics in making critical decisions.

  6. Department of the Army Cost Analysis Manual

    DTIC Science & Technology

    2001-05-01

    SECTION I - AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) ................................................................179 SECTION II - AUTOMATED...Management & Comptroller) endorsed the Automated Cost Estimating Integrated Tools ( ACEIT ) model and since it is widely used to prepare POEs, CCAs and...CRB IPT (in ACEIT ) will be the basis for information contained in the CAB. Any remaining unresolved issues from the IPT process will be raised at the

  7. Does recyclable separation reduce the cost of municipal waste management in Japan?

    PubMed

    Chifari, Rosaria; Lo Piano, Samuele; Matsumoto, Shigeru; Tasaki, Tomohiro

    2017-02-01

    Municipal solid waste (MSW) management is a system involving multiple sub-systems that typically require demanding inputs, materials and resources to properly process generated waste throughput. For this reason, MSW management is generally one of the most expensive services provided by municipalities. In this paper, we analyze the Japanese MSW management system and estimate the cost elasticity with respect to the waste volumes at three treatment stages: collection, processing, and disposal. Although we observe economies of scale at all three stages, the collection cost is less elastic than the disposal cost. We also examine whether source separation at home affects the cost of MSW management. The empirical results show that the separate collection of the recyclable fraction leads to reduced processing costs at intermediate treatment facilities, but does not change the overall waste management cost. Our analysis also reveals that the cost of waste management systems decreases when the service is provided by private companies through a public tender. The cost decreases even more when the service is performed under the coordination of adjacent municipalities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Role of mathematical models in assessment of risk and in attempts to define management strategy.

    PubMed

    Flamm, W G; Winbush, J S

    1984-06-01

    Risk assessment of food-borne carcinogens is becoming a common practice at FDA. Actual risk is not being estimated, only the upper limit of risk. The risk assessment process involves a large number of steps and assumptions, many of which affect the numerical value estimated. The mathematical model which is to be applied is only one of the factors which affect these numerical values. To fulfill the policy objective of using the "worst plausible case" in estimating the upper limit of risk, recognition needs to be given to a proper balancing of assumptions and decisions. Interaction between risk assessors and risk managers should avoid making or giving the appearance of making specific technical decisions such as the choice of the mathematical model. The importance of this emerging field is too great to jeopardize it by inappropriately mixing scientific judgments with policy judgments. The risk manager should understand fully the points and range of uncertainty involved in arriving at the estimates of risk which must necessarily affect the choice of the policy or regulatory options available.

  9. An estimation framework for building information modeling (BIM)-based demolition waste by type.

    PubMed

    Kim, Young-Chan; Hong, Won-Hwa; Park, Jae-Woo; Cha, Gi-Wook

    2017-12-01

    Most existing studies on demolition waste (DW) quantification do not have an official standard to estimate the amount and type of DW. Therefore, there are limitations in the existing literature for estimating DW with a consistent classification system. Building information modeling (BIM) is a technology that can generate and manage all the information required during the life cycle of a building, from design to demolition. Nevertheless, there has been a lack of research regarding its application to the demolition stage of a building. For an effective waste management plan, the estimation of the type and volume of DW should begin from the building design stage. However, the lack of tools hinders an early estimation. This study proposes a BIM-based framework that estimates DW in the early design stages, to achieve an effective and streamlined planning, processing, and management. Specifically, the input of construction materials in the Korean construction classification system and those in the BIM library were matched. Based on this matching integration, the estimates of DW by type were calculated by applying the weight/unit volume factors and the rates of DW volume change. To verify the framework, its operation was demonstrated by means of an actual BIM modeling and by comparing its results with those available in the literature. This study is expected to contribute not only to the estimation of DW at the building level, but also to the automated estimation of DW at the district level.

  10. Evaluating changes to reservoir rule curves using historical water-level data

    USGS Publications Warehouse

    Mower, Ethan; Miranda, Leandro E.

    2013-01-01

    Flood control reservoirs are typically managed through rule curves (i.e. target water levels) which control the storage and release timing of flood waters. Changes to rule curves are often contemplated and requested by various user groups and management agencies with no information available about the actual flood risk of such requests. Methods of estimating flood risk in reservoirs are not easily available to those unfamiliar with hydrological models that track water movement through a river basin. We developed a quantile regression model that uses readily available daily water-level data to estimate risk of spilling. Our model provided a relatively simple process for estimating the maximum applicable water level under a specific flood risk for any day of the year. This water level represents an upper-limit umbrella under which water levels can be operated in a variety of ways. Our model allows the visualization of water-level management under a user-specified flood risk and provides a framework for incorporating the effect of a changing environment on water-level management in reservoirs, but is not designed to replace existing hydrological models. The model can improve communication and collaboration among agencies responsible for managing natural resources dependent on reservoir water levels.

  11. Self-Centered Management Skills and Knowledge Appropriation by Students in High Schools and Private Secondary Schools of the City of Maroua

    ERIC Educational Resources Information Center

    Oyono, Tadjuidje Michel

    2016-01-01

    Knowledge in its process of appropriation necessitates on the part of the learner, the mobilization of an efficient management strategy of adapted competencies. The present article in its problematic presents the theoretical perspective of Desaunay (1985) which estimates that three fundamental competences (relational, technical and affective) have…

  12. Mark-recapture estimation of snag standing rates in northern Arizona mixed-conifer and ponderosa pine forests

    Treesearch

    Joseph L. Ganey; Gary C. White; Jeffrey S. Jenness; Scott C. Vojta

    2015-01-01

    Snags (standing dead trees) are important components of forests that provide resources for numerous species of wildlife and contribute to decay dynamics and other ecological processes. Managers charged with managing populations of snags need information about standing rates of snags and factors influencing those rates, yet such data are limited for ponderosa pine (...

  13. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  14. Atmospheric dispersion prediction and source estimation of hazardous gas using artificial neural network, particle swarm optimization and expectation maximization

    NASA Astrophysics Data System (ADS)

    Qiu, Sihang; Chen, Bin; Wang, Rongxiao; Zhu, Zhengqiu; Wang, Yuan; Qiu, Xiaogang

    2018-04-01

    Hazardous gas leak accident has posed a potential threat to human beings. Predicting atmospheric dispersion and estimating its source become increasingly important in emergency management. Current dispersion prediction and source estimation models cannot satisfy the requirement of emergency management because they are not equipped with high efficiency and accuracy at the same time. In this paper, we develop a fast and accurate dispersion prediction and source estimation method based on artificial neural network (ANN), particle swarm optimization (PSO) and expectation maximization (EM). The novel method uses a large amount of pre-determined scenarios to train the ANN for dispersion prediction, so that the ANN can predict concentration distribution accurately and efficiently. PSO and EM are applied for estimating the source parameters, which can effectively accelerate the process of convergence. The method is verified by the Indianapolis field study with a SF6 release source. The results demonstrate the effectiveness of the method.

  15. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  16. Fewer pallets reaching landfills, more are processed for recovery

    Treesearch

    Robert J. Bush; Daryl T. Corr; Philip A. Araman

    2001-01-01

    The Virginia Tech Center for Forest Products Marketing and Management estimates that the pallet industry received 171 million pallets in 1995 (containing 2.6 billion board feet of wood) for the purpose of reuse and recycling. We estimate that this figure grew to just short of 300 million pallets in 1999. This means that over one-third of the material used for pallets...

  17. Project risk management in the construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya

    2018-03-01

    This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.

  18. The importance of parameterization when simulating the hydrologic response of vegetative land-cover change

    NASA Astrophysics Data System (ADS)

    White, Jeremy; Stengel, Victoria; Rendon, Samuel; Banta, John

    2017-08-01

    Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral in that they reproduce daily mean streamflow acceptably well according to Nash-Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.

  19. The importance of parameterization when simulating the hydrologic response of vegetative land-cover change

    USGS Publications Warehouse

    White, Jeremy; Stengel, Victoria G.; Rendon, Samuel H.; Banta, John

    2017-01-01

    Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral in that they reproduce daily mean streamflow acceptably well according to Nash–Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.

  20. Establishing crash modification factors and their use.

    DOT National Transportation Integrated Search

    2014-08-01

    A critical component in the Association of State Highway and Transportation Officials (AASHTO) Highway Safety Manual : (HSM) safety management process is the Crash Modification Factor (CMF). It is used to estimate the change in the : expected (ave...

  1. Guidance on Systematic Planning Using the Data Quality Objectives Process, EPA QA/G-4

    EPA Pesticide Factsheets

    Provides a standard working tool for project managers and planners to develop DQO for determining the type, quantity, and quality of data needed to reach defensible decisions or make credible estimates.

  2. Evaluation of standard methods for collecting and processing fuel moisture samples

    Treesearch

    Sally M. Haase; José Sánchez; David R. Weise

    2016-01-01

    A variety of techniques for collecting and processing samples to determine moisture content of wildland fuels in support of fire management activities were evaluated. The effects of using a chainsaw or handsaw to collect samples of largediameter wood, containers for storing and transporting collected samples, and quick-response ovens for estimating moisture content...

  3. Processing of next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data for the DuPage County streamflow simulation system

    USGS Publications Warehouse

    Bera, Maitreyee; Ortel, Terry W.

    2018-01-12

    The U.S. Geological Survey, in cooperation with DuPage County Stormwater Management Department, is testing a near real-time streamflow simulation system that assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek and West Branch DuPage River drainage basins in DuPage County, Illinois. As part of this effort, the U.S. Geological Survey maintains a database of hourly meteorological and hydrologic data for use in this near real-time streamflow simulation system. Among these data are next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data, which are retrieved from the North Central River Forecasting Center of the National Weather Service. The DuPage County streamflow simulation system uses these quantitative precipitation forecast data to create streamflow predictions for the two simulated drainage basins. This report discusses in detail how these data are processed for inclusion in the Watershed Data Management files used in the streamflow simulation system for the Salt Creek and West Branch DuPage River drainage basins.

  4. Application of a distributed process-based hydrologic model to estimate the effects of forest road density on stormflows in the Southern Appalachians

    Treesearch

    Salli F. Dymond; W. Michael Aust; Stephen P. Prisley; Mark H. Eisenbies; James M. Vose

    2014-01-01

    Managed forests have historically been linked to watershed protection and flood mitigation. Research indicates that forests can potentially minimize peak flows during storm events, yet the relationship between forests and flooding is complex. Forest roads, usually found in managed systems, can potentially magnify the effects of forest harvesting on water yields. The...

  5. Evaluating abundance and trends in a Hawaiian avian community using state-space analysis

    USGS Publications Warehouse

    Camp, Richard J.; Brinck, Kevin W.; Gorresen, P.M.; Paxton, Eben H.

    2016-01-01

    Estimating population abundances and patterns of change over time are important in both ecology and conservation. Trend assessment typically entails fitting a regression to a time series of abundances to estimate population trajectory. However, changes in abundance estimates from year-to-year across time are due to both true variation in population size (process variation) and variation due to imperfect sampling and model fit. State-space models are a relatively new method that can be used to partition the error components and quantify trends based only on process variation. We compare a state-space modelling approach with a more traditional linear regression approach to assess trends in uncorrected raw counts and detection-corrected abundance estimates of forest birds at Hakalau Forest National Wildlife Refuge, Hawai‘i. Most species demonstrated similar trends using either method. In general, evidence for trends using state-space models was less strong than for linear regression, as measured by estimates of precision. However, while the state-space models may sacrifice precision, the expectation is that these estimates provide a better representation of the real world biological processes of interest because they are partitioning process variation (environmental and demographic variation) and observation variation (sampling and model variation). The state-space approach also provides annual estimates of abundance which can be used by managers to set conservation strategies, and can be linked to factors that vary by year, such as climate, to better understand processes that drive population trends.

  6. Estimation of pollutant loads considering dam operation in Han River Basin by BASINS/Hydrological Simulation Program-FORTRAN.

    PubMed

    Jung, Kwang-Wook; Yoon, Choon-G; Jang, Jae-Ho; Kong, Dong-Soo

    2008-01-01

    Effective watershed management often demands qualitative and quantitative predictions of the effect of future management activities as arguments for policy makers and administration. The BASINS geographic information system was developed to compute total maximum daily loads, which are helpful to establish hydrological process and water quality modeling system. In this paper the BASINS toolkit HSPF model is applied in 20,271 km(2) large watershed of the Han River Basin is used for applicability of HSPF and BMPs scenarios. For proper evaluation of watershed and stream water quality, comprehensive estimation methods are necessary to assess large amounts of point source and nonpoint-source (NPS) pollution based on the total watershed area. In this study, The Hydrological Simulation Program-FORTRAN (HSPF) was estimated to simulate watershed pollutant loads containing dam operation and applied BMPs scenarios for control NPS pollution. The 8-day monitoring data (about three years) were used in the calibration and verification processes. Model performance was in the range of "very good" and "good" based on percent difference. The water-quality simulation results were encouraging for this large sizable watershed with dam operation practice and mixed land uses; HSPF proved adequate, and its application is recommended to simulate watershed processes and BMPs evaluation. IWA Publishing 2008.

  7. Making do with less: Must sparse data preclude informed harvest strategies for European waterbirds?

    USGS Publications Warehouse

    Johnson, Fred A.; Alhainen, Mikko; Fox, Anthony D.; Madsen, Jesper; Guillemain, Matthieu

    2018-01-01

    The demography of many European waterbirds is not well understood because most countries have conducted little monitoring and assessment, and coordination among countries on waterbird management has little precedent. Yet intergovernmental treaties now mandate the use of sustainable, adaptive harvest strategies, whose development is challenged by a paucity of demographic information. In this study, we explore how a combination of allometric relationships, fragmentary monitoring and research information, and expert judgment can be used to estimate the parameters of a theta-logistic population model, which in turn can be used in a Markov decision process to derive optimal harvesting strategies. We show how to account for considerable parametric uncertainty, as well as for different management objectives. We illustrate our methodology with a poorly understood population of taiga bean geese (Anser fabalis fabalis), which is a popular game bird in Fennoscandia. Our results for taiga bean geese suggest that they may have demographic rates similar to other, well-studied species of geese, and our model-based predictions of population size are consistent with the limited monitoring information available. Importantly, we found that by using a Markov decision process, a simple scalar population model may be sufficient to guide harvest management of this species, even if its demography is age-structured. Finally, we demonstrated how two different management objectives can lead to very different optimal harvesting strategies, and how conflicting objectives may be traded off with each other. This approach will have broad application for European waterbirds by providing preliminary estimates of key demographic parameters, by providing insights into the monitoring and research activities needed to corroborate those estimates, and by producing harvest management strategies that are optimal with respect to the managers’ objectives, options, and available demographic information.

  8. Requirements Flowdown for Prognostics and Health Management

    NASA Technical Reports Server (NTRS)

    Goebel, Kai; Saxena, Abhinav; Roychoudhury, Indranil; Celaya, Jose R.; Saha, Bhaskar; Saha, Sankalita

    2012-01-01

    Prognostics and Health Management (PHM) principles have considerable promise to change the game of lifecycle cost of engineering systems at high safety levels by providing a reliable estimate of future system states. This estimate is a key for planning and decision making in an operational setting. While technology solutions have made considerable advances, the tie-in into the systems engineering process is lagging behind, which delays fielding of PHM-enabled systems. The derivation of specifications from high level requirements for algorithm performance to ensure quality predictions is not well developed. From an engineering perspective some key parameters driving the requirements for prognostics performance include: (1) maximum allowable Probability of Failure (PoF) of the prognostic system to bound the risk of losing an asset, (2) tolerable limits on proactive maintenance to minimize missed opportunity of asset usage, (3) lead time to specify the amount of advanced warning needed for actionable decisions, and (4) required confidence to specify when prognosis is sufficiently good to be used. This paper takes a systems engineering view towards the requirements specification process and presents a method for the flowdown process. A case study based on an electric Unmanned Aerial Vehicle (e-UAV) scenario demonstrates how top level requirements for performance, cost, and safety flow down to the health management level and specify quantitative requirements for prognostic algorithm performance.

  9. Environmental Cost Analysis System (ECAS) Status and Compliance Requirements for EM Consolidated Business Center Contracts - 13204

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanford, P.C.; Moe, M.A.; Hombach, W.G.

    2013-07-01

    The Department of Energy (DOE) Office of Environmental Management (EM) has developed a web-accessible database to collect actual cost data from completed EM projects to support cost estimating and analysis. This Environmental Cost Analysis System (ECAS) database was initially deployed in early 2009 containing the cost and parametric data from 77 decommissioning, restoration, and waste management projects completed under the Rocky Flats Closure Project. In subsequent years we have added many more projects to ECAS and now have a total of 280 projects from 8 major DOE sites. This data is now accessible to DOE users through a web-based reportingmore » tool that allows users to tailor report outputs to meet their specific needs. We are using it as a principal resource supporting the EM Consolidated Business Center (EMCBC) and the EM Applied Cost Engineering (ACE) team cost estimating and analysis efforts across the country. The database has received Government Accountability Office review as supporting its recommended improvements in DOE's cost estimating process, as well as review from the DOE Office of Acquisition and Project Management (APM). Moving forward, the EMCBC has developed a Special Contract Requirement clause or 'H-Clause' to be included in all current and future EMCBC procurements identifying the process that contractors will follow to provide DOE their historical project data in a format compatible with ECAS. Changes to DOE O 413.3B implementation are also in progress to capture historical costs as part of the Critical Decision project closeout process. (authors)« less

  10. Impact of biology knowledge on the conservation and management of large pelagic sharks.

    PubMed

    Yokoi, Hiroki; Ijima, Hirotaka; Ohshimo, Seiji; Yokawa, Kotaro

    2017-09-06

    Population growth rate, which depends on several biological parameters, is valuable information for the conservation and management of pelagic sharks, such as blue and shortfin mako sharks. However, reported biological parameters for estimating the population growth rates of these sharks differ by sex and display large variability. To estimate the appropriate population growth rate and clarify relationships between growth rate and relevant biological parameters, we developed a two-sex age-structured matrix population model and estimated the population growth rate using combinations of biological parameters. We addressed elasticity analysis and clarified the population growth rate sensitivity. For the blue shark, the estimated median population growth rate was 0.384 with a range of minimum and maximum values of 0.195-0.533, whereas those values of the shortfin mako shark were 0.102 and 0.007-0.318, respectively. The maturity age of male sharks had the largest impact for blue sharks, whereas that of female sharks had the largest impact for shortfin mako sharks. Hypotheses for the survival process of sharks also had a large impact on the population growth rate estimation. Both shark maturity age and survival rate were based on ageing validation data, indicating the importance of validating the quality of these data for the conservation and management of large pelagic sharks.

  11. Sources and sinks of carbon in boreal ecosystems of interior Alaska: a review

    USGS Publications Warehouse

    Douglas, Thomas A.; Jones, Miriam C.; Hiemstra, Christopher A.

    2014-01-01

    Boreal regions store large quantities of carbon but are increasingly vulnerable to carbon loss due to disturbance and climate warming. The boreal region, underlain by discontinuous permafrost, presents a challenging landscape for itemizing current and potential carbon sources and sinks in the boreal soil and vegetation. The roles of fire, forest succession, and the presence (or absence) of permafrost on carbon cycle, vegetation, and hydrologic processes have been the focus of multidisciplinary research in this area for the past 20 years. However, projections of a warming future climate, an increase in fire severity and extent, and the potential degradation of permafrost could lead to major landscape process changes over the next 20 to 50 years. This provides a major challenge for predicting how the interplay between land management activities and impacts of climate warming will affect carbon sources and sinks in Interior Alaska. To assist land managers in adapting and managing for potential changes in the Interior Alaska carbon cycle we developed this review paper incorporating an overview of the climate, ecosystem processes, vegetation types, and soil regimes in Interior Alaska with a focus on ramifications for the carbon cycle. Our objective is to provide a synthesis of the most current carbon storage estimates and measurements to support policy and land management decisions on how to best manage carbon sources and sinks in Interior Alaska. To support this we have surveyed relevant peer reviewed estimates of carbon stocks in aboveground and belowground biomass for Interior Alaska boreal ecosystems. We have also summarized methane and carbon dioxide fluxes from the same ecosystems. These data have been converted into the same units to facilitate comparison across ecosystem compartments. We identify potential changes in the carbon cycle with climate change and human disturbance including how compounding disturbances can affect the boreal system. Finally, we provide recommendations to address the challenges facing land managers in efforts to manage carbon cycle processes. The results of this study can be used for carbon cycle management in other locations within the boreal biome which encompass a broad distribution from 45° to 83° north.

  12. An evaluation and comparison of conservation guidelines for an at-risk migratory songbird

    USGS Publications Warehouse

    McNeil, Darin J.; Aldinger, Kyle R.; Bakermans, Marja H.; Lehman, Justin A.; Tisdale, Anna C.; Jones, John A.; Wood, Petra B.; Buehler, David A.; Smalling, Curtis G.; Siefferman, Lynn; Larkin, Jeffrey L.

    2017-01-01

    For at-risk wildlife species, it is important to consider conservation within the process of adaptive management. Golden-winged Warblers (Vermivora chrysoptera) are Neotropical migratory songbirds that are experiencing long-term population declines due in part to the loss of early-successional nesting habitat. Recently-developed Golden-winged Warbler habitat management guidelines are being implemented by USDA: Natural Resource Conservation Service (2014) and its partners through the Working Lands For Wildlife (WLFW) program. During 2012–2014, we studied the nesting ecology of Golden-winged Warblers in managed habitats of the eastern US that conformed to WLFW conservation practices. We evaluated five NRCS “management scenarios” with respect to nesting success and attainment of recommended nest site vegetation conditions outlined in the Golden-winged Warbler breeding habitat guidelines. Using estimates of territory density, pairing rate, nest survival, and clutch size, we also estimated fledgling productivity (number of fledglings/ha) for each management scenario. In general, Golden-winged Warbler nest survival declined as each breeding season advanced, but nest survival was similar across management scenarios. Within each management scenario, vegetation variables had little influence on nest survival. Still, percent Rubus cover and density of >2 m tall shrubs were relevant in some management scenarios. All five management scenarios rarely attained recommended levels of nest site vegetation conditions for Golden-winged, yet nest survival was high. Fledgling productivity estimates for each management scenario ranged from 2.1 to 8.6 fledglings/10 hectares. Our results indicate that targeted habitat management for Golden-winged Warblers using a variety of management techniques on private lands has the capability to yield high nest survival and fledgling productivity, and thus have the potential to contribute to the species recovery.

  13. Effects of model complexity and priors on estimation using sequential importance sampling/resampling for species conservation

    USGS Publications Warehouse

    Dunham, Kylee; Grand, James B.

    2016-01-01

    We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.

  14. Estimating psychiatric manpower requirements based on patients' needs.

    PubMed

    Faulkner, L R; Goldman, C R

    1997-05-01

    To provide a better understanding of the complexities of estimating psychiatric manpower requirements, the authors describe several approaches to estimation and present a method based on patients' needs. A five-step method for psychiatric manpower estimation is used, with estimates of data pertinent to each step, to calculate the total psychiatric manpower requirements for the United States. The method is also used to estimate the hours of psychiatric service per patient per year that might be available under current psychiatric practice and under a managed care scenario. Depending on assumptions about data at each step in the method, the total psychiatric manpower requirements for the U.S. population range from 2,989 to 358,696 full-time-equivalent psychiatrists. The number of available hours of psychiatric service per patient per year is 14.1 hours under current psychiatric practice and 2.8 hours under the managed care scenario. The key to psychiatric manpower estimation lies in clarifying the assumptions that underlie the specific method used. Even small differences in assumptions mean large differences in estimates. Any credible manpower estimation process must include discussions and negotiations between psychiatrists, other clinicians, administrators, and patients and families to clarify the treatment needs of patients and the roles, responsibilities, and job description of psychiatrists.

  15. [Administrative efficiency in the Mexican Fund for the Prevention of Catastrophic Expenditures in Health].

    PubMed

    Orozco-Núñez, Emanuel; Alcalde-Rabanal, Jaqueline; Navarro, Juan; Lozano, Rafael

    2016-01-01

    To show that the administrative regime of specialized hospitals has some influence on the administrative processes to operate the Mexican Fund for Catastrophic Expenditures in Health (FPGC, in Spanish), for providing health care to breast cancer, cervical cancer and child leukemia. The variable for estimating administrative efficiency was the time estimated from case notification to reimbursement. For its estimation, semistructured interviews were applied to key actors involved in management of cancer care financed by FPGC. Additionally, a group of experts was organized to make recommendations for improving processes. Specialized hospitals with a decentralized scheme showed less time to solve the administrative process in comparison with the model on the hospitals dependent on State Health Services, where timing and intermediation levels were higher. Decentralized hospitals administrative scheme for specialized care is more efficient, because they tend to be more autonomous.

  16. Systematic review of epidemiological studies on health effects associated with management of solid waste

    PubMed Central

    2009-01-01

    Background Management of solid waste (mainly landfills and incineration) releases a number of toxic substances, most in small quantities and at extremely low levels. Because of the wide range of pollutants, the different pathways of exposure, long-term low-level exposure, and the potential for synergism among the pollutants, concerns remain about potential health effects but there are many uncertainties involved in the assessment. Our aim was to systematically review the available epidemiological literature on the health effects in the vicinity of landfills and incinerators and among workers at waste processing plants to derive usable excess risk estimates for health impact assessment. Methods We examined the published, peer-reviewed literature addressing health effects of waste management between 1983 and 2008. For each paper, we examined the study design and assessed potential biases in the effect estimates. We evaluated the overall evidence and graded the associated uncertainties. Results In most cases the overall evidence was inadequate to establish a relationship between a specific waste process and health effects; the evidence from occupational studies was not sufficient to make an overall assessment. For community studies, at least for some processes, there was limited evidence of a causal relationship and a few studies were selected for a quantitative evaluation. In particular, for populations living within two kilometres of landfills there was limited evidence of congenital anomalies and low birth weight with excess risk of 2 percent and 6 percent, respectively. The excess risk tended to be higher when sites dealing with toxic wastes were considered. For populations living within three kilometres of old incinerators, there was limited evidence of an increased risk of cancer, with an estimated excess risk of 3.5 percent. The confidence in the evaluation and in the estimated excess risk tended to be higher for specific cancer forms such as non-Hodgkin's lymphoma and soft tissue sarcoma than for other cancers. Conclusions The studies we have reviewed suffer from many limitations due to poor exposure assessment, ecological level of analysis, and lack of information on relevant confounders. With a moderate level confidence, however, we have derived some effect estimates that could be used for health impact assessment of old landfill and incineration plants. The uncertainties surrounding these numbers should be considered carefully when health effects are estimated. It is clear that future research into the health risks of waste management needs to overcome current limitations. PMID:20030820

  17. The role of population monitoring in the management of North American waterfowl

    USGS Publications Warehouse

    Nichols, J.D.; Williams, B.K.; Johnson, F.A.

    2000-01-01

    Despite the effort and expense devoted to large-scale monitoring programs, few existing programs have been designed with specific objectives in mind and few permit strong inferences about the dynamics of monitored systems. The waterfowl population monitoring programs of the U.S. Fish and Wildlife Service, Canadian Wildlife Service and state and provincial agencies provide a nice example with respect to program objectives, design and implementation. The May Breeding Grounds Survey provides an estimate of system state (population size) that serves two primary purposes in the adaptive management process: identifying the appropriate time-specific management actions and updating the information state (model weights) by providing a basis for evaluating predictions of competing models. Other waterfowl monitoring programs (e.g., banding program, hunter questionnaire survey, parts collection survey, winter survey) provide estimates of vital rates (rates of survival, reproduction and movement) associated with system dynamics and variables associated with management objectives (e.g., harvest). The reliability of estimates resulting from monitoring programs depends strongly on whether considerations about spatial variation and detection probability have been adequately incorporated into program design and implementation. Certain waterfowl surveys again provide nice examples of monitoring programs that incorporate these considerations.

  18. Modern methods for the quality management of high-rate melt solidification

    NASA Astrophysics Data System (ADS)

    Vasiliev, V. A.; Odinokov, S. A.; Serov, M. M.

    2016-12-01

    The quality management of high-rate melt solidification needs combined solution obtained by methods and approaches adapted to a certain situation. Technological audit is recommended to estimate the possibilities of the process. Statistical methods are proposed with the choice of key parameters. Numerical methods, which can be used to perform simulation under multifactor technological conditions, and an increase in the quality of decisions are of particular importance.

  19. A Lessons Learned Knowledge Warehouse to Support the Army Knowledge Management Command-Centric

    DTIC Science & Technology

    2004-03-01

    Warehouse to Support the Army Knowledge Management Command-Centric increase the quality and availability of information in context ( knowledge ) to the... information , geographical information , knowledge base, Intelligence data (HUMINT, SIGINT, etc.); and • • Human Computer Interaction (HCI): allows...the Data Fusion Process from the HCI point of view? Can the LL Knowledge Base provide any valuable information to achieve better estimates of the

  20. Municipal solid waste management in Tehran: Changes during the last 5 years.

    PubMed

    Malmir, Tahereh; Tojo, Yasumasa

    2016-05-01

    The situation of waste management in Tehran was a typical example of it in developing countries. The amount of municipal solid waste has been increasing and the city has depended on landfill for municipal solid waste management. However, in recent years, various measures have been taken by the city, such as collecting recyclables at the source and increasing the capacity of waste-processing facilities. As a result, significant changes in the waste stream are starting to occur. This study investigated the nature of, and reasons for, the marked changes in the waste stream from 2008 to 2012 by analysing the municipal solid waste statistics published by the Tehran Waste Management Organization in 2013 and survey data on the physical composition of the municipal solid waste. The following trends were identified: Although the generation of municipal solid waste increased by 10% during the 5-year period, the amount of waste directly disposed of to landfill halved and resource recovery almost doubled. An increase in the capacity of a waste-processing facility contributed significantly to these changes. The biodegradable fraction going to landfill was estimated by using the quantity and the composition of each input to the landfill. The estimated result in 2012 decreased to 49% of its value in 2008. © The Author(s) 2016.

  1. Valorisation of fish by-products against waste management treatments--Comparison of environmental impacts.

    PubMed

    Lopes, Carla; Antelo, Luis T; Franco-Uría, Amaya; Alonso, Antonio A; Pérez-Martín, Ricardo

    2015-12-01

    Reuse and valorisation of fish by-products is a key process for marine resources conservation. Usually, fishmeal and oil processing factories collect the by-products generated by fishing port and industry processing activities, producing an economical benefit to both parts. In the same way, different added-value products can be recovered by the valorisation industries whereas fishing companies save the costs associated with the management of those wastes. However, it is important to estimate the advantages of valorisation processes not only in terms of economic income, but also considering the environmental impacts. This would help to know if the valorisation of a residue provokes higher impact than other waste management options, which means that its advantages are probably not enough for guarantying a sustainable waste reuse. To that purpose, there are several methodologies to evaluate the environmental impacts of processes, including those of waste management, providing different indicators which give information on relevant environmental aspects. In the current study, a comparative environmental assessment between a valorisation process (fishmeal and oil production) and different waste management scenarios (composting, incineration and landfilling) was developed. This comparison is a necessary step for the development and industrial implementation of these processes as the best alternative treatment for fish by-products. The obtained results showed that both valorisation process and waste management treatments presented similar impacts. However, a significant benefit can be achieved through valorisation of fish by-products. Additionally, the implications of the possible presence of pollutants were discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. SPS Energy Conversion Power Management Workshop

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Energy technology concerning photovoltaic conversion, solar thermal conversion systems, and electrical power distribution processing is discussed. The manufacturing processes involving solar cells and solar array production are summarized. Resource issues concerning gallium arsenides and silicon alternatives are reported. Collector structures for solar construction are described and estimates in their service life, failure rates, and capabilities are presented. Theories of advanced thermal power cycles are summarized. Power distribution system configurations and processing components are presented.

  3. Multisite evaluation of APEX for water quality: II. Regional parameterization

    USDA-ARS?s Scientific Manuscript database

    Phosphorus (P) index assessment requires independent estimates of long-term average annual P loss from multiple locations, management practices, soils, and landscape positions. Because currently available measured data are insufficient, calibrated and validated process-based models have been propos...

  4. Research management peer exchange hosted by the Ohio Department of Transportation, August 5-7, 2002.

    DOT National Transportation Integrated Search

    2002-08-01

    The expressed objectives of the Peer Exchange were to: : Enhance the overall research process : Enhance implementation and tracking of research results : Improve the quality and accuracy of preliminary research cost estimates prepared : internally pr...

  5. Rapid Estimation of Life Cycle Inventory

    EPA Science Inventory

    Many chemical manufacturers and regulators use life cycle assessment (LCA) to manage the sustainability of chemical manufacturing processes. A significant challenge to using LCA, however, is the sheer quantity of data related to energy and material flows that needs to be collecte...

  6. Realization of daily evapotranspiration in arid ecosystems based on remote sensing techniques

    NASA Astrophysics Data System (ADS)

    Elhag, Mohamed; Bahrawi, Jarbou A.

    2017-03-01

    Daily evapotranspiration is a major component of water resources management plans. In arid ecosystems, the quest for an efficient water budget is always hard to achieve due to insufficient irrigational water and high evapotranspiration rates. Therefore, monitoring of daily evapotranspiration is a key practice for sustainable water resources management, especially in arid environments. Remote sensing techniques offered a great help to estimate the daily evapotranspiration on a regional scale. Existing open-source algorithms proved to estimate daily evapotranspiration comprehensively in arid environments. The only deficiency of these algorithms is the course scale of the used remote sensing data. Consequently, the adequate downscaling algorithm is a compulsory step to rationalize an effective water resources management plan. Daily evapotranspiration was estimated fairly well using an Advance Along-Track Scanner Radiometer (AATSR) in conjunction with (MEdium Resolution Imaging Spectrometer) MERIS data acquired in July 2013 with 1 km spatial resolution and 3 days of temporal resolution under a surface energy balance system (SEBS) model. Results were validated against reference evapotranspiration ground truth values using standardized Penman-Monteith method with R2 of 0.879. The findings of the current research successfully monitor turbulent heat fluxes values estimated from AATSR and MERIS data with a temporal resolution of 3 days only in conjunction with reliable meteorological data. Research verdicts are necessary inputs for a well-informed decision-making processes regarding sustainable water resource management.

  7. A Discussion on Uncertainty Representation and Interpretation in Model-Based Prognostics Algorithms based on Kalman Filter Estimation Applied to Prognostics of Electronics Components

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Saxen, Abhinav; Goebel, Kai

    2012-01-01

    This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process and how it relates to uncertainty representation, management, and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function and the true remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for the two while considering prognostics in making critical decisions.

  8. Resource use and costs of type 2 diabetes patients receiving managed or protocolized primary care: a controlled clinical trial.

    PubMed

    van der Heijden, Amber A W A; de Bruijne, Martine C; Feenstra, Talitha L; Dekker, Jacqueline M; Baan, Caroline A; Bosmans, Judith E; Bot, Sandra D M; Donker, Gé A; Nijpels, Giel

    2014-06-25

    The increasing prevalence of diabetes is associated with increased health care use and costs. Innovations to improve the quality of care, manage the increasing demand for health care and control the growth of health care costs are needed. The aim of this study is to evaluate the care process and costs of managed, protocolized and usual care for type 2 diabetes patients from a societal perspective. In two distinct regions of the Netherlands, both managed and protocolized diabetes care were implemented. Managed care was characterized by centralized organization, coordination, responsibility and centralized annual assessment. Protocolized care had a partly centralized organizational structure. Usual care was characterized by a decentralized organizational structure. Using a quasi-experimental control group pretest-posttest design, the care process (guideline adherence) and costs were compared between managed (n = 253), protocolized (n = 197), and usual care (n = 333). We made a distinction between direct health care costs, direct non-health care costs and indirect costs. Multivariate regression models were used to estimate differences in costs adjusted for confounding factors. Because of the skewed distribution of the costs, bootstrapping methods (5000 replications) with a bias-corrected and accelerated approach were used to estimate 95% confidence intervals (CI) around the differences in costs. Compared to usual and protocolized care, in managed care more patients were treated according to diabetes guidelines. Secondary health care use was higher in patients under usual care compared to managed and protocolized care. Compared to usual care, direct costs were significantly lower in managed care (€-1.181 (95% CI: -2.597 to -334)) while indirect costs were higher (€ 758 (95% CI: -353 to 2.701), although not significant. Direct, indirect and total costs were lower in protocolized care compared to usual care (though not significantly). Compared to usual care, managed care was significantly associated with better process in terms of diabetes care, fewer secondary care consultations and lower health care costs. The same trends were seen for protocolized care, however they were not statistically significant. Current Controlled trials: ISRCTN66124817.

  9. Resource use and costs of type 2 diabetes patients receiving managed or protocolized primary care: a controlled clinical trial

    PubMed Central

    2014-01-01

    Background The increasing prevalence of diabetes is associated with increased health care use and costs. Innovations to improve the quality of care, manage the increasing demand for health care and control the growth of health care costs are needed. The aim of this study is to evaluate the care process and costs of managed, protocolized and usual care for type 2 diabetes patients from a societal perspective. Methods In two distinct regions of the Netherlands, both managed and protocolized diabetes care were implemented. Managed care was characterized by centralized organization, coordination, responsibility and centralized annual assessment. Protocolized care had a partly centralized organizational structure. Usual care was characterized by a decentralized organizational structure. Using a quasi-experimental control group pretest-posttest design, the care process (guideline adherence) and costs were compared between managed (n = 253), protocolized (n = 197), and usual care (n = 333). We made a distinction between direct health care costs, direct non-health care costs and indirect costs. Multivariate regression models were used to estimate differences in costs adjusted for confounding factors. Because of the skewed distribution of the costs, bootstrapping methods (5000 replications) with a bias-corrected and accelerated approach were used to estimate 95% confidence intervals (CI) around the differences in costs. Results Compared to usual and protocolized care, in managed care more patients were treated according to diabetes guidelines. Secondary health care use was higher in patients under usual care compared to managed and protocolized care. Compared to usual care, direct costs were significantly lower in managed care (€-1.181 (95% CI: -2.597 to -334)) while indirect costs were higher (€758 (95% CI: -353 to 2.701), although not significant. Direct, indirect and total costs were lower in protocolized care compared to usual care (though not significantly). Conclusions Compared to usual care, managed care was significantly associated with better process in terms of diabetes care, fewer secondary care consultations and lower health care costs. The same trends were seen for protocolized care, however they were not statistically significant. Trial registration Current Controlled trials: ISRCTN66124817. PMID:24966055

  10. Scope Complexity Options Risks Excursions (SCORE) Factor Mathematical Description.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element canmore » be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less

  11. The Use of Radar to Improve Rainfall Estimation over the Tennessee and San Joaquin River Valleys

    NASA Technical Reports Server (NTRS)

    Petersen, Walter A.; Gatlin, Patrick N.; Felix, Mariana; Carey, Lawrence D.

    2010-01-01

    This slide presentation provides an overview of the collaborative radar rainfall project between the Tennessee Valley Authority (TVA), the Von Braun Center for Science & Innovation (VCSI), NASA MSFC and UAHuntsville. Two systems were used in this project, Advanced Radar for Meteorological & Operational Research (ARMOR) Rainfall Estimation Processing System (AREPS), a demonstration project of real-time radar rainfall using a research radar and NEXRAD Rainfall Estimation Processing System (NREPS). The objectives, methodology, some results and validation, operational experience and lessons learned are reviewed. The presentation. Another project that is using radar to improve rainfall estimations is in California, specifically the San Joaquin River Valley. This is part of a overall project to develop a integrated tool to assist water management within the San Joaquin River Valley. This involves integrating several components: (1) Radar precipitation estimates, (2) Distributed hydro model, (3) Snowfall measurements and Surface temperature / moisture measurements. NREPS was selected to provide precipitation component.

  12. Quality metric for spherical panoramic video

    NASA Astrophysics Data System (ADS)

    Zakharchenko, Vladyslav; Choi, Kwang Pyo; Park, Jeong Hoon

    2016-09-01

    Virtual reality (VR)/ augmented reality (AR) applications allow users to view artificial content of a surrounding space simulating presence effect with a help of special applications or devices. Synthetic contents production is well known process form computer graphics domain and pipeline has been already fixed in the industry. However emerging multimedia formats for immersive entertainment applications such as free-viewpoint television (FTV) or spherical panoramic video require different approaches in content management and quality assessment. The international standardization on FTV has been promoted by MPEG. This paper is dedicated to discussion of immersive media distribution format and quality estimation process. Accuracy and reliability of the proposed objective quality estimation method had been verified with spherical panoramic images demonstrating good correlation results with subjective quality estimation held by a group of experts.

  13. Estimation of process capability indices from the results of limit gauge inspection of dimensional parameters in machining industry

    NASA Astrophysics Data System (ADS)

    Masterenko, Dmitry A.; Metel, Alexander S.

    2018-03-01

    The process capability indices Cp, Cpk are widely used in the modern quality management as statistical measures of the ability of a process to produce output X within specification limits. The customer's requirement to ensure Cp ≥ 1.33 is often applied in contracts. Capability indices estimates may be calculated with the estimates of the mean µ and the variability 6σ, and for it, the quality characteristic in a sample of pieces should be measured. It requires, in turn, using advanced measuring devices and well-qualified staff. From the other hand, quality inspection by attributes, fulfilled with limit gauges (go/no-go) is much simpler and has a higher performance, but it does not give the numerical values of the quality characteristic. The described method allows estimating the mean and the variability of the process on the basis of the results of limit gauge inspection with certain lower limit LCL and upper limit UCL, which separates the pieces into three groups: where X < LCL, number of pieces is n1, where LCL ≤ X < UCL, n2 pieces, and where X ≥ UCL, n3 pieces. So-called Pittman-type estimates, developed by the author, are functions of n1, n2, n3 and allow calculation of the estimated µ and σ. Thus, Cp and Cpk also may be estimated without precise measurements. The estimates can be used in quality inspection of lots of pieces as well as in monitoring and control of the manufacturing process. It is very important for improving quality of articles in machining industry through their tolerance.

  14. Robust Abundance Estimation in Animal Abundance Surveys with Imperfect Detection

    EPA Science Inventory

    Surveys of animal abundance are central to the conservation and management of living natural resources. However, detection uncertainty complicates the sampling process of many species. One sampling method employed to deal with this problem is depletion (or removal) surveys in whi...

  15. Robust Abundance Estimation in Animal Surveys with Imperfect Detection

    EPA Science Inventory

    Surveys of animal abundance are central to the conservation and management of living natural resources. However, detection uncertainty complicates the sampling process of many species. One sampling method employed to deal with this problem is depletion (or removal) surveys in whi...

  16. Nuclear Weapons. National Nuclear Security Administration’s Plans for Its Uranium Processing Facility Should Better Reflect Funding Estimates and Technology Readiness

    DTIC Science & Technology

    2010-11-01

    metal. Recovery extraction centrifugal contactors A process that uses solvent to extract uranium for purposes of purification. Agile machining A...extraction centrifugal contactors 5 6 Yes 6 No Agile machining 5 5 No 6 No Chip management 5 6 Yes 6 No Special casting 3 6 Yes 6 No Source: GAO

  17. Estimation of construction and demolition waste volume generation in new residential buildings in Spain.

    PubMed

    Villoria Sáez, Paola; del Río Merino, Mercedes; Porras-Amores, César

    2012-02-01

    The management planning of construction and demolition (C&D) waste uses a single indicator which does not provide enough detailed information. Therefore the determination and implementation of other innovative and precise indicators should be determined. The aim of this research work is to improve existing C&D waste quantification tools in the construction of new residential buildings in Spain. For this purpose, several housing projects were studied to determine an estimation of C&D waste generated during their construction process. This paper determines the values of three indicators to estimate the generation of C&D waste in new residential buildings in Spain, itemizing types of waste and construction stages. The inclusion of two more accurate indicators, in addition to the global one commonly in use, provides a significant improvement in C&D waste quantification tools and management planning.

  18. Estimating linear temporal trends from aggregated environmental monitoring data

    USGS Publications Warehouse

    Erickson, Richard A.; Gray, Brian R.; Eager, Eric A.

    2017-01-01

    Trend estimates are often used as part of environmental monitoring programs. These trends inform managers (e.g., are desired species increasing or undesired species decreasing?). Data collected from environmental monitoring programs is often aggregated (i.e., averaged), which confounds sampling and process variation. State-space models allow sampling variation and process variations to be separated. We used simulated time-series to compare linear trend estimations from three state-space models, a simple linear regression model, and an auto-regressive model. We also compared the performance of these five models to estimate trends from a long term monitoring program. We specifically estimated trends for two species of fish and four species of aquatic vegetation from the Upper Mississippi River system. We found that the simple linear regression had the best performance of all the given models because it was best able to recover parameters and had consistent numerical convergence. Conversely, the simple linear regression did the worst job estimating populations in a given year. The state-space models did not estimate trends well, but estimated population sizes best when the models converged. We found that a simple linear regression performed better than more complex autoregression and state-space models when used to analyze aggregated environmental monitoring data.

  19. Use of Smartphones to Estimate Carbohydrates in Foods for Diabetes Management.

    PubMed

    Huang, Jurong; Ding, Hang; McBride, Simon; Ireland, David; Karunanithi, Mohan

    2015-01-01

    Over 380 million adults worldwide are currently living with diabetes and the number has been projected to reach 590 million by 2035. Uncontrolled diabetes often lead to complications, disability, and early death. In the management of diabetes, dietary intervention to control carbohydrate intake is essential to help manage daily blood glucose level within a recommended range. The intervention traditionally relies on a self-report to estimate carbohydrate intake through a paper based diary. The traditional approach is known to be inaccurate, inconvenient, and resource intensive. Additionally, patients often require a long term of learning or training to achieve a certain level of accuracy and reliability. To address these issues, we propose a design of a smartphone application that automatically estimates carbohydrate intake from food images. The application uses imaging processing techniques to classify food type, estimate food volume, and accordingly calculate the amount of carbohydrates. To examine the proof of concept, a small fruit database was created to train a classification algorithm implemented in the application. Consequently, a set of fruit photos (n=6) from a real smartphone were applied to evaluate the accuracy of the carbohydrate estimation. This study demonstrates the potential to use smartphones to improve dietary intervention, although further studies are needed to improve the accuracy, and extend the capability of the smartphone application to analyse broader food contents.

  20. Ecosystem services as a common language for coastal ecosystem-based management.

    PubMed

    Granek, Elise F; Polasky, Stephen; Kappel, Carrie V; Reed, Denise J; Stoms, David M; Koch, Evamaria W; Kennedy, Chris J; Cramer, Lori A; Hacker, Sally D; Barbier, Edward B; Aswani, Shankar; Ruckelshaus, Mary; Perillo, Gerardo M E; Silliman, Brian R; Muthiga, Nyawira; Bael, David; Wolanski, Eric

    2010-02-01

    Ecosystem-based management is logistically and politically challenging because ecosystems are inherently complex and management decisions affect a multitude of groups. Coastal ecosystems, which lie at the interface between marine and terrestrial ecosystems and provide an array of ecosystem services to different groups, aptly illustrate these challenges. Successful ecosystem-based management of coastal ecosystems requires incorporating scientific information and the knowledge and views of interested parties into the decision-making process. Estimating the provision of ecosystem services under alternative management schemes offers a systematic way to incorporate biogeophysical and socioeconomic information and the views of individuals and groups in the policy and management process. Employing ecosystem services as a common language to improve the process of ecosystem-based management presents both benefits and difficulties. Benefits include a transparent method for assessing trade-offs associated with management alternatives, a common set of facts and common currency on which to base negotiations, and improved communication among groups with competing interests or differing worldviews. Yet challenges to this approach remain, including predicting how human interventions will affect ecosystems, how such changes will affect the provision of ecosystem services, and how changes in service provision will affect the welfare of different groups in society. In a case study from Puget Sound, Washington, we illustrate the potential of applying ecosystem services as a common language for ecosystem-based management.

  1. NREPS Applications for Water Supply and Management in California and Tennessee

    NASA Technical Reports Server (NTRS)

    Gatlin, P.; Scott, M.; Carery, L. D.; Petersen, W. A.

    2011-01-01

    Management of water resources is a balancing act between temporally and spatially limited sources and competitive needs which can often exceed the supply. In order to manage water resources over a region such as the San Joaquin Valley or the Tennessee River Valley, it is pertinent to know the amount of water that has fallen in the watershed and where the water is going within it. Since rain gauge networks are typically sparsely spaced, it is typical that the majority of rainfall on the region may not be measured. To mitigate this under-sampling of rainfall, weather radar has long been employed to provide areal rainfall estimates. The Next-Generation Weather Radars (NEXRAD) make it possible to estimate rainfall over the majority of the conterminous United States. The NEXRAD Rainfall Estimation Processing System (NREPS) was developed specifically for the purpose of using weather radar to estimate rainfall for water resources management. The NREPS is tailored to meet customer needs on spatial and temporal scales relevant to the hydrologic or land-surface models of the end-user. It utilizes several techniques to mitigate artifacts in the NEXRAD data from contaminating the rainfall field. These techniques include clutter filtering, correction for occultation by topography as well as accounting for the vertical profile of reflectivity. This presentation will focus on improvements made to the NREPS system to map rainfall in the San Joaquin Valley for NASA s Water Supply and Management Project in California, but also ongoing rainfall mapping work in the Tennessee River watershed for the Tennessee Valley Authority and possible future applications in other areas of the continent.

  2. Environmental risk assessment of water quality in harbor areas: a new methodology applied to European ports.

    PubMed

    Gómez, Aina G; Ondiviela, Bárbara; Puente, Araceli; Juanes, José A

    2015-05-15

    This work presents a standard and unified procedure for assessment of environmental risks at the contaminant source level in port aquatic systems. Using this method, port managers and local authorities will be able to hierarchically classify environmental hazards and proceed with the most suitable management actions. This procedure combines rigorously selected parameters and indicators to estimate the environmental risk of each contaminant source based on its probability, consequences and vulnerability. The spatio-temporal variability of multiple stressors (agents) and receptors (endpoints) is taken into account to provide accurate estimations for application of precisely defined measures. The developed methodology is tested on a wide range of different scenarios via application in six European ports. The validation process confirms its usefulness, versatility and adaptability as a management tool for port water quality in Europe and worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A tool for efficient, model-independent management optimization under uncertainty

    USGS Publications Warehouse

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  4. Panaceas and diversification of environmental policy

    PubMed Central

    Brock, William A.; Carpenter, Stephen R.

    2007-01-01

    We consider panacea formation in the framework of adaptive learning and decision for social–ecological systems (SESs). Institutions for managing such systems must address multiple timescales of ecological change, as well as features of the social community in which the ecosystem policy problem is embedded. Response of the SES to each candidate institution must be modeled and treated as a stochastic process with unknown parameters to be estimated. A fundamental challenge is to design institutions that are not vulnerable to capture by subsets of the community that self-organize to direct the institution against the overall social interest. In a world of episodic structural change, such as SESs, adaptive learning can lock in to a single institution, model, or parameter estimate. Policy diversification, leading to escape from panacea traps, can come from monitoring indicators of episodic change on slow timescales, minimax regret decision making, active experimentation to accelerate model identification, mechanisms for broadening the set of models or institutions under consideration, and processes for discovery of new institutions and technologies for ecosystem management. It is difficult to take all of these factors into account, but the discipline that comes with the attempt to model the coupled social–ecological dynamics forces policy makers to confront all conceivable responses. This process helps induce the modesty needed to avoid panacea traps while supporting systematic effort to improve resource management in the public interest. PMID:17881581

  5. Effects of scale of movement, detection probability, and true population density on common methods of estimating population density

    DOE PAGES

    Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.; ...

    2017-08-25

    Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less

  6. Effects of scale of movement, detection probability, and true population density on common methods of estimating population density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.

    Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less

  7. Utility-Scale Photovoltaic Deployment Scenarios of the Western United States: Implications for Solar Energy Zones in Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany; Mai, Trieu; Krishnan, Venkat

    2016-12-01

    In this study, we use the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) capacity expansion model to estimate utility-scale photovoltaic (UPV) deployment trends from present day through 2030. The analysis seeks to inform the U.S. Bureau of Land Management's (BLM's) planning activities related to UPV development on federal lands in Nevada as part of the Resource Management Plan (RMP) revision for the Las Vegas and Pahrump field offices. These planning activities include assessing the demand for new or expanded additional Solar Energy Zones (SEZ), per the process outlined in BLM's Western Solar Plan process.

  8. Estimated benefits of connected vehicle applications : dynamic mobility applications, AERIS, V2I safety, and road weather management applications.

    DOT National Transportation Integrated Search

    2015-08-01

    Connected vehicles have the potential to transform travel as we know it by combining leading edge technologies advanced wireless communications, on-board computer processing, advanced vehicle-sensors, Global Positioning System (GPS) navigation, sm...

  9. How Do Land-Use and Climate Change Affect Watershed Health? A Scenario-Based Analysis

    EPA Science Inventory

    With the growing emphasis on biofuel crops and potential impacts of climate variability and change, there is a need to quantify their effects on hydrological processes for developing watershed management plans. Environmental consequences are currently estimated by utilizing comp...

  10. 48 CFR 242.302 - Contract administration functions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Contractor estimating systems (see FAR 15.407-5); and (B) Contractor material management and accounting... report identifying significant accounting system or related internal control deficiencies. (9) For... solicitation or award. (S-70) Serve as the single point of contact for all Single Process Initiative (SPI...

  11. Process-based costing.

    PubMed

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  12. Monitoring and Modeling Carbon Dynamics at a Network of Intensive Sites in the USA and Mexico

    NASA Astrophysics Data System (ADS)

    Birdsey, R.; Wayson, C.; Johnson, K. D.; Pan, Y.; Angeles, G.; De Jong, B. H.; Andrade, J. L.; Dai, Z.

    2013-05-01

    The Forest Services of the USA and Mexico, supported by NASA and USAID, have begun to establish a network of intensive forest carbon monitoring sites. These sites are used for research and teaching, developing forest management practices, and forging links to the needs of communities. Several of the sites have installed eddy flux towers to basic meteorology data and daily estimates of forest carbon uptake and release, the processes that determine forest growth. Field sampling locations at each site provide estimates of forest biomass and carbon stocks, and monitor forest dynamic processes such as growth and mortality rates. Remote sensing facilitates scaling up to the surrounding landscapes. The sites support information requirements for implementing programs such as Reducing Emissions from Deforestation and Forest Degradation (REDD+), enabling communities to receive payments for ecosystem services such as reduced carbon emissions or improved forest management. In addition to providing benchmark data for REDD+ projects, the sites are valuable for validating state and national estimates from satellite remote sensing and the national forest inventory. Data from the sites provide parameters for forest models that support strategic management analysis, and support student training and graduate projects. The intensive monitoring sites may be a model for other countries in Latin America. Coordination among sites in the USA, Mexico and other Latin American countries can ensure harmonization of approaches and data, and share experiences and knowledge among countries with emerging opportunities for implementing REDD+ and other conservation programs.

  13. Exploring the life cycle management of industrial solid waste in the case of copper slag.

    PubMed

    Song, Xiaolong; Yang, Jianxin; Lu, Bin; Li, Bo

    2013-06-01

    Industrial solid waste has potential impacts on soil, water and air quality, as well as human health, during its whole life stages. A framework for the life cycle management of industrial solid waste, which integrates the source reduction process, is presented and applied to copper slag management. Three management scenarios of copper slag are developed: (i) production of cement after electric furnace treatment, (ii) production of cement after flotation, and (iii) source reduction before the recycling process. A life cycle assessment is carried out to estimate the environmental burdens of these three scenarios. Life cycle assessment results showed that the environmental burdens of the three scenarios are 2710.09, 2061.19 and 2145.02 Pt respectively. In consideration of the closed-loop recycling process, the environmental performance of the flotation approach excelled that of the electric furnace approach. Additionally, although flash smelting promotes the source reduction of copper slag compared with bath smelting, it did not reduce the overall environmental burdens resulting from the complete copper slag management process. Moreover, it led to the shifting of environmental burdens from ecosystem quality damage and resources depletion to human health damage. The case study shows that it is necessary to integrate the generation process into the whole life cycle of industrial solid waste, and to make an integrated assessment for quantifying the contribution of source reduction, rather than to simply follow the priority of source reduction and the hierarchy of waste management.

  14. ECOLOGICAL RISK ASSESSMENT IN THE CONTEXT OF GLOBAL CLIMATE CHANGE

    PubMed Central

    Landis, Wayne G; Durda, Judi L; Brooks, Marjorie L; Chapman, Peter M; Menzie, Charles A; Stahl, Ralph G; Stauber, Jennifer L

    2013-01-01

    Changes to sources, stressors, habitats, and geographic ranges; toxicological effects; end points; and uncertainty estimation require significant changes in the implementation of ecological risk assessment (ERA). Because of the lack of analog systems and circumstances in historically studied sites, there is a likelihood of type III error. As a first step, the authors propose a decision key to aid managers and risk assessors in determining when and to what extent climate change should be incorporated. Next, when global climate change is an important factor, the authors recommend seven critical changes to ERA. First, develop conceptual cause–effect diagrams that consider relevant management decisions as well as appropriate spatial and temporal scales to include both direct and indirect effects of climate change and the stressor of management interest. Second, develop assessment end points that are expressed as ecosystem services. Third, evaluate multiple stressors and nonlinear responses—include the chemicals and the stressors related to climate change. Fourth, estimate how climate change will affect or modify management options as the impacts become manifest. Fifth, consider the direction and rate of change relative to management objectives, recognizing that both positive and negative outcomes can occur. Sixth, determine the major drivers of uncertainty, estimating and bounding stochastic uncertainty spatially, temporally, and progressively. Seventh, plan for adaptive management to account for changing environmental conditions and consequent changes to ecosystem services. Good communication is essential for making risk-related information understandable and useful for managers and stakeholders to implement a successful risk-assessment and decision-making process. Environ. Toxicol. Chem. 2013;32:79–92. © 2012 SETAC PMID:23161373

  15. Ecological risk assessment in the context of global climate change.

    PubMed

    Landis, Wayne G; Durda, Judi L; Brooks, Marjorie L; Chapman, Peter M; Menzie, Charles A; Stahl, Ralph G; Stauber, Jennifer L

    2013-01-01

    Changes to sources, stressors, habitats, and geographic ranges; toxicological effects; end points; and uncertainty estimation require significant changes in the implementation of ecological risk assessment (ERA). Because of the lack of analog systems and circumstances in historically studied sites, there is a likelihood of type III error. As a first step, the authors propose a decision key to aid managers and risk assessors in determining when and to what extent climate change should be incorporated. Next, when global climate change is an important factor, the authors recommend seven critical changes to ERA. First, develop conceptual cause-effect diagrams that consider relevant management decisions as well as appropriate spatial and temporal scales to include both direct and indirect effects of climate change and the stressor of management interest. Second, develop assessment end points that are expressed as ecosystem services. Third, evaluate multiple stressors and nonlinear responses-include the chemicals and the stressors related to climate change. Fourth, estimate how climate change will affect or modify management options as the impacts become manifest. Fifth, consider the direction and rate of change relative to management objectives, recognizing that both positive and negative outcomes can occur. Sixth, determine the major drivers of uncertainty, estimating and bounding stochastic uncertainty spatially, temporally, and progressively. Seventh, plan for adaptive management to account for changing environmental conditions and consequent changes to ecosystem services. Good communication is essential for making risk-related information understandable and useful for managers and stakeholders to implement a successful risk-assessment and decision-making process. Copyright © 2012 SETAC.

  16. Dissimilarity of contemporary and historical gene flow in a wild carrot (Daucus carota) metapopulation under contrasting levels of human disturbance: implications for risk assessment and management of transgene introgression

    PubMed Central

    Rong, Jun; Xu, Shuhua; Meirmans, Patrick G.; Vrieling, Klaas

    2013-01-01

    Background and Aims Transgene introgression from crops into wild relatives may increase the resistance of wild plants to herbicides, insects, etc. The chance of transgene introgression depends not only on the rate of hybridization and the establishment of hybrids in local wild populations, but also on the metapopulation dynamics of the wild relative. The aim of the study was to estimate gene flow in a metapopulation for assessing and managing the risks of transgene introgression. Methods Wild carrots (Daucus carota) were sampled from 12 patches in a metapopulation. Eleven microsatellites were used to genotype wild carrots. Genetic structure was estimated based on the FST statistic. Contemporary (over the last several generations) and historical (over many generations) gene flow was estimated with assignment and coalescent methods, respectively. Key Results The genetic structure in the wild carrot metapopulation was moderate (FST = 0·082) and most of the genetic variation resided within patches. A pattern of isolation by distance was detected, suggesting that most of the gene flow occurred between neighbouring patches (≤1 km). The mean contemporary gene flow was 5 times higher than the historical estimate, and the correlation between them was very low. Moreover, the contemporary gene flow in roadsides was twice that in a nature reserve, and the correlation between contemporary and historical estimates was much higher in the nature reserve. Mowing of roadsides may contribute to the increase in contemporary gene flow. Simulations demonstrated that the higher contemporary gene flow could accelerate the process of transgene introgression in the metapopulation. Conclusions Human disturbance such as mowing may alter gene flow patterns in wild populations, affecting the metapopulation dynamics of wild plants and the processes of transgene introgression in the metapopulation. The risk assessment and management of transgene introgression and the control of weeds need to take metapopulation dynamics into consideration. PMID:24052560

  17. Utilization of accident databases and fuzzy sets to estimate frequency of HazMat transport accidents.

    PubMed

    Qiao, Yuanhua; Keren, Nir; Mannan, M Sam

    2009-08-15

    Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.

  18. Estimating hydrologic and erosion response for use in ecological site descriptions

    USDA-ARS?s Scientific Manuscript database

    Ecological resilience of rangeland landscapes is strongly related to eco-hydrologic pattern-process feedbacks that regulate the retention or loss of water and soil resources. However, key ecohydrologic information is often lacking in Ecological Site Descriptions (ESDs) used to guide management of ra...

  19. Costing for Policy Analysis.

    ERIC Educational Resources Information Center

    National Association of College and University Business Officers, Washington, DC.

    Cost behavior analysis, a costing process that can assist managers in estimating how certain institutional costs change in response to volume, policy, and environmental factors, is described. The five steps of this approach are examined, and the application of cost behavior analysis at four college-level settings is documented. The institutions…

  20. Estimation of potential loss of two pesticides in runoff in Fillmore County, Minnesota using a field-scale process-based model and a geographic information system

    USGS Publications Warehouse

    Capel, P.D.; Zhang, H.

    2000-01-01

    In assessing the occurrence, behavior, and effects of agricultural chemicals in surface water, the scales of study (i.e., watershed, county, state, and regional areas) are usually much larger than the scale of agricultural fields, where much of the understanding of processes has been developed. Field-scale areas are characterized by relatively homogeneous conditions. The combination of process-based simulation models and geographic information system technology can be used to help extend our understanding of field processes to water-quality concerns at larger scales. To demonstrate this, the model "Groundwater Loading Effects of Agricultural Management Systems" was used to estimate the potential loss of two pesticides (atrazine and permethrin) in runoff to surface water in Fillmore County in southeastern Minnesota. The county was divided into field-scale areas on the basis of a 100 m by 100 m grid, and the influences of soil type and surface topography on the potential losses of the two pesticides in runoff was evaluated for each individual grid cell. The results could be used for guidance for agricultural management and regulatory decisions, for planning environmental monitoring programs, and as an educational tool for the public.

  1. An overview of surface radiance and biology studies in FIFE

    NASA Astrophysics Data System (ADS)

    Blad, B. L.; Schimel, D. S.

    1992-11-01

    The use of satellite data to study and to understand energy and mass exchanges between the land surface and the atmosphere requires information about various biological processes and how various reflected or emitted spectral radiances are influenced by or manifested in these processes. To obtain such information, studies were conducted by the First ISLSCP Field Experiment (FIFE) surface radiances and biology (SRB) group using surface, near-surface, helicopter, and aircraft measurements. The two primary objectives of this group were to relate radiative fluxes to biophysical parameters and physiological processes and to assess how various management treatments affect important biological processes. This overview paper summarizes the results obtained by various SRB teams working in nine different areas: (1) measurement of bidirectional reflectance and estimation of hemispherical albedo; (2) evaluation of spatial and seasonal variability of spectral reflectance and vegetation indices; (3) determination of surface and radiational factors and their effects on vegetation indices and PAR relationships; (4) use of surface temperatures to estimate sensible heat flux; (5) controls over photosynthesis and respiration at small scales; (6) soil surface CO2 fluxes and grassland carbon budget; (7) landscape variations in controls over gas exchange and energy partitioning; (8) radiometric response of prairie to management and topography; and (9) determination of nitrogen gas exchanges in a tallgrass prairie.

  2. Launch and Landing Effects Ground Operations (LLEGO) Model

    NASA Technical Reports Server (NTRS)

    2008-01-01

    LLEGO is a model for understanding recurring launch and landing operations costs at Kennedy Space Center for human space flight. Launch and landing operations are often referred to as ground processing, or ground operations. Currently, this function is specific to the ground operations for the Space Shuttle Space Transportation System within the Space Shuttle Program. The Constellation system to follow the Space Shuttle consists of the crewed Orion spacecraft atop an Ares I launch vehicle and the uncrewed Ares V cargo launch vehicle. The Constellation flight and ground systems build upon many elements of the existing Shuttle flight and ground hardware, as well as upon existing organizations and processes. In turn, the LLEGO model builds upon past ground operations research, modeling, data, and experience in estimating for future programs. Rather than to simply provide estimates, the LLEGO model s main purpose is to improve expenses by relating complex relationships among functions (ground operations contractor, subcontractors, civil service technical, center management, operations, etc.) to tangible drivers. Drivers include flight system complexity and reliability, as well as operations and supply chain management processes and technology. Together these factors define the operability and potential improvements for any future system, from the most direct to the least direct expenses.

  3. Bayesian inference and assessment for rare-event bycatch in marine fisheries: a drift gillnet fishery case study.

    PubMed

    Martin, Summer L; Stohs, Stephen M; Moore, Jeffrey E

    2015-03-01

    Fisheries bycatch is a global threat to marine megafauna. Environmental laws require bycatch assessment for protected species, but this is difficult when bycatch is rare. Low bycatch rates, combined with low observer coverage, may lead to biased, imprecise estimates when using standard ratio estimators. Bayesian model-based approaches incorporate uncertainty, produce less volatile estimates, and enable probabilistic evaluation of estimates relative to management thresholds. Here, we demonstrate a pragmatic decision-making process that uses Bayesian model-based inferences to estimate the probability of exceeding management thresholds for bycatch in fisheries with < 100% observer coverage. Using the California drift gillnet fishery as a case study, we (1) model rates of rare-event bycatch and mortality using Bayesian Markov chain Monte Carlo estimation methods and 20 years of observer data; (2) predict unobserved counts of bycatch and mortality; (3) infer expected annual mortality; (4) determine probabilities of mortality exceeding regulatory thresholds; and (5) classify the fishery as having low, medium, or high bycatch impact using those probabilities. We focused on leatherback sea turtles (Dermochelys coriacea) and humpback whales (Megaptera novaeangliae). Candidate models included Poisson or zero-inflated Poisson likelihood, fishing effort, and a bycatch rate that varied with area, time, or regulatory regime. Regulatory regime had the strongest effect on leatherback bycatch, with the highest levels occurring prior to a regulatory change. Area had the strongest effect on humpback bycatch. Cumulative bycatch estimates for the 20-year period were 104-242 leatherbacks (52-153 deaths) and 6-50 humpbacks (0-21 deaths). The probability of exceeding a regulatory threshold under the U.S. Marine Mammal Protection Act (Potential Biological Removal, PBR) of 0.113 humpback deaths was 0.58, warranting a "medium bycatch impact" classification of the fishery. No PBR thresholds exist for leatherbacks, but the probability of exceeding an anticipated level of two deaths per year, stated as part of a U.S. Endangered Species Act assessment process, was 0.0007. The approach demonstrated here would allow managers to objectively and probabilistically classify fisheries with respect to bycatch impacts on species that have population-relevant mortality reference points, and declare with a stipulated level of certainty that bycatch did or did not exceed estimated upper bounds.

  4. Frequency Domain Analysis of Sensor Data for Event Classification in Real-Time Robot Assisted Deburring

    PubMed Central

    Pappachan, Bobby K; Caesarendra, Wahyu; Tjahjowidodo, Tegoeh; Wijaya, Tomi

    2017-01-01

    Process monitoring using indirect methods relies on the usage of sensors. Using sensors to acquire vital process related information also presents itself with the problem of big data management and analysis. Due to uncertainty in the frequency of events occurring, a higher sampling rate is often used in real-time monitoring applications to increase the chances of capturing and understanding all possible events related to the process. Advanced signal processing methods are used to further decipher meaningful information from the acquired data. In this research work, power spectrum density (PSD) of sensor data acquired at sampling rates between 40–51.2 kHz was calculated and the corelation between PSD and completed number of cycles/passes is presented. Here, the progress in number of cycles/passes is the event this research work intends to classify and the algorithm used to compute PSD is Welch’s estimate method. A comparison between Welch’s estimate method and statistical methods is also discussed. A clear co-relation was observed using Welch’s estimate to classify the number of cycles/passes. The paper also succeeds in classifying vibration signal generated by the spindle from the vibration signal acquired during finishing process. PMID:28556809

  5. Scope Complexity Options Risks Excursions (SCORE) Version 3.0 Mathematical Description.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options. The results of this model allow those considering these options to understand the complexity tradeoffs between proposed warhead options. The core idea of SCORE is to divide a warhead option into a well- defined set of scope elements and then estimate the complexity of each scope element against a well understood reference system. The uncertainty associated with estimates can also be captured. A weighted summation of the relative complexity of each scope element is used tomore » determine the total complexity of the proposed warhead option or portions of the warhead option (i.e., a National Work Breakdown Structure code). The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA- 12 led Enterprise Modeling and Analysis Consortium (EMAC), that has provided the data elicitation, integration and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less

  6. Enabling Process Improvement and Control in Higher Education Management

    ERIC Educational Resources Information Center

    Bell, Gary; Warwick, Jon; Kennedy, Mike

    2009-01-01

    The emergence of "managerialism" in the governance and direction of UK higher education (HE) institutions has been led by government demands for greater accountability in the quality and cost of universities. There is emerging anecdotal evidence indicating that the estimation performance of HE spreadsheets and regression models are poor.…

  7. ESTIMATION OF INFILTRATION RATE IN THE VADOSE ZONE: APPLICATION OF SELECTED MATHEMATICAL MODELS - VOLUME II

    EPA Science Inventory

    Movement of water into and through the vadose zone is of great importance to the assessment of contaminant fate and transport, agricultural management, and natural resource protection. The process of water movement is very dynamic, changing dramatically over time and space. Inf...

  8. Applying principles from economics to improve the transfer of ecological production estimates in fisheries ecosystem services research

    EPA Science Inventory

    Ecosystem services (ES) represent a way to represent and quantify multiple uses, values as well as connectivity between ecosystem processes and human well-being. Ecosystem-based fisheries management approaches may seek to quantify expected trade-offs in ecosystem services due to ...

  9. Cash on Demand: A Framework for Managing a Cash Liquidity Position.

    ERIC Educational Resources Information Center

    Augustine, John H.

    1995-01-01

    A well-run college or university will seek to accumulate and maintain an appropriate cash reserve or liquidity position. A rigorous analytic process for estimating the size and cost of a liquidity position, based on judgments about the institution's operating risks and opportunities, is outlined. (MSE)

  10. 30 CFR 203.89 - What is in a cost report?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... estimates, or analogous projects. These costs cover: (1) Oil or gas tariffs from pipeline or tankerage; (2) Trunkline and tieback lines; and (3) Gas plant processing for natural gas liquids. (e) Abandonment costs... INTERIOR MINERALS REVENUE MANAGEMENT RELIEF OR REDUCTION IN ROYALTY RATES OCS Oil, Gas, and Sulfur General...

  11. Foliar Moisture Contents of North American Conifers

    Treesearch

    Christopher R. Keyes

    2006-01-01

    Foliar moisture content (FMC) is a primary factor in the canopy ignition process as surface fire transitions to crown fire. In combination with measured stand data and assumed environmental conditions, reasonable estimates of foliar moisture content are necessary to determine and justify silvicultural targets for canopy fuels management strategies. FMC values reported...

  12. The health economic burden that acute and chronic wounds impose on an average clinical commissioning group/health board in the UK.

    PubMed

    Guest, J F; Vowden, K; Vowden, P

    2017-06-02

    To estimate the patterns of care and related resource use attributable to managing acute and chronic wounds among a catchment population of a typical clinical commissioning group (CCG)/health board and corresponding National Health Service (NHS) costs in the UK. This was a sub-analysis of a retrospective cohort analysis of the records of 2000 patients in The Health Improvement Network (THIN) database. Patients' characteristics, wound-related health outcomes and health-care resource use were quantified for an average CCG/health board with a catchment population of 250,000 adults ≥18 years of age, and the corresponding NHS cost of patient management was estimated at 2013/2014 prices. An average CCG/health board was estimated to be managing 11,200 wounds in 2012/2013. Of these, 40% were considered to be acute wounds, 48% chronic and 12% lacking any specific diagnosis. The prevalence of acute, chronic and unspecified wounds was estimated to be growing at the rate of 9%, 12% and 13% per annum respectively. Our analysis indicated that the current rate of wound healing must increase by an average of at least 1% per annum across all wound types in order to slow down the increasing prevalence. Otherwise, an average CCG/health board is predicted to manage ~23,200 wounds per annum by 2019/2020 and is predicted to spend a discounted (the process of determining the present value of a payment that is to be received in the future) £50 million on managing these wounds and associated comorbidities. Real-world evidence highlights the substantial burden that acute and chronic wounds impose on an average CCG/health board. Strategies are required to improve the accuracy of diagnosis and healing rates.

  13. Estimating the effects of potential climate and land use changes on hydrologic processes of a large agriculture dominated watershed

    NASA Astrophysics Data System (ADS)

    Neupane, Ram P.; Kumar, Sandeep

    2015-10-01

    Land use and climate are two major components that directly influence catchment hydrologic processes, and therefore better understanding of their effects is crucial for future land use planning and water resources management. We applied Soil and Water Assessment Tool (SWAT) to assess the effects of potential land use change and climate variability on hydrologic processes of large agriculture dominated Big Sioux River (BSR) watershed located in North Central region of USA. Future climate change scenarios were simulated using average output of temperature and precipitation data derived from Special Report on Emission Scenarios (SRES) (B1, A1B, and A2) for end-21st century. Land use change was modeled spatially based on historic long-term pattern of agricultural transformation in the basin, and included the expansion of corn (Zea mays L.) cultivation by 2, 5, and 10%. We estimated higher surface runoff in all land use scenarios with maximum increase of 4% while expanding 10% corn cultivation in the basin. Annual stream discharge was estimated higher with maximum increase of 72% in SRES-B1 attributed from higher groundwater contribution of 152% in the same scenario. We assessed increased precipitation during spring season but the summer precipitation decreased substantially in all climate change scenarios. Similar to decreased summer precipitation, discharge of the BSR also decreased potentially affecting agricultural production due to reduced future water availability during crop growing season in the basin. However, combined effects of potential land use change with climate variability enhanced for higher annual discharge of the BSR. Therefore, these estimations can be crucial for implications of future land use planning and water resources management of the basin.

  14. Measuring the efficiency of a healthcare waste management system in Serbia with data envelopment analysis.

    PubMed

    Ratkovic, Branislava; Andrejic, Milan; Vidovic, Milorad

    2012-06-01

    In 2007, the Serbian Ministry of Health initiated specific activities towards establishing a workable model based on the existing administrative framework, which corresponds to the needs of healthcare waste management throughout Serbia. The objective of this research was to identify the reforms carried out and their outcomes by estimating the efficiencies of a sample of 35 healthcare facilities engaged in the process of collection and treatment of healthcare waste, using data envelopment analysis. Twenty-one (60%) of the 35 healthcare facilities analysed were found to be technically inefficient, with an average level of inefficiency of 13%. This fact indicates deficiencies in the process of collection and treatment of healthcare waste and the information obtained and presented in this paper could be used for further improvement and development of healthcare waste management in Serbia.

  15. The role of remotely-sensed evapotranspiration data in watershed water resources management

    NASA Astrophysics Data System (ADS)

    Shuster, W.; Carroll, M.; Zhang, Y.

    2006-12-01

    Evapotranspiration (ET) is an important component of the watershed hydrologic cycle and a key factor to consider in water resource planning. Partly due to the loss of evaporation pans from the national network in the 1980s because of budget cuts, ET values are not available in many locations in the US and practitioners often have to rely on the climatically averaged regional estimates instead. Several new approaches have been developed for estimating ET via remote sensing. In this study we employ one established approach that allows us to derive ET estimates on 1 km2 resolution on the basis of AVHRR brightness temperature. By applying this method to southwestern Ohio we obtain ET estimates for a 2 km2 partially suburban watershed near Cincinnati, OH. Along with precipitation and surface discharge measurements, these remotely-sensed ET estimates form the basis for determining both long and short term water budgets for this watershed. These ET estimates are next compared with regional climatic values on a seasonal basis to examine the potential differences that can be introduced to our conceptualization of the watershed processes by considering area- specific ET values. We then discuss implications of this work for more widespread application to watershed management imperatives (e.g., stream ecological health).

  16. The Internet's role in a biodosimetric response to a radiation mass casualty event.

    PubMed

    Sugarman, S L; Livingston, G K; Stricklin, D L; Abbott, M G; Wilkins, R C; Romm, H; Oestreicher, U; Yoshida, M A; Miura, T; Moquet, J E; Di Giorgio, M; Ferrarotto, C; Gross, G A; Christiansen, M E; Hart, C L; Christensen, D M

    2014-05-01

    Response to a large-scale radiological incident could require timely medical interventions to minimize radiation casualties. Proper medical care requires knowing the victim's radiation dose. When physical dosimetry is absent, radiation-specific chromosome aberration analysis can serve to estimate the absorbed dose in order to assist physicians in the medical management of radiation injuries. A mock exercise scenario was presented to six participating biodosimetry laboratories as one individual acutely exposed to Co under conditions suggesting whole-body exposure. The individual was not wearing a dosimeter and within 2-3 h of the incident began vomiting. The individual also had other medical symptoms indicating likelihood of a significant dose. Physicians managing the patient requested a dose estimate in order to develop a treatment plan. Participating laboratories in North and South America, Europe, and Asia were asked to evaluate more than 800 electronic images of metaphase cells from the patient to determine the dicentric yield and calculate a dose estimate with 95% confidence limits. All participants were blind to the physical dose until after submitting their estimates based on the dicentric chromosome assay (DCA). The exercise was successful since the mean biological dose estimate was 1.89 Gy whereas the actual physical dose was 2 Gy. This is well within the requirements for guidance of medical management. The exercise demonstrated that the most labor-intensive step in the entire process (visual evaluation of images) can be accelerated by taking advantage of world-wide expertise available on the Internet.

  17. The Design of an Information Management Program for Headquarters, Department of the Army. Phase 2. Management Summary.

    DTIC Science & Technology

    1980-02-26

    months estimated to be required in some areas), and more direct invol ement of information users in long range planning of information requirements (with...most people, there is a definite need to educate the members of the organization as to the implications of the IRM approach. Emphasis should be placed...from information sharing and a coordinated approach. Such an educational process has already begun with the execution of this study, but more must be

  18. Models and metrics for software management and engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1988-01-01

    This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved.

  19. Integrating land management into Earth system models: the importance of land use transitions at sub-grid-scale

    NASA Astrophysics Data System (ADS)

    Pongratz, Julia; Wilkenskjeld, Stiig; Kloster, Silvia; Reick, Christian

    2014-05-01

    Recent studies indicate that changes in surface climate and carbon fluxes caused by land management (i.e., modifications of vegetation structure without changing the type of land cover) can be as large as those caused by land cover change. Further, such effects may occur on substantial areas: while about one quarter of the land surface has undergone land cover change, another fifty percent are managed. This calls for integration of management processes in Earth system models (ESMs). This integration increases the importance of awareness and agreement on how to diagnose effects of land use in ESMs to avoid additional model spread and thus unnecessary uncertainties in carbon budget estimates. Process understanding of management effects, their model implementation, as well as data availability on management type and extent pose challenges. In this respect, a significant step forward has been done in the framework of the current IPCC's CMIP5 simulations (Coupled Model Intercomparison Project Phase 5): The climate simulations were driven with the same harmonized land use dataset that, different from most datasets commonly used before, included information on two important types of management: wood harvest and shifting cultivation. However, these new aspects were employed by only part of the CMIP5 models, while most models continued to use the associated land cover maps. Here, we explore the consequences for the carbon cycle of including subgrid-scale land transformations ("gross transitions"), such as shifting cultivation, as example of the current state of implementation of land management in ESMs. Accounting for gross transitions is expected to increase land use emissions because it represents simultaneous clearing and regrowth of natural vegetation in different parts of the grid cell, reducing standing carbon stocks. This process cannot be captured by prescribing land cover maps ("net transitions"). Using the MPI-ESM we find that ignoring gross transitions underestimates emissions substantially, for historical times by about 40%. Implementation of land management such as gross transitions is a step forward in terms of comprehensiveness of simulated processes. However, it has increased model spread in carbon fluxes, because land management processes have been considered by only a subset of recent ESMs contributing to major projects such as IPCC or the Global Carbon Project. This model spread still causes the net land use flux to be the most uncertain component in the global carbon budget. Other causes have previously been identified as differences in land use datasets, differing types of vegetation model, accounting of nutrient limitation, the inclusion of land use feedbacks (increase in atmospheric CO2 due to land use emissions causing terrestrial carbon uptake), and a confusion of whether the net land use flux in ESMs should be reported as instantaneous emissions, or also account for delayed carbon responses and regrowth. These differences explain a factor 2-6 difference between model estimates and are expected to be further affected by interactions with land management. This highlights the importance of an accurate protocol for future model intercomparisons of carbon fluxes from land cover change and land management to ensure comparison of the same processes and fluxes.

  20. Carbon footprint analysis as a tool for energy and environmental management in small and medium-sized enterprises

    NASA Astrophysics Data System (ADS)

    Giama, E.; Papadopoulos, A. M.

    2018-01-01

    The reduction of carbon emissions has become a top priority in the decision-making process for governments and companies, the strict European legislation framework being a major driving force behind this effort. On the other hand, many companies face difficulties in estimating their footprint and in linking the results derived from environmental evaluation processes with an integrated energy management strategy, which will eventually lead to energy-efficient and cost-effective solutions. The paper highlights the need of companies to establish integrated environmental management practices, with tools such as carbon footprint analysis to monitor the energy performance of production processes. Concepts and methods are analysed, and selected indicators are presented by means of benchmarking, monitoring and reporting the results in order to be used effectively from the companies. The study is based on data from more than 90 Greek small and medium enterprises, followed by a comprehensive discussion of cost-effective and realistic energy-saving measures.

  1. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  2. Fast estimation of space-robots inertia parameters: A modular mathematical formulation

    NASA Astrophysics Data System (ADS)

    Nabavi Chashmi, Seyed Yaser; Malaek, Seyed Mohammad-Bagher

    2016-10-01

    This work aims to propose a new technique that considerably helps enhance time and precision needed to identify ;Inertia Parameters (IPs); of a typical Autonomous Space-Robot (ASR). Operations might include, capturing an unknown Target Space-Object (TSO), ;active space-debris removal; or ;automated in-orbit assemblies;. In these operations generating precise successive commands are essential to the success of the mission. We show how a generalized, repeatable estimation-process could play an effective role to manage the operation. With the help of the well-known Force-Based approach, a new ;modular formulation; has been developed to simultaneously identify IPs of an ASR while it captures a TSO. The idea is to reorganize the equations with associated IPs with a ;Modular Set; of matrices instead of a single matrix representing the overall system dynamics. The devised Modular Matrix Set will then facilitate the estimation process. It provides a conjugate linear model in mass and inertia terms. The new formulation is, therefore, well-suited for ;simultaneous estimation processes; using recursive algorithms like RLS. Further enhancements would be needed for cases the effect of center of mass location becomes important. Extensive case studies reveal that estimation time is drastically reduced which in-turn paves the way to acquire better results.

  3. Application of analytical redundancy management to Shuttle crafts. [computerized simulation of microelectronic implementation

    NASA Technical Reports Server (NTRS)

    Montgomery, R. C.; Tabak, D.

    1979-01-01

    The study involves the bank of filters approach to analytical redundancy management since this is amenable to microelectronic implementation. Attention is given to a study of the UD factorized filter to determine if it gives more accurate estimates than the standard Kalman filter when data processing word size is reduced. It is reported that, as the word size is reduced, the effect of modeling error dominates the filter performance of the two filters. However, the UD filter is shown to maintain a slight advantage in tracking performance. It is concluded that because of the UD filter's stability in the serial processing mode, it remains the leading candidate for microelectronic implementation.

  4. Contributions to a thermodynamic model of Earth systems on rivers

    NASA Technical Reports Server (NTRS)

    Iberall, A. S.

    1981-01-01

    A model for the chemical (ground water) erosion and physical (bed load, including sedimentation) erosion of the land was developed. The rudiments of the relation between a regulated sea level (for the past 2500 million years) and the episodic rise and erosion of continents was examined to obtain some notion of the process scalings. Major process scales of about 200 years, 100,000 years, 3 My, 40 My, 300 My were estimated. It was suggested that a program targeted at ecological management would have to become familiar with processes at the first four scales (i.e., from glaciation to the horizontal movement of continents). The study returns to the initial premise. In order to understand and manage Earth biology (life, and modern man), it is necessary minimally to pursue systems' biogeology at a considerable number of process space and time scales via their irreversible thermodynamic couplings.

  5. Activity-based costing via an information system: an application created for a breast imaging center.

    PubMed

    Hawkins, H; Langer, J; Padua, E; Reaves, J

    2001-06-01

    Activity-based costing (ABC) is a process that enables the estimation of the cost of producing a product or service. More accurate than traditional charge-based approaches, it emphasizes analysis of processes, and more specific identification of both direct and indirect costs. This accuracy is essential in today's healthcare environment, in which managed care organizations necessitate responsible and accountable costing. However, to be successfully utilized, it requires time, effort, expertise, and support. Data collection can be tedious and expensive. By integrating ABC with information management (IM) and systems (IS), organizations can take advantage of the process orientation of both, extend and improve ABC, and decrease resource utilization for ABC projects. In our case study, we have examined the process of a multidisciplinary breast center. We have mapped the constituent activities and established cost drivers. This information has been structured and included in our information system database for subsequent analysis.

  6. PD_Manager: an mHealth platform for Parkinson's disease patient management.

    PubMed

    Tsiouris, Kostas M; Gatsios, Dimitrios; Rigas, George; Miljkovic, Dragana; Koroušić Seljak, Barbara; Bohanec, Marko; Arredondo, Maria T; Antonini, Angelo; Konitsiotis, Spyros; Koutsouris, Dimitrios D; Fotiadis, Dimitrios I

    2017-06-01

    PD_Manager is a mobile health platform designed to cover most of the aspects regarding the management of Parkinson's disease (PD) in a holistic approach. Patients are unobtrusively monitored using commercial wrist and insole sensors paired with a smartphone, to automatically estimate the severity of most of the PD motor symptoms. Besides motor symptoms monitoring, the patient's mobile application also provides various non-motor self-evaluation tests for assessing cognition, mood and nutrition to motivate them in becoming more active in managing their disease. All data from the mobile application and the sensors is transferred to a cloud infrastructure to allow easy access for clinicians and further processing. Clinicians can access this information using a separate mobile application that is specifically designed for their respective needs to provide faster and more accurate assessment of PD symptoms that facilitate patient evaluation. Machine learning techniques are used to estimate symptoms and disease progression trends to further enhance the provided information. The platform is also complemented with a decision support system (DSS) that notifies clinicians for the detection of new symptoms or the worsening of existing ones. As patient's symptoms are progressing, the DSS can also provide specific suggestions regarding appropriate medication changes.

  7. PD_Manager: an mHealth platform for Parkinson's disease patient management

    PubMed Central

    Gatsios, Dimitrios; Rigas, George; Miljkovic, Dragana; Koroušić Seljak, Barbara; Bohanec, Marko; Arredondo, Maria T.; Antonini, Angelo; Konitsiotis, Spyros; Koutsouris, Dimitrios D.

    2017-01-01

    PD_Manager is a mobile health platform designed to cover most of the aspects regarding the management of Parkinson's disease (PD) in a holistic approach. Patients are unobtrusively monitored using commercial wrist and insole sensors paired with a smartphone, to automatically estimate the severity of most of the PD motor symptoms. Besides motor symptoms monitoring, the patient's mobile application also provides various non-motor self-evaluation tests for assessing cognition, mood and nutrition to motivate them in becoming more active in managing their disease. All data from the mobile application and the sensors is transferred to a cloud infrastructure to allow easy access for clinicians and further processing. Clinicians can access this information using a separate mobile application that is specifically designed for their respective needs to provide faster and more accurate assessment of PD symptoms that facilitate patient evaluation. Machine learning techniques are used to estimate symptoms and disease progression trends to further enhance the provided information. The platform is also complemented with a decision support system (DSS) that notifies clinicians for the detection of new symptoms or the worsening of existing ones. As patient's symptoms are progressing, the DSS can also provide specific suggestions regarding appropriate medication changes. PMID:28706727

  8. Using expert opinion to prioritize impacts of climate change on sea turtles' nesting grounds.

    PubMed

    Fuentes, M M P B; Cinner, J E

    2010-12-01

    Managers and conservationists often need to prioritize which impacts from climate change to deal with from a long list of threats. However, data which allows comparison of the relative impact from climatic threats for decision-making is often unavailable. This is the case for the management of sea turtles in the face of climate change. The terrestrial life stages of sea turtles can be negatively impacted by various climatic processes, such as sea level rise, altered cyclonic activity, and increased sand temperatures. However, no study has systematically investigated the relative impact of each of these climatic processes, making it challenging for managers to prioritize their decisions and resources. To address this we offer a systematic method for eliciting expert knowledge to estimate the relative impact of climatic processes on sea turtles' terrestrial reproductive phase. For this we used as an example the world's largest population of green sea turtles and asked 22 scientists and managers to answer a paper based survey with a series of pair-wise comparison matrices that compared the anticipated impacts from each climatic process. Both scientists and managers agreed that increased sand temperature will likely cause the most threat to the reproductive output of the nGBR green turtle population followed by sea level rise, then altered cyclonic activity. The methodology used proved useful to determine the relative impact of the selected climatic processes on sea turtles' reproductive output and provided valuable information for decision-making. Thus, the methodological approach can potentially be applied to other species and ecosystems of management concern. Copyright © 2009 Elsevier Ltd. All rights reserved.

  9. Inverse Modeling of Tropospheric Methane Constrained by 13C Isotope in Methane

    NASA Astrophysics Data System (ADS)

    Mikaloff Fletcher, S. E.; Tans, P. P.; Bruhwiler, L. M.

    2001-12-01

    Understanding the budget of methane is crucial to predicting climate change and managing earth's carbon reservoirs. Methane is responsible for approximately 15% of the anthropogenic greenhouse forcing and has a large impact on the oxidative capacity of Earth's atmosphere due to its reaction with hydroxyl radical. At present, many of the sources and sinks of methane are poorly understood, due in part to the large spatial and temporal variability of the methane flux. Model calculations of methane mixing ratios using most process-based source estimates typically over-predict the inter-hemispheric gradient of atmospheric methane. Inverse models, which estimate trace gas budgets by using observations of atmospheric mixing ratios and transport models to estimate sources and sinks, have been used to incorporate features of the atmospheric observations into methane budgets. While inverse models of methane generally tend to find a decrease in northern hemisphere sources and an increase in southern hemisphere sources relative to process-based estimates,no inverse study has definitively associated the inter-hemispheric gradient difference with a specific source process or group of processes. In this presentation, observations of isotopic ratios of 13C in methane and isotopic signatures of methane source processes are used in conjunction with an inverse model of methane to further constrain the source estimates of methane. In order to investigate the advantages of incorporating 13C, the TM3 three-dimensional transport model was used. The methane and carbon dioxide measurements used are from a cooperative international effort, the Cooperative Air Sampling Network, lead by the Climate Monitoring Diagnostics Laboratory (CMDL) at the National Oceanic and Atmospheric Administration (NOAA). Experiments using model calculations based on process-based source estimates show that the inter-hemispheric gradient of δ 13CH4 is not reproduced by these source estimates, showing that the addition of observations of δ 13CH4 should provide unique insight into the methane problem.

  10. Natural-technological risk assessment and management

    NASA Astrophysics Data System (ADS)

    Burova, Valentina; Frolova, Nina

    2016-04-01

    EM-DAT statistical data on human impact and economic damages in the 1st semester 2015 are the highest since 2011: 41% of disasters were floods, responsible for 39% of economic damage and 7% of events were earthquakes responsible for 59% of total death toll. This suggests that disaster risk assessment and management still need to be improved and stay the principle issue in national and international related programs. The paper investigates the risk assessment and management practice in the Russian Federation at different levels. The method is proposed to identify the territories characterized by integrated natural-technological hazard. The maps of the Russian Federation zoning according to the integrated natural-technological hazard level are presented, as well as the procedure of updating the integrated hazard level taking into account the activity of separate processes. Special attention is paid to data bases on past natural and technological processes consequences, which are used for verification of current hazard estimation. The examples of natural-technological risk zoning for the country and some regions territory are presented. Different output risk indexes: both social and economic, are estimated taking into account requirements of end-users. In order to increase the safety of population of the Russian Federation the trans-boundaries hazards are also taken into account.

  11. Administrative Costs Associated With Physician Billing and Insurance-Related Activities at an Academic Health Care System.

    PubMed

    Tseng, Phillip; Kaplan, Robert S; Richman, Barak D; Shah, Mahek A; Schulman, Kevin A

    2018-02-20

    Administrative costs in the US health care system are an important component of total health care spending, and a substantial proportion of these costs are attributable to billing and insurance-related activities. To examine and estimate the administrative costs associated with physician billing activities in a large academic health care system with a certified electronic health record system. This study used time-driven activity-based costing. Interviews were conducted with 27 health system administrators and 34 physicians in 2016 and 2017 to construct a process map charting the path of an insurance claim through the revenue cycle management process. These data were used to calculate the cost for each major billing and insurance-related activity and were aggregated to estimate the health system's total cost of processing an insurance claim. Estimated time required to perform billing and insurance-related activities, based on interviews with management personnel and physicians. Estimated billing and insurance-related costs for 5 types of patient encounters: primary care visits, discharged emergency department visits, general medicine inpatient stays, ambulatory surgical procedures, and inpatient surgical procedures. Estimated processing time and total costs for billing and insurance-related activities were 13 minutes and $20.49 for a primary care visit, 32 minutes and $61.54 for a discharged emergency department visit, 73 minutes and $124.26 for a general inpatient stay, 75 minutes and $170.40 for an ambulatory surgical procedure, and 100 minutes and $215.10 for an inpatient surgical procedure. Of these totals, time and costs for activities carried out by physicians were estimated at a median of 3 minutes or $6.36 for a primary care visit, 3 minutes or $10.97 for an emergency department visit, 5 minutes or $13.29 for a general inpatient stay, 15 minutes or $51.20 for an ambulatory surgical procedure, and 15 minutes or $51.20 for an inpatient surgical procedure. Of professional revenue, professional billing costs were estimated to represent 14.5% for primary care visits, 25.2% for emergency department visits, 8.0% for general medicine inpatient stays, 13.4% for ambulatory surgical procedures, and 3.1% for inpatient surgical procedures. In a time-driven activity-based costing study in a large academic health care system with a certified electronic health record system, the estimated costs of billing and insurance-related activities ranged from $20 for a primary care visit to $215 for an inpatient surgical procedure. Knowledge of how specific billing and insurance-related activities contribute to administrative costs may help inform policy solutions to reduce these expenses.

  12. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  13. System and method for motor speed estimation of an electric motor

    DOEpatents

    Lu, Bin [Kenosha, WI; Yan, Ting [Brookfield, WI; Luebke, Charles John [Sussex, WI; Sharma, Santosh Kumar [Viman Nagar, IN

    2012-06-19

    A system and method for a motor management system includes a computer readable storage medium and a processing unit. The processing unit configured to determine a voltage value of a voltage input to an alternating current (AC) motor, determine a frequency value of at least one of a voltage input and a current input to the AC motor, determine a load value from the AC motor, and access a set of motor nameplate data, where the set of motor nameplate data includes a rated power, a rated speed, a rated frequency, and a rated voltage of the AC motor. The processing unit is also configured to estimate a motor speed based on the voltage value, the frequency value, the load value, and the set of nameplate data and also store the motor speed on the computer readable storage medium.

  14. Multinomial N-mixture models improve the applicability of electrofishing for developing population estimates of stream-dwelling Smallmouth Bass

    USGS Publications Warehouse

    Mollenhauer, Robert; Brewer, Shannon K.

    2017-01-01

    Failure to account for variable detection across survey conditions constrains progressive stream ecology and can lead to erroneous stream fish management and conservation decisions. In addition to variable detection’s confounding long-term stream fish population trends, reliable abundance estimates across a wide range of survey conditions are fundamental to establishing species–environment relationships. Despite major advancements in accounting for variable detection when surveying animal populations, these approaches remain largely ignored by stream fish scientists, and CPUE remains the most common metric used by researchers and managers. One notable advancement for addressing the challenges of variable detection is the multinomial N-mixture model. Multinomial N-mixture models use a flexible hierarchical framework to model the detection process across sites as a function of covariates; they also accommodate common fisheries survey methods, such as removal and capture–recapture. Effective monitoring of stream-dwelling Smallmouth Bass Micropterus dolomieu populations has long been challenging; therefore, our objective was to examine the use of multinomial N-mixture models to improve the applicability of electrofishing for estimating absolute abundance. We sampled Smallmouth Bass populations by using tow-barge electrofishing across a range of environmental conditions in streams of the Ozark Highlands ecoregion. Using an information-theoretic approach, we identified effort, water clarity, wetted channel width, and water depth as covariates that were related to variable Smallmouth Bass electrofishing detection. Smallmouth Bass abundance estimates derived from our top model consistently agreed with baseline estimates obtained via snorkel surveys. Additionally, confidence intervals from the multinomial N-mixture models were consistently more precise than those of unbiased Petersen capture–recapture estimates due to the dependency among data sets in the hierarchical framework. We demonstrate the application of this contemporary population estimation method to address a longstanding stream fish management issue. We also detail the advantages and trade-offs of hierarchical population estimation methods relative to CPUE and estimation methods that model each site separately.

  15. Estimating costs in the economic evaluation of medical technologies.

    PubMed

    Luce, B R; Elixhauser, A

    1990-01-01

    The complexities and nuances of evaluating the costs associated with providing medical technologies are often underestimated by analysts engaged in economic evaluations. This article describes the theoretical underpinnings of cost estimation, emphasizing the importance of accounting for opportunity costs and marginal costs. The various types of costs that should be considered in an analysis are described; a listing of specific cost elements may provide a helpful guide to analysis. The process of identifying and estimating costs is detailed, and practical recommendations for handling the challenges of cost estimation are provided. The roles of sensitivity analysis and discounting are characterized, as are determinants of the types of costs to include in an analysis. Finally, common problems facing the analyst are enumerated with suggestions for managing these problems.

  16. Cost-to-Complete Estimates and Financial Reporting for the Management of the Iraq Relief and Reconstruction Fund

    DTIC Science & Technology

    2005-07-26

    Audit Report Cost-to-Complete Estimates and Financial Reporting for the Management of the Iraq Relief and Reconstruction...Complete Estimates and Financial Reporting for the Management of the Iraq Relief and Reconstruction Fund 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...RECONSTRUCTION MANAGEMENT OFFICE DIRECTOR, PROJECT AND CONTRACTING OFFICE SUBJECT: Cost-to-Complete Estimates and Financial Reporting for the Management of

  17. Biogeochemical modelling vs. tree-ring data - comparison of forest ecosystem productivity estimates

    NASA Astrophysics Data System (ADS)

    Zorana Ostrogović Sever, Maša; Barcza, Zoltán; Hidy, Dóra; Paladinić, Elvis; Kern, Anikó; Marjanović, Hrvoje

    2017-04-01

    Forest ecosystems are sensitive to environmental changes as well as human-induce disturbances, therefore process-based models with integrated management modules represent valuable tool for estimating and forecasting forest ecosystem productivity under changing conditions. Biogeochemical model Biome-BGC simulates carbon, nitrogen and water fluxes, and it is widely used for different terrestrial ecosystems. It was modified and parameterised by many researchers in the past to meet the specific local conditions. In this research, we used recently published improved version of the model Biome-BGCMuSo (BBGCMuSo), with multilayer soil module and integrated management module. The aim of our research is to validate modelling results of forest ecosystem productivity (NPP) from BBGCMuSo model with observed productivity estimated from an extensive dataset of tree-rings. The research was conducted in two distinct forest complexes of managed Pedunculate oak in SE Europe (Croatia), namely Pokupsko basin and Spačva basin. First, we parameterized BBGCMuSo model at a local level using eddy-covariance (EC) data from Jastrebarsko EC site. Parameterized model was used for the assessment of productivity on a larger scale. Results of NPP assessment with BBGCMuSo are compared with NPP estimated from tree ring data taken from trees on over 100 plots in both forest complexes. Keywords: Biome-BGCMuSo, forest productivity, model parameterization, NPP, Pedunculate oak

  18. Modelling the energy costs of the wastewater treatment process: The influence of the aging factor.

    PubMed

    Castellet-Viciano, Lledó; Hernández-Chover, Vicent; Hernández-Sancho, Francesc

    2018-06-01

    Wastewater treatment plants (WWTPs) are aging and its effects on the process are more evident as time goes by. Due to the deterioration of the facilities, the efficiency of the treatment process decreases gradually. Within this framework, this paper proves the increase in the energy consumption of the WWTPs with time, and finds differences among facilities size. Accordingly, the paper aims to develop a dynamic energy cost function capable of predicting the energy cost of the process in the future. The time variable is used to introduce the aging effects on the energy cost estimation in order to increase the accuracy of the estimation. For this purpose, the evolution of energy costs will be assessed and modelled for a group of WWTPs using the methodology of cost functions. The results will be useful for the managers of the facilities in the decision making process. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. An Estimation of Construction and Demolition Debris in Seoul, Korea: Waste Amount, Type, and Estimating Model.

    PubMed

    Seo, Seongwon; Hwang, Yongwoo

    1999-08-01

    Construction and demolition (C&D) debris is generated at the site of various construction activities. However, the amount of the debris is usually so large that it is necessary to estimate the amount of C&D debris as accurately as possible for effective waste management and control in urban areas. In this paper, an effective estimation method using a statistical model was proposed. The estimation process was composed of five steps: estimation of the life span of buildings; estimation of the floor area of buildings to be constructed and demolished; calculation of individual intensity units of C&D debris; and estimation of the future C&D debris production. This method was also applied in the city of Seoul as an actual case, and the estimated amount of C&D debris in Seoul in 2021 was approximately 24 million tons. Of this total amount, 98% was generated by demolition, and the main components of debris were concrete and brick.

  20. Why are You Late?: Investigating the Role of Time Management in Time-Based Prospective Memory

    PubMed Central

    Waldum, Emily R; McDaniel, Mark A.

    2016-01-01

    Time-based prospective memory tasks (TBPM) are those that are to be performed at a specific future time. Contrary to typical laboratory TBPM tasks (e.g., “hit the “z” key every 5 minutes”), many real-world TBPM tasks require more complex time-management processes. For instance to attend an appointment on time, one must estimate the duration of the drive to the appointment and then utilize this estimate to create and execute a secondary TBPM intention (e.g., “I need to start driving by 1:30 to make my 2:00 appointment on time”). Future under- and overestimates of drive time can lead to inefficient TBPM performance with the former lending to missed appointments and the latter to long stints in the waiting room. Despite the common occurrence of complex TBPM tasks in everyday life, to date, no studies have investigated how components of time management, including time estimation, affect behavior in such complex TBPM tasks. Therefore, the current study aimed to investigate timing biases in both older and younger adults and further to determine how such biases along with additional time management components including planning and plan fidelity influence complex TBPM performance. Results suggest for the first time that younger and older adults do not always utilize similar timing strategies, and as a result, can produce differential timing biases under the exact same environmental conditions. These timing biases, in turn, play a vital role in how efficiently both younger and older adults perform a later TBPM task that requires them to utilize their earlier time estimate. PMID:27336325

  1. Calculating cost savings in utilization management.

    PubMed

    MacMillan, Donna

    2014-01-01

    A major motivation for managing the utilization of laboratory testing is to reduce the cost of medical care. For this reason it is important to understand the basic principles of cost accounting in the clinical laboratory. The process of laboratory testing includes three distinct components termed the pre-analytic, analytic and post-analytic phases. Utilization management efforts may impact the cost structure of these three phases in different ways depending on the specific details of the initiative. Estimates of cost savings resulting from utilization management programs reported in the literature have often been fundamentally flawed due to a failure to understand basic concepts such as the difference between laboratory costs versus charges and the impact of reducing laboratory test volumes on the average versus marginal cost structure in the laboratory. This article will provide an overview of basic cost accounting principles in the clinical laboratory including both job order and process cost accounting. Specific examples will be presented to illustrate these concepts in various different scenarios. © 2013.

  2. Applying the metro map to software development management

    NASA Astrophysics Data System (ADS)

    Aguirregoitia, Amaia; Dolado, J. Javier; Presedo, Concepción

    2010-01-01

    This paper presents MetroMap, a new graphical representation model for controlling and managing the software development process. Metromap uses metaphors and visual representation techniques to explore several key indicators in order to support problem detection and resolution. The resulting visualization addresses diverse management tasks, such as tracking of deviations from the plan, analysis of patterns of failure detection and correction, overall assessment of change management policies, and estimation of product quality. The proposed visualization uses a metaphor with a metro map along with various interactive techniques to represent information concerning the software development process and to deal efficiently with multivariate visual queries. Finally, the paper shows the implementation of the tool in JavaFX with data of a real project and the results of testing the tool with the aforementioned data and users attempting several information retrieval tasks. The conclusion shows the results of analyzing user response time and efficiency using the MetroMap visualization system. The utility of the tool was positively evaluated.

  3. Important considerations for feasibility studies in physical activity research involving persons with multiple sclerosis: a scoping systematic review and case study.

    PubMed

    Learmonth, Yvonne C; Motl, Robert W

    2018-01-01

    Much research has been undertaken to establish the important benefits of physical activity in persons with multiple sclerosis (MS). There is disagreement regarding the strength of this research, perhaps because the majority of studies on physical activity and its benefits have not undergone initial and systematic feasibility testing. We aim to address the feasibility processes that have been examined within the context of physical activity interventions in MS. A systematic scoping review was conducted based on a literature search of five databases to identify feasibility processes described in preliminary studies of physical activity in MS. We read and extracted methodology from each study based on the following feasibility metrics: process (e.g. recruitment), resource (e.g. monetary costs), management (e.g. personnel time requirements) and scientific outcomes (e.g. clinical/participant reported outcome measures). We illustrate the use of the four feasibility metrics within a randomised controlled trial of a home-based exercise intervention in persons with MS. Twenty-five studies were identified. Resource feasibility (e.g. time and resources) and scientific outcomes feasibility (e.g. clinical outcomes) methodologies were applied and described in many studies; however, these metrics have not been systematically addressed. Metrics related to process feasibility (e.g. recruitment) and management feasibility (e.g. human and data management) are not well described within the literature. Our case study successfully enabled us to address the four feasibility metrics, and we provide new information on management feasibility (i.e. estimate data completeness and estimate data entry) and scientific outcomes feasibility (i.e. determining data collection materials appropriateness). Our review highlights the existing research and provides a case study which assesses important metrics of study feasibility. This review serves as a clarion call for feasibility trials that will substantially strengthen the foundation of research on exercise in MS.

  4. The indicator performance estimate approach to determining acceptable wilderness conditions

    NASA Astrophysics Data System (ADS)

    Hollenhorst, Steven; Gardner, Lisa

    1994-11-01

    Using data from a study conducted in the Cranberry Wilderness Area of West Virginia, United States, this paper describes how a modified importance—performance approach can be used to prioritize wilderness indicators and determine how much change from the pristine is acceptable. The approach uses two key types of information: (1) indicator importance, or visitor opinion as to which wilderness indicators have the greatest influence on their experience, and (2) management performance, or the extent to which actual indicator conditions exceed or are within visitor expectations. Performance was represented by calculating indicator performance estimates (IPEs), as defined by standardized differences between actual conditions and visitor preferences for each indicator. The results for each indicator are then presented graphically on a four-quadrant matrix for objective interpretation. Each quadrant represents a management response: keep up the good work, concentrate here, low priority, or possible overkill. The technique allows managers to more systematically and effectively utilize information routinely collected during the limits of acceptable change wilderness planning process.

  5. A study of the additional costs of dispensing workers' compensation prescriptions.

    PubMed

    Schafermeyer, Kenneth W

    2007-03-01

    Although there is a significant amount of additional work involved in dispensing workers' compensation prescriptions, these costs have not been quantified. A study of the additional costs to dispense a workers' compensation prescription is needed to measure actual costs and to help determine the reasonableness of reimbursement for prescriptions dispensed under workers' compensation programs. The purpose of this study was to determine the minimum additional time and costs required to dispense workers' compensation prescriptions in Texas. A convenience sample of 30 store-level pharmacy staff members involved in submitting and processing prescription claims for the Texas Mutual workers' compensation program were interviewed by telephone. Data collected to determine the additional costs of dispensing a workers' compensation prescription included (1) the amount of additional time and personnel costs required to dispense and process an average workers' compensation prescription claim, (2) the difference in time required for a new versus a refilled prescription, (3) overhead costs for processing workers' compensation prescription claims by experienced experts at a central processing facility, (4) carrying costs for workers' compensation accounts receivable, and (5) bad debts due to uncollectible workers' compensation claims. The median of the sample pharmacies' additional costs for dispensing a workers' compensation prescription was estimated to be at least $9.86 greater than for a cash prescription. This study shows that the estimated costs for workers' compensation prescriptions were significantly higher than for cash prescriptions. These costs are probably much more than most employers, workers' compensation payers, and pharmacy managers would expect. It is recommended that pharmacy managers should estimate their own costs and compare these costs to actual reimbursement when considering the reasonableness of workers' compensation prescriptions and whether to accept these prescriptions.

  6. Vehicle Health Management Communications Requirements for AeroMACS

    NASA Technical Reports Server (NTRS)

    Kerczewski, Robert J.; Clements, Donna J.; Apaza, Rafael D.

    2012-01-01

    As the development of standards for the aeronautical mobile airport communications system (AeroMACS) progresses, the process of identifying and quantifying appropriate uses for the system is progressing. In addition to defining important elements of AeroMACS standards, indentifying the systems uses impacts AeroMACS bandwidth requirements. Although an initial 59 MHz spectrum allocation for AeroMACS was established in 2007, the allocation may be inadequate; studies have indicated that 100 MHz or more of spectrum may be required to support airport surface communications. Hence additional spectrum allocations have been proposed. Vehicle health management (VHM) systems, which can produce large volumes of vehicle health data, were not considered in the original bandwidth requirements analyses, and are therefore of interest in supporting proposals for additional AeroMACS spectrum. VHM systems are an emerging development in air vehicle safety, and preliminary estimates of the amount of data that will be produced and transmitted off an aircraft, both in flight and on the ground, have been prepared based on estimates of data produced by on-board vehicle health sensors and initial concepts of data processing approaches. This allowed an initial estimate of VHM data transmission requirements for the airport surface. More recently, vehicle-level systems designed to process and analyze VHM data and draw conclusions on the current state of vehicle health have been undergoing testing and evaluation. These systems make use of vehicle system data that is mostly different from VHM data considered previously for airport surface transmission, and produce processed system outputs that will be also need to be archived, thus generating additional data load for AeroMACS. This paper provides an analysis of airport surface data transmission requirements resulting from the vehicle level reasoning systems, within the context of overall VHM data requirements.

  7. Floods and climate: emerging perspectives for flood risk assessment and management

    NASA Astrophysics Data System (ADS)

    Merz, B.; Aerts, J.; Arnbjerg-Nielsen, K.; Baldi, M.; Becker, A.; Bichet, A.; Blöschl, G.; Bouwer, L. M.; Brauer, A.; Cioffi, F.; Delgado, J. M.; Gocht, M.; Guzzetti, F.; Harrigan, S.; Hirschboeck, K.; Kilsby, C.; Kron, W.; Kwon, H.-H.; Lall, U.; Merz, R.; Nissen, K.; Salvatti, P.; Swierczynski, T.; Ulbrich, U.; Viglione, A.; Ward, P. J.; Weiler, M.; Wilhelm, B.; Nied, M.

    2014-07-01

    Flood estimation and flood management have traditionally been the domain of hydrologists, water resources engineers and statisticians, and disciplinary approaches abound. Dominant views have been shaped; one example is the catchment perspective: floods are formed and influenced by the interaction of local, catchment-specific characteristics, such as meteorology, topography and geology. These traditional views have been beneficial, but they have a narrow framing. In this paper we contrast traditional views with broader perspectives that are emerging from an improved understanding of the climatic context of floods. We come to the following conclusions: (1) extending the traditional system boundaries (local catchment, recent decades, hydrological/hydraulic processes) opens up exciting possibilities for better understanding and improved tools for flood risk assessment and management. (2) Statistical approaches in flood estimation need to be complemented by the search for the causal mechanisms and dominant processes in the atmosphere, catchment and river system that leave their fingerprints on flood characteristics. (3) Natural climate variability leads to time-varying flood characteristics, and this variation may be partially quantifiable and predictable, with the perspective of dynamic, climate-informed flood risk management. (4) Efforts are needed to fully account for factors that contribute to changes in all three risk components (hazard, exposure, vulnerability) and to better understand the interactions between society and floods. (5) Given the global scale and societal importance, we call for the organization of an international multidisciplinary collaboration and data-sharing initiative to further understand the links between climate and flooding and to advance flood research.

  8. 75 FR 72836 - Information Collection Sent to the Office of Management and Budget (OMB) for Approval; OMB...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... Annual Nonhour Burden Cost: $102,550 associated with recovering the costs of processing applications... (NCR) Application for Public Gathering AGENCY: National Park Service, Interior. ACTION: Notice; request... collection and the estimated burden and cost. This ICR is scheduled to expire on November 30, 2010. We may...

  9. 75 FR 382 - Proposed Collection; Comment Request; Process Evaluation of the NIH's Roadmap Interdisciplinary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-05

    ... submitted to the Office of Management and Budget (OMB) for review and approval. Proposed Collection: The... Investigators, 1; Trainees, 1; Average burden hours per response: 30 minutes; and Estimated total annual burden hours requested: 250 hours. The total annualized cost to respondents (calculated as the number of...

  10. 78 FR 60885 - Proposed Collection; 60-Day Notice Request: Application Process for Clinical Research Training...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... the Office of Management and Budget (OMB) for review and approval. Written comments and/or suggestions... methodology and assumptions used; (3) The quality, utility, and clarity of the information to be collected.... There are capital, operating, and/or maintenance costs of $98,022. The total estimated annualized burden...

  11. Performance of the SWEEP model affected by estimates of threshold friction velocity

    USDA-ARS?s Scientific Manuscript database

    The Wind Erosion Prediction System (WEPS) is a process-based model and needs to be verified under a broad range of climatic, soil, and management conditions. Occasional failure of the WEPS erosion submodel (Single-event Wind Erosion Evaluation Program or SWEEP) to simulate erosion in the Columbia Pl...

  12. Canopy gap dynamics of second-growth red spruce-northern hardwood stands in West Virginia

    Treesearch

    James S. Rentch; Thomas M. Schuler; Gregory J. Nowacki; Nathan R. Beane; W. Mark Ford

    2010-01-01

    Forest restoration requires an understanding of the natural disturbance regime of the target community and estimates of the historic range of variability of ecosystem components (composition, structure, and disturbance processes). Management prescriptions that support specific restoration activities should be consistent with these parameters. In this study, we describe...

  13. Project NFFL: The Niagara Fantasy Football League and Sport Marketing Education

    ERIC Educational Resources Information Center

    Davis, Dexter J.

    2012-01-01

    Estimates are that 32 million people currently play fantasy football every year. Project Based Learning (PBL) is one method of engaging students in the educational process. This paper outlines a semester long project undertaken by undergraduate sport management students that uses fantasy football as a vehicle to enhance student knowledge of basic…

  14. A screening procedure to evaluate air pollution effects on Class I wilderness areas

    Treesearch

    Douglas G. Fox; Ann M. Bartuska; James G. Byrne; Ellis Cowling; Richard Fisher; Gene E. Likens; Steven E. Lindberg; Rick A. Linthurst; Jay Messer; Dale S. Nichols

    1989-01-01

    This screening procedure is intended to help wilderness managers conduct "adverse impact determinations" as part of Prevention of Significant Deterioration (PSD) applications for sources that emit air pollutants that might impact Class I wildernesses. The process provides an initial estimate of susceptibility to critical loadings for sulfur, nitrogen, and...

  15. Computational provenance in hydrologic science: a snow mapping example.

    PubMed

    Dozier, Jeff; Frew, James

    2009-03-13

    Computational provenance--a record of the antecedents and processing history of digital information--is key to properly documenting computer-based scientific research. To support investigations in hydrologic science, we produce the daily fractional snow-covered area from NASA's moderate-resolution imaging spectroradiometer (MODIS). From the MODIS reflectance data in seven wavelengths, we estimate the fraction of each 500 m pixel that snow covers. The daily products have data gaps and errors because of cloud cover and sensor viewing geometry, so we interpolate and smooth to produce our best estimate of the daily snow cover. To manage the data, we have developed the Earth System Science Server (ES3), a software environment for data-intensive Earth science, with unique capabilities for automatically and transparently capturing and managing the provenance of arbitrary computations. Transparent acquisition avoids the scientists having to express their computations in specific languages or schemas in order for provenance to be acquired and maintained. ES3 models provenance as relationships between processes and their input and output files. It is particularly suited to capturing the provenance of an evolving algorithm whose components span multiple languages and execution environments.

  16. Increased Use of Care Management Processes and Expanded Health Information Technology Functions by Practice Ownership and Medicaid Revenue.

    PubMed

    Rodriguez, Hector P; McClellan, Sean R; Bibi, Salma; Casalino, Lawrence P; Ramsay, Patricia P; Shortell, Stephen M

    2016-06-01

    Practice ownership and Medicaid revenue may affect the use of care management processes (CMPs) for chronic conditions and expansion of health information technology (HIT). Using a national cohort of medical practices, we compared the use of CMPs and HIT from 2006/2008 to 2013 by practice ownership and level of Medicaid revenue. Poisson regression models estimated changes in CMP use, and linear regression estimated changes in HIT, by practice ownership and Medicaid patient revenue, controlling for other practice characteristics. Compared with physician-owned practices, system-owned practices adopted a greater number of CMPs and HIT functions over time (p < .001). High Medicaid revenue (≥30.0%) was associated with less adoption of CMPs (p < .001) and HIT (p < .01). System-owned practices (p < .001) and community health centers (p < .001) with high Medicaid revenue were more likely than physician-owned practices with high Medicaid revenue to adopt CMPs over time. System and community health center ownership appear to help high Medicaid practices overcome CMP adoption constraints. © The Author(s) 2015.

  17. Constructing a Database from Multiple 2D Images for Camera Pose Estimation and Robot Localization

    NASA Technical Reports Server (NTRS)

    Wolf, Michael; Ansar, Adnan I.; Brennan, Shane; Clouse, Daniel S.; Padgett, Curtis W.

    2012-01-01

    The LMDB (Landmark Database) Builder software identifies persistent image features (landmarks) in a scene viewed multiple times and precisely estimates the landmarks 3D world positions. The software receives as input multiple 2D images of approximately the same scene, along with an initial guess of the camera poses for each image, and a table of features matched pair-wise in each frame. LMDB Builder aggregates landmarks across an arbitrarily large collection of frames with matched features. Range data from stereo vision processing can also be passed to improve the initial guess of the 3D point estimates. The LMDB Builder aggregates feature lists across all frames, manages the process to promote selected features to landmarks, and iteratively calculates the 3D landmark positions using the current camera pose estimations (via an optimal ray projection method), and then improves the camera pose estimates using the 3D landmark positions. Finally, it extracts image patches for each landmark from auto-selected key frames and constructs the landmark database. The landmark database can then be used to estimate future camera poses (and therefore localize a robotic vehicle that may be carrying the cameras) by matching current imagery to landmark database image patches and using the known 3D landmark positions to estimate the current pose.

  18. Spatial Distribution of Hydrologic Ecosystem Service Estimates: Comparing Two Models

    NASA Astrophysics Data System (ADS)

    Dennedy-Frank, P. J.; Ghile, Y.; Gorelick, S.; Logsdon, R. A.; Chaubey, I.; Ziv, G.

    2014-12-01

    We compare estimates of the spatial distribution of water quantity provided (annual water yield) from two ecohydrologic models: the widely-used Soil and Water Assessment Tool (SWAT) and the much simpler water models from the Integrated Valuation of Ecosystem Services and Tradeoffs (InVEST) toolbox. These two models differ significantly in terms of complexity, timescale of operation, effort, and data required for calibration, and so are often used in different management contexts. We compare two study sites in the US: the Wildcat Creek Watershed (2083 km2) in Indiana, a largely agricultural watershed in a cold aseasonal climate, and the Upper Upatoi Creek Watershed (876 km2) in Georgia, a mostly forested watershed in a temperate aseasonal climate. We evaluate (1) quantitative estimates of water yield to explore how well each model represents this process, and (2) ranked estimates of water yield to indicate how useful the models are for management purposes where other social and financial factors may play significant roles. The SWAT and InVEST models provide very similar estimates of the water yield of individual subbasins in the Wildcat Creek Watershed (Pearson r = 0.92, slope = 0.89), and a similar ranking of the relative water yield of those subbasins (Spearman r = 0.86). However, the two models provide relatively different estimates of the water yield of individual subbasins in the Upper Upatoi Watershed (Pearson r = 0.25, slope = 0.14), and very different ranking of the relative water yield of those subbasins (Spearman r = -0.10). The Upper Upatoi watershed has a significant baseflow contribution due to its sandy, well-drained soils. InVEST's simple seasonality terms, which assume no change in storage over the time of the model run, may not accurately estimate water yield processes when baseflow provides such a strong contribution. Our results suggest that InVEST users take care in situations where storage changes are significant.

  19. Power Consumption Optimization in Tooth Gears Processing

    NASA Astrophysics Data System (ADS)

    Kanatnikov, N.; Harlamov, G.; Kanatnikova, P.; Pashmentova, A.

    2018-01-01

    The paper reviews the issue of optimization of technological process of tooth gears production of the power consumption criteria. The authors dwell on the indices used for cutting process estimation by the consumed energy criteria and their applicability in the analysis of the toothed wheel production process. The inventors proposed a method for optimization of power consumptions based on the spatial modeling of cutting pattern. The article is aimed at solving the problem of effective source management in order to achieve economical and ecological effect during the mechanical processing of toothed gears. The research was supported by Russian Science Foundation (project No. 17-79-10316).

  20. Feasibility study: Liquid hydrogen plant, 30 tons per day

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The design considerations of the plant are discussed in detail along with management planning, objective schedules, and cost estimates. The processing scheme is aimed at ultimate use of coal as the basic raw material. For back-up, and to provide assurance of a dependable and steady supply of hydrogen, a parallel and redundant facility for gasifying heavy residual oil will be installed. Both the coal and residual oil gasifiers will use the partial oxidation process.

  1. A Systems Analysis and Project Management Plan for the Petite Amateur Navy Satellite (PANSAT)

    DTIC Science & Technology

    1994-09-01

    engineering is defined as a process by which a stated need (objective) is transformed into a life cycle balanced set of product and process descriptions...and rr.eseing the collection of informialtio Send comments regardmg this burden estimate or any o•her aspect of this :olleiori•t i informationt...including suggeston, for reducing this bourden, to Washington Hieadquaroers .ervin Directorate I’r Infomiation Operalions% and Repows 1211 Jefferson Paws

  2. H-Coal Pilot Plant Phase II construction, Catlettsburg, Kentucky. Final construction management report, December 1976-February 1980. [February, 1977 to approximately January, 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-03-01

    This report covers the H-Coal Pilot Plant facility located in Catlettsburg, Kentucky. The authorization for this project was under DOE contract No. DE-AC05-78ET11052, formally ET-78-C-01-3224. Badger Plants, Inc. carried out the construction management of this facility. The estimated total cost is $147,265,013. A brief process/technical description of the Pilot Plant covers subjects such as objectives, capacity, expected life, etc. A brief technical description of each processing unit, including its purpose in the overall operations of the plant is given. A general description of the organizational history of the project is given. Current overall organization and a description of the responsibilitiesmore » of each participant are included. Badger Plant's organization at manager level is shown.« less

  3. Managing the Alert Process at NewYork-Presbyterian Hospital

    PubMed Central

    Kuperman, Gilad J; Diamente, Rosanna; Khatu, Vrinda; Chan-Kraushar, Terri; Stetson, Pete; Boyer, Aurelia; Cooper, Mary

    2005-01-01

    Clinical decision support can improve the quality of care, but requires substantial knowledge management activities. At NewYork-Presbyterian Hospital in New York City, we have implemented a formal alert management process whereby only hospital committees and departments can request alerts. An explicit requestor, who will help resolve the details of the alert logic and the alert message must be identified. Alerts must be requested in writing using a structured alert request form. Alert requests are reviewed by the Alert Committee and then forwarded to the Information Systems department for a software development estimate. The model required that clinical committees and departments become more actively involved in the development of alerts than had previously been necessary. In the 12 months following implementation, 10 alert requests were received. The model has been well received. A lot of the knowledge engineering work has been distributed and burden has been removed from scarce medical informatics resources. PMID:16779073

  4. Estimating direct fatality impacts at wind farms: how far we’ve come, where we have yet to go

    USGS Publications Warehouse

    Huso, Manuela M.; Schwartz, Susan Savitt

    2013-01-01

    Measuring the potential impacts of wind farms on wildlife can be difficult and may require development of new statistical tools and models to accurately reflect the measurement process. This presentation reviews the recent history of approaches to estimating wildlife fatality under the unique conditions encountered at wind farms, their unifying themes and their potential shortcomings. Avenues of future research are suggested to continue to address the needs of resource managers and industry in understanding direct impacts of wind turbine-caused wildlife fatality.

  5. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.

  6. Lessons learned in deploying software estimation technology and tools

    NASA Technical Reports Server (NTRS)

    Panlilio-Yap, Nikki; Ho, Danny

    1994-01-01

    Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.

  7. Potential effects of forest management on surface albedo

    NASA Astrophysics Data System (ADS)

    Otto, J.; Bréon, F.-M.; Schelhaas, M.-J.; Pinty, B.; Luyssaert, S.

    2012-04-01

    Currently 70% of the world's forests are managed and this figure is likely to rise due to population growth and increasing demand for wood based products. Forest management has been put forward by the Kyoto-Protocol as one of the key instruments in mitigating climate change. For temperate and boreal forests, the effects of forest management on the stand-level carbon balance are reasonably well understood, but the biophysical effects, for example through changes in the albedo, remain elusive. Following a modeling approach, we aim to quantify the variability in albedo that can be attributed to forest management through changes in canopy structure and density. The modelling approach chains three separate models: (1) a forest gap model to describe stand dynamics, (2) a Monte-Carlo model to estimate the probability density function of the optical path length of photons through the canopy and (3) a physically-based canopy transfer model to estimate the interaction between photons and leaves. The forest gap model provides, on a monthly time step the position, height, diameter, crown size and leaf area index of individual trees. The Monte-Carlo model computes from this the probability density function of the distance a photon travels through crown volumes to determine the direct light reaching the forest floor. This information is needed by the canopy transfer model to calculate the effective leaf area index - a quantity that allows it to correctly represent a 3D process with a 1D model. Outgoing radiation is calculated as the result of multiple processes involving the scattering due to the canopy layer and the forest floor. Finally, surface albedo is computed as the ratio between incident solar radiation and calculated outgoing radiation. The study used two time series representing thinning from below of a beech and a Scots pine forest. The results show a strong temporal evolution in albedo during stand establishment followed by a relatively stable albedo once the canopy is closed. During this period, albedo is affected for a short time by forest operations. The modelling approach allowed us to estimate the importance of ground vegetation in the stand albedo. Given that ground vegetation depends on the light reaching the forest floor, ground vegetation could act as a natural buffer to dampen changes in albedo, allowing the stand to maintain optimal leaf temperature. Consequently, accounting for only the carbon balance component of forest management ignores albedo impacts and is thus likely to yield biased estimates of the climate benefits of forest ecosystems.

  8. Processing of intended and unintended strategic issues and integration into the strategic agenda.

    PubMed

    Ridder, Hans-Gerd; Schrader, Jan Simon

    2017-11-01

    Strategic change is needed in hospitals due to external and internal pressures. However, research on strategic change, as a combination of management and medical expertise in hospitals, remains scarce. We analyze how intended strategic issues are processed into deliberate strategies and how unintended strategic issues are processed into emergent strategies in the management of strategy formation in hospitals. This study empirically investigates the integration of medical and management expertise in strategy formation. The longitudinal character of the case study enabled us to track patterns of intended and unintended strategic issues over 2 years. We triangulated data from interviews, observations, and documents. In accordance with the quality standards of qualitative research procedures, we analyzed the data by pattern matching and provided analytical generalization regarding strategy formation in hospitals. Our findings suggest that strategic issues are particularly successful within the strategy formation process if interest groups are concerned with the strategic issue, prospective profits are estimated, and relevant decisions makers are involved early on. Structure and interaction processes require clear criteria and transparent procedures for effective strategy formation. There is systematic neglect of medical expertise in processes of generating strategies. Our study reveals that the decentralized structure of medical centers is an adequate template for both the operationalization of intended strategic issues and the development of unintended strategic issues. However, tasks, roles, responsibility, resources, and administrative support are necessary for effective management of strategy formation. Similarly, criteria, procedures, and decision-making are prerequisites for effective strategy formation.

  9. Innovation and motivation in public health professionals.

    PubMed

    García-Goñi, Manuel; Maroto, Andrés; Rubalcaba, Luis

    2007-12-01

    Innovations in public health services promote increases in the health status of the population. Therefore, it is a major concern for health policy makers to understand the drivers of innovation processes. This paper focuses on the differences in behaviour of managers and front-line employees in the pro-innovative provision of public health services. We utilize a survey conducted on front-line employees and managers in public health institutions across six European countries. The survey covers topics related to satisfaction, or attitude towards innovation or their institution. We undertake principal components analysis and analysis of variance, and estimate a multinomial ordered probit model to analyse the existence of different behaviour in managers and front-line employees with respect to innovation. Perception of innovation is different for managers and front-line employees in public health institutions. While front-line employees' attitude depends mostly on the overall performance of the institution, managers feel more involved and motivated, and their behaviour depends more on individual and organisational innovative profiles. It becomes crucial to make both managers and front-line employees at public health institutions feel participative and motivated in order to maximise the benefits of technical or organisational innovative process in the health services provision.

  10. Estimating effectiveness of crop management for reduction of soil erosion and runoff

    NASA Astrophysics Data System (ADS)

    Hlavcova, K.; Studvova, Z.; Kohnova, S.; Szolgay, J.

    2017-10-01

    The paper focuses on erosion processes in the Svacenický Creek catchment which is a small sub-catchment of the Myjava River basin. To simulate soil loss and sediment transport the USLE/SDR and WaTEM/SEDEM models were applied. The models were validated by comparing the simulated results with the actual bathymetry of a polder at the catchment outlet. Methods of crop management based on rotation and strip cropping were applied for the reduction of soil loss and sediment transport. The comparison shows that the greatest intensities of soil loss were achieved by the bare soil without vegetation and from the planting of maize for corn. The lowest values were achieved from the planting of winter wheat. At the end the effectiveness of row crops and strip cropping for decreasing design floods from the catchment was estimated.

  11. Real Time Data Management for Estimating Probabilities of Incidents and Near Misses

    NASA Astrophysics Data System (ADS)

    Stanitsas, P. D.; Stephanedes, Y. J.

    2011-08-01

    Advances in real-time data collection, data storage and computational systems have led to development of algorithms for transport administrators and engineers that improve traffic safety and reduce cost of road operations. Despite these advances, problems in effectively integrating real-time data acquisition, processing, modelling and road-use strategies at complex intersections and motorways remain. These are related to increasing system performance in identification, analysis, detection and prediction of traffic state in real time. This research develops dynamic models to estimate the probability of road incidents, such as crashes and conflicts, and incident-prone conditions based on real-time data. The models support integration of anticipatory information and fee-based road use strategies in traveller information and management. Development includes macroscopic/microscopic probabilistic models, neural networks, and vector autoregressions tested via machine vision at EU and US sites.

  12. Benefits of information technology-enabled diabetes management.

    PubMed

    Bu, Davis; Pan, Eric; Walker, Janice; Adler-Milstein, Julia; Kendrick, David; Hook, Julie M; Cusack, Caitlin M; Bates, David W; Middleton, Blackford

    2007-05-01

    To determine the financial and clinical benefits of implementing information technology (IT)-enabled disease management systems. A computer model was created to project the impact of IT-enabled disease management on care processes, clinical outcomes, and medical costs for patients with type 2 diabetes aged >25 years in the U.S. Several ITs were modeled (e.g., diabetes registries, computerized decision support, remote monitoring, patient self-management systems, and payer-based systems). Estimates of care process improvements were derived from published literature. Simulations projected outcomes for both payer and provider organizations, scaled to the national level. The primary outcome was medical cost savings, in 2004 U.S. dollars discounted at 5%. Secondary measures include reduction of cardiovascular, cerebrovascular, neuropathy, nephropathy, and retinopathy clinical outcomes. All forms of IT-enabled disease management improved the health of patients with diabetes and reduced health care expenditures. Over 10 years, diabetes registries saved $14.5 billion, computerized decision support saved $10.7 billion, payer-centered technologies saved $7.10 billion, remote monitoring saved $326 million, self-management saved $285 million, and integrated provider-patient systems saved $16.9 billion. IT-enabled diabetes management has the potential to improve care processes, delay diabetes complications, and save health care dollars. Of existing systems, provider-centered technologies such as diabetes registries currently show the most potential for benefit. Fully integrated provider-patient systems would have even greater potential for benefit. These benefits must be weighed against the implementation costs.

  13. Application of Bayesian techniques to model the burden of human salmonellosis attributable to U.S. food commodities at the point of processing: adaptation of a Danish model.

    PubMed

    Guo, Chuanfa; Hoekstra, Robert M; Schroeder, Carl M; Pires, Sara Monteiro; Ong, Kanyin Liane; Hartnett, Emma; Naugle, Alecia; Harman, Jane; Bennett, Patricia; Cieslak, Paul; Scallan, Elaine; Rose, Bonnie; Holt, Kristin G; Kissler, Bonnie; Mbandi, Evelyne; Roodsari, Reza; Angulo, Frederick J; Cole, Dana

    2011-04-01

    Mathematical models that estimate the proportion of foodborne illnesses attributable to food commodities at specific points in the food chain may be useful to risk managers and policy makers to formulate public health goals, prioritize interventions, and document the effectiveness of mitigations aimed at reducing illness. Using human surveillance data on laboratory-confirmed Salmonella infections from the Centers for Disease Control and Prevention and Salmonella testing data from U.S. Department of Agriculture Food Safety and Inspection Service's regulatory programs, we developed a point-of-processing foodborne illness attribution model by adapting the Hald Salmonella Bayesian source attribution model. Key model outputs include estimates of the relative proportions of domestically acquired sporadic human Salmonella infections resulting from contamination of raw meat, poultry, and egg products processed in the United States from 1998 through 2003. The current model estimates the relative contribution of chicken (48%), ground beef (28%), turkey (17%), egg products (6%), intact beef (1%), and pork (<1%) across 109 Salmonella serotypes found in food commodities at point of processing. While interpretation of the attribution estimates is constrained by data inputs, the adapted model shows promise and may serve as a basis for a common approach to attribution of human salmonellosis and food safety decision-making in more than one country. © Mary Ann Liebert, Inc.

  14. Application of Bayesian Techniques to Model the Burden of Human Salmonellosis Attributable to U.S. Food Commodities at the Point of Processing: Adaptation of a Danish Model

    PubMed Central

    Guo, Chuanfa; Hoekstra, Robert M.; Schroeder, Carl M.; Pires, Sara Monteiro; Ong, Kanyin Liane; Hartnett, Emma; Naugle, Alecia; Harman, Jane; Bennett, Patricia; Cieslak, Paul; Scallan, Elaine; Rose, Bonnie; Holt, Kristin G.; Kissler, Bonnie; Mbandi, Evelyne; Roodsari, Reza; Angulo, Frederick J.

    2011-01-01

    Abstract Mathematical models that estimate the proportion of foodborne illnesses attributable to food commodities at specific points in the food chain may be useful to risk managers and policy makers to formulate public health goals, prioritize interventions, and document the effectiveness of mitigations aimed at reducing illness. Using human surveillance data on laboratory-confirmed Salmonella infections from the Centers for Disease Control and Prevention and Salmonella testing data from U.S. Department of Agriculture Food Safety and Inspection Service's regulatory programs, we developed a point-of-processing foodborne illness attribution model by adapting the Hald Salmonella Bayesian source attribution model. Key model outputs include estimates of the relative proportions of domestically acquired sporadic human Salmonella infections resulting from contamination of raw meat, poultry, and egg products processed in the United States from 1998 through 2003. The current model estimates the relative contribution of chicken (48%), ground beef (28%), turkey (17%), egg products (6%), intact beef (1%), and pork (<1%) across 109 Salmonella serotypes found in food commodities at point of processing. While interpretation of the attribution estimates is constrained by data inputs, the adapted model shows promise and may serve as a basis for a common approach to attribution of human salmonellosis and food safety decision-making in more than one country. PMID:21235394

  15. Setting conservation management thresholds using a novel participatory modeling approach.

    PubMed

    Addison, P F E; de Bie, K; Rumpff, L

    2015-10-01

    We devised a participatory modeling approach for setting management thresholds that show when management intervention is required to address undesirable ecosystem changes. This approach was designed to be used when management thresholds: must be set for environmental indicators in the face of multiple competing objectives; need to incorporate scientific understanding and value judgments; and will be set by participants with limited modeling experience. We applied our approach to a case study where management thresholds were set for a mat-forming brown alga, Hormosira banksii, in a protected area management context. Participants, including management staff and scientists, were involved in a workshop to test the approach, and set management thresholds to address the threat of trampling by visitors to an intertidal rocky reef. The approach involved trading off the environmental objective, to maintain the condition of intertidal reef communities, with social and economic objectives to ensure management intervention was cost-effective. Ecological scenarios, developed using scenario planning, were a key feature that provided the foundation for where to set management thresholds. The scenarios developed represented declines in percent cover of H. banksii that may occur under increased threatening processes. Participants defined 4 discrete management alternatives to address the threat of trampling and estimated the effect of these alternatives on the objectives under each ecological scenario. A weighted additive model was used to aggregate participants' consequence estimates. Model outputs (decision scores) clearly expressed uncertainty, which can be considered by decision makers and used to inform where to set management thresholds. This approach encourages a proactive form of conservation, where management thresholds and associated actions are defined a priori for ecological indicators, rather than reacting to unexpected ecosystem changes in the future. © 2015 The Authors Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  16. Natural Resources Research Program: Proceedings of a Workshop on Operational Management Plans: Improving the Process Held in Arlington, Texas on 5-7 December 1989

    DTIC Science & Technology

    1991-03-01

    direction of the work unit was changed to focus on exchanging information in the form of a workshop. It was * eyman-Dove, Linda, Waring, Michael R., and...plan should be broad in scope and evolutionary in principle to permit subsequent revisions neces- sary to fit changing conditions." Cost estimates for...began to update guidance to the field offices in the area of recreation and natural resource manage- ment. Times had changed . New water resource

  17. Reducing uncertainty for estimating forest carbon stocks and dynamics using integrated remote sensing, forest inventory and process-based modeling

    NASA Astrophysics Data System (ADS)

    Poulter, B.; Ciais, P.; Joetzjer, E.; Maignan, F.; Luyssaert, S.; Barichivich, J.

    2015-12-01

    Accurately estimating forest biomass and forest carbon dynamics requires new integrated remote sensing, forest inventory, and carbon cycle modeling approaches. Presently, there is an increasing and urgent need to reduce forest biomass uncertainty in order to meet the requirements of carbon mitigation treaties, such as Reducing Emissions from Deforestation and forest Degradation (REDD+). Here we describe a new parameterization and assimilation methodology used to estimate tropical forest biomass using the ORCHIDEE-CAN dynamic global vegetation model. ORCHIDEE-CAN simulates carbon uptake and allocation to individual trees using a mechanistic representation of photosynthesis, respiration and other first-order processes. The model is first parameterized using forest inventory data to constrain background mortality rates, i.e., self-thinning, and productivity. Satellite remote sensing data for forest structure, i.e., canopy height, is used to constrain simulated forest stand conditions using a look-up table approach to match canopy height distributions. The resulting forest biomass estimates are provided for spatial grids that match REDD+ project boundaries and aim to provide carbon estimates for the criteria described in the IPCC Good Practice Guidelines Tier 3 category. With the increasing availability of forest structure variables derived from high-resolution LIDAR, RADAR, and optical imagery, new methodologies and applications with process-based carbon cycle models are becoming more readily available to inform land management.

  18. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.

  19. Biodiversity Hotspots, Climate Change, and Agricultural Development: Global Limits of Adaptation

    NASA Astrophysics Data System (ADS)

    Schneider, U. A.; Rasche, L.; Schmid, E.; Habel, J. C.

    2017-12-01

    Terrestrial ecosystems are threatened by climate and land management change. These changes result from complex and heterogeneous interactions of human activities and natural processes. Here, we study the potential change in pristine area in 33 global biodiversity hotspots within this century under four climate projections (representative concentration pathways) and associated population and income developments (shared socio-economic pathways). A coupled modelling framework computes the regional net expansion of crop and pasture lands as result of changes in food production and consumption. We use a biophysical crop simulation model to quantify climate change impacts on agricultural productivity, water, and nutrient emissions for alternative crop management systems in more than 100 thousand agricultural land polygons (homogeneous response units) and for each climate projection. The crop simulation model depicts detailed soil, weather, and management information and operates with a daily time step. We use time series of livestock statistics to link livestock production to feed and pasture requirements. On the food consumption side, we estimate national demand shifts in all countries by processing population and income growth projections through econometrically estimated Engel curves. Finally, we use a global agricultural sector optimization model to quantify the net change in pristine area in all biodiversity hotspots under different adaptation options. These options include full-scale global implementation of i) crop yield maximizing management without additional irrigation, ii) crop yield maximizing management with additional irrigation, iii) food yield maximizing crop mix adjustments, iv) food supply maximizing trade flow adjustments, v) healthy diets, and vi) combinations of the individual options above. Results quantify the regional potentials and limits of major agricultural producer and consumer adaptation options for the preservation of pristine areas in biodiversity hotspots. Results also quantify the conflicts between food and water security, biodiversity protection, and climate change mitigation.

  20. Stream Discharge and Evapotranspiration Responses to Climate Change and Their Associated Uncertainties in a Large Semi-Arid Basin

    NASA Astrophysics Data System (ADS)

    Bassam, S.; Ren, J.

    2017-12-01

    Predicting future water availability in watersheds is very important for proper water resources management, especially in semi-arid regions with scarce water resources. Hydrological models have been considered as powerful tools in predicting future hydrological conditions in watershed systems in the past two decades. Streamflow and evapotranspiration are the two important components in watershed water balance estimation as the former is the most commonly-used indicator of the overall water budget estimation, and the latter is the second biggest component of water budget (biggest outflow from the system). One of the main concerns in watershed scale hydrological modeling is the uncertainties associated with model prediction, which could arise from errors in model parameters and input meteorological data, or errors in model representation of the physics of hydrological processes. Understanding and quantifying these uncertainties are vital to water resources managers for proper decision making based on model predictions. In this study, we evaluated the impacts of different climate change scenarios on the future stream discharge and evapotranspiration, and their associated uncertainties, throughout a large semi-arid basin using a stochastically-calibrated, physically-based, semi-distributed hydrological model. The results of this study could provide valuable insights in applying hydrological models in large scale watersheds, understanding the associated sensitivity and uncertainties in model parameters, and estimating the corresponding impacts on interested hydrological process variables under different climate change scenarios.

  1. An empirical model for estimating annual consumption by freshwater fish populations

    USGS Publications Warehouse

    Liao, H.; Pierce, C.L.; Larscheid, J.G.

    2005-01-01

    Population consumption is an important process linking predator populations to their prey resources. Simple tools are needed to enable fisheries managers to estimate population consumption. We assembled 74 individual estimates of annual consumption by freshwater fish populations and their mean annual population size, 41 of which also included estimates of mean annual biomass. The data set included 14 freshwater fish species from 10 different bodies of water. From this data set we developed two simple linear regression models predicting annual population consumption. Log-transformed population size explained 94% of the variation in log-transformed annual population consumption. Log-transformed biomass explained 98% of the variation in log-transformed annual population consumption. We quantified the accuracy of our regressions and three alternative consumption models as the mean percent difference from observed (bioenergetics-derived) estimates in a test data set. Predictions from our population-size regression matched observed consumption estimates poorly (mean percent difference = 222%). Predictions from our biomass regression matched observed consumption reasonably well (mean percent difference = 24%). The biomass regression was superior to an alternative model, similar in complexity, and comparable to two alternative models that were more complex and difficult to apply. Our biomass regression model, log10(consumption) = 0.5442 + 0.9962??log10(biomass), will be a useful tool for fishery managers, enabling them to make reasonably accurate annual population consumption predictions from mean annual biomass estimates. ?? Copyright by the American Fisheries Society 2005.

  2. Eccentricity and fluting in young–growth western hemlock in Oregon.

    Treesearch

    Ryan Singleton; Dean S. DeBell; David D. Marshall; Barbara L. Gartner

    2004-01-01

    Stem irregularities can influence estimates of tree and stand attributes, efficiency of manufacturing processes, and quality of wood products. Eccentricity and fluting were characterized in young, managed western hemlock stands in the Oregon Coast Range. Sixty-one trees were selected from pure western hemlock stands across a range of age, site, and densities. The trees...

  3. Ecosystem evapotranspiration: challenges in measurements, estimates, and modeling

    Treesearch

    Devendra Amatya; S. Irmak; P. Gowda; Ge Sun; J.E. Nettles; K.R. Douglas-Mankin

    2016-01-01

    Evapotranspiration (ET) processes at the leaf to landscape scales in multiple land uses have important controls and feedbacks for local, regional, and global climate and water resource systems. Innovative methods, tools, and technologies for improved understanding and quantification of ET and crop water use are critical for adapting more effective management strategies...

  4. Assessing the suitability of American National Aeronautics and Space Administration (NASA) agro-climatology archive to predict daily meteorological variables and reference evapotranspiration in Sicily, Italy

    USDA-ARS?s Scientific Manuscript database

    For decades, the importance of evapotranspiration processes has been recognized in many disciplines, including hydrologic and drainage studies, irrigation systems design and management. A wide number of equations have been proposed to estimate crop reference evapotranspiration, ET0, based on the var...

  5. Estimates of Down Woody Materials in Eastern US Forests

    Treesearch

    David C. Chojnacky; Robert A. Mickler; Linda S. Heath; Christopher W. Woodall

    2004-01-01

    Down woody materials (WVMs) are an important part of forest ecosystems for wildlife habitat, carbon storage, structural diversity, wildfire hazard, and other large-scale ecosystem processes. To better manage forests for DWMs, available and easily accessible data on DWM components are needed. We examined data on DWMs, collected in 2001 by the US Department of...

  6. Dynamic Fuzzy Logic-Based Quality of Interaction within Blended-Learning: The Rare and Contemporary Dance Cases

    ERIC Educational Resources Information Center

    Dias, Sofia B.; Diniz, José A.; Hadjileontiadis, Leontios J.

    2014-01-01

    The combination of the process of pedagogical planning within the Blended (b-) learning environment with the users' quality of interaction ("QoI") with the Learning Management System (LMS) is explored here. The required "QoI" (both for professors and students) is estimated by adopting a fuzzy logic-based modeling approach,…

  7. Characterizing forest succession with lidar data: An evaluation for the Inland Northwest, USA

    Treesearch

    Michael J. Falkowski; Jeffrey S. Evans; Sebastian Martinuzzi; Paul E. Gessler; Andrew T. Hudak

    2009-01-01

    Quantifying forest structure is important for sustainable forest management, as it relates to a wide variety of ecosystem processes and services. Lidar data have proven particularly useful for measuring or estimating a suite of forest structural attributes such as canopy height, basal area, and LAI. However, the potential of this technology to characterize forest...

  8. Software Measurement Guidebook. Version 02.00.02

    DTIC Science & Technology

    1992-12-01

    Compatibility Testing Process .............................. 9-5 Figure 9-3. Development Effort Planning Curve ................................. 9-7 Figure 10-1...requirements, design, code, and test and for analyzing this data. "* Proposal Manager. The person responsible for describing and supporting the estimated...designed, build/elease ranges, variances, and comparisons size growth; costs; completions; and content, units completing test , units with historical

  9. Nurse-led diabetes management in remote locations.

    PubMed

    Kirby, Sue; Moore, Malcolm; McCarron, Trish; Perkins, David; Lyle, David

    2015-01-01

    Nurse-led diabetes management has been shown to be effective in urban and regional general practice. We sought to test the feasibility of providing a nurse-led annual cycle of diabetes care in a remote location and to explore the factors that patients indicated were important in diabetes self-management. We conducted a pilot study in 3 locations: 1 town and 2 small townships in remote Australia. A chronic disease nurse (CDN) visited each patient over the course of a year. We examined patient clinical outcomes and interview data. We estimated the cost per hour of the CDN's time, including travel time, per 1% drop in glycated hemoglobin (HbA1C). A total of 21 patients participated in the pilot study. Clinical findings showed significant reductions in HbA1C levels after the nurse-led intervention. Patients reported that they trusted the nurse and thought her advice was pitched at their level. Patients were motivated through a process that included emotional response, change identity and acceptance. The estimated cost in CDN hours per 1% drop in HbA1C level was A$242.95 (Can$237.60). Nurse-led diabetes care motivated patients to manage their diabetes and resulted in a significant improvement in diabetes management in this remote setting.

  10. Water resource sensitivity from a Mediterranean perspective

    NASA Astrophysics Data System (ADS)

    Lyon, S. W.; Klein, J.; Archibald, J. A.; Walter, T.

    2012-12-01

    The water cycle in semiarid environments is intimately connected to plant-water interactions making these regions sensitive to both future climatic changes and landuse alterations. This study explores the sensitivity of water resource availability from a Mediterranean perspective using the Navarino Environmental Observatory (NEO) in Costa Navarino, Greece as a large-scale laboratory for developing and testing the potential resource impacts of various landuse/climatic trajectories. Direct measurements of evapotranspiration were combined with Penman-Monteith estimates to compare water vapor flux variability across the gradient of current management conditions found within the NEO landscape. These range from native, non-managed vegetation to historic, traditionally managed agriculture to modern, actively managed recreational lands. These management conditions greatly impact the vertical flux of water vapor in this semiarid landscape. Our evapotranspiration estimates were placed into a process-based modeling framework to characterize the current state of regional water resource availability and simulate future trajectories (and the associated uncertainties) in response to landuse/climatic changes. This region is quite sensitive with regards to water cycle modifications due to the anthropogenic redistribution of water within and across the landscape. Such sensitivity typifies that expected for much of the Mediterranean region, highlighting the NEO as a potential key location for future observation and investigation.

  11. Risk-Based Approach for Microbiological Food Safety Management in the Dairy Industry: The Case of Listeria monocytogenes in Soft Cheese Made from Pasteurized Milk.

    PubMed

    Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez

    2014-01-01

    According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. © 2013 Society for Risk Analysis.

  12. Soil mapping and processes models to support climate change mitigation and adaptation strategies: a review

    NASA Astrophysics Data System (ADS)

    Muñoz-Rojas, Miriam; Pereira, Paulo; Brevik, Eric; Cerda, Artemi; Jordan, Antonio

    2017-04-01

    As agreed in Paris in December 2015, global average temperature is to be limited to "well below 2 °C above pre-industrial levels" and efforts will be made to "limit the temperature increase to 1.5 °C above pre-industrial levels. Thus, reducing greenhouse gas emissions (GHG) in all sectors becomes critical and appropriate sustainable land management practices need to be taken (Pereira et al., 2017). Mitigation strategies focus on reducing the rate and magnitude of climate change by reducing its causes. Complementary to mitigation, adaptation strategies aim to minimise impacts and maximize the benefits of new opportunities. The adoption of both practices will require developing system models to integrate and extrapolate anticipated climate changes such as global climate models (GCMs) and regional climate models (RCMs). Furthermore, integrating climate models driven by socio-economic scenarios in soil process models has allowed the investigation of potential changes and threats in soil characteristics and functions in future climate scenarios. One of the options with largest potential for climate change mitigation is sequestering carbon in soils. Therefore, the development of new methods and the use of existing tools for soil carbon monitoring and accounting have therefore become critical in a global change context. For example, soil C maps can help identify potential areas where management practices that promote C sequestration will be productive and guide the formulation of policies for climate change mitigation and adaptation strategies. Despite extensive efforts to compile soil information and map soil C, many uncertainties remain in the determination of soil C stocks, and the reliability of these estimates depends upon the quality and resolution of the spatial datasets used for its calculation. Thus, better estimates of soil C pools and dynamics are needed to advance understanding of the C balance and the potential of soils for climate change mitigation. Here, we discuss the most recent advances on the application of soil mapping and modeling to support climate change mitigation and adaptation strategies; and These strategies are a key component of the implementation of sustainable land management policies need to be integrated are critical to. The objective of this work is to present a review about the advantages of soil mapping and process modeling for sustainable land management. Muñoz-Rojas, M., Pereira, P., Brevic, E., Cerda, A., Jordan, A. (2017) Soil mapping and processes models for sustainable land management applied to modern challenges. In: Pereira, P., Brevik, E., Munoz-Rojas, M., Miller, B. (Eds.) Soil mapping and process modelling for sustainable land use management (Elsevier Publishing House) ISBN: 9780128052006

  13. Simulation of nitrous oxide emissions at field scale using the SPACSYS model

    PubMed Central

    Wu, L.; Rees, R.M.; Tarsitano, D.; Zhang, Xubo; Jones, S.K.; Whitmore, A.P.

    2015-01-01

    Nitrous oxide emitted to the atmosphere via the soil processes of nitrification and denitrification plays an important role in the greenhouse gas balance of the atmosphere and is involved in the destruction of stratospheric ozone. These processes are controlled by biological, physical and chemical factors such as growth and activity of microbes, nitrogen availability, soil temperature and water availability. A comprehensive understanding of these processes embodied in an appropriate model can help develop agricultural mitigation strategies to reduce greenhouse gas emissions, and help with estimating emissions at landscape and regional scales. A detailed module to describe the denitrification and nitrification processes and nitrogenous gas emissions was incorporated into the SPACSYS model to replace an earlier module that used a simplified first-order equation to estimate denitrification and was unable to distinguish the emissions of individual nitrogenous gases. A dataset derived from a Scottish grassland experiment in silage production was used to validate soil moisture in the top 10 cm soil, cut biomass, nitrogen offtake and N2O emissions. The comparison between the simulated and observed data suggested that the new module can provide a good representation of these processes and improve prediction of N2O emissions. The model provides an opportunity to estimate gaseous N emissions under a wide range of management scenarios in agriculture, and synthesises our understanding of the interaction and regulation of the processes. PMID:26026411

  14. An Exploratory Study of Cost Engineering in Axiomatic Design: Creation of the Cost Model Based on an FR-DP Map

    NASA Technical Reports Server (NTRS)

    Lee, Taesik; Jeziorek, Peter

    2004-01-01

    Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.

  15. Can we reliably estimate managed forest carbon dynamics using remotely sensed data?

    NASA Astrophysics Data System (ADS)

    Smallman, Thomas Luke; Exbrayat, Jean-Francois; Bloom, A. Anthony; Williams, Mathew

    2015-04-01

    Forests are an important part of the global carbon cycle, serving as both a large store of carbon and currently as a net sink of CO2. Forest biomass varies significantly in time and space, linked to climate, soils, natural disturbance and human impacts. This variation means that the global distribution of forest biomass and their dynamics are poorly quantified. Terrestrial ecosystem models (TEMs) are rarely evaluated for their predictions of forest carbon stocks and dynamics, due to a lack of knowledge on site specific factors such as disturbance dates and / or managed interventions. In this regard, managed forests present a valuable opportunity for model calibration and improvement. Spatially explicit datasets of planting dates, species and yield classification, in combination with remote sensing data and an appropriate data assimilation (DA) framework can reduce prediction uncertainty and error. We use a Baysian approach to calibrate the data assimilation linked ecosystem carbon (DALEC) model using a Metropolis Hastings-Markov Chain Monte Carlo (MH-MCMC) framework. Forest management information is incorporated into the data assimilation framework as part of ecological and dynamic constraints (EDCs). The key advantage here is that DALEC simulates a full carbon balance, not just the living biomass, and that both parameter and prediction uncertainties are estimated as part of the DA analysis. DALEC has been calibrated at two managed forests, in the USA (Pinus taeda; Duke Forest) and UK (Picea sitchensis; Griffin Forest). At each site DALEC is calibrated twice (exp1 & exp2). Both calibrations (exp1 & exp2) assimilated MODIS LAI and HWSD estimates of soil carbon stored in soil organic matter, in addition to common management information and prior knowledge included in parameter priors and the EDCs. Calibration exp1 also utilises multiple site level estimates of carbon storage in multiple pools. By comparing simulations we determine the impact of site-level observations on uncertainty and error on predictions, and which observations are key to constraining ecosystem processes. Preliminary simulations indicate that DALEC calibration exp1 accurately simulated the assimilated observations for forest and soil carbon stock estimates including, critically for forestry, standing wood stocks (R2 = 0.92, bias = -4.46 MgC ha-1, RMSE = 5.80 MgC ha-1). The results from exp1 indicate the model is able to find parameters that are both consistent with EDC and observations. In the absence of site-level stock observations (exp2) DALEC accurately estimates foliage and fine root pools, while the median estimate of above ground litter and wood stocks (R2 = 0.92, bias = -48.30 MgC ha-1, RMSE = 50.30 MgC ha-1) are over- and underestimated respectively, site-level observations are within model uncertainty. These results indicate that we can estimate managed forests dynamics using remotely sensed data, particularly as remotely sensed above ground biomass maps become available to provide constraint to correct biases in woody accumulation.

  16. Why are you late? Investigating the role of time management in time-based prospective memory.

    PubMed

    Waldum, Emily R; McDaniel, Mark A

    2016-08-01

    Time-based prospective memory tasks (TBPM) are those that are to be performed at a specific future time. Contrary to typical laboratory TBPM tasks (e.g., hit the Z key every 5 min), many real-world TBPM tasks require more complex time-management processes. For instance, to attend an appointment on time, one must estimate the duration of the drive to the appointment and then use this estimate to create and execute a secondary TBPM intention (e.g., "I need to start driving by 1:30 to make my 2:00 appointment on time"). Future under- and overestimates of drive time can lead to inefficient TBPM performance with the former lending to missed appointments and the latter to long stints in the waiting room. Despite the common occurrence of complex TBPM tasks in everyday life, to date, no studies have investigated how components of time management, including time estimation, affect behavior in such complex TBPM tasks. Therefore, the current study aimed to investigate timing biases in both older and younger adults and, further, to determine how such biases along with additional time management components including planning and plan fidelity influence complex TBPM performance. Results suggest for the first time that younger and older adults do not always utilize similar timing strategies, and as a result, can produce differential timing biases under the exact same environmental conditions. These timing biases, in turn, play a vital role in how efficiently both younger and older adults perform a later TBPM task that requires them to utilize their earlier time estimate. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Natural and management influences on freshwater inflows and salinity in the San Francisco Estuary at monthly to interannual scales

    USGS Publications Warehouse

    Knowles, Noah

    2002-01-01

    Understanding the processes controlling the physics, chemistry, and biology of the San Francisco Estuary and their relation to climate variability is complicated by the combined influence on freshwater inflows of natural variability and upstream management. To distinguish these influences, alterations of estuarine inflow due to major reservoirs and freshwater pumping in the watershed were inferred from available data. Effects on salinity were estimated by using reconstructed estuarine inflows corresponding to differing levels of impairment to drive a numerical salinity model. Both natural and management inflow and salinity signals show strong interannual variability. Management effects raise salinities during the wet season, with maximum influence in spring. While year‐to‐year variations in all signals are very large, natural interannual variability can greatly exceed the range of management effects on salinity in the estuary.

  18. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.

    PubMed

    Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi

    2015-10-01

    In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. © 2015 Society for Risk Analysis.

  19. Use of PRA in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri L.

    2010-01-01

    How do you use PRA to support an operating program? This presentation will explore how the Shuttle Program Management has used the Shuttle PRA in its decision making process. It will reveal how the PRA has evolved from a tool used to evaluate Shuttle upgrades like Electric Auxiliary Power Unit (EAPU) to a tool that supports Flight Readiness Reviews (FRR) and real-time flight decisions. Specific examples of Shuttle Program decisions that have used the Shuttle PRA as input will be provided including how it was used in the Hubble Space Telescope (HST) manifest decision. It will discuss the importance of providing management with a clear presentation of the analysis, applicable assumptions and limitations, along with estimates of the uncertainty. This presentation will show how the use of PRA by the Shuttle Program has evolved overtime and how it has been used in the decision making process providing specific examples.

  20. Comparison of known food weights with image-based portion-size automated estimation and adolescents' self-reported portion size.

    PubMed

    Lee, Christina D; Chae, Junghoon; Schap, TusaRebecca E; Kerr, Deborah A; Delp, Edward J; Ebert, David S; Boushey, Carol J

    2012-03-01

    Diet is a critical element of diabetes self-management. An emerging area of research is the use of images for dietary records using mobile telephones with embedded cameras. These tools are being designed to reduce user burden and to improve accuracy of portion-size estimation through automation. The objectives of this study were to (1) assess the error of automatically determined portion weights compared to known portion weights of foods and (2) to compare the error between automation and human. Adolescents (n = 15) captured images of their eating occasions over a 24 h period. All foods and beverages served were weighed. Adolescents self-reported portion sizes for one meal. Image analysis was used to estimate portion weights. Data analysis compared known weights, automated weights, and self-reported portions. For the 19 foods, the mean ratio of automated weight estimate to known weight ranged from 0.89 to 4.61, and 9 foods were within 0.80 to 1.20. The largest error was for lettuce and the most accurate was strawberry jam. The children were fairly accurate with portion estimates for two foods (sausage links, toast) using one type of estimation aid and two foods (sausage links, scrambled eggs) using another aid. The automated method was fairly accurate for two foods (sausage links, jam); however, the 95% confidence intervals for the automated estimates were consistently narrower than human estimates. The ability of humans to estimate portion sizes of foods remains a problem and a perceived burden. Errors in automated portion-size estimation can be systematically addressed while minimizing the burden on people. Future applications that take over the burden of these processes may translate to better diabetes self-management. © 2012 Diabetes Technology Society.

  1. Systems engineering and integration: Cost estimation and benefits analysis

    NASA Technical Reports Server (NTRS)

    Dean, ED; Fridge, Ernie; Hamaker, Joe

    1990-01-01

    Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are needed as well as improved usage of risk data by decision-makers. More and better ways to display and communicate cost and cost risk to management are required.

  2. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Using multiple data types and integrated population models to improve our knowledge of apex predator population dynamics.

    PubMed

    Bled, Florent; Belant, Jerrold L; Van Daele, Lawrence J; Svoboda, Nathan; Gustine, David; Hilderbrand, Grant; Barnes, Victor G

    2017-11-01

    Current management of large carnivores is informed using a variety of parameters, methods, and metrics; however, these data are typically considered independently. Sharing information among data types based on the underlying ecological, and recognizing observation biases, can improve estimation of individual and global parameters. We present a general integrated population model (IPM), specifically designed for brown bears ( Ursus arctos ), using three common data types for bear ( U . spp.) populations: repeated counts, capture-mark-recapture, and litter size. We considered factors affecting ecological and observation processes for these data. We assessed the practicality of this approach on a simulated population and compared estimates from our model to values used for simulation and results from count data only. We then present a practical application of this general approach adapted to the constraints of a case study using historical data available for brown bears on Kodiak Island, Alaska, USA. The IPM provided more accurate and precise estimates than models accounting for repeated count data only, with credible intervals including the true population 94% and 5% of the time, respectively. For the Kodiak population, we estimated annual average litter size (within one year after birth) to vary between 0.45 [95% credible interval: 0.43; 0.55] and 1.59 [1.55; 1.82]. We detected a positive relationship between salmon availability and adult survival, with survival probabilities greater for females than males. Survival probabilities increased from cubs to yearlings to dependent young ≥2 years old and decreased with litter size. Linking multiple information sources based on ecological and observation mechanisms can provide more accurate and precise estimates, to better inform management. IPMs can also reduce data collection efforts by sharing information among agencies and management units. Our approach responds to an increasing need in bear populations' management and can be readily adapted to other large carnivores.

  4. Using multiple data types and integrated population models to improve our knowledge of apex predator population dynamics

    USGS Publications Warehouse

    Bled, Florent; Belant, Jerrold L.; Van Daele, Lawrence J.; Svoboda, Nathan; Gustine, David D.; Hilderbrand, Grant V.; Barnes, Victor G.

    2017-01-01

    Current management of large carnivores is informed using a variety of parameters, methods, and metrics; however, these data are typically considered independently. Sharing information among data types based on the underlying ecological, and recognizing observation biases, can improve estimation of individual and global parameters. We present a general integrated population model (IPM), specifically designed for brown bears (Ursus arctos), using three common data types for bear (U. spp.) populations: repeated counts, capture–mark–recapture, and litter size. We considered factors affecting ecological and observation processes for these data. We assessed the practicality of this approach on a simulated population and compared estimates from our model to values used for simulation and results from count data only. We then present a practical application of this general approach adapted to the constraints of a case study using historical data available for brown bears on Kodiak Island, Alaska, USA. The IPM provided more accurate and precise estimates than models accounting for repeated count data only, with credible intervals including the true population 94% and 5% of the time, respectively. For the Kodiak population, we estimated annual average litter size (within one year after birth) to vary between 0.45 [95% credible interval: 0.43; 0.55] and 1.59 [1.55; 1.82]. We detected a positive relationship between salmon availability and adult survival, with survival probabilities greater for females than males. Survival probabilities increased from cubs to yearlings to dependent young ≥2 years old and decreased with litter size. Linking multiple information sources based on ecological and observation mechanisms can provide more accurate and precise estimates, to better inform management. IPMs can also reduce data collection efforts by sharing information among agencies and management units. Our approach responds to an increasing need in bear populations’ management and can be readily adapted to other large carnivores.

  5. An economic evaluation of contingency management for completion of hepatitis B vaccination in those on treatment for opiate dependence.

    PubMed

    Rafia, Rachid; Dodd, Peter J; Brennan, Alan; Meier, Petra S; Hope, Vivian D; Ncube, Fortune; Byford, Sarah; Tie, Hiong; Metrebian, Nicola; Hellier, Jennifer; Weaver, Tim; Strang, John

    2016-09-01

    To determine whether the provision of contingency management using financial incentives to improve hepatitis B vaccine completion in people who inject drugs entering community treatment represents a cost-effective use of health-care resources. A probabilistic cost-effectiveness analysis was conducted, using a decision-tree to estimate the short-term clinical and health-care cost impact of the vaccination strategies, followed by a Markov process to evaluate the long-term clinical consequences and costs associated with hepatitis B infection. Data on attendance to vaccination from a UK cluster randomized trial. Two contingency management options were examined in the trial: fixed versus escalating schedule financial incentives. Life-time health-care costs and quality-adjusted life years discounted at 3.5% annually; incremental cost-effectiveness ratios. The resulting estimate for the incremental life-time health-care cost of the contingency management strategy versus usual care was £21.86 [95% confidence interval (CI) = -£12.20 to 39.86] per person offered the incentive. For 1000 people offered the incentive, the incremental reduction in numbers of hepatitis B infections avoided over their lifetime was estimated at 19 (95% CI = 8-30). The probabilistic incremental cost per quality adjusted life-year gained of the contingency management programme was estimated to be £6738 (95% CI = £6297-7172), with an 89% probability of being considered cost-effective at a threshold of £20 000 per quality-adjusted life years gained (97.60% at £30 000). Using financial incentives to increase hepatitis B vaccination completion in people who inject drugs could be a cost-effective use of health-care resources in the UK as long as the incidence remains above 1.2%. © 2016 Society for the Study of Addiction.

  6. From mess to mass: a methodology for calculating storm event pollutant loads with their uncertainties, from continuous raw data time series.

    PubMed

    Métadier, M; Bertrand-Krajewski, J-L

    2011-01-01

    With the increasing implementation of continuous monitoring of both discharge and water quality in sewer systems, large data bases are now available. In order to manage large amounts of data and calculate various variables and indicators of interest it is necessary to apply automated methods for data processing. This paper deals with the processing of short time step turbidity time series to estimate TSS (Total Suspended Solids) and COD (Chemical Oxygen Demand) event loads in sewer systems during storm events and their associated uncertainties. The following steps are described: (i) sensor calibration, (ii) estimation of data uncertainties, (iii) correction of raw data, (iv) data pre-validation tests, (v) final validation, and (vi) calculation of TSS and COD event loads and estimation of their uncertainties. These steps have been implemented in an integrated software tool. Examples of results are given for a set of 33 storm events monitored in a stormwater separate sewer system.

  7. Evaluation of decision making and negotiation processes under uncertainties regarding the water management of Peiros-Parapeiros Dam, in Achaia Region (Greece).

    NASA Astrophysics Data System (ADS)

    Podimata, Marianthi V.; Yannopoulos, Panayotis C.

    2015-04-01

    Water managers, decision-makers, water practitioners and others involved in Integrated Water Resources Management often encounter the problem of finding a joint agreement among stakeholders concerning the management of a common water body. Handling conflict situations/disputes over water issues and finding an acceptable joint solution remain a thorny issue in water negotiation processes, since finding a formula for wise, fair and sustainable management of a water resource is a complex process that includes environmental, economic, technical, socio-political criteria and their uncertainties. Decision Support Systems and Adaptive Management are increasingly used in that direction. To assist decision makers in handling water disputes and execute negotiations, a conceptual tool is required. The Graph Model for Conflict Resolution is a Decision Support flexible tool for negotiation support regarding water conflicts. It includes efficient algorithms for estimating strategic moves of water stakeholders, even though there is a lack of detail concerning their real motives and prospects. It calculates the stability of their states and encourages what-if analyses. This paper presents a case study of water decision makers' evaluations concerning the management of up-coming technical infrastructure Peiros-Parapeiros Dam, in Achaia Region (Greece). The continuous consultations between institutions and representatives revealed that the formation of a joint agreement between stakeholders is not easy, due to arising conflicts and contradictions regarding the jurisdiction and legal status of the dam operator and the cost undertaking of the dam operation. This paper analyzes the positions of the parties involved in the consultation process and examines possible conflict resolution states, using GMCR II. This methodology tries to minimize uncertainty to a certain extent concerning the possible moves/decisions of involved parties regarding the operation and management of the dam by developing and simulating potential strategic interactions and multilateral negotiations and finding confidence-building cooperation schemes (cooperative arrangements) over water use and management.

  8. Modelling hen harrier dynamics to inform human-wildlife conflict resolution: a spatially-realistic, individual-based approach.

    PubMed

    Heinonen, Johannes P M; Palmer, Stephen C F; Redpath, Steve M; Travis, Justin M J

    2014-01-01

    Individual-based models have gained popularity in ecology, and enable simultaneous incorporation of spatial explicitness and population dynamic processes to understand spatio-temporal patterns of populations. We introduce an individual-based model for understanding and predicting spatial hen harrier (Circus cyaneus) population dynamics in Great Britain. The model uses a landscape with habitat, prey and game management indices. The hen harrier population was initialised according to empirical census estimates for 1988/89 and simulated until 2030, and predictions for 1998, 2004 and 2010 were compared to empirical census estimates for respective years. The model produced a good qualitative match to overall trends between 1989 and 2010. Parameter explorations revealed relatively high elasticity in particular to demographic parameters such as juvenile male mortality. This highlights the need for robust parameter estimates from empirical research. There are clearly challenges for replication of real-world population trends, but this model provides a useful tool for increasing understanding of drivers of hen harrier dynamics and focusing research efforts in order to inform conflict management decisions.

  9. Modelling Hen Harrier Dynamics to Inform Human-Wildlife Conflict Resolution: A Spatially-Realistic, Individual-Based Approach

    PubMed Central

    Heinonen, Johannes P. M.; Palmer, Stephen C. F.; Redpath, Steve M.; Travis, Justin M. J.

    2014-01-01

    Individual-based models have gained popularity in ecology, and enable simultaneous incorporation of spatial explicitness and population dynamic processes to understand spatio-temporal patterns of populations. We introduce an individual-based model for understanding and predicting spatial hen harrier (Circus cyaneus) population dynamics in Great Britain. The model uses a landscape with habitat, prey and game management indices. The hen harrier population was initialised according to empirical census estimates for 1988/89 and simulated until 2030, and predictions for 1998, 2004 and 2010 were compared to empirical census estimates for respective years. The model produced a good qualitative match to overall trends between 1989 and 2010. Parameter explorations revealed relatively high elasticity in particular to demographic parameters such as juvenile male mortality. This highlights the need for robust parameter estimates from empirical research. There are clearly challenges for replication of real-world population trends, but this model provides a useful tool for increasing understanding of drivers of hen harrier dynamics and focusing research efforts in order to inform conflict management decisions. PMID:25405860

  10. Application of expert systems in project management decision aiding

    NASA Technical Reports Server (NTRS)

    Harris, Regina; Shaffer, Steven; Stokes, James; Goldstein, David

    1987-01-01

    The feasibility of developing an expert systems-based project management decision aid to enhance the performance of NASA project managers was assessed. The research effort included extensive literature reviews in the areas of project management, project management decision aiding, expert systems technology, and human-computer interface engineering. Literature reviews were augmented by focused interviews with NASA managers. Time estimation for project scheduling was identified as the target activity for decision augmentation, and a design was developed for an Integrated NASA System for Intelligent Time Estimation (INSITE). The proposed INSITE design was judged feasible with a low level of risk. A partial proof-of-concept experiment was performed and was successful. Specific conclusions drawn from the research and analyses are included. The INSITE concept is potentially applicable in any management sphere, commercial or government, where time estimation is required for project scheduling. As project scheduling is a nearly universal management activity, the range of possibilities is considerable. The INSITE concept also holds potential for enhancing other management tasks, especially in areas such as cost estimation, where estimation-by-analogy is already a proven method.

  11. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  12. Evaluation and comparison of alternative designs for water/solid-waste processing systems for spacecraft

    NASA Technical Reports Server (NTRS)

    Spurlock, J. M.

    1975-01-01

    Promising candidate designs currently being considered for the management of spacecraft solid waste and waste-water materials were assessed. The candidate processes were: (1) the radioisotope thermal energy evaporation/incinerator process; (2) the dry incineration process; and (3) the wet oxidation process. The types of spacecraft waste materials that were included in the base-line computational input to the candidate systems were feces, urine residues, trash and waste-water concentrates. The performance characteristics and system requirements for each candidate process to handle this input and produce the specified acceptable output (i.e., potable water, a storable dry ash, and vapor phase products that can be handled by a spacecraft atmosphere control system) were estimated and compared. Recommendations are presented.

  13. Coupled carbon-nitrogen land surface modelling for UK agricultural landscapes using JULES and JULES-ECOSSE-FUN (JEF)

    NASA Astrophysics Data System (ADS)

    Comyn-Platt, Edward; Clark, Douglas; Blyth, Eleanor

    2016-04-01

    The UK is required to provide accurate estimates of the UK greenhouse gas (GHG; CO2, CH4 and N2O) emissions for the UNFCCC (United Nations Framework Convention on Climate Change). Process based land surface models (LSMs), such as the Joint UK Land Environment Simulator (JULES), attempt to provide such estimates based on environmental (e.g. land use and soil type) and meteorological conditions. The standard release of JULES focusses on the water and carbon cycles, however, it has long been suggested that a coupled carbon-nitrogen scheme could enhance simulations. This is of particular importance when estimating agricultural emission inventories where the carbon cycle is effectively managed via the human application of nitrogen based fertilizers. JULES-ECOSSE-FUN (JEF) links JULES with the Estimation of Carbon in Organic Soils - Sequestration and Emission (ECOSSE) model and the Fixation and Uptake of Nitrogen (FUN) model as a means of simulating C:N coupling. This work presents simulations from the standard release of JULES and the most recent incarnation of the JEF coupled system at the point and field scale. Various configurations of JULES and JEF were calibrated and fine-tuned based on comparisons with observations from three UK field campaigns (Crichton, Harwood Forest and Brattleby) specifically chosen to represent the managed vegetation types that cover the UK. The campaigns included flux tower and chamber measurements of CO2, CH4 and N2O amongst other meteorological parameters and records of land management such as application of fertilizer and harvest date at the agricultural sites. Based on the results of these comparisons, JULES and/or JEF will be used to provide simulations on the regional and national scales in order to provide improved estimates of the total UK emission inventory.

  14. Estimating ammonium and nitrate load from septic systems to surface water bodies within ArcGIS environments

    NASA Astrophysics Data System (ADS)

    Zhu, Yan; Ye, Ming; Roeder, Eberhard; Hicks, Richard W.; Shi, Liangsheng; Yang, Jinzhong

    2016-01-01

    This paper presents a recently developed software, ArcGIS-based Nitrogen Load Estimation Toolkit (ArcNLET), for estimating nitrogen loading from septic systems to surface water bodies. The load estimation is important for managing nitrogen pollution, a world-wide challenge to water resources and environmental management. ArcNLET simulates coupled transport of ammonium and nitrate in both vadose zone and groundwater. This is a unique feature that cannot be found in other ArcGIS-based software for nitrogen modeling. ArcNLET is designed to be flexible for the following four simulating scenarios: (1) nitrate transport alone in groundwater; (2) ammonium and nitrate transport in groundwater; (3) ammonium and nitrate transport in vadose zone; and (4) ammonium and nitrate transport in both vadose zone and groundwater. With this flexibility, ArcNLET can be used as an efficient screening tool in a wide range of management projects related to nitrogen pollution. From the modeling perspective, this paper shows that in areas with high water table (e.g. river and lake shores), it may not be correct to assume a completed nitrification process that converts all ammonium to nitrate in the vadose zone, because observation data can indicate that substantial amount of ammonium enters groundwater. Therefore, in areas with high water table, simulating ammonium transport and estimating ammonium loading, in addition to nitrate transport and loading, are important for avoiding underestimation of nitrogen loading. This is demonstrated in the Eggleston Heights neighborhood in the City of Jacksonville, FL, USA, where monitoring well observations included a well with predominant ammonium concentrations. The ammonium loading given by the calibrated ArcNLET model can be 10-18% of the total nitrogen load, depending on various factors discussed in the paper.

  15. Estimating the spatial distribution of wintering little brown bat populations in the eastern United States

    USGS Publications Warehouse

    Russell, Robin E.; Tinsley, Karl; Erickson, Richard A.; Thogmartin, Wayne E.; Jennifer A. Szymanski,

    2014-01-01

    Depicting the spatial distribution of wildlife species is an important first step in developing management and conservation programs for particular species. Accurate representation of a species distribution is important for predicting the effects of climate change, land-use change, management activities, disease, and other landscape-level processes on wildlife populations. We developed models to estimate the spatial distribution of little brown bat (Myotis lucifugus) wintering populations in the United States east of the 100th meridian, based on known hibernacula locations. From this data, we developed several scenarios of wintering population counts per county that incorporated uncertainty in the spatial distribution of the hibernacula as well as uncertainty in the size of the current little brown bat population. We assessed the variability in our results resulting from effects of uncertainty. Despite considerable uncertainty in the known locations of overwintering little brown bats in the eastern United States, we believe that models accurately depicting the effects of the uncertainty are useful for making management decisions as these models are a coherent organization of the best available information.

  16. Use of structured decision making to identify monitoring variables and management priorities for salt marsh ecosystems

    USGS Publications Warehouse

    Neckles, Hilary A.; Lyons, James E.; Guntenspergen, Glenn R.; Shriver, W. Gregory; Adamowicz, Susan C.

    2015-01-01

    Most salt marshes in the USA have been degraded by human activities, and coastal managers are faced with complex choices among possible actions to restore or enhance ecosystem integrity. We applied structured decision making (SDM) to guide selection of monitoring variables and management priorities for salt marshes within the National Wildlife Refuge System in the northeastern USA. In general, SDM is a systematic process for decomposing a decision into its essential elements. We first engaged stakeholders in clarifying regional salt marsh decision problems, defining objectives and attributes to evaluate whether objectives are achieved, and developing a pool of alternative management actions for achieving objectives. Through this process, we identified salt marsh attributes that were applicable to monitoring National Wildlife Refuges on a regional scale and that targeted management needs. We then analyzed management decisions within three salt marsh units at Prime Hook National Wildlife Refuge, coastal Delaware, as a case example of prioritizing management alternatives. Values for salt marsh attributes were estimated from 2 years of baseline monitoring data and expert opinion. We used linear value modeling to aggregate multiple attributes into a single performance score for each alternative, constrained optimization to identify alternatives that maximized total management benefits subject to refuge-wide cost constraints, and used graphical analysis to identify the optimal set of alternatives for the refuge. SDM offers an efficient, transparent approach for integrating monitoring into management practice and improving the quality of management decisions.

  17. Variability of Hormonal Stress Markers Collected from a Managed Dolphin Population

    DTIC Science & Technology

    2011-09-30

    physiological indicators of stress in wild marine mammals and the interrelationships between different stress markers can be used to estimate the impact of...samples will be processed for adrenocorticosteroids (ACTH, cortisol, aldosterone ), catecholamines (epinephrine, norepinephrine), and thyroid hormones...T3 and T4) via radioimmunoassay (RIA). Radioimmunoassay methods have previously been validated for cortisol and aldosterone in this species (Houser

  18. Simulating double-peak hydrographs from single storms over mixed-use watersheds

    Treesearch

    Yang Yang; Theodore A. Endreny; David J. Nowak

    2015-01-01

    Two-peak hydrographs after a single rain event are observed in watersheds and storms with distinct volumes contributing as fast and slow runoff. The authors developed a hydrograph model able to quantify these separate runoff volumes to help in estimation of runoff processes and residence times used by watershed managers. The model uses parallel application of two...

  19. Scenic beauty estimation model: predicting perceived beauty of forest landscapes

    Treesearch

    Terry C. Daniel; Herbert Schroeder

    1979-01-01

    An important activity in any land-use planning process is prediction of the consequences of alternative management approaches. Alternative plans must be com-pared in terms of their respective costs (economic and environmental) and benefits (in market and non-market values) if rational choice among them is to be made. The purpose of this paper is to describe a model for...

  20. Estimating Nitrogen Loading in the Wabash River Subwatershed Using a GIS Schematic Processing Network in Support of Sustainable Watershed Management Planning

    EPA Science Inventory

    The Wabash River is a tributary of the Ohio River. This river system consists of headwaters and small streams, medium river reaches in the upper Wabash watershed, and large river reaches in the lower Wabash watershed. A large part of the river system is situated in agricultural a...

  1. Daily Landsat-scale evapotranspiration estimation over a forested landscape in North Carolina, USA, using multi-satellite data fusion

    Treesearch

    Yun Yang; Martha C. Anderson; Feng Gao; Christopher R. Hain; Kathryn A. Semmens; William P. Kustas; Asko Noormets; Randolph H. Wynne; Valerie A. Thomas; Ge Sun

    2017-01-01

    As a primary flux in the global water cycle, evapotranspiration (ET) connects hydrologic and biological processes and is directly affected by water and land management, land use change and climate variability. Satellite remote sensing provides an effective means for diagnosing ET patterns over heterogeneous landscapes; however, limitations on the spatial and temporal...

  2. Recharge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fayer, Michael J.

    2008-01-17

    This chapter describes briefly the nature and measurement of recharge in support of the CH2M HILL Tank Farm Vadose Zone Project. Appendix C (Recharge) and the Recharge Data Package (Fayer and Keller 2007) provide a more thorough and extensive review of the recharge process and the estimation of recharge rates for the forthcoming RCRA Facility Investigation report for Hanford single-shell tank (SST) Waste Management Areas (WMAs).

  3. Structured decision making as a conceptual framework to identify thresholds for conservation and management

    USGS Publications Warehouse

    Martin, J.; Runge, M.C.; Nichols, J.D.; Lubow, B.C.; Kendall, W.L.

    2009-01-01

    Thresholds and their relevance to conservation have become a major topic of discussion in the ecological literature. Unfortunately, in many cases the lack of a clear conceptual framework for thinking about thresholds may have led to confusion in attempts to apply the concept of thresholds to conservation decisions. Here, we advocate a framework for thinking about thresholds in terms of a structured decision making process. The purpose of this framework is to promote a logical and transparent process for making informed decisions for conservation. Specification of such a framework leads naturally to consideration of definitions and roles of different kinds of thresholds in the process. We distinguish among three categories of thresholds. Ecological thresholds are values of system state variables at which small changes bring about substantial changes in system dynamics. Utility thresholds are components of management objectives (determined by human values) and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. The approach that we present focuses directly on the objectives of management, with an aim to providing decisions that are optimal with respect to those objectives. This approach clearly distinguishes the components of the decision process that are inherently subjective (management objectives, potential management actions) from those that are more objective (system models, estimates of system state). Optimization based on these components then leads to decision matrices specifying optimal actions to be taken at various values of system state variables. Values of state variables separating different actions in such matrices are viewed as decision thresholds. Utility thresholds are included in the objectives component, and ecological thresholds may be embedded in models projecting consequences of management actions. Decision thresholds are determined by the above-listed components of a structured decision process. These components may themselves vary over time, inducing variation in the decision thresholds inherited from them. These dynamic decision thresholds can then be determined using adaptive management. We provide numerical examples (that are based on patch occupancy models) of structured decision processes that include all three kinds of thresholds. ?? 2009 by the Ecological Society of America.

  4. Dietary trends and management of hyperphosphatemia among patients with chronic kidney disease: an international survey of renal care professionals.

    PubMed

    Fouque, Denis; Cruz Casal, Maria; Lindley, Elizabeth; Rogers, Susan; Pancířová, Jitka; Kernc, Jennifer; Copley, J Brian

    2014-03-01

    The objective of this study was to review the opinions and experiences of renal care professionals to examine dietary trends among patients with chronic kidney disease (CKD) and problems associated with the clinical management of hyperphosphatemia. This was an online survey comprising open and closed questions requesting information on patient dietary trends and the clinical management of hyperphosphatemia. The study was conducted in 4 European countries (the Netherlands, Spain, Sweden, and the United Kingdom). Participants were 84 renal care professionals. This was an online survey. Responder-reported experiences and perceptions of patient dietary trends and hyperphosphatemia management were assessed. Most survey responders (56%) observed an increase in the consumption of processed convenience food, 48% noticed an increase in the consumption of foods rich in phosphorus-containing additives, and 60% believed that there has been a trend of increasing patient awareness of the phosphorus content of food. Patients undergoing hemodialysis (HD) were most likely to experience difficulties in following advice on dietary phosphorus restriction (38% of responders estimated that 25-50% of their patients experienced difficulties, and 29% estimated that 51-75% experienced difficulties). Maintaining protein intake and restricting dietary phosphorus were perceived as being equally important by at least half of responders for predialysis patients (56%) and for those undergoing peritoneal dialysis and HD (54% and 50%, respectively). There were international variations in dietary trends and hyperphosphatemia management. Although most responders have observed a trend of increasing awareness of the phosphorus content of food among patients with CKD, the survey results indicate that many patients continue to experience difficulties when attempting to restrict dietary phosphorus. The survey responses reflect the global trend of increasing consumption of processed convenience foods and phosphorus-containing additives, which has implications for the management of hyperphosphatemia in patients with CKD. Copyright © 2014 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  5. Improving Care Transitions Management: Examining the Role of Accountable Care Organization Participation and Expanded Electronic Health Record Functionality.

    PubMed

    Huber, Thomas P; Shortell, Stephen M; Rodriguez, Hector P

    2017-08-01

    Examine the extent to which physician organization participation in an accountable care organization (ACO) and electronic health record (EHR) functionality are associated with greater adoption of care transition management (CTM) processes. A total of 1,398 physician organizations from the third National Study of Physician Organization survey (NSPO3), a nationally representative sample of medical practices in the United States (January 2012-May 2013). We used data from the third National Study of Physician Organization survey (NSPO3) to assess medical practice characteristics, including CTM processes, ACO participation, EHR functionality, practice type, organization size, ownership, public reporting, and pay-for-performance participation. Multivariate linear regression models estimated the extent to which ACO participation and EHR functionality were associated with greater CTM capabilities, controlling for practice size, ownership, public reporting, and pay-for-performance participation. Approximately half (52.4 percent) of medical practices had a formal program for managing care transitions in place. In adjusted analyses, ACO participation (p < .001) and EHR functionality (p < .001) were independently associated with greater use of CTM processes among medical practices. The growth of ACOs and similar provider risk-bearing arrangements across the country may improve the management of care transitions by physician organizations. © Health Research and Educational Trust.

  6. Developing Daily Quantitative Damage Estimates From Geospatial Layers To Support Post Event Recovery

    NASA Astrophysics Data System (ADS)

    Woods, B. K.; Wei, L. H.; Connor, T. C.

    2014-12-01

    With the growth of natural hazard data available in near real-time it is increasingly feasible to deliver damage estimates caused by natural disasters. These estimates can be used in disaster management setting or by commercial entities to optimize the deployment of resources and/or routing of goods and materials. This work outlines an end-to-end, modular process to generate estimates of damage caused by severe weather. The processing stream consists of five generic components: 1) Hazard modules that provide quantitate data layers for each peril. 2) Standardized methods to map the hazard data to an exposure layer based on atomic geospatial blocks. 3) Peril-specific damage functions that compute damage metrics at the atomic geospatial block level. 4) Standardized data aggregators, which map damage to user-specific geometries. 5) Data dissemination modules, which provide resulting damage estimates in a variety of output forms. This presentation provides a description of this generic tool set, and an illustrated example using HWRF-based hazard data for Hurricane Arthur (2014). In this example, the Python-based real-time processing ingests GRIB2 output from the HWRF numerical model, dynamically downscales it in conjunctions with a land cover database using a multiprocessing pool, and a just-in-time compiler (JIT). The resulting wind fields are contoured, and ingested into a PostGIS database using OGR. Finally, the damage estimates are calculated at the atomic block level and aggregated to user-defined regions using PostgreSQL queries to construct application specific tabular and graphics output.

  7. Calculating salt loads to Great Salt Lake and the associated uncertainties for water year 2013; updating a 48 year old standard

    USGS Publications Warehouse

    Shope, Christopher L.; Angeroth, Cory E.

    2015-01-01

    Effective management of surface waters requires a robust understanding of spatiotemporal constituent loadings from upstream sources and the uncertainty associated with these estimates. We compared the total dissolved solids loading into the Great Salt Lake (GSL) for water year 2013 with estimates of previously sampled periods in the early 1960s.We also provide updated results on GSL loading, quantitatively bounded by sampling uncertainties, which are useful for current and future management efforts. Our statistical loading results were more accurate than those from simple regression models. Our results indicate that TDS loading to the GSL in water year 2013 was 14.6 million metric tons with uncertainty ranging from 2.8 to 46.3 million metric tons, which varies greatly from previous regression estimates for water year 1964 of 2.7 million metric tons. Results also indicate that locations with increased sampling frequency are correlated with decreasing confidence intervals. Because time is incorporated into the LOADEST models, discrepancies are largely expected to be a function of temporally lagged salt storage delivery to the GSL associated with terrestrial and in-stream processes. By incorporating temporally variable estimates and statistically derived uncertainty of these estimates,we have provided quantifiable variability in the annual estimates of dissolved solids loading into the GSL. Further, our results support the need for increased monitoring of dissolved solids loading into saline lakes like the GSL by demonstrating the uncertainty associated with different levels of sampling frequency.

  8. Climate Change and Water Resources Management: A Federal Perspective

    USGS Publications Warehouse

    Brekke, Levi D.; Kiang, Julie E.; Olsen, J. Rolf; Pulwarty, Roger S.; Raff, David A.; Turnipseed, D. Phil; Webb, Robert S.; White, Kathleen D.

    2009-01-01

    Many challenges, including climate change, face the Nation's water managers. The Intergovernmental Panel on Climate Change (IPCC) has provided estimates of how climate may change, but more understanding of the processes driving the changes, the sequences of the changes, and the manifestation of these global changes at different scales could be beneficial. Since the changes will likely affect fundamental drivers of the hydrological cycle, climate change may have a large impact on water resources and water resources managers. The purpose of this interagency report prepared by the U.S. Geological Survey (USGS), U.S. Army Corps of Engineers (USACE), Bureau of Reclamation (Reclamation), and National Oceanic and Atmospheric Administration (NOAA) is to explore strategies to improve water management by tracking, anticipating, and responding to climate change. This report describes the existing and still needed underpinning science crucial to addressing the many impacts of climate change on water resources management.

  9. Essays in financial economics and econometrics

    NASA Astrophysics Data System (ADS)

    La Spada, Gabriele

    Chapter 1 (my job market paper) asks the following question: Do asset managers reach for yield because of competitive pressures in a low rate environment? I propose a tournament model of money market funds (MMFs) to study this issue. I show that funds with different costs of default respond differently to changes in interest rates, and that it is important to distinguish the role of risk-free rates from that of risk premia. An increase in the risk premium leads funds with lower default costs to increase risk-taking, while funds with higher default costs reduce risk-taking. Without changes in the premium, low risk-free rates reduce risk-taking. My empirical analysis shows that these predictions are consistent with the risk-taking of MMFs during the 2006--2008 period. Chapter 2, co-authored with Fabrizio Lillo and published in Studies in Nonlinear Dynamics and Econometrics (2014), studies the effect of round-off error (or discretization) on stationary Gaussian long-memory process. For large lags, the autocovariance is rescaled by a factor smaller than one, and we compute this factor exactly. Hence, the discretized process has the same Hurst exponent as the underlying one. We show that in presence of round-off error, two common estimators of the Hurst exponent, the local Whittle (LW) estimator and the detrended fluctuation analysis (DFA), are severely negatively biased in finite samples. We derive conditions for consistency and asymptotic normality of the LW estimator applied to discretized processes and compute the asymptotic properties of the DFA for generic long-memory processes that encompass discretized processes. Chapter 3, co-authored with Fabrizio Lillo, studies the effect of round-off error on integrated Gaussian processes with possibly correlated increments. We derive the variance and kurtosis of the realized increment process in the limit of both "small" and "large" round-off errors, and its autocovariance for large lags. We propose novel estimators for the variance and lag-one autocorrelation of the underlying, unobserved increment process. We also show that for fractionally integrated processes, the realized increments have the same Hurst exponent as the underlying ones, but the LW estimator applied to the realized series is severely negatively biased in medium-sized samples.

  10. Obtaining mathematical models for assessing efficiency of dust collectors using integrated system of analysis and data management STATISTICA Design of Experiments

    NASA Astrophysics Data System (ADS)

    Azarov, A. V.; Zhukova, N. S.; Kozlovtseva, E. Yu; Dobrinsky, D. R.

    2018-05-01

    The article considers obtaining mathematical models to assess the efficiency of the dust collectors using an integrated system of analysis and data management STATISTICA Design of Experiments. The procedure for obtaining mathematical models and data processing is considered by the example of laboratory studies on a mounted installation containing a dust collector in counter-swirling flows (CSF) using gypsum dust of various fractions. Planning of experimental studies has been carried out in order to reduce the number of experiments and reduce the cost of experimental research. A second-order non-position plan (Box-Bencken plan) was used, which reduced the number of trials from 81 to 27. The order of statistical data research of Box-Benken plan using standard tools of integrated system for analysis and data management STATISTICA Design of Experiments is considered. Results of statistical data processing with significance estimation of coefficients and adequacy of mathematical models are presented.

  11. Mapping canopy gap fraction and leaf area index at continent-scale from satellite lidar

    NASA Astrophysics Data System (ADS)

    Mahoney, C.; Hopkinson, C.; Held, A. A.

    2015-12-01

    Information on canopy cover is essential for understanding spatial and temporal variability in vegetation biomass, local meteorological processes and hydrological transfers within vegetated environments. Gap fraction (GF), an index of canopy cover, is often derived over large areas (100's km2) via airborne laser scanning (ALS), estimates of which are reasonably well understood. However, obtaining country-wide estimates is challenging due to the lack of spatially distributed point cloud data. The Geoscience Laser Altimeter System (GLAS) removes spatial limitations, however, its large footprint nature and continuous waveform data measurements make derivations of GF challenging. ALS data from 3 Australian sites are used as a basis to scale-up GF estimates to GLAS footprint data by the use of a physically-based Weibull function. Spaceborne estimates of GF are employed in conjunction with supplementary predictor variables in the predictive Random Forest algorithm to yield country-wide estimates at a 250 m spatial resolution; country-wide estimates are accompanied with uncertainties at the pixel level. Preliminary estimates of effective Leaf Area Index (eLAI) are also presented by converting GF via the Beer-Lambert law, where an extinction coefficient of 0.5 is employed; deemed acceptable at such spatial scales. The need for such wide-scale quantification of GF and eLAI are key in the assessment and modification of current forest management strategies across Australia. Such work also assists Australia's Terrestrial Ecosystem Research Network (TERN), a key asset to policy makers with regards to the management of the national ecosystem, in fulfilling their government issued mandates.

  12. Horvitz-Thompson survey sample methods for estimating large-scale animal abundance

    USGS Publications Warehouse

    Samuel, M.D.; Garton, E.O.

    1994-01-01

    Large-scale surveys to estimate animal abundance can be useful for monitoring population status and trends, for measuring responses to management or environmental alterations, and for testing ecological hypotheses about abundance. However, large-scale surveys may be expensive and logistically complex. To ensure resources are not wasted on unattainable targets, the goals and uses of each survey should be specified carefully and alternative methods for addressing these objectives always should be considered. During survey design, the impoflance of each survey error component (spatial design, propofiion of detected animals, precision in detection) should be considered carefully to produce a complete statistically based survey. Failure to address these three survey components may produce population estimates that are inaccurate (biased low), have unrealistic precision (too precise) and do not satisfactorily meet the survey objectives. Optimum survey design requires trade-offs in these sources of error relative to the costs of sampling plots and detecting animals on plots, considerations that are specific to the spatial logistics and survey methods. The Horvitz-Thompson estimators provide a comprehensive framework for considering all three survey components during the design and analysis of large-scale wildlife surveys. Problems of spatial and temporal (especially survey to survey) heterogeneity in detection probabilities have received little consideration, but failure to account for heterogeneity produces biased population estimates. The goal of producing unbiased population estimates is in conflict with the increased variation from heterogeneous detection in the population estimate. One solution to this conflict is to use an MSE-based approach to achieve a balance between bias reduction and increased variation. Further research is needed to develop methods that address spatial heterogeneity in detection, evaluate the effects of temporal heterogeneity on survey objectives and optimize decisions related to survey bias and variance. Finally, managers and researchers involved in the survey design process must realize that obtaining the best survey results requires an interactive and recursive process of survey design, execution, analysis and redesign. Survey refinements will be possible as further knowledge is gained on the actual abundance and distribution of the population and on the most efficient techniques for detection animals.

  13. Toward accurate and precise estimates of lion density.

    PubMed

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2017-08-01

    Reliable estimates of animal density are fundamental to understanding ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation because wildlife authorities rely on estimates to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores, such as lions (Panthera leo). Although abundance indices for lions may produce poor inferences, they continue to be used to estimate density and inform management and policy. We used sighting data from a 3-month survey and adapted a Bayesian spatially explicit capture-recapture (SECR) model to estimate spatial lion density in the Maasai Mara National Reserve and surrounding conservancies in Kenya. Our unstructured spatial capture-recapture sampling design incorporated search effort to explicitly estimate detection probability and density on a fine spatial scale, making our approach robust in the context of varying detection probabilities. Overall posterior mean lion density was estimated to be 17.08 (posterior SD 1.310) lions >1 year old/100 km 2 , and the sex ratio was estimated at 2.2 females to 1 male. Our modeling framework and narrow posterior SD demonstrate that SECR methods can produce statistically rigorous and precise estimates of population parameters, and we argue that they should be favored over less reliable abundance indices. Furthermore, our approach is flexible enough to incorporate different data types, which enables robust population estimates over relatively short survey periods in a variety of systems. Trend analyses are essential to guide conservation decisions but are frequently based on surveys of differing reliability. We therefore call for a unified framework to assess lion numbers in key populations to improve management and policy decisions. © 2016 Society for Conservation Biology.

  14. Prediction of the flooding of a mining reservoir in NW Spain.

    PubMed

    Álvarez, R; Ordóñez, A; De Miguel, E; Loredo, C

    2016-12-15

    Abandoned and flooded mines constitute underground reservoirs which must be managed. When pumping is stopped in a closed mine, the process of flooding should be anticipated in order to avoid environmentally undesirable or unexpected mine water discharges at the surface, particularly in populated areas. The Candín-Fondón mining reservoir in Asturias (NW Spain) has an estimated void volume of 8 million m 3 and some urban areas are susceptible to be flooded if the water is freely released from the lowest mine adit/pithead. A conceptual model of this reservoir was undertaken and the flooding process was numerically modelled in order to estimate the time that the flooding would take. Additionally, the maximum safe height for the filling of the reservoir is discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Simulation of nitrous oxide emissions at field scale using the SPACSYS model.

    PubMed

    Wu, L; Rees, R M; Tarsitano, D; Zhang, Xubo; Jones, S K; Whitmore, A P

    2015-10-15

    Nitrous oxide emitted to the atmosphere via the soil processes of nitrification and denitrification plays an important role in the greenhouse gas balance of the atmosphere and is involved in the destruction of stratospheric ozone. These processes are controlled by biological, physical and chemical factors such as growth and activity of microbes, nitrogen availability, soil temperature and water availability. A comprehensive understanding of these processes embodied in an appropriate model can help develop agricultural mitigation strategies to reduce greenhouse gas emissions, and help with estimating emissions at landscape and regional scales. A detailed module to describe the denitrification and nitrification processes and nitrogenous gas emissions was incorporated into the SPACSYS model to replace an earlier module that used a simplified first-order equation to estimate denitrification and was unable to distinguish the emissions of individual nitrogenous gases. A dataset derived from a Scottish grassland experiment in silage production was used to validate soil moisture in the top 10 cm soil, cut biomass, nitrogen offtake and N2O emissions. The comparison between the simulated and observed data suggested that the new module can provide a good representation of these processes and improve prediction of N2O emissions. The model provides an opportunity to estimate gaseous N emissions under a wide range of management scenarios in agriculture, and synthesises our understanding of the interaction and regulation of the processes. Copyright © 2015. Published by Elsevier B.V.

  16. Emission factors of air toxics from semiconductor manufacturing in Korea.

    PubMed

    Eom, Yun-Sung; Hong, Ji-Hyung; Lee, Suk-Jo; Lee, Eun-Jung; Cha, Jun-Seok; Lee, Dae-Gyun; Bang, Sun-Ae

    2006-11-01

    The development of local, accurate emission factors is very important for the estimation of reliable national emissions and air quality management. For that, this study is performed for pollutants released to the atmosphere with source-specific emission tests from the semiconductor manufacturing industry. The semiconductor manufacturing industry is one of the major sources of air toxics or hazardous air pollutants (HAPs); thus, understanding the emission characteristics of the emission source is a very important factor in the development of a control strategy. However, in Korea, there is a general lack of information available on air emissions from the semiconductor industry. The major emission sources of air toxics examined from the semiconductor manufacturing industry were wet chemical stations, coating applications, gaseous operations, photolithography, and miscellaneous devices in the wafer fabrication and semiconductor packaging processes. In this study, analyses of emission characteristics, and the estimations of emission data and factors for air toxics, such as acids, bases, heavy metals, and volatile organic compounds from the semiconductor manufacturing process have been performed. The concentration of hydrogen chloride from the packaging process was the highest among all of the processes. In addition, the emission factor of total volatile organic compounds (TVOCs) for the packaging process was higher than that of the wafer fabrication process. Emission factors estimated in this study were compared with those of Taiwan for evaluation, and they were found to be of similar level in the case of TVOCs and fluorine compounds.

  17. Improving performance with knowledge management

    NASA Astrophysics Data System (ADS)

    Kim, Sangchul

    2018-06-01

    People and organization are unable to easily locate their experience and knowledge, so meaningful data is usually fragmented, unstructured, not up-to-date and largely incomplete. Poor knowledge management (KM) leaves a company weak to their knowledge-base - or intellectual capital - walking out of the door each year, that is minimum estimated at 10%. Knowledge management (KM) can be defined as an emerging set of organizational design and operational principles, processes, organizational structures, applications and technologies that helps knowledge workers dramatically leverage their creativity and ability to deliver business value and to reap finally a competitive advantage. Then, this paper proposed various method and software starting with an understanding of the enterprise aspect, and gave inspiration to those who wanted to use KM.

  18. AvianBuffer: An interactive tool for characterising and managing wildlife fear responses.

    PubMed

    Guay, Patrick-Jean; van Dongen, Wouter F D; Robinson, Randall W; Blumstein, Daniel T; Weston, Michael A

    2016-11-01

    The characterisation and management of deleterious processes affecting wildlife are ideally based on sound scientific information. However, relevant information is often absent, or difficult to access or contextualise for specific management purposes. We describe 'AvianBuffer', an interactive online tool enabling the estimation of distances at which Australian birds respond fearfully to humans. Users can input species assemblages and determine a 'separation distance' above which the assemblage is predicted to not flee humans. They can also nominate the diversity they wish to minimise disturbance to, or a specific separation distance to obtain an estimate of the diversity that will remain undisturbed. The dataset is based upon flight-initiation distances (FIDs) from 251 Australian bird species (n = 9190 FIDs) and a range of human-associated stimuli. The tool will be of interest to a wide audience including conservation managers, pest managers, policy makers, land-use planners, education and public outreach officers, animal welfare proponents and wildlife ecologists. We discuss possible applications of the data, including the construction of buffers, development of codes of conduct, environmental impact assessments and public outreach. This tool will help balance the growing need for biodiversity conservation in areas where humans can experience nature. The online resource will be expanded in future iterations to include an international database of FIDs of both avian and non-avian species.

  19. [Quantitative estimation of vegetation cover and management factor in USLE and RUSLE models by using remote sensing data: a review].

    PubMed

    Wu, Chang-Guang; Li, Sheng; Ren, Hua-Dong; Yao, Xiao-Hua; Huang, Zi-Jie

    2012-06-01

    Soil loss prediction models such as universal soil loss equation (USLE) and its revised universal soil loss equation (RUSLE) are the useful tools for risk assessment of soil erosion and planning of soil conservation at regional scale. To make a rational estimation of vegetation cover and management factor, the most important parameters in USLE or RUSLE, is particularly important for the accurate prediction of soil erosion. The traditional estimation based on field survey and measurement is time-consuming, laborious, and costly, and cannot rapidly extract the vegetation cover and management factor at macro-scale. In recent years, the development of remote sensing technology has provided both data and methods for the estimation of vegetation cover and management factor over broad geographic areas. This paper summarized the research findings on the quantitative estimation of vegetation cover and management factor by using remote sensing data, and analyzed the advantages and the disadvantages of various methods, aimed to provide reference for the further research and quantitative estimation of vegetation cover and management factor at large scale.

  20. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr Environ Assess Manag 2014;10:224–236. © 2014 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. PMID:24343931

  1. Estimates of natural salinity and hydrology in a subtropical estuarine ecosystem: implications for Greater Everglades restoration

    USGS Publications Warehouse

    Marshall, Frank E.; Wingard, G. Lynn; Pitts, Patrick A.

    2014-01-01

    Disruption of the natural patterns of freshwater flow into estuarine ecosystems occurred in many locations around the world beginning in the twentieth century. To effectively restore these systems, establishing a pre-alteration perspective allows managers to develop science-based restoration targets for salinity and hydrology. This paper describes a process to develop targets based on natural hydrologic functions by coupling paleoecology and regression models using the subtropical Greater Everglades Ecosystem as an example. Paleoecological investigations characterize the circa 1900 CE (pre-alteration) salinity regime in Florida Bay based on molluscan remains in sediment cores. These paleosalinity estimates are converted into time series estimates of paleo-based salinity, stage, and flow using numeric and statistical models. Model outputs are weighted using the mean square error statistic and then combined. Results indicate that, in the absence of water management, salinity in Florida Bay would be about 3 to 9 salinity units lower than current conditions. To achieve this target, upstream freshwater levels must be about 0.25 m higher than indicated by recent observed data, with increased flow inputs to Florida Bay between 2.1 and 3.7 times existing flows. This flow deficit is comparable to the average volume of water currently being diverted from the Everglades ecosystem by water management. The products (paleo-based Florida Bay salinity and upstream hydrology) provide estimates of pre-alteration hydrology and salinity that represent target restoration conditions. This method can be applied to any estuarine ecosystem with available paleoecologic data and empirical and/or model-based hydrologic data.

  2. Soil carbon stocks in Sarawak, Malaysia.

    PubMed

    Padmanabhan, E; Eswaran, H; Reich, P F

    2013-11-01

    The relationship between greenhouse gas emission and climate change has led to research to identify and manage the natural sources and sinks of the gases. CO2, CH4, and N2O have an anthropic source and of these CO2 is the least effective in trapping long wave radiation. Soil carbon sequestration can best be described as a process of removing carbon dioxide from the atmosphere and relocating into soils in a form that is not readily released back into the atmosphere. The purpose of this study is to estimate carbon stocks available under current conditions in Sarawak, Malaysia. SOC estimates are made for a standard depth of 100 cm unless the soil by definition is less than this depth, as in the case of lithic subgroups. Among the mineral soils, Inceptisols tend to generally have the highest carbon contents (about 25 kg m(-2) m(-1)), while Oxisols and Ultisols rate second (about 10-15 kg m(-2) m(-1)). The Oxisols store a good amount of carbon because of an appreciable time-frame to sequester carbon and possibly lower decomposition rates for the organic carbon that is found at 1m depths. Wet soils such as peatlands tend to store significant amounts of carbon. The highest values estimated for such soils are about 114 kg m(-2) m(-1). Such appreciable amounts can also be found in the Aquepts. In conclusion, it is pertinent to recognize that degradation of the carbon pool, just like desertification, is a real process and that this irreversible process must be addressed immediately. Therefore, appropriate soil management practices should be instituted to sequester large masses of soil carbon on an annual basis. This knowledge can be used effectively to formulate strategies to prevent forest fires and clearing: two processes that can quickly release sequestered carbon to the atmosphere in an almost irreversible manner. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Distinguishing values from science in decision making: Setting harvest quotas for mountain lions in Montana

    USGS Publications Warehouse

    Mitchell, Michael S.; Cooley, Hilary; Gude, Justin A.; Kolbe, Jay; Nowak, J. Joshua; Proffitt, Kelly M.; Sells, Sarah N.; Thompson, Mike

    2018-01-01

    The relative roles of science and human values can be difficult to distinguish when informal processes are used to make complex and contentious decisions in wildlife management. Structured Decision Making (SDM) offers a formal process for making such decisions, where scientific results and concepts can be disentangled from the values of differing stakeholders. We used SDM to formally integrate science and human values for a citizen working group of ungulate hunting advocates, lion hunting advocates, and outfitters convened to address the contentious allocation of harvest quotas for mountain lions (Puma concolor) in west‐central Montana, USA, during 2014. A science team consisting of mountain lion biologists and population ecologists convened to support the working group. The science team used integrated population models that incorporated 4 estimates of mountain lion density to estimate population trajectories for 5 alternative harvest quotas developed by the working group. Results of the modeling predicted that effects of each harvest quota were consistent across the 4 density estimates; harvest quotas affected predicted population trajectories for 5 years after implementation but differences were not strong. Based on these results, the focus of the working group changed to differences in values among stakeholders that were the true impediment to allocating harvest quotas. By distinguishing roles of science and human values in this process, the working group was able to collaboratively recommend a compromise solution. This solution differed little from the status quo that had been the focus of debate, but the SDM process produced understanding and buy‐in among stakeholders involved, reducing disagreements, misunderstanding, and unproductive arguments founded on informal application of scientific data and concepts. Whereas investments involved in conducting SDM may be unnecessary for many decisions in wildlife management, the investment may be beneficial for complex, contentious, and multiobjective decisions that integrate science and human values.

  4. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    PubMed

    Green, Adam W; Bailey, Larissa L

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies.

  5. Management of laser welding based on analysis informative signals

    NASA Astrophysics Data System (ADS)

    Zvezdin, V. V.; Rakhimov, R. R.; Saubanov, Ruz R.; Israfilov, I. H.; Akhtiamov, R. F.

    2017-09-01

    Features of formation precision weld of metal were presented. It has been shown that the quality of the welding process depends not only on the energy characteristics of the laser processing facility, the temperature of the surface layer, but also on the accuracy of positioning laser focus relative to seam and the workpiece surface. So the laser focus positioning accuracy is an estimate of the quality of the welding process. This approach allows to build a system automated control of the laser technological complex with the stabilization of the setpoint accuracy of of positioning of the laser beam relative to the workpiece surface.

  6. Strategic Methodologies in Public Health Cost Analyses.

    PubMed

    Whittington, Melanie; Atherly, Adam; VanRaemdonck, Lisa; Lampe, Sarah

    The National Research Agenda for Public Health Services and Systems Research states the need for research to determine the cost of delivering public health services in order to assist the public health system in communicating financial needs to decision makers, partners, and health reform leaders. The objective of this analysis is to compare 2 cost estimation methodologies, public health manager estimates of employee time spent and activity logs completed by public health workers, to understand to what degree manager surveys could be used in lieu of more time-consuming and burdensome activity logs. Employees recorded their time spent on communicable disease surveillance for a 2-week period using an activity log. Managers then estimated time spent by each employee on a manager survey. Robust and ordinary least squares regression was used to measure the agreement between the time estimated by the manager and the time recorded by the employee. The 2 outcomes for this study included time recorded by the employee on the activity log and time estimated by the manager on the manager survey. This study was conducted in local health departments in Colorado. Forty-one Colorado local health departments (82%) agreed to participate. Seven of the 8 models showed that managers underestimate their employees' time, especially for activities on which an employee spent little time. Manager surveys can best estimate time for time-intensive activities, such as total time spent on a core service or broad public health activity, and yet are less precise when estimating discrete activities. When Public Health Services and Systems Research researchers and health departments are conducting studies to determine the cost of public health services, there are many situations in which managers can closely approximate the time required and produce a relatively precise approximation of cost without as much time investment by practitioners.

  7. Estimation of Methane Emissions from Slurry Pits below Pig and Cattle Confinements

    PubMed Central

    Petersen, Søren O.; Olsen, Anne B.; Elsgaard, Lars; Triolo, Jin Mi; Sommer, Sven G.

    2016-01-01

    Quantifying in-house emissions of methane (CH4) from liquid manure (slurry) is difficult due to high background emissions from enteric processes, yet of great importance for correct estimation of CH4 emissions from manure management and effects of treatment technologies such as anaerobic digestion. In this study CH4 production rates were determined in 20 pig slurry and 11 cattle slurry samples collected beneath slatted floors on six representative farms; rates were determined within 24 h at temperatures close to the temperature in slurry pits at the time of collection. Methane production rates in pig and cattle slurry differed significantly at 0.030 and 0.011 kg CH4 kg-1 VS (volatile solids). Current estimates of CH4 emissions from pig and cattle manure management correspond to 0.032 and 0.015 kg CH4 kg-1, respectively, indicating that slurry pits under animal confinements are a significant source. Fractions of degradable volatile solids (VSd, kg kg-1 VS) were estimated using an aerobic biodegradability assay and total organic C analyses. The VSd in pig and cattle slurry averaged 0.51 and 0.33 kg kg-1 VS, and it was estimated that on average 43 and 28% of VSd in fresh excreta from pigs and cattle, respectively, had been lost at the time of sampling. An empirical model of CH4 emissions from slurry was reparameterised based on experimental results. A sensitivity analysis indicated that predicted CH4 emissions were highly sensitive to uncertainties in the value of lnA of the Arrhenius equation, but much less sensitive to uncertainties in VSd or slurry temperature. A model application indicated that losses of carbon in VS as CO2 may be much greater than losses as CH4. Implications of these results for the correct estimation of CH4 emissions from manure management, and for the mitigation potential of treatments such as anaerobic digestion, are discussed. PMID:27529692

  8. (How) do we learn from errors? A prospective study of the link between the ward's learning practices and medication administration errors.

    PubMed

    Drach-Zahavy, A; Somech, A; Admi, H; Peterfreund, I; Peker, H; Priente, O

    2014-03-01

    Attention in the ward should shift from preventing medication administration errors to managing them. Nevertheless, little is known in regard with the practices nursing wards apply to learn from medication administration errors as a means of limiting them. To test the effectiveness of four types of learning practices, namely, non-integrated, integrated, supervisory and patchy learning practices in limiting medication administration errors. Data were collected from a convenient sample of 4 hospitals in Israel by multiple methods (observations and self-report questionnaires) at two time points. The sample included 76 wards (360 nurses). Medication administration error was defined as any deviation from prescribed medication processes and measured by a validated structured observation sheet. Wards' use of medication administration technologies, location of the medication station, and workload were observed; learning practices and demographics were measured by validated questionnaires. Results of the mixed linear model analysis indicated that the use of technology and quiet location of the medication cabinet were significantly associated with reduced medication administration errors (estimate=.03, p<.05 and estimate=-.17, p<.01 correspondingly), while workload was significantly linked to inflated medication administration errors (estimate=.04, p<.05). Of the learning practices, supervisory learning was the only practice significantly linked to reduced medication administration errors (estimate=-.04, p<.05). Integrated and patchy learning were significantly linked to higher levels of medication administration errors (estimate=-.03, p<.05 and estimate=-.04, p<.01 correspondingly). Non-integrated learning was not associated with it (p>.05). How wards manage errors might have implications for medication administration errors beyond the effects of typical individual, organizational and technology risk factors. Head nurse can facilitate learning from errors by "management by walking around" and monitoring nurses' medication administration behaviors. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. A sustainable development of a city electrical grid via a non-contractual Demand-Side Management

    NASA Astrophysics Data System (ADS)

    Samoylenko, Vladislav O.; Pazderin, Andrew V.

    2017-06-01

    An increasing energy consumption of large cities as well as an extreme high density of city electrical loads leads to the necessity to search for an alternative approaches to city grid development. The ongoing implementation of the energy accounting tariffs with differentiated rates depending upon the market conditions and changing in a short-term perspective, provide the possibility to use it as a financial incentive base of a Demand-Side Management (DSM). Modern hi-technology energy metering and accounting systems with a large number of functions and consumer feedback are supposed to be the good means of DSM. Existing systems of Smart Metering (SM) billing usually provide general information about consumption curve, bills and compared data, but not the advanced statistics about the correspondence of financial and electric parameters. Also, consumer feedback is usually not fully used. So, the efforts to combine the market principle, Smart Metering and a consumer feedback for an active non-contractual load control are essential. The paper presents the rating-based multi-purpose system of mathematical statistics and algorithms of DSM efficiency estimation useful for both the consumers and the energy companies. The estimation is performed by SM Data processing systems. The system is aimed for load peak shaving and load curve smoothing. It is focused primarily on a retail market support. The system contributes to the energy efficiency and a distribution process improvement by the manual management or by the automated Smart Appliances interaction.

  10. Groundwater Discharge of Legacy Nitrogen to River Networks: Linking Regional Groundwater Models to Streambed Groundwater-Surface Water Exchange and Nitrogen Processing

    NASA Astrophysics Data System (ADS)

    Barclay, J. R.; Helton, A. M.; Briggs, M. A.; Starn, J. J.; Hunt, A.

    2017-12-01

    Despite years of management, excess nitrogen (N) is a pervasive problem in many aquatic ecosystems. More than half of surface water in the United States is derived from groundwater, and widespread N contamination in aquifers from decades of watershed N inputs suggest legacy N discharging from groundwater may contribute to contemporary N pollution problems in surface waters. Legacy N loads to streams and rivers are controlled by both regional scale flow paths and fine-scale processes that drive N transformations, such as groundwater-surface water exchange across steep redox gradients that occur at stream bed interfaces. Adequately incorporating these disparate scales is a challenge, but it is essential to understanding legacy N transport and making informed management decisions. We developed a regional groundwater flow model for the Farmington River, a HUC-8 basin that drains to the Long Island Sound, a coastal estuary that suffers from elevated N loads despite decades of management, to understand broad patterns of regional transport. To evaluate and refine the regional model, we used thermal infrared imagery paired with vertical temperature profiling to estimate groundwater discharge at the streambed interface. We also analyzed discharging groundwater for multiple N species to quantify fine scale patterns of N loading and transformation via denitrification at the streambed interface. Integrating regional and local estimates of groundwater discharge of legacy N to river networks should improve our ability to predict spatiotemporal patterns of legacy N loading to and transformation within surface waters.

  11. Management of precancerous cervical lesions in iran: a cost minimizing study.

    PubMed

    Nahvijou, Azin; Sari, Ali Akbari; Zendehdel, Kazem; Marnani, Ahmad Barati

    2014-01-01

    Cervical cancer is a common, preventable and manageable disease in women worldwide. This study was conducted to determine the cost of follow-up for suspicious precancerous cervical lesions within a screening program using Pap smear or HPV DNA test through the decision tree. Patient follow-up processes were determined using standard guidelines and consultation with specialists to design a decision tree model. Costs of treatment in both public and private sectors were identified according to the national tariffs in 2010 and determined based on decision tree and provided services (visits to specialists, colposcopy, and conization) with two modalities: Pap smear and HPV DNA test. The number of patients and the mean cost of treatment in each sector were calculated. The prevalence of lesions and HPV were obtained from literature to estimate the cost of treatment for each woman in the population. Follow-up costs were determined using seven processes for Pap smear and 11 processes for HPV DNA test. The total cost of using Pap smear and HPV DNA process for each woman in the population was 36.1$ and 174 $ respectively. The follow-up process for patients with suspicious cervical lesions needs to be included in the existing screening program. HPV DNA test is currently more expensive than Pap smear, it is suggested that we manage precancerous cervical lesions with this latter test.

  12. Knowledge-based processing for aircraft flight control

    NASA Technical Reports Server (NTRS)

    Painter, John H.; Glass, Emily; Economides, Gregory; Russell, Paul

    1994-01-01

    This Contractor Report documents research in Intelligent Control using knowledge-based processing in a manner dual to methods found in the classic stochastic decision, estimation, and control discipline. Such knowledge-based control has also been called Declarative, and Hybid. Software architectures were sought, employing the parallelism inherent in modern object-oriented modeling and programming. The viewpoint adopted was that Intelligent Control employs a class of domain-specific software architectures having features common over a broad variety of implementations, such as management of aircraft flight, power distribution, etc. As much attention was paid to software engineering issues as to artificial intelligence and control issues. This research considered that particular processing methods from the stochastic and knowledge-based worlds are duals, that is, similar in a broad context. They provide architectural design concepts which serve as bridges between the disparate disciplines of decision, estimation, control, and artificial intelligence. This research was applied to the control of a subsonic transport aircraft in the airport terminal area.

  13. Variability of Hormonal Stress Markers Collected from a Managed Dolphin Population

    DTIC Science & Technology

    2013-09-30

    physiological indicators of stress in wild marine mammals and the interrelationships between different stress markers can be used to estimate the impact...Radioimmunoassay methods have previously been validated for cortisol and aldosterone in this species (Houser et al., 2011). Parallel processing of...for these hormones.. Metabolites of cortisol, aldosterone and thyroid hormone will be extracted from fecal samples and measured via RIA using

  14. The effects of habitat, climate, and Barred Owls on long-term demography of Northern Spotted Owls

    Treesearch

    Katie M. Dugger; Eric D. Forsman; Alan B. Franklin; Raymond J. Davis; Gary C. White; Carl J. Schwarz; Kenneth P. Burnham; James D. Nichols; James E. Hines; Charles B. Yackulic; Paul F. Doherty; Larissa Bailey; Darren A. Clark; Steven H. Ackers; Lawrence S. Andrews; Benjamin Augustine; Brian L. Biswell; Jennifer Blakesley; Peter C. Carlson; Matthew J. Clement; Lowell V. Diller; Elizabeth M. Glenn; Adam Green; Scott A. Gremel; Dale R. Herter; J. Mark Higley; Jeremy Hobson; Rob B. Horn; Kathryn P. Huyvaert; Christopher McCafferty; Trent McDonald; Kevin McDonnell; Gail S. Olson; Janice A. Reid; Jeremy Rockweit; Viviana Ruiz; Jessica Saenz; Stan G. Sovern

    2016-01-01

    Estimates of species’ vital rates and an understanding of the factors affecting those parameters over time and space can provide crucial information for management and conservation. We used mark–recapture, reproductive output, and territory occupancy data collected during 1985–2013 to evaluate population processes of Northern Spotted Owls (Strix occidentalis...

  15. Regional carbon cycle responses to 25 years of variation in climate and disturbance in the US Pacific Northwest

    Treesearch

    David P. Turner; William D. Ritts; Robert E. Kennedy; Andrew N. Gray; Zhiqiang Yang

    2016-01-01

    Variation in climate, disturbance regime, and forest management strongly influence terrestrial carbon sources and sinks. Spatially distributed, process-based, carbon cycle simulation models provide a means to integrate information on these various influences to estimate carbon pools and flux over large domains. Here we apply the Biome-BGC model over the four-state...

  16. When relationships estimated in the past cannot be used to predict the future: using mechanistic models to predict landscape ecological dynamics in a changing world

    Treesearch

    Eric J. Gustafson

    2013-01-01

    Researchers and natural resource managers need predictions of how multiple global changes (e.g., climate change, rising levels of air pollutants, exotic invasions) will affect landscape composition and ecosystem function. Ecological predictive models used for this purpose are constructed using either a mechanistic (process-based) or a phenomenological (empirical)...

  17. Can Wireless Technology Enable New Diabetes Management Tools?

    PubMed Central

    Hedtke, Paul A.

    2008-01-01

    Mobile computing and communications technology embodied in the modern cell phone device can be employed to improve the lives of diabetes patients by giving them better tools for self-management. Several companies are working on the development of diabetes management tools that leverage the ubiquitous cell phone to bring self-management tools to the hand of the diabetes patient. Integration of blood glucose monitoring (BGM) technology with the cell phone platform adds a level of convenience for the person with diabetes, but, more importantly, allows BGM data to be automatically captured, logged, and processed in near real time in order to provide the diabetes patient with assistance in managing their blood glucose levels. Other automatic measurements can estimate physical activity, and information regarding medication events and food intake can be captured and analyzed in order to provide the diabetes patient with continual assistance in managing their therapy and behaviors in order to improve glycemic control. The path to realization of such solutions is not, however, without obstacles. PMID:19885187

  18. Can wireless technology enable new diabetes management tools?

    PubMed

    Hedtke, Paul A

    2008-01-01

    Mobile computing and communications technology embodied in the modern cell phone device can be employed to improve the lives of diabetes patients by giving them better tools for self-management. Several companies are working on the development of diabetes management tools that leverage the ubiquitous cell phone to bring self-management tools to the hand of the diabetes patient. Integration of blood glucose monitoring (BGM) technology with the cell phone platform adds a level of convenience for the person with diabetes, but, more importantly, allows BGM data to be automatically captured, logged, and processed in near real time in order to provide the diabetes patient with assistance in managing their blood glucose levels. Other automatic measurements can estimate physical activity, and information regarding medication events and food intake can be captured and analyzed in order to provide the diabetes patient with continual assistance in managing their therapy and behaviors in order to improve glycemic control. The path to realization of such solutions is not, however, without obstacles.

  19. Methodology to Estimate the Quantity, Composition, and ...

    EPA Pesticide Factsheets

    This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estimates have been limited to building-related CDD, a goal in the development of this methodology was to use data originating from CDD facilities and contractors to better capture the current picture of total CDD management, including materials from roads, bridges and infrastructure. This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estimates have been limited to building-related CDD, a goal in the development of this methodology was to use data originating from CDD facilities and contractors to better capture the current picture of total CDD management, including materials from roads, bridges and infrastructure.

  20. Valuing Insect Pollination Services with Cost of Replacement

    PubMed Central

    Allsopp, Mike H.; de Lange, Willem J.; Veldtman, Ruan

    2008-01-01

    Value estimates of ecosystem goods and services are useful to justify the allocation of resources towards conservation, but inconclusive estimates risk unsustainable resource allocations. Here we present replacement costs as a more accurate value estimate of insect pollination as an ecosystem service, although this method could also be applied to other services. The importance of insect pollination to agriculture is unequivocal. However, whether this service is largely provided by wild pollinators (genuine ecosystem service) or managed pollinators (commercial service), and which of these requires immediate action amidst reports of pollinator decline, remains contested. If crop pollination is used to argue for biodiversity conservation, clear distinction should be made between values of managed- and wild pollination services. Current methods either under-estimate or over-estimate the pollination service value, and make use of criticised general insect and managed pollinator dependence factors. We apply the theoretical concept of ascribing a value to a service by calculating the cost to replace it, as a novel way of valuing wild and managed pollination services. Adjusted insect and managed pollinator dependence factors were used to estimate the cost of replacing insect- and managed pollination services for the Western Cape deciduous fruit industry of South Africa. Using pollen dusting and hand pollination as suitable replacements, we value pollination services significantly higher than current market prices for commercial pollination, although lower than traditional proportional estimates. The complexity associated with inclusive value estimation of pollination services required several defendable assumptions, but made estimates more inclusive than previous attempts. Consequently this study provides the basis for continued improvement in context specific pollination service value estimates. PMID:18781196

  1. Systems identification and the adaptive management of waterfowl in the United States

    USGS Publications Warehouse

    Williams, B.K.; Nichols, J.D.

    2001-01-01

    Waterfowl management in the United States is one of the more visible conservation success stories in the United States. It is authorized and supported by appropriate legislative authorities, based on large-scale monitoring programs, and widely accepted by the public. The process is one of only a limited number of large-scale examples of effective collaboration between research and management, integrating scientific information with management in a coherent framework for regulatory decision-making. However, harvest management continues to face some serious technical problems, many of which focus on sequential identification of the resource system in a context of optimal decision-making. The objective of this paper is to provide a theoretical foundation of adaptive harvest management, the approach currently in use in the United States for regulatory decision-making. We lay out the legal and institutional framework for adaptive harvest management and provide a formal description of regulatory decision-making in terms of adaptive optimization. We discuss some technical and institutional challenges in applying adaptive harvest management and focus specifically on methods of estimating resource states for linear resource systems.

  2. Calibration of a COTS Integration Cost Model Using Local Project Data

    NASA Technical Reports Server (NTRS)

    Boland, Dillard; Coon, Richard; Byers, Kathryn; Levitt, David

    1997-01-01

    The software measures and estimation techniques appropriate to a Commercial Off the Shelf (COTS) integration project differ from those commonly used for custom software development. Labor and schedule estimation tools that model COTS integration are available. Like all estimation tools, they must be calibrated with the organization's local project data. This paper describes the calibration of a commercial model using data collected by the Flight Dynamics Division (FDD) of the NASA Goddard Spaceflight Center (GSFC). The model calibrated is SLIM Release 4.0 from Quantitative Software Management (QSM). By adopting the SLIM reuse model and by treating configuration parameters as lines of code, we were able to establish a consistent calibration for COTS integration projects. The paper summarizes the metrics, the calibration process and results, and the validation of the calibration.

  3. Towards sustainable management of huntable migratory waterbirds in Europe

    USGS Publications Warehouse

    Madsen, Jesper; Guillemain, Matthieu; Nagy, Szabolcs; Defos du Rau, Pierre; Mondain-Monval, Jean-Yves; Griffin, Cy; Williams, James Henty; Bunnefeld, Nils; Czajkowski, Alexandre; Hearn, Richard; Grauer, Andreas; Alhainen, Mikko; Middleton, Angus; Johnson, Fred A.

    2015-01-01

    The EU Birds Directive and the African-Eurasian Waterbird Agreement provide an adequate legal framework for sustainable management of migratory waterbird populations. The main shortcoming of both instruments is that it leaves harvest decisions of a shared resource to individual Member States and Contracting Parties without providing a shared information base and mechanism to assess the impact of harvest and coordinate actions in relation to mutually agreed objectives. A recent update of the conservation status of waterbirds in the EU shows that almost half of the populations of species listed on Annex II of the Birds Directive have a declining short-term trend and over half of them are listed in Columns A and B of AEWA. This implies that their hunting could either only continue under the framework of an adaptive harvest management plan or their hunting should be regulated with the view of restoring them in favourable conservation status. We argue that a structured approach to decision-making (such as adaptive management) is needed, supported with adequate organisational structures at flyway scale. We review the experience with such an approach in North America and assess the applicability of a similar approach in the European context. We show there is no technical reason why adaptive harvest management could be not applied in the EU or even AEWA context. We demonstrate that an informed approach to setting allowable harvests does not require detailed demographic information. Essential to the process, however, are estimates of either the observed growth rate from a monitoring program or the growth rate expected under ideal conditions. In addition, periodic estimates of population size are needed, as well as either empirical information or reasonable assumptions about the form of density dependence. We show that such information exists for many populations, but improvements are needed to improve geographic coverage, reliability and timely data availability. We highlight the importance of the International Waterbird Census and specialised goose and seaduck monitoring in estimating population sizes and observed growth rate of the populations. We encourage further investments into the development of these schemes. We also recognise the importance of migration studies to improve our understanding of delineations of populations. We also highlight that, with a few exceptions, the available data does not allow the European Commission, competent authorities of the Members States or other AEWA Contracting Parties to assess levels of harvest and their sustainability and, therefore, regulate hunting accordingly. Therefore, we recommend that annual reporting on 2 harvest levels of waterbird populations would be gradually introduced in the EU and the AEWA region. We propose that future AEWA and EU action plans and management plans for Annex II species should apply the principles of adaptive harvest management framework and make provisions for setting up adequate monitoring and information management systems and organisational structures to manage the decision-making process. We suggest that internationally coordinated management structures are established to facilitate dialogue, learning and communication between stakeholders with different interests and cultural backgrounds.

  4. Batch Model for Batched Timestamps Data Analysis with Application to the SSA Disability Program

    PubMed Central

    Yue, Qingqi; Yuan, Ao; Che, Xuan; Huynh, Minh; Zhou, Chunxiao

    2016-01-01

    The Office of Disability Adjudication and Review (ODAR) is responsible for holding hearings, issuing decisions, and reviewing appeals as part of the Social Security Administration’s disability determining process. In order to control and process cases, the ODAR has established a Case Processing and Management System (CPMS) to record management information since December 2003. The CPMS provides a detailed case status history for each case. Due to the large number of appeal requests and limited resources, the number of pending claims at ODAR was over one million cases by March 31, 2015. Our National Institutes of Health (NIH) team collaborated with SSA and developed a Case Status Change Model (CSCM) project to meet the ODAR’s urgent need of reducing backlogs and improve hearings and appeals process. One of the key issues in our CSCM project is to estimate the expected service time and its variation for each case status code. The challenge is that the systems recorded job departure times may not be the true job finished times. As the CPMS timestamps data of case status codes showed apparent batch patterns, we proposed a batch model and applied the constrained least squares method to estimate the mean service times and the variances. We also proposed a batch search algorithm to determine the optimal batch partition, as no batch partition was given in the real data. Simulation studies were conducted to evaluate the performance of the proposed methods. Finally, we applied the method to analyze a real CPMS data from ODAR/SSA. PMID:27747132

  5. Tools for Interdisciplinary Data Assimilation and Sharing in Support of Hydrologic Science

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Walker, J.; Suftin, I.; Warren, M.; Kunicki, T.

    2013-12-01

    Information consumed and produced in hydrologic analyses is interdisciplinary and massive. These factors put a heavy information management burden on the hydrologic science community. The U.S. Geological Survey (USGS) Office of Water Information Center for Integrated Data Analytics (CIDA) seeks to assist hydrologic science investigators with all-components of their scientific data management life cycle. Ongoing data publication and software development projects will be presented demonstrating publically available data access services and manipulation tools being developed with support from two Department of the Interior initiatives. The USGS-led National Water Census seeks to provide both data and tools in support of nationally consistent water availability estimates. Newly available data include national coverages of radar-indicated precipitation, actual evapotranspiration, water use estimates aggregated by county, and South East region estimates of streamflow for 12-digit hydrologic unit code watersheds. Web services making these data available and applications to access them will be demonstrated. Web-available processing services able to provide numerous streamflow statistics for any USGS daily flow record or model result time series and other National Water Census processing tools will also be demonstrated. The National Climate Change and Wildlife Science Center is a USGS center leading DOI-funded academic global change adaptation research. It has a mission goal to ensure data used and produced by funded projects is available via web services and tools that streamline data management tasks in interdisciplinary science. For example, collections of downscaled climate projections, typically large collections of files that must be downloaded to be accessed, are being published using web services that allow access to the entire dataset via simple web-service requests and numerous processing tools. Recent progress on this front includes, data web services for Climate Model Intercomparison Phase 5 based downscaled climate projections, EPA's Integrated Climate and Land Use Scenarios projections of population and land cover metrics, and MODIS-derived land cover parameters from NASA's Land Processes Distributed Active Archive Center. These new services and ways to discover others will be presented through demonstration of a recently open-sourced project from a web-application or scripted workflow. Development and public deployment of server-based processing tools to subset and summarize these and other data is ongoing at the CIDA with partner groups such as 52 Degrees North and Unidata. The latest progress on subsetting, spatial summarization to areas of interest, and temporal summarization via common-statistical methods will be presented.

  6. Pretest expectations strongly influence interpretation of abnormal laboratory results and further management

    PubMed Central

    2010-01-01

    Background Abnormal results of diagnostic laboratory tests can be difficult to interpret when disease probability is very low. Although most physicians generally do not use Bayesian calculations to interpret abnormal results, their estimates of pretest disease probability and reasons for ordering diagnostic tests may - in a more implicit manner - influence test interpretation and further management. A better understanding of this influence may help to improve test interpretation and management. Therefore, the objective of this study was to examine the influence of physicians' pretest disease probability estimates, and their reasons for ordering diagnostic tests, on test result interpretation, posttest probability estimates and further management. Methods Prospective study among 87 primary care physicians in the Netherlands who each ordered laboratory tests for 25 patients. They recorded their reasons for ordering the tests (to exclude or confirm disease or to reassure patients) and their pretest disease probability estimates. Upon receiving the results they recorded how they interpreted the tests, their posttest probability estimates and further management. Logistic regression was used to analyse whether the pretest probability and the reasons for ordering tests influenced the interpretation, the posttest probability estimates and the decisions on further management. Results The physicians ordered tests for diagnostic purposes for 1253 patients; 742 patients had an abnormal result (64%). Physicians' pretest probability estimates and their reasons for ordering diagnostic tests influenced test interpretation, posttest probability estimates and further management. Abnormal results of tests ordered for reasons of reassurance were significantly more likely to be interpreted as normal (65.8%) compared to tests ordered to confirm a diagnosis or exclude a disease (27.7% and 50.9%, respectively). The odds for abnormal results to be interpreted as normal were much lower when the physician estimated a high pretest disease probability, compared to a low pretest probability estimate (OR = 0.18, 95% CI = 0.07-0.52, p < 0.001). Conclusions Interpretation and management of abnormal test results were strongly influenced by physicians' estimation of pretest disease probability and by the reason for ordering the test. By relating abnormal laboratory results to their pretest expectations, physicians may seek a balance between over- and under-reacting to laboratory test results. PMID:20158908

  7. Pretest expectations strongly influence interpretation of abnormal laboratory results and further management.

    PubMed

    Houben, Paul H H; van der Weijden, Trudy; Winkens, Bjorn; Winkens, Ron A G; Grol, Richard P T M

    2010-02-16

    Abnormal results of diagnostic laboratory tests can be difficult to interpret when disease probability is very low. Although most physicians generally do not use Bayesian calculations to interpret abnormal results, their estimates of pretest disease probability and reasons for ordering diagnostic tests may--in a more implicit manner--influence test interpretation and further management. A better understanding of this influence may help to improve test interpretation and management. Therefore, the objective of this study was to examine the influence of physicians' pretest disease probability estimates, and their reasons for ordering diagnostic tests, on test result interpretation, posttest probability estimates and further management. Prospective study among 87 primary care physicians in the Netherlands who each ordered laboratory tests for 25 patients. They recorded their reasons for ordering the tests (to exclude or confirm disease or to reassure patients) and their pretest disease probability estimates. Upon receiving the results they recorded how they interpreted the tests, their posttest probability estimates and further management. Logistic regression was used to analyse whether the pretest probability and the reasons for ordering tests influenced the interpretation, the posttest probability estimates and the decisions on further management. The physicians ordered tests for diagnostic purposes for 1253 patients; 742 patients had an abnormal result (64%). Physicians' pretest probability estimates and their reasons for ordering diagnostic tests influenced test interpretation, posttest probability estimates and further management. Abnormal results of tests ordered for reasons of reassurance were significantly more likely to be interpreted as normal (65.8%) compared to tests ordered to confirm a diagnosis or exclude a disease (27.7% and 50.9%, respectively). The odds for abnormal results to be interpreted as normal were much lower when the physician estimated a high pretest disease probability, compared to a low pretest probability estimate (OR = 0.18, 95% CI = 0.07-0.52, p < 0.001). Interpretation and management of abnormal test results were strongly influenced by physicians' estimation of pretest disease probability and by the reason for ordering the test. By relating abnormal laboratory results to their pretest expectations, physicians may seek a balance between over- and under-reacting to laboratory test results.

  8. Social value and individual choice: The value of a choice-based decision-making process in a collectively funded health system.

    PubMed

    Espinoza, Manuel Antonio; Manca, Andrea; Claxton, Karl; Sculpher, Mark

    2018-02-01

    Evidence about cost-effectiveness is increasingly being used to inform decisions about the funding of new technologies that are usually implemented as guidelines from centralized decision-making bodies. However, there is also an increasing recognition for the role of patients in determining their preferred treatment option. This paper presents a method to estimate the value of implementing a choice-based decision process using the cost-effectiveness analysis toolbox. This value is estimated for 3 alternative scenarios. First, it compares centralized decisions, based on population average cost-effectiveness, against a decision process based on patient choice. Second, it compares centralized decision based on patients' subgroups versus an individual choice-based decision process. Third, it compares a centralized process based on average cost-effectiveness against a choice-based process where patients choose according to a different measure of outcome to that used by the centralized decision maker. The methods are applied to a case study for the management of acute coronary syndrome. It is concluded that implementing a choice-based process of treatment allocation may be an option in collectively funded health systems. However, its value will depend on the specific health problem and the social values considered relevant to the health system. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Estimating occupancy and abundance using aerial images with imperfect detection

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.; Womble, Jamie N.; Bower, Michael R.

    2017-01-01

    Species distribution and abundance are critical population characteristics for efficient management, conservation, and ecological insight. Point process models are a powerful tool for modelling distribution and abundance, and can incorporate many data types, including count data, presence-absence data, and presence-only data. Aerial photographic images are a natural tool for collecting data to fit point process models, but aerial images do not always capture all animals that are present at a site. Methods for estimating detection probability for aerial surveys usually include collecting auxiliary data to estimate the proportion of time animals are available to be detected.We developed an approach for fitting point process models using an N-mixture model framework to estimate detection probability for aerial occupancy and abundance surveys. Our method uses multiple aerial images taken of animals at the same spatial location to provide temporal replication of sample sites. The intersection of the images provide multiple counts of individuals at different times. We examined this approach using both simulated and real data of sea otters (Enhydra lutris kenyoni) in Glacier Bay National Park, southeastern Alaska.Using our proposed methods, we estimated detection probability of sea otters to be 0.76, the same as visual aerial surveys that have been used in the past. Further, simulations demonstrated that our approach is a promising tool for estimating occupancy, abundance, and detection probability from aerial photographic surveys.Our methods can be readily extended to data collected using unmanned aerial vehicles, as technology and regulations permit. The generality of our methods for other aerial surveys depends on how well surveys can be designed to meet the assumptions of N-mixture models.

  10. The Biggest Loser Thinks Long-Term: Recency as a Predictor of Success in Weight Management.

    PubMed

    Koritzky, Gilly; Rice, Chantelle; Dieterle, Camille; Bechara, Antoine

    2015-01-01

    Only a minority of participants in behavioral weight management lose weight significantly. The ability to predict who is likely to benefit from weight management can improve the efficiency of obesity treatment. Identifying predictors of weight loss can also reveal potential ways to improve existing treatments. We propose a neuro-psychological model that is focused on recency: the reliance on recent information at the expense of time-distant information. Forty-four weight-management patients completed a decision-making task and their recency level was estimated by a mathematical model. Impulsivity and risk-taking were also measured for comparison. Weight loss was measured in the end of the 16-week intervention. Consistent with our hypothesis, successful dieters (n = 12) had lower recency scores than unsuccessful ones (n = 32; p = 0.006). Successful and unsuccessful dieters were similar in their demographics, intelligence, risk taking, impulsivity, and delay of gratification. We conclude that dieters who process time-distant information in their decision making are more likely to lose weight than those who are high in recency. We argue that having low recency facilitates future-oriented thinking, and thereby contributes to behavior change treatment adherence. Our findings underline the importance of choosing the right treatment for every individual, and outline a way to improve weight-management processes for more patients.

  11. Incorporating parametric uncertainty into population viability analysis models

    USGS Publications Warehouse

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  12. Real-time yield estimation based on deep learning

    NASA Astrophysics Data System (ADS)

    Rahnemoonfar, Maryam; Sheppard, Clay

    2017-05-01

    Crop yield estimation is an important task in product management and marketing. Accurate yield prediction helps farmers to make better decision on cultivation practices, plant disease prevention, and the size of harvest labor force. The current practice of yield estimation based on the manual counting of fruits is very time consuming and expensive process and it is not practical for big fields. Robotic systems including Unmanned Aerial Vehicles (UAV) and Unmanned Ground Vehicles (UGV), provide an efficient, cost-effective, flexible, and scalable solution for product management and yield prediction. Recently huge data has been gathered from agricultural field, however efficient analysis of those data is still a challenging task. Computer vision approaches currently face diffident challenges in automatic counting of fruits or flowers including occlusion caused by leaves, branches or other fruits, variance in natural illumination, and scale. In this paper a novel deep convolutional network algorithm was developed to facilitate the accurate yield prediction and automatic counting of fruits and vegetables on the images. Our method is robust to occlusion, shadow, uneven illumination and scale. Experimental results in comparison to the state-of-the art show the effectiveness of our algorithm.

  13. Evaluating mallard adaptive management models with time series

    USGS Publications Warehouse

    Conn, P.B.; Kendall, W.L.

    2004-01-01

    Wildlife practitioners concerned with midcontinent mallard (Anas platyrhynchos) management in the United States have instituted a system of adaptive harvest management (AHM) as an objective format for setting harvest regulations. Under the AHM paradigm, predictions from a set of models that reflect key uncertainties about processes underlying population dynamics are used in coordination with optimization software to determine an optimal set of harvest decisions. Managers use comparisons of the predictive abilities of these models to gauge the relative truth of different hypotheses about density-dependent recruitment and survival, with better-predicting models giving more weight to the determination of harvest regulations. We tested the effectiveness of this strategy by examining convergence rates of 'predictor' models when the true model for population dynamics was known a priori. We generated time series for cases when the a priori model was 1 of the predictor models as well as for several cases when the a priori model was not in the model set. We further examined the addition of different levels of uncertainty into the variance structure of predictor models, reflecting different levels of confidence about estimated parameters. We showed that in certain situations, the model-selection process favors a predictor model that incorporates the hypotheses of additive harvest mortality and weakly density-dependent recruitment, even when the model is not used to generate data. Higher levels of predictor model variance led to decreased rates of convergence to the model that generated the data, but model weight trajectories were in general more stable. We suggest that predictive models should incorporate all sources of uncertainty about estimated parameters, that the variance structure should be similar for all predictor models, and that models with different functional forms for population dynamics should be considered for inclusion in predictor model! sets. All of these suggestions should help lower the probability of erroneous learning in mallard ABM and adaptive management in general.

  14. Integrating the Clearance in NPP Residual Material Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia-Bermejo, R.; Lamela, B.

    Previous Experiences in decommissioning projects are being used to optimize the residual material management in NPP, metallic scrap usually. The approach is based in the availability of a materials Clearance MARSSIM-based methodology developed and licensed in Spain. A typical project includes the integration of segregation, decontamination, clearance, quality control and quality assurance activities. The design is based in the clearance methodology features translating them into standard operational procedures. In terms of ecological taxes and final disposal costs, significant amounts of money could be saved with this type of approaches. The last clearance project managed a total amount of 405 tonsmore » scrap metal and a similar amount of other residual materials occupying a volume of 1500 m{sup 3}. After less than a year of field works 251 tons were finally recycled in a non-licensed smelting facility. The balance was disposed as LILW. In the planning phase the estimated cost savings were 4.5 Meuro. However, today a VLLW option is available in European countries so, the estimated cost savings are reduced to 1.2 Meuro. In conclusion: the application of materials clearance in NPP decommissioning lessons learnt to the NPP residual material management is an interesting management option. This practice is currently going on in Spanish NPP and, in a preliminary view, is consistent with the new MARSAME Draft. An interesting parameter is the cost of 1 m3 of recyclable scrap. The above estimates are very project specific because in the segregation process other residual materials were involved. If the effect of this other materials is removed the estimated Unit Cost were in this project around 1700 euro/m{sup 3}, this figure is clearly below the above VLLW disposal cost of 2600 euro. In a future project it appears feasible to descend to 839 euro/m{sup 3} and if it became routine values and is used in big Decommissioning projects, around 600 euro/m{sup 3} or below possibly could be achieved. A rough economical analysis permits to estimate a saving around 2000 US$ to 13000 US$ per cubic meter of steel scrap according the variability of materials and disposal costs. Many learnt lessons of this practice were used as a feed back in the planning of characterization activities for decommissioning a Spanish NPP and today are considered as a significant reference in our Decommissioning engineering approaches.« less

  15. Evaluation of effectiveness of information systems implementation in organization (by example of ERP-systems)

    NASA Astrophysics Data System (ADS)

    Demyanova, O. V.; Andreeva, E. V.; Sibgatullina, D. R.; Kireeva-Karimova, A. M.; Gafurova, A. Y.; Zakirova, Ch S.

    2018-05-01

    ERP in a modern enterprise information system allowed optimizing internal business processes, reducing production costs and increasing the attractiveness of enterprises for investors. It is an important component of success in the competition and an important condition for attracting investments in the key sector of the state. A vivid example of these systems are enterprise information systems using the methodology of ERP (Enterprise Resource Planning - enterprise resource planning). ERP is an integrated set of methods, processes, technologies and tools. It is based on: supply chain management; advanced planning and scheduling; sales automation; tool responsible for configuring; final resource planning; intelligence business; OLAP technology; block e- Commerce; management of product data. The main purpose of ERP systems is the automation of interrelated processes of planning, accounting and management in key areas of the company. ERP systems are automated systems that effectively address complex problems, including optimal allocation of business resources, ensuring quick and efficient delivery of goods and services to the consumer. Knowledge embedded in ERP systems provided enterprise-wide automation to introduce the activities of all functional departments of the company as a single complex system. At the level of quality estimates, most managers understand that the implementations of ERP systems is a necessary and useful procedure. Assessment of the effectiveness of the information systems implementation is relevant.

  16. A quality risk management model approach for cell therapy manufacturing.

    PubMed

    Lopez, Fabio; Di Bartolo, Chiara; Piazza, Tommaso; Passannanti, Antonino; Gerlach, Jörg C; Gridelli, Bruno; Triolo, Fabio

    2010-12-01

    International regulatory authorities view risk management as an essential production need for the development of innovative, somatic cell-based therapies in regenerative medicine. The available risk management guidelines, however, provide little guidance on specific risk analysis approaches and procedures applicable in clinical cell therapy manufacturing. This raises a number of problems. Cell manufacturing is a poorly automated process, prone to operator-introduced variations, and affected by heterogeneity of the processed organs/tissues and lot-dependent variability of reagent (e.g., collagenase) efficiency. In this study, the principal challenges faced in a cell-based product manufacturing context (i.e., high dependence on human intervention and absence of reference standards for acceptable risk levels) are identified and addressed, and a risk management model approach applicable to manufacturing of cells for clinical use is described for the first time. The use of the heuristic and pseudo-quantitative failure mode and effect analysis/failure mode and critical effect analysis risk analysis technique associated with direct estimation of severity, occurrence, and detection is, in this specific context, as effective as, but more efficient than, the analytic hierarchy process. Moreover, a severity/occurrence matrix and Pareto analysis can be successfully adopted to identify priority failure modes on which to act to mitigate risks. The application of this approach to clinical cell therapy manufacturing in regenerative medicine is also discussed. © 2010 Society for Risk Analysis.

  17. Hindcast of water availability in regional aquifer systems using MODFLOW Farm Process

    USGS Publications Warehouse

    Schmid, Wolfgang; Hanson, Randall T.; Faunt, Claudia C.; Phillips, Steven P.

    2015-01-01

    Coupled groundwater and surface-water components of the hydrologic cycle can be simulated by the Farm Process for MODFLOW (MF-FMP) in both irrigated and non-irrigated areas and aquifer-storage and recovery systems. MF-FMP is being applied to three productive agricultural regions of different scale in the State of California, USA, to assess the availability of water and the impacts of alternative management decisions. Hindcast simulations are conducted for similar periods from the 1960s to near recent times. Historical groundwater pumpage is mostly unknown in one region (Central Valley) and is estimated by MF-FMP. In another region (Pajaro Valley), recorded pumpage is used to calibrate model-estimated pumpage. Multiple types of observations are used to estimate uncertain parameters, such as hydraulic, land-use, and farm properties. MF-FMP simulates how climate variability and water-import availability affect water demand and supply. MF-FMP can be used to predict water availability based on anticipated changes in anthropogenic or natural water demands. Keywords groundwater; surface-water; irrigation; water availability; response to climate variability/change

  18. Rapid assessment of rice seed availability for wildlife in harvested fields

    USGS Publications Warehouse

    Halstead, B.J.; Miller, M.R.; Casazza, Michael L.; Coates, P.S.; Farinha, M.A.; Benjamin, Gustafson K.; Yee, J.L.; Fleskes, J.P.

    2011-01-01

    Rice seed remaining in commercial fields after harvest (waste rice) is a critical food resource for wintering waterfowl in rice-growing regions of North America. Accurate and precise estimates of the seed mass density of waste rice are essential for planning waterfowl wintering habitat extents and management. In the Sacramento Valley of California, USA, the existing method for obtaining estimates of availability of waste rice in harvested fields produces relatively precise estimates, but the labor-, time-, and machineryintensive process is not practical for routine assessments needed to examine long-term trends in waste rice availability. We tested several experimental methods designed to rapidly derive estimates that would not be burdened with disadvantages of the existing method. We first conducted a simulation study of the efficiency of each method and then conducted field tests. For each approach, methods did not vary in root mean squared error, although some methods did exhibit bias for both simulations and field tests. Methods also varied substantially in the time to conduct each sample and in the number of samples required to detect a standard trend. Overall, modified line-intercept methods performed well for estimating the density of rice seeds. Waste rice in the straw, although not measured directly, can be accounted for by a positive relationship with density of rice on the ground. Rapid assessment of food availability is a useful tool to help waterfowl managers establish and implement wetland restoration and agricultural habitat-enhancement goals for wintering waterfowl. ?? 2011 The Wildlife Society.

  19. Depression and chronic pain in the elderly: links and management challenges

    PubMed Central

    Zis, Panagiotis; Daskalaki, Argyro; Bountouni, Ilia; Sykioti, Panagiota; Varrassi, Giustino; Paladini, Antonella

    2017-01-01

    Aging is an inevitable process and represents the accumulation of bodily alterations over time. Depression and chronic pain are highly prevalent in elderly populations. It is estimated that 13% of the elderly population will suffer simultaneously from the two conditions. Accumulating evidence suggests than neuroinflammation plays a critical role in the pathogenesis of both depression and chronic pain. Apart from the common pathophysiological mechanisms, however, the two entities have several clinical links. Their management is challenging for the pain physician; however, both pharmacologic and nonpharmacologic approaches are available and can be used when the two conditions are comorbid in the elderly patients. PMID:28461745

  20. WIS Implementation Study Report. Volume 2. Resumes.

    DTIC Science & Technology

    1983-10-01

    WIS modernization that major attention be paid to interface definition and design, system integra- tion and test , and configuration management of the...Estimates -- Computer Corporation of America -- 155 Test Processing Systems -- Newburyport Computer Associates, Inc. -- 183 Cluster II Papers-- Standards...enhancements of the SPL/I compiler system, development of test systems for the verification of SDEX/M and the timing and architecture of the AN/U YK-20 and

  1. Man Portable Vector EMI Sensor for Full UXO Characterization

    DTIC Science & Technology

    2012-05-01

    with project management and coordination. Drs. Laurens Beran, Leonard Pasion , and Stephen Billings advised on technical aspects and Dr. Gregory Schultz...approximated as a point dipole (e.g., Bell et al., 2001; Pasion and Oldenburg, 2001; Gasperikova et al., 2009). The process of estimating the target...39, 1286–1293. Bell, T. 2005. Geo-location Requirements for UXO Discrimination. SERDP Geo-location Workshop. Billings, S., L. Pasion , N. Lhomme

  2. Design and Development of a User Interface for the Dynamic Model of Software Project Management.

    DTIC Science & Technology

    1988-03-01

    rectory of the user’s choice for future...the last choice selected. Let us assume for the sake of this tour that the user has selected all eight choices . ESTIMATED ACTUAL PROJECT SIZE DEFINITION...manipulation of varaibles in the * •. TJin~ca model "h ... ser Inter ace for the Dynamica model was designed b in iterative process of prototyping

  3. Fuels planning: science synthesis and integration; forest structure and fire hazard fact sheet 01: forest structure and fire hazard overview

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Many managers and policymakers guided by the National Environmental Policy Act process want to understand the scientific principles on which they can base fuel treatments for reducing the size and severity of wildfires. These Forest Structure and Fire Hazard fact sheets discuss how to estimate fire hazard, how to visualize fuel treatments, and how the role of...

  4. Preliminary design of a redundant strapped down inertial navigation unit using two-degree-of-freedom tuned-gimbal gyroscopes

    NASA Technical Reports Server (NTRS)

    1976-01-01

    This redundant strapdown INS preliminary design study demonstrates the practicality of a skewed sensor system configuration by means of: (1) devising a practical system mechanization utilizing proven strapdown instruments, (2) thoroughly analyzing the skewed sensor redundancy management concept to determine optimum geometry, data processing requirements, and realistic reliability estimates, and (3) implementing the redundant computers into a low-cost, maintainable configuration.

  5. End-to-end modeling as part of an integrated research program in the Bering Sea

    NASA Astrophysics Data System (ADS)

    Punt, André E.; Ortiz, Ivonne; Aydin, Kerim Y.; Hunt, George L.; Wiese, Francis K.

    2016-12-01

    Traditionally, the advice provided to fishery managers has focused on the trade-offs between short- and long-term yields, and between future resource size and expected future catches. The harvest control rules that are used to provide management advice consequently relate catches to stock biomass levels expressed relative to reference biomass levels. There are, however, additional trade-offs. Ecosystem-based fisheries management (EBFM) aims to consider fish and fisheries in their ecological context, taking into account physical, biological, economic, and social factors. However, making EBFM operational remains challenging. It is generally recognized that end-to-end modeling should be a key part of implementing EBFM, along with harvest control rules that use information in addition to estimates of stock biomass to provide recommendations for management actions. Here we outline the process for selecting among alternative management strategies in an ecosystem context and summarize a Field-integrated End-To-End modeling program, or FETE, intended to implement this process as part of the Bering Sea Project. A key aspect of this project was that, from the start, the FETE included a management strategy evaluation component to compare management strategies. Effective use of end-to-end modeling requires that the models developed for a system are indeed integrated across climate drivers, lower trophic levels, fish population dynamics, and fisheries and their management. We summarize the steps taken by the program managers to promote integration of modeling efforts by multiple investigators and highlight the lessons learned during the project that can be used to guide future use and design of end-to-end models.

  6. Modeling with uncertain science: estimating mitigation credits from abating lead poisoning in Golden Eagles.

    PubMed

    Fitts Cochrane, Jean; Lonsdorf, Eric; Allison, Taber D; Sanders-Reed, Carol A

    2015-09-01

    Challenges arise when renewable energy development triggers "no net loss" policies for protected species, such as where wind energy facilities affect Golden Eagles in the western United States. When established mitigation approaches are insufficient to fully avoid or offset losses, conservation goals may still be achievable through experimental implementation of unproven mitigation methods provided they are analyzed within a framework that deals transparently and rigorously with uncertainty. We developed an approach to quantify and analyze compensatory mitigation that (1) relies on expert opinion elicited in a thoughtful and structured process to design the analysis (models) and supplement available data, (2) builds computational models as hypotheses about cause-effect relationships, (3) represents scientific uncertainty in stochastic model simulations, (4) provides probabilistic predictions of "relative" mortality with and without mitigation, (5) presents results in clear formats useful to applying risk management preferences (regulatory standards) and selecting strategies and levels of mitigation for immediate action, and (6) defines predictive parameters in units that could be monitored effectively, to support experimental adaptive management and reduction in uncertainty. We illustrate the approach with a case study characterized by high uncertainty about underlying biological processes and high conservation interest: estimating the quantitative effects of voluntary strategies to abate lead poisoning in Golden Eagles in Wyoming due to ingestion of spent game hunting ammunition.

  7. The role of local populations within a landscape context: Defining and classifying sources and sinks

    USGS Publications Warehouse

    Runge, J.P.; Runge, M.C.; Nichols, J.D.

    2006-01-01

    The interaction of local populations has been the focus of an increasing number of studies in the past 30 years. The study of source-sink dynamics has especially generated much interest. Many of the criteria used to distinguish sources and sinks incorporate the process of apparent survival (i.e., the combined probability of true survival and site fidelity) but not emigration. These criteria implicitly treat emigration as mortality, thus biasing the classification of sources and sinks in a manner that could lead to flawed habitat management. Some of the same criteria require rather restrictive assumptions about population equilibrium that, when violated, can also generate misleading inference. Here, we expand on a criterion (denoted ?contribution? or Cr) that incorporates successful emigration in differentiating sources and sinks and that makes no restrictive assumptions about dispersal or equilibrium processes in populations of interest. The metric Cr is rooted in the theory of matrix population models, yet it also contains clearly specified parameters that have been estimated in previous empirical research. We suggest that estimates of emigration are important for delineating sources and sinks and, more generally, for evaluating how local populations interact to generate overall system dynamics. This suggestion has direct implications for issues such as species conservation and habitat management.

  8. Regression sampling: some results for resource managers and researchers

    Treesearch

    William G. O' Regan; Robert W. Boyd

    1974-01-01

    Regression sampling is widely used in natural resources management and research to estimate quantities of resources per unit area. This note brings together results found in the statistical literature in the application of this sampling technique. Conditional and unconditional estimators are listed and for each estimator, exact variances and unbiased estimators for the...

  9. Effective sociodemographic population assessment of elusive species in ecology and conservation management.

    PubMed

    Head, Josephine S; Boesch, Christophe; Robbins, Martha M; Rabanal, Luisa I; Makaga, Loïc; Kühl, Hjalmar S

    2013-09-01

    Wildlife managers are urgently searching for improved sociodemographic population assessment methods to evaluate the effectiveness of implemented conservation activities. These need to be inexpensive, appropriate for a wide spectrum of species and straightforward to apply by local staff members with minimal training. Furthermore, conservation management would benefit from single approaches which cover many aspects of population assessment beyond only density estimates, to include for instance social and demographic structure, movement patterns, or species interactions. Remote camera traps have traditionally been used to measure species richness. Currently, there is a rapid move toward using remote camera trapping in density estimation, community ecology, and conservation management. Here, we demonstrate such comprehensive population assessment by linking remote video trapping, spatially explicit capture-recapture (SECR) techniques, and other methods. We apply it to three species: chimpanzees Pan troglodytes troglodytes, gorillas Gorilla gorilla gorilla, and forest elephants Loxodonta cyclotis in Loango National Park, Gabon. All three species exhibited considerable heterogeneity in capture probability at the sex or group level and density was estimated at 1.72, 1.2, and 1.37 individuals per km(2) and male to female sex ratios were 1:2.1, 1:3.2, and 1:2 for chimpanzees, gorillas, and elephants, respectively. Association patterns revealed four, eight, and 18 independent social groups of chimpanzees, gorillas, and elephants, respectively: key information for both conservation management and studies on the species' ecology. Additionally, there was evidence of resident and nonresident elephants within the study area and intersexual variation in home range size among elephants but not chimpanzees. Our study highlights the potential of combining camera trapping and SECR methods in conducting detailed population assessments that go far beyond documenting species diversity patterns or estimating single species population size. Our study design is widely applicable to other species and spatial scales, and moderately trained staff members can collect and process the required data. Furthermore, assessments using the same method can be extended to include several other ecological, behavioral, and demographic aspects: fission and fusion dynamics and intergroup transfers, birth and mortality rates, species interactions, and ranging patterns.

  10. Effective sociodemographic population assessment of elusive species in ecology and conservation management

    PubMed Central

    Head, Josephine S; Boesch, Christophe; Robbins, Martha M; Rabanal, Luisa I; Makaga, Loïc; Kühl, Hjalmar S

    2013-01-01

    Wildlife managers are urgently searching for improved sociodemographic population assessment methods to evaluate the effectiveness of implemented conservation activities. These need to be inexpensive, appropriate for a wide spectrum of species and straightforward to apply by local staff members with minimal training. Furthermore, conservation management would benefit from single approaches which cover many aspects of population assessment beyond only density estimates, to include for instance social and demographic structure, movement patterns, or species interactions. Remote camera traps have traditionally been used to measure species richness. Currently, there is a rapid move toward using remote camera trapping in density estimation, community ecology, and conservation management. Here, we demonstrate such comprehensive population assessment by linking remote video trapping, spatially explicit capture–recapture (SECR) techniques, and other methods. We apply it to three species: chimpanzees Pan troglodytes troglodytes, gorillas Gorilla gorilla gorilla, and forest elephants Loxodonta cyclotis in Loango National Park, Gabon. All three species exhibited considerable heterogeneity in capture probability at the sex or group level and density was estimated at 1.72, 1.2, and 1.37 individuals per km2 and male to female sex ratios were 1:2.1, 1:3.2, and 1:2 for chimpanzees, gorillas, and elephants, respectively. Association patterns revealed four, eight, and 18 independent social groups of chimpanzees, gorillas, and elephants, respectively: key information for both conservation management and studies on the species' ecology. Additionally, there was evidence of resident and nonresident elephants within the study area and intersexual variation in home range size among elephants but not chimpanzees. Our study highlights the potential of combining camera trapping and SECR methods in conducting detailed population assessments that go far beyond documenting species diversity patterns or estimating single species population size. Our study design is widely applicable to other species and spatial scales, and moderately trained staff members can collect and process the required data. Furthermore, assessments using the same method can be extended to include several other ecological, behavioral, and demographic aspects: fission and fusion dynamics and intergroup transfers, birth and mortality rates, species interactions, and ranging patterns. PMID:24101982

  11. SU-F-P-19: Fetal Dose Estimate for a High-Dose Fluoroscopy Guided Intervention Using Modern Data Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moirano, J

    Purpose: An accurate dose estimate is necessary for effective patient management after a fetal exposure. In the case of a high-dose exposure, it is critical to use all resources available in order to make the most accurate assessment of the fetal dose. This work will demonstrate a methodology for accurate fetal dose estimation using tools that have recently become available in many clinics, and show examples of best practices for collecting data and performing the fetal dose calculation. Methods: A fetal dose estimate calculation was performed using modern data collection tools to determine parameters for the calculation. The reference pointmore » air kerma as displayed by the fluoroscopic system was checked for accuracy. A cumulative dose incidence map and DICOM header mining were used to determine the displayed reference point air kerma. Corrections for attenuation caused by the patient table and pad were measured and applied in order to determine the peak skin dose. The position and depth of the fetus was determined by ultrasound imaging and consultation with a radiologist. The data collected was used to determine a normalized uterus dose from Monte Carlo simulation data. Fetal dose values from this process were compared to other accepted calculation methods. Results: An accurate high-dose fetal dose estimate was made. Comparison to accepted legacy methods were were within 35% of estimated values. Conclusion: Modern data collection and reporting methods ease the process for estimation of fetal dose from interventional fluoroscopy exposures. Many aspects of the calculation can now be quantified rather than estimated, which should allow for a more accurate estimation of fetal dose.« less

  12. Closed Fuel Cycle Waste Treatment Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vienna, J. D.; Collins, E. D.; Crum, J. V.

    This study is aimed at evaluating the existing waste management approaches for nuclear fuel cycle facilities in comparison to the objectives of implementing an advanced fuel cycle in the U.S. under current legal, regulatory, and logistical constructs. The study begins with the Global Nuclear Energy Partnership (GNEP) Integrated Waste Management Strategy (IWMS) (Gombert et al. 2008) as a general strategy and associated Waste Treatment Baseline Study (WTBS) (Gombert et al. 2007). The tenets of the IWMS are equally valid to the current waste management study. However, the flowsheet details have changed significantly from those considered under GNEP. In addition, significantmore » additional waste management technology development has occurred since the GNEP waste management studies were performed. This study updates the information found in the WTBS, summarizes the results of more recent technology development efforts, and describes waste management approaches as they apply to a representative full recycle reprocessing flowsheet. Many of the waste management technologies discussed also apply to other potential flowsheets that involve reprocessing. These applications are occasionally discussed where the data are more readily available. The report summarizes the waste arising from aqueous reprocessing of a typical light-water reactor (LWR) fuel to separate actinides for use in fabricating metal sodium fast reactor (SFR) fuel and from electrochemical reprocessing of the metal SFR fuel to separate actinides for recycle back into the SFR in the form of metal fuel. The primary streams considered and the recommended waste forms include; Tritium in low-water cement in high integrity containers (HICs); Iodine-129: As a reference case, a glass composite material (GCM) formed by the encapsulation of the silver Mordenite (AgZ) getter material in a low-temperature glass is assumed. A number of alternatives with distinct advantages are also considered including a fused silica waste form with encapsulated nano-sized AgI crystals; Carbon-14 immobilized as a CaCO3 in a cement waste form; Krypton-85 stored as a compressed gas; An aqueous reprocessing high-level waste (HLW) raffinate waste immobilized by the vitrification process; An undissolved solids (UDS) fraction from aqueous reprocessing of LWR fuel either included in the borosilicate HLW glass or immobilized in the form of a metal alloy or titanate ceramics; Zirconium-based LWR fuel cladding hulls and stainless steel (SS) fuel assembly hardware super-compacted for disposal or purified for reuse (or disposal as low-level waste, LLW) of Zr by reactive gas separations; Electrochemical process salt HLW incorporated into a glass bonded Sodalite waste form; and Electrochemical process UDS and SS cladding hulls melted into an iron based alloy waste form. Mass and volume estimates for each of the recommended waste forms based on the source terms from a representative flowsheet are reported. In addition to the above listed primary waste streams, a range of secondary process wastes are generated by aqueous reprocessing of LWR fuel, metal SFR fuel fabrication, and electrochemical reprocessing of SFR fuel. These secondary wastes have been summarized and volumes estimated by type and classification. The important waste management data gaps and research needs have been summarized for each primary waste stream and selected waste process.« less

  13. Energy-balanced algorithm for RFID estimation

    NASA Astrophysics Data System (ADS)

    Zhao, Jumin; Wang, Fangyuan; Li, Dengao; Yan, Lijuan

    2016-10-01

    RFID has been widely used in various commercial applications, ranging from inventory control, supply chain management to object tracking. It is necessary for us to estimate the number of RFID tags deployed in a large area periodically and automatically. Most of the prior works use passive tags to estimate and focus on designing time-efficient algorithms that can estimate tens of thousands of tags in seconds. But for a RFID reader to access tags in a large area, active tags are likely to be used due to their longer operational ranges. But these tags use their own battery as energy supplier. Hence, conserving energy for active tags becomes critical. Some prior works have studied how to reduce energy expenditure of a RFID reader when it reads tags IDs. In this paper, we study how to reduce the amount of energy consumed by active tags during the process of estimating the number of tags in a system and make the energy every tag consumed balanced approximately. We design energy-balanced estimation algorithm that can achieve our goal we mentioned above.

  14. Estimation and evaluation of management options to control and/or reduce the risk of not complying with commercial sterility.

    PubMed

    Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2015-11-20

    In a previous study, a modular process risk model, from the raw material reception to the final product storage, was built to estimate the risk of a UHT-aseptic line of not complying with commercial sterility (Pujol et al., 2015). This present study was focused on demonstrating how the model (updated version with uncertainty and variability separated and 2(nd) order Monte Carlo procedure run) could be used to assess quantitatively the influence of management options. This assessment was done in three steps: pinpoint which process step had the highest influence on the risk, identify which management option(s) could be the most effective to control and/or reduce the risk, and finally evaluate quantitatively the influence of changing process setting(s) on the risk. For Bacillus cereus, it was identified that during post-process storage in an aseptic tank, there was potentially an air re-contamination due to filter efficiency loss (efficiency loss due to successive in-place sterilizations after cleaning operations), followed by B. cereus growth. Two options were then evaluated: i) reducing by one fifth of the number of filter sterilizations before renewing the filters, ii) designing new UHT-aseptic lines without an aseptic tank, i.e. without a storage period after the thermal process and before filling. Considering the uncertainty in the model, it was not possible to confirm whether these options had a significant influence on the risk associated with B. cereus. On the other hand, for Geobacillus stearothermophilus, combinations of heat-treatment time and temperature enabling the control or reduction in risk by a factor of ca. 100 were determined; for ease of operational implementation, they were presented graphically in the form of iso-risk curves. For instance, it was established that a heat treatment of 138°C for 31s (instead of 138°C for 25s) enabled a reduction in risk to 18×10(-8) (95% CI=[10; 34]×10(-8)), instead of 578×10(-8) (95% CI=[429; 754]×10(-8)) initially. In conclusion, a modular risk model, as the one exemplified here with a UHT-aseptic line, is a valuable tool in process design and operation, bringing definitive quantitative elements into the decision making process. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Battery Power Management in Heavy-duty HEVs based on the Estimated Critical Surface Charge

    DTIC Science & Technology

    2011-03-01

    health prospects without any penalty on fuel efficiency. Keywords: Lithium - ion battery ; power management; critical surface charge; Lithium-ion...fuel efficiency. 15. SUBJECT TERMS Lithium - ion battery ; power management; critical surface charge; Lithium-ion concentration; estimation; extended...Di Domenico, D., Fiengo, G., and Stefanopoulou, A. (2008) ’ Lithium - ion battery state of charge estimation with a kalman filter based on a

  16. The cost-effectiveness of syndromic management in pharmacies in Lima, Peru.

    PubMed

    Adams, Elisabeth J; Garcia, Patricia J; Garnett, Geoffrey P; Edmunds, W John; Holmes, King K

    2003-05-01

    Many people with sexually transmitted diseases (STDs) in Lima, Peru, seek treatment in pharmacies. The goal was to assess the cost-effectiveness of training pharmacy workers in syndromic management of STDs. Cost-effectiveness from both the program and societal perspectives was determined on the basis of study costs, societal costs (cost of medicine), and the number of cases adequately managed. The latter was calculated from estimated incidence, proportion of symptomatic patients, proportion seeking treatment in pharmacies, and proportion of cases adequately managed in both comparison and intervention districts. Univariate and multivariate sensitivity analyses were performed. Under base-case assumptions, from the societal perspective the intervention saved an estimated US$1.51 per case adequately managed; from the program perspective, it cost an estimated US$3.67 per case adequately managed. In the sensitivity analyses, the proportion of females with vaginal discharge or pelvic inflammatory disease who seek treatment in pharmacies had the greatest impact on the estimated cost-effectiveness, along with the medication costs under the societal perspective. Training pharmacists in syndromic management of STDs appears to be cost-effective when only program costs are used and cost-saving from the societal perspective. Our methods provide a template for assessing the cost-effectiveness of managing STD syndromes, on the basis of indirect estimates of effectiveness.

  17. Spatio-temporal population estimates for risk management

    NASA Astrophysics Data System (ADS)

    Cockings, Samantha; Martin, David; Smith, Alan; Martin, Rebecca

    2013-04-01

    Accurate estimation of population at risk from hazards and effective emergency management of events require not just appropriate spatio-temporal modelling of hazards but also of population. While much recent effort has been focused on improving the modelling and predictions of hazards (both natural and anthropogenic), there has been little parallel advance in the measurement or modelling of population statistics. Different hazard types occur over diverse temporal cycles, are of varying duration and differ significantly in their spatial extent. Even events of the same hazard type, such as flood events, vary markedly in their spatial and temporal characteristics. Conceptually and pragmatically then, population estimates should also be available for similarly varying spatio-temporal scales. Routine population statistics derived from traditional censuses or surveys are usually static representations in both space and time, recording people at their place of usual residence on census/survey night and presenting data for administratively defined areas. Such representations effectively fix the scale of population estimates in both space and time, which is unhelpful for meaningful risk management. Over recent years, the Pop24/7 programme of research, based at the University of Southampton (UK), has developed a framework for spatio-temporal modelling of population, based on gridded population surfaces. Based on a data model which is fully flexible in terms of space and time, the framework allows population estimates to be produced for any time slice relevant to the data contained in the model. It is based around a set of origin and destination centroids, which have capacities, spatial extents and catchment areas, all of which can vary temporally, such as by time of day, day of week, season. A background layer, containing information on features such as transport networks and landuse, provides information on the likelihood of people being in certain places at specific times. Unusual patterns associated with special events can also be modelled and the framework is fully volume preserving. Outputs from the model are gridded population surfaces for the specified time slice, either for total population or by sub-groups (e.g. age). Software to implement the models (SurfaceBuilder247) has been developed and pre-processed layers for typical time slices for England and Wales in 2001 and 2006 are available for UK academic purposes. The outputs and modelling framework from the Pop24/7 programme provide significant opportunities for risk management applications. For estimates of mid- to long-term cumulative population exposure to hazards, such as in flood risk mapping, populations can be produced for numerous time slices and integrated with flood models. For applications in emergency response/ management, time-specific population models can be used as seeds for agent-based models or other response/behaviour models. Estimates for sub-groups of the population also permit exploration of vulnerability through space and time. This paper outlines the requirements for effective spatio-temporal population models for risk management. It then describes the Pop24/7 framework and illustrates its potential for risk management through presentation of examples from natural and anthropogenic hazard applications. The paper concludes by highlighting key challenges for future research in this area.

  18. Effects of Climate Change on Extreme Streamflow Risks in the Olympic National Park

    NASA Astrophysics Data System (ADS)

    Tohver, I. M.; Lee, S.; Hamlet, A.

    2011-12-01

    Conventionally, natural resource management practices are designed within the framework that past conditions serve as a baseline for future conditions. However, the warmer future climate projected for the Pacific Northwest will alter the region's flood and low flow risks, posing considerable challenges to resource managers in the Olympic National Forest (ONF) and Olympic National Park (ONP). Shifts in extreme streamflow will influence two key management objectives in the ONF and ONP: the protection of wildlife and the maintenance of road infrastructure. The ONF is charged with managing habitat for species listed under the Endangered Species Act (ESA), and with maintaining the network of forest roads and culverts. Climate-induced increases in flood severity will introduce additional challenges in road and culvert design. Furthermore, the aging road infrastructure and more extreme summer low flows will compromise aquatic habitats, intrinsic to the health of threatened and endangered fish species listed under the ESA. Current practice uses estimates of Q100 (or the peak flow with an estimated 100 year return frequency) as the standard metric for stream crossing design. Simple regression models relating annual precipitation and basin area to Q100 are used in the design process. Low flow estimates are based on historical streamflow data to calculate the 7-day consecutive lowest flow with a 10-year return interval, or 7Q10. Under the projections a changing climate, these methods for estimating extreme flows are ill equipped to capture the complex and spatially varying effects of seasonal changes in temperature, precipitation, and snowpack on extreme flow risk. As an alternative approach, this study applies a physically-based hydrologic model to estimate historical and future flood risk at 1/16th degree (latitude/longitude) resolution (about 32 km2). We downscaled climate data derived from 10 global climate models to use as input for the Variable Infiltration Capacity (VIC) model, a macro-scale hydrologic model, which simulates various hydrologic variables at a daily time step. Using the VIC estimates for baseflow and run-off, we calculated Q100 and 7Q10 for the historical period and under two emission scenarios, A1B and B1, at three future time intervals: the 2020s, the 2040s and the 2080s. We also calculated Q100 and 7Q10 at the spatial scale of the 12-digit hydrologic unit codes (HUCs) as delineated by the United States Geologic Survey. The results demonstrate the sensitivity of snowpack at mid-elevation basins to a warmer climate, resulting in more severe winter flooding and lower streamflows in the summertime. These ensemble estimates of extreme streamflows will serve as a tool for management practices by providing high-resolution maps of changing risk over the ONF and ONP.

  19. Improving waterfowl production estimates: Results of a test in the prairie pothole region

    USGS Publications Warehouse

    Arnold, P.M.; Cowardin, L.M.

    1985-01-01

    The U.S. Fish and Wildlife Service in an effort to improve and standardize methods for estimating waterfowl production tested a new technique in the four-county Arrowwood Wetland Management District (WMD) for three years (1982-1984). On 14 randomly selected 10.36 km2 plots, upland and wetland habitat was mapped, classified, and digitized. Waterfowl breeding pairs were counted twice each year and the proportion of wetland basins containing water was determined. Pair numbers and habitat conditions were entered into a computer model developed by Northern Prairie Wildlife Research Center. That model estimates production on small federally owned wildlife tracts, federal wetland easements, and private land. Results indicate that production estimates were most accurate for mallards (Anas platyrhynchos), the species for which the computer model and data base were originally designed. Predictions for the pintail (Anas acuta), gadwall (A. strepa), blue-winged teal (A. discors), and northern shoveler (A. clypeata) were believed to be less accurate. Modeling breeding period dynamics of a waterfowl species and making credible production estimates for a geographic area are possible if the data used in the model are adequate. The process of modeling the breeding period of a species aids in locating areas of insufficient biological knowledge. This process will help direct future research efforts and permit more efficient gathering of field data.

  20. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods.

    PubMed

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis.

  1. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  2. Use of IDEF modeling to develop an information management system for drug and alcohol outpatient treatment clinics

    NASA Astrophysics Data System (ADS)

    Hoffman, Kenneth J.

    1995-10-01

    Few information systems create a standardized clinical patient record in which there are discrete and concise observations of patient problems and their resolution. Clinical notes usually are narratives which don't support an aggregate and systematic outcome analysis. Many programs collect information on diagnosis and coded procedures but are not focused on patient problems. Integrated definition (IDEF) methodology has been accepted by the Department of Defense as part of the Corporate Information Management Initiative and serves as the foundation that establishes a need for automation. We used IDEF modeling to describe present and idealized patient care activities. A logical IDEF data model was created to support those activities. The modeling process allows for accurate cost estimates based upon performed activities, efficient collection of relevant information, and outputs which allow real- time assessments of process and outcomes. This model forms the foundation for a prototype automated clinical information system (ACIS).

  3. The evolution of a health hazard assessment database management system for military weapons, equipment, and materiel.

    PubMed

    Murnyak, George R; Spencer, Clark O; Chaney, Ann E; Roberts, Welford C

    2002-04-01

    During the 1970s, the Army health hazard assessment (HHA) process developed as a medical program to minimize hazards in military materiel during the development process. The HHA Program characterizes health hazards that soldiers and civilians may encounter as they interact with military weapons and equipment. Thus, it is a resource for medical planners and advisors to use that can identify and estimate potential hazards that soldiers may encounter as they train and conduct missions. The U.S. Army Center for Health Promotion and Preventive Medicine administers the program, which is integrated with the Army's Manpower and Personnel Integration program. As the HHA Program has matured, an electronic database has been developed to record and monitor the health hazards associated with military equipment and systems. The current database tracks the results of HHAs and provides reporting designed to assist the HHA Program manager in daily activities.

  4. Measurement of historical cliff-top changes and estimation of future trends using GIS data between Bridlington and Hornsea - Holderness Coast (UK)

    NASA Astrophysics Data System (ADS)

    Castedo, Ricardo; de la Vega-Panizo, Rogelio; Fernández-Hernández, Marta; Paredes, Carlos

    2015-02-01

    A key requirement for effective coastal zone management is good knowledge of historical rates of change and the ability to predict future shoreline evolution, especially for rapidly eroding areas. Historical shoreline recession analysis was used for the prediction of future cliff shoreline positions along a section of 9 km between Bridlington and Hornsea, on the northern area of the Holderness Coast, UK. The analysis was based on historical maps and aerial photographs dating from 1852 to 2011 using the Digital Shoreline Analysis System (DSAS) 4.3, extension of ESRI's ArcInfo 10.×. The prediction of future shorelines was performed for the next 40 years using a variety of techniques, ranging from extrapolation from historical data, geometric approaches like the historical trend analysis, to a process-response numerical model that incorporates physically-based equations and geotechnical stability analysis. With climate change and sea-level rise implying that historical rates of change may not be a reliable guide for the future, enhanced visualization of the evolving coastline has the potential to improve awareness of these changing conditions. Following the IPCC, 2013 report, two sea-level rise rates, 2 mm/yr and 6 mm/yr, have been used to estimate future shoreline conditions. This study illustrated that good predictive models, once their limitations are estimated or at least defined, are available for use by managers, planners, engineers, scientists and the public to make better decisions regarding coastal management, development, and erosion-control strategies.

  5. Greenhouse gas emissions from alternative futures of deforestation and agricultural management in the southern Amazon.

    PubMed

    Galford, Gillian L; Melillo, Jerry M; Kicklighter, David W; Cronin, Timothy W; Cerri, Carlos E P; Mustard, John F; Cerri, Carlos C

    2010-11-16

    The Brazilian Amazon is one of the most rapidly developing agricultural areas in the world and represents a potentially large future source of greenhouse gases from land clearing and subsequent agricultural management. In an integrated approach, we estimate the greenhouse gas dynamics of natural ecosystems and agricultural ecosystems after clearing in the context of a future climate. We examine scenarios of deforestation and postclearing land use to estimate the future (2006-2050) impacts on carbon dioxide (CO(2)), methane (CH(4)), and nitrous oxide (N(2)O) emissions from the agricultural frontier state of Mato Grosso, using a process-based biogeochemistry model, the Terrestrial Ecosystems Model (TEM). We estimate a net emission of greenhouse gases from Mato Grosso, ranging from 2.8 to 15.9 Pg CO(2)-equivalents (CO(2)-e) from 2006 to 2050. Deforestation is the largest source of greenhouse gas emissions over this period, but land uses following clearing account for a substantial portion (24-49%) of the net greenhouse gas budget. Due to land-cover and land-use change, there is a small foregone carbon sequestration of 0.2-0.4 Pg CO(2)-e by natural forests and cerrado between 2006 and 2050. Both deforestation and future land-use management play important roles in the net greenhouse gas emissions of this frontier, suggesting that both should be considered in emissions policies. We find that avoided deforestation remains the best strategy for minimizing future greenhouse gas emissions from Mato Grosso.

  6. Integrating forest growth and harvesting cost models to improve forest management planning

    Treesearch

    J.E. Baumgras; C.B. LeDoux

    1991-01-01

    Two methods of estimating harvesting revenue--reported stumpage prices - and delivered prices minus estimated harvesting and haul costs were compared by estimating entry cash flows and rotation net present value for three simulated even-aged forest management options that included 1 to 3 thinnings over a 90 year rotation. Revenue estimates derived from stumpage prices...

  7. How to deal with the Poisson-gamma model to forecast patients' recruitment in clinical trials when there are pauses in recruitment dynamic?

    PubMed

    Minois, Nathan; Savy, Stéphanie; Lauwers-Cances, Valérie; Andrieu, Sandrine; Savy, Nicolas

    2017-03-01

    Recruiting patients is a crucial step of a clinical trial. Estimation of the trial duration is a question of paramount interest. Most techniques are based on deterministic models and various ad hoc methods neglecting the variability in the recruitment process. To overpass this difficulty the so-called Poisson-gamma model has been introduced involving, for each centre, a recruitment process modelled by a Poisson process whose rate is assumed constant in time and gamma-distributed. The relevancy of this model has been widely investigated. In practice, rates are rarely constant in time, there are breaks in recruitment (for instance week-ends or holidays). Such information can be collected and included in a model considering piecewise constant rate functions yielding to an inhomogeneous Cox model. The estimation of the trial duration is much more difficult. Three strategies of computation of the expected trial duration are proposed considering all the breaks, considering only large breaks and without considering breaks. The bias of these estimations procedure are assessed by means of simulation studies considering three scenarios of breaks simulation. These strategies yield to estimations with a very small bias. Moreover, the strategy with the best performances in terms of prediction and with the smallest bias is the one which does not take into account of breaks. This result is important as, in practice, collecting breaks data is pretty hard to manage.

  8. Physician practice participation in accountable care organizations: the emergence of the unicorn.

    PubMed

    Shortell, Stephen M; McClellan, Sean R; Ramsay, Patricia P; Casalino, Lawrence P; Ryan, Andrew M; Copeland, Kennon R

    2014-10-01

    To provide the first nationally based information on physician practice involvement in ACOs. Primary data from the third National Survey of Physician Organizations (January 2012-May 2013). We conducted a 40-minute phone survey in a sample of physician practices. A nationally representative sample of practices was surveyed in order to provide estimates of organizational characteristics, care management processes, ACO participation, and related variables for four major chronic illnesses. We evaluated the associations between ACO participation, organizational characteristics, and a 25-point index of patient-centered medical home processes. We found that 23.7 percent of physician practices (n = 280) reported joining an ACO; 15.7 percent (n = 186) were planning to become involved within the next 12 months and 60.6 percent (n = 717) reported no involvement and no plans to become involved. Larger practices, those receiving patients from an IPA and/or PHO, those that were physician-owned versus hospital/health system-owned, those located in New England, and those with greater patient-centered medical home (PCMH) care management processes were more likely to have joined an ACO. Physician practices that are currently participating in ACOs appear to be relatively large, or to be members of an IPA or PHO, are less likely to be hospital-owned and are more likely to use more care management processes than nonparticipating practices. © Health Research and Educational Trust.

  9. Reconstructing European forest management from 1600 to 2010

    NASA Astrophysics Data System (ADS)

    McGrath, M. J.; Luyssaert, S.; Meyfroidt, P.; Kaplan, J. O.; Bürgi, M.; Chen, Y.; Erb, K.; Gimmi, U.; McInerney, D.; Naudts, K.; Otto, J.; Pasztor, F.; Ryder, J.; Schelhaas, M.-J.; Valade, A.

    2015-07-01

    Because of the slow accumulation and long residence time of carbon in biomass and soils, the present state and future dynamics of temperate forests are influenced by management that took place centuries to millennia ago. Humans have exploited the forests of Europe for fuel, construction materials and fodder for the entire Holocene. In recent centuries, economic and demographic trends led to increases in both forest area and management intensity across much of Europe. In order to quantify the effects of these changes in forests and to provide a baseline for studies on future land-cover-climate interactions and biogeochemical cycling, we created a temporally and spatially resolved reconstruction of European forest management from 1600 to 2010. For the period 1600-1828, we took a supply-demand approach, in which supply was estimated on the basis of historical annual wood increment and land cover reconstructions. We made demand estimates by multiplying population with consumption factors for construction materials, household fuelwood, industrial food processing and brewing, metallurgy, and salt production. For the period 1829-2010, we used a supply-driven backcasting method based on national and regional statistics of forest age structure from the second half of the 20th century. Our reconstruction reproduces the most important changes in forest management between 1600 and 2010: (1) an increase of 593 000 km2 in conifers at the expense of deciduous forest (decreasing by 538 000 km2); (2) a 612 000 km2 decrease in unmanaged forest; (3) a 152 000 km2 decrease in coppice management; (4) a 818 000 km2 increase in high-stand management; and (5) the rise and fall of litter raking, which at its peak in 1853 resulted in the removal of 50 Tg dry litter per year.

  10. Timber production assessment of a plantation forest: An integrated framework with field-based inventory, multi-source remote sensing data and forest management history

    NASA Astrophysics Data System (ADS)

    Gao, Tian; Zhu, Jiaojun; Deng, Songqiu; Zheng, Xiao; Zhang, Jinxin; Shang, Guiduo; Huang, Liyan

    2016-10-01

    Timber production is the purpose for managing plantation forests, and its spatial and quantitative information is critical for advising management strategies. Previous studies have focused on growing stock volume (GSV), which represents the current potential of timber production, yet few studies have investigated historical process-harvested timber. This resulted in a gap in a synthetical ecosystem service assessment of timber production. In this paper, we established a Management Process-based Timber production (MPT) framework to integrate the current GSV and the harvested timber derived from historical logging regimes, trying to synthetically assess timber production for a historical period. In the MPT framework, age-class and current GSV determine the times of historical thinning and the corresponding harvested timber, by using a ;space-for-time; substitution. The total timber production can be estimated by the historical harvested timber in each thinning and the current GSV. To test this MPT framework, an empirical study on a larch plantation (LP) with area of 43,946 ha was conducted in North China for a period from 1962 to 2010. Field-based inventory data was integrated with ALOS PALSAR (Advanced Land-Observing Satellite Phased Array L-band Synthetic Aperture Radar) and Landsat-8 OLI (Operational Land Imager) data for estimating the age-class and current GSV of LP. The random forest model with PALSAR backscatter intensity channels and OLI bands as input predictive variables yielded an accuracy of 67.9% with a Kappa coefficient of 0.59 for age-class classification. The regression model using PALSAR data produced a root mean square error (RMSE) of 36.5 m3 ha-1. The total timber production of LP was estimated to be 7.27 × 106 m3, with 4.87 × 106 m3 in current GSV and 2.40 × 106 m3 in harvested timber through historical thinning. The historical process-harvested timber accounts to 33.0% of the total timber production, which component has been neglected in the assessments for current status of plantation forests. Synthetically considering the RMSE for predictive GSV and misclassification of age-class, the error in timber production were supposed to range from -55.2 to 56.3 m3 ha-1. The MPT framework can be used to assess timber production of other tree species at a larger spatial scale, providing crucial information for a better understanding of forest ecosystem service.

  11. Health Systems Readiness to Manage the Hypertension Epidemic in Primary Health Care Facilities in the Western Cape, South Africa: A Study Protocol.

    PubMed

    Deuboué Tchialeu, Rodrigue Innocent; Yaya, Sanni; Labonté, Ronald

    2016-02-29

    Developing countries are undergoing a process of epidemiological transition from infectious to noncommunicable diseases, described by the United Nations Secretary General Ban Ki-Moon as ''a public health emergency in slow motion." One of the most prevalent in sub-Saharan Africa is hypertension, which is a complex chronic condition often referred to as a "silent killer" and key contributor to the development of cardiovascular and cerebrovascular diseases. Hypertensive patients in this setting are estimated to increase from 74.7 million in 2008 to 125.5 million in 2025, a 68% increase. However, there is an important gap between emerging high-level policies and recommendations, and the near-absence of practical guidance and experience delivering long-term medical care for noncommunicable diseases within resource-limited health systems. To address this gap, our study will consist of field investigations to determine the minimum health systems requirements to ensure successful delivery of antihypertensive medications when scaling-up interventions to control the hypertension epidemic. A cross-sectional analytic study will be conducted in the Western Cape using a mixed-method approach with two semistructured interview guides. The first will be for health professionals involved in the care of hypertensive patients within at least 6 community health centers (3 urban and 3 rural) to understand the challenges associated with their care. The second will be to map and assess the current supply chain management system of antihypertensive medications by interviewing key informants at different levels of the processes. Finally, modeling and simulation tools will be used to understand how to estimate minimum numbers of health workers required at each supply chain interval to ensure successful delivery of medications when scaling-up interventions. Funding for the study was secured through a Doctoral Research Award in October 2014 from the International Development Research Centre (IDRC). The study is currently in the data analysis phase and results are expected during the first half of 2016. This investigation will highlight the detailed processes in place for the care of hypertensive patients in primary health care facilities, and thus also identify the challenges. It will also describe the drug supply chain management systems in place and identify their strengths and weaknesses. The findings, along with the estimates from modeling and simulation, will inform the health system minimum requirements to scale-up interventions to manage and control the hypertension epidemic in the Western Cape province of South Africa.

  12. Recycling of glass: accounting of greenhouse gases and global warming contributions.

    PubMed

    Larsen, Anna W; Merrild, Hanna; Christensen, Thomas H

    2009-11-01

    Greenhouse gas (GHG) emissions related to recycling of glass waste were assessed from a waste management perspective. Focus was on the material recovery facility (MRF) where the initial sorting of glass waste takes place. The MRF delivers products like cullet and whole bottles to other industries. Two possible uses of reprocessed glass waste were considered: (i) remelting of cullet added to glass production; and (ii) re-use of whole bottles. The GHG emission accounting included indirect upstream emissions (provision of energy, fuels and auxiliaries), direct activities at the MRF and bottle-wash facility (combustion of fuels) as well as indirect downstream activities in terms of using the recovered glass waste in other industries and, thereby, avoiding emissions from conventional production. The GHG accounting was presented as aggregated global warming factors (GWFs) for the direct and indirect upstream and downstream processes, respectively. The range of GWFs was estimated to 0-70 kg CO(2)eq. tonne( -1) of glass waste for the upstream activities and the direct emissions from the waste management system. The GWF for the downstream effect showed some significant variation between the two cases. It was estimated to approximately -500 kg CO(2)-eq. tonne(- 1) of glass waste for the remelting technology and -1500 to -600 kg CO(2)-eq. tonne(-1) of glass waste for bottle re-use. Including the downstream process, large savings of GHG emissions can be attributed to the waste management system. The results showed that, in GHG emission accounting, attention should be drawn to thorough analysis of energy sources, especially electricity, and the downstream savings caused by material substitution.

  13. Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.

    PubMed

    Yago, Martín; Alcover, Silvia

    2016-07-01

    According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.

  14. Regional variability of nitrate fluxes in the unsaturated zone and groundwater, Wisconsin, USA

    USGS Publications Warehouse

    Green, Christopher T.; Liao, Lixia; Nolan, Bernard T.; Juckem, Paul F.; Shope, Christopher L.; Tesoriero, Anthony J.; Jurgens, Bryant

    2018-01-01

    Process-based modeling of regional NO3− fluxes to groundwater is critical for understanding and managing water quality, but the complexity of NO3− reactive transport processes make implementation a challenge. This study introduces a regional vertical flux method (VFM) for efficient estimation of reactive transport of NO3− in the vadose zone and groundwater. The regional VFM was applied to 443 well samples in central-eastern Wisconsin. Chemical measurements included O2, NO3−, N2 from denitrification, and atmospheric tracers of groundwater age including carbon-14, chlorofluorocarbons, tritium, and tritiogenic helium. VFM results were consistent with observed chemistry, and calibrated parameters were in-line with estimates from previous studies. Results indicated that (1) unsaturated zone travel times were a substantial portion of the transit time to wells and streams (2) since 1945 fractions of applied N leached to groundwater have increased for manure-N, possibly due to increased injection of liquid manure, and decreased for fertilizer-N, and (3) under current practices and conditions, approximately 60% of the shallow aquifer will eventually be affected by downward migration of NO3−, with denitrification protecting the remaining 40%. Recharge variability strongly affected the unsaturated zone lag times and the eventual depth of the NO3− front. Principal components regression demonstrated that VFM parameters and predictions were significantly correlated with hydrogeochemical landscape features. The diverse and sometimes conflicting aspects of N management (e.g. limiting N volatilization versus limiting N losses to groundwater) warrant continued development of large-scale holistic strategies to manage water quality and quantity.

  15. The role of reservoir storage in large-scale surface water availability analysis for Europe

    NASA Astrophysics Data System (ADS)

    Garrote, L. M.; Granados, A.; Martin-Carrasco, F.; Iglesias, A.

    2017-12-01

    A regional assessment of current and future water availability in Europe is presented in this study. The assessment was made using the Water Availability and Adaptation Policy Analysis (WAAPA) model. The model was built on the river network derived from the Hydro1K digital elevation maps, including all major river basins of Europe. Reservoir storage volume was taken from the World Register of Dams of ICOLD, including all dams with storage capacity over 5 hm3. Potential Water Availability is defined as the maximum amount of water that could be supplied at a certain point of the river network to satisfy a regular demand under pre-specified reliability requirements. Water availability is the combined result of hydrological processes, which determine streamflow in natural conditions, and human intervention, which determines the available hydraulic infrastructure to manage water and establishes water supply conditions through operating rules. The WAAPA algorithm estimates the maximum demand that can be supplied at every node of the river network accounting for the regulation capacity of reservoirs under different management scenarios. The model was run for a set of hydrologic scenarios taken from the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP), where the PCRGLOBWB hydrological model was forced with results from five global climate models. Model results allow the estimation of potential water stress by comparing water availability to projections of water abstractions along the river network under different management alternatives. The set of sensitivity analyses performed showed the effect of policy alternatives on water availability and highlighted the large uncertainties linked to hydrological and anthropological processes.

  16. Regional Variability of Nitrate Fluxes in the Unsaturated Zone and Groundwater, Wisconsin, USA

    NASA Astrophysics Data System (ADS)

    Green, Christopher T.; Liao, Lixia; Nolan, Bernard T.; Juckem, Paul F.; Shope, Christopher L.; Tesoriero, Anthony J.; Jurgens, Bryant C.

    2018-01-01

    Process-based modeling of regional NO3- fluxes to groundwater is critical for understanding and managing water quality, but the complexity of NO3- reactive transport processes makes implementation a challenge. This study introduces a regional vertical flux method (VFM) for efficient estimation of reactive transport of NO3- in the vadose zone and groundwater. The regional VFM was applied to 443 well samples in central-eastern Wisconsin. Chemical measurements included O2, NO3-, N2 from denitrification, and atmospheric tracers of groundwater age including carbon-14, chlorofluorocarbons, tritium, and tritiogenic helium. VFM results were consistent with observed chemistry, and calibrated parameters were in-line with estimates from previous studies. Results indicated that (1) unsaturated zone travel times were a substantial portion of the transit time to wells and streams, (2) since 1945 fractions of applied N leached to groundwater have increased for manure-N, possibly due to increased injection of liquid manure, and decreased for fertilizer-N, and (3) under current practices and conditions, approximately 60% of the shallow aquifer will eventually be affected by downward migration of NO3-, with denitrification protecting the remaining 40%. Recharge variability strongly affected the unsaturated zone lag times and the eventual depth of the NO3- front. Principal components regression demonstrated that VFM parameters and predictions were significantly correlated with hydrogeochemical landscape features. The diverse and sometimes conflicting aspects of N management (e.g., limiting N volatilization versus limiting N losses to groundwater) warrant continued development of large-scale holistic strategies to manage water quality and quantity.

  17. Estimating Limit Reference Points for Western Pacific Leatherback Turtles (Dermochelys coriacea) in the U.S. West Coast EEZ

    PubMed Central

    Curtis, K. Alexandra; Moore, Jeffrey E.; Benson, Scott R.

    2015-01-01

    Biological limit reference points (LRPs) for fisheries catch represent upper bounds that avoid undesirable population states. LRPs can support consistent management evaluation among species and regions, and can advance ecosystem-based fisheries management. For transboundary species, LRPs prorated by local abundance can inform local management decisions when international coordination is lacking. We estimated LRPs for western Pacific leatherbacks in the U.S. West Coast Exclusive Economic Zone (WCEEZ) using three approaches with different types of information on local abundance. For the current application, the best-informed LRP used a local abundance estimate derived from nest counts, vital rate information, satellite tag data, and fishery observer data, and was calculated with a Potential Biological Removal estimator. Management strategy evaluation was used to set tuning parameters of the LRP estimators to satisfy risk tolerances for falling below population thresholds, and to evaluate sensitivity of population outcomes to bias in key inputs. We estimated local LRPs consistent with three hypothetical management objectives: allowing the population to rebuild to its maximum net productivity level (4.7 turtles per five years), limiting delay of population rebuilding (0.8 turtles per five years), or only preventing further decline (7.7 turtles per five years). These LRPs pertain to all human-caused removals and represent the WCEEZ contribution to meeting population management objectives within a broader international cooperative framework. We present multi-year estimates, because at low LRP values, annual assessments are prone to substantial error that can lead to volatile and costly management without providing further conservation benefit. The novel approach and the performance criteria used here are not a direct expression of the “jeopardy” standard of the U.S. Endangered Species Act, but they provide useful assessment information and could help guide international management frameworks. Given the range of abundance data scenarios addressed, LRPs should be estimable for many other areas, populations, and taxa. PMID:26368557

  18. Estimating Limit Reference Points for Western Pacific Leatherback Turtles (Dermochelys coriacea) in the U.S. West Coast EEZ.

    PubMed

    Curtis, K Alexandra; Moore, Jeffrey E; Benson, Scott R

    2015-01-01

    Biological limit reference points (LRPs) for fisheries catch represent upper bounds that avoid undesirable population states. LRPs can support consistent management evaluation among species and regions, and can advance ecosystem-based fisheries management. For transboundary species, LRPs prorated by local abundance can inform local management decisions when international coordination is lacking. We estimated LRPs for western Pacific leatherbacks in the U.S. West Coast Exclusive Economic Zone (WCEEZ) using three approaches with different types of information on local abundance. For the current application, the best-informed LRP used a local abundance estimate derived from nest counts, vital rate information, satellite tag data, and fishery observer data, and was calculated with a Potential Biological Removal estimator. Management strategy evaluation was used to set tuning parameters of the LRP estimators to satisfy risk tolerances for falling below population thresholds, and to evaluate sensitivity of population outcomes to bias in key inputs. We estimated local LRPs consistent with three hypothetical management objectives: allowing the population to rebuild to its maximum net productivity level (4.7 turtles per five years), limiting delay of population rebuilding (0.8 turtles per five years), or only preventing further decline (7.7 turtles per five years). These LRPs pertain to all human-caused removals and represent the WCEEZ contribution to meeting population management objectives within a broader international cooperative framework. We present multi-year estimates, because at low LRP values, annual assessments are prone to substantial error that can lead to volatile and costly management without providing further conservation benefit. The novel approach and the performance criteria used here are not a direct expression of the "jeopardy" standard of the U.S. Endangered Species Act, but they provide useful assessment information and could help guide international management frameworks. Given the range of abundance data scenarios addressed, LRPs should be estimable for many other areas, populations, and taxa.

  19. Integrating resource selection into spatial capture-recapture models for large carnivores

    USGS Publications Warehouse

    Proffitt, Kelly M.; Goldberg, Joshua; Hebblewite, Mark; Russell, Robin E.; Jimenez, Ben; Robinson, Hugh S.; Pilgrim, Kristine; Schwartz, Michael K.

    2015-01-01

    Wildlife managers need reliable methods to estimate large carnivore densities and population trends; yet large carnivores are elusive, difficult to detect, and occur at low densities making traditional approaches intractable. Recent advances in spatial capture-recapture (SCR) models have provided new approaches for monitoring trends in wildlife abundance and these methods are particularly applicable to large carnivores. We applied SCR models in a Bayesian framework to estimate mountain lion densities in the Bitterroot Mountains of west central Montana. We incorporate an existing resource selection function (RSF) as a density covariate to account for heterogeneity in habitat use across the study area and include data collected from harvested lions. We identify individuals through DNA samples collected by (1) biopsy darting mountain lions detected in systematic surveys of the study area, (2) opportunistically collecting hair and scat samples, and (3) sampling all harvested mountain lions. We included 80 DNA samples collected from 62 individuals in the analysis. Including information on predicted habitat use as a covariate on the distribution of activity centers reduced the median estimated density by 44%, the standard deviation by 7%, and the width of 95% credible intervals by 10% as compared to standard SCR models. Within the two management units of interest, we estimated a median mountain lion density of 4.5 mountain lions/100 km2 (95% CI = 2.9, 7.7) and 5.2 mountain lions/100 km2 (95% CI = 3.4, 9.1). Including harvested individuals (dead recovery) did not create a significant bias in the detection process by introducing individuals that could not be detected after removal. However, the dead recovery component of the model did have a substantial effect on results by increasing sample size. The ability to account for heterogeneity in habitat use provides a useful extension to SCR models, and will enhance the ability of wildlife managers to reliably and economically estimate density of wildlife populations, particularly large carnivores.

  20. Free-Roaming Dog Population Estimation and Status of the Dog Population Management and Rabies Control Program in Dhaka City, Bangladesh

    PubMed Central

    Tenzin, Tenzin; Ahmed, Rubaiya; Debnath, Nitish C.; Ahmed, Garba; Yamage, Mat

    2015-01-01

    Beginning January 2012, a humane method of dog population management using a Catch-Neuter-Vaccinate-Release (CNVR) program was implemented in Dhaka City, Bangladesh as part of the national rabies control program. To enable this program, the size and distribution of the free-roaming dog population needed to be estimated. We present the results of a dog population survey and a pilot assessment of the CNVR program coverage in Dhaka City. Free-roaming dog population surveys were undertaken in 18 wards of Dhaka City on consecutive days using mark-resight methods. Data was analyzed using Lincoln-Petersen index-Chapman correction methods. The CNVR program was assessed over the two years (2012–2013) whilst the coverage of the CNVR program was assessed by estimating the proportion of dogs that were ear-notched (processed dogs) via dog population surveys. The free-roaming dog population was estimated to be 1,242 (95 % CI: 1205–1278) in the 18 sampled wards and 18,585 dogs in Dhaka City (52 dogs/km2) with an estimated human-to-free-roaming dog ratio of 828:1. During the two year CNVR program, a total of 6,665 dogs (3,357 male and 3,308 female) were neutered and vaccinated against rabies in 29 of the 92 city wards. A pilot population survey indicated a mean CNVR coverage of 60.6% (range 19.2–79.3%) with only eight wards achieving > 70% coverage. Given that the coverage in many neighborhoods was below the WHO-recommended threshold level of 70% for rabies eradications and since the CNVR program takes considerable time to implement throughout the entire Dhaka City area, a mass dog vaccination program in the non-CNVR coverage area is recommended to create herd immunity. The findings from this study are expected to guide dog population management and the rabies control program in Dhaka City and elsewhere in Bangladesh. PMID:25978406

  1. Free-roaming dog population estimation and status of the dog population management and rabies control program in Dhaka City, Bangladesh.

    PubMed

    Tenzin, Tenzin; Ahmed, Rubaiya; Debnath, Nitish C; Ahmed, Garba; Yamage, Mat

    2015-05-01

    Beginning January 2012, a humane method of dog population management using a Catch-Neuter-Vaccinate-Release (CNVR) program was implemented in Dhaka City, Bangladesh as part of the national rabies control program. To enable this program, the size and distribution of the free-roaming dog population needed to be estimated. We present the results of a dog population survey and a pilot assessment of the CNVR program coverage in Dhaka City. Free-roaming dog population surveys were undertaken in 18 wards of Dhaka City on consecutive days using mark-resight methods. Data was analyzed using Lincoln-Petersen index-Chapman correction methods. The CNVR program was assessed over the two years (2012-2013) whilst the coverage of the CNVR program was assessed by estimating the proportion of dogs that were ear-notched (processed dogs) via dog population surveys. The free-roaming dog population was estimated to be 1,242 (95 % CI: 1205-1278) in the 18 sampled wards and 18,585 dogs in Dhaka City (52 dogs/km2) with an estimated human-to-free-roaming dog ratio of 828:1. During the two year CNVR program, a total of 6,665 dogs (3,357 male and 3,308 female) were neutered and vaccinated against rabies in 29 of the 92 city wards. A pilot population survey indicated a mean CNVR coverage of 60.6% (range 19.2-79.3%) with only eight wards achieving > 70% coverage. Given that the coverage in many neighborhoods was below the WHO-recommended threshold level of 70% for rabies eradications and since the CNVR program takes considerable time to implement throughout the entire Dhaka City area, a mass dog vaccination program in the non-CNVR coverage area is recommended to create herd immunity. The findings from this study are expected to guide dog population management and the rabies control program in Dhaka City and elsewhere in Bangladesh.

  2. Inferring invasive species abundance using removal data from management actions

    USGS Publications Warehouse

    Davis, Amy J.; Hooten, Mevin B.; Miller, Ryan S.; Farnsworth, Matthew L.; Lewis, Jesse S.; Moxcey, Michael; Pepin, Kim M.

    2016-01-01

    Evaluation of the progress of management programs for invasive species is crucial for demonstrating impacts to stakeholders and strategic planning of resource allocation. Estimates of abundance before and after management activities can serve as a useful metric of population management programs. However, many methods of estimating population size are too labor intensive and costly to implement, posing restrictive levels of burden on operational programs. Removal models are a reliable method for estimating abundance before and after management using data from the removal activities exclusively, thus requiring no work in addition to management. We developed a Bayesian hierarchical model to estimate abundance from removal data accounting for varying levels of effort, and used simulations to assess the conditions under which reliable population estimates are obtained. We applied this model to estimate site-specific abundance of an invasive species, feral swine (Sus scrofa), using removal data from aerial gunning in 59 site/time-frame combinations (480–19,600 acres) throughout Oklahoma and Texas, USA. Simulations showed that abundance estimates were generally accurate when effective removal rates (removal rate accounting for total effort) were above 0.40. However, when abundances were small (<50) the effective removal rate needed to accurately estimates abundances was considerably higher (0.70). Based on our post-validation method, 78% of our site/time frame estimates were accurate. To use this modeling framework it is important to have multiple removals (more than three) within a time frame during which demographic changes are minimized (i.e., a closed population; ≤3 months for feral swine). Our results show that the probability of accurately estimating abundance from this model improves with increased sampling effort (8+ flight hours across the 3-month window is best) and increased removal rate. Based on the inverse relationship between inaccurate abundances and inaccurate removal rates, we suggest auxiliary information that could be collected and included in the model as covariates (e.g., habitat effects, differences between pilots) to improve accuracy of removal rates and hence abundance estimates.

  3. Application of the ReNuMa model in the Sha He river watershed: tools for watershed environmental management.

    PubMed

    Sha, Jian; Liu, Min; Wang, Dong; Swaney, Dennis P; Wang, Yuqiu

    2013-07-30

    Models and related analytical methods are critical tools for use in modern watershed management. A modeling approach for quantifying the source apportionment of dissolved nitrogen (DN) and associated tools for examining the sensitivity and uncertainty of the model estimates were assessed for the Sha He River (SHR) watershed in China. The Regional Nutrient Management model (ReNuMa) was used to infer the primary sources of DN in the SHR watershed. This model is based on the Generalized Watershed Loading Functions (GWLF) and the Net Anthropogenic Nutrient Input (NANI) framework, modified to improve the characterization of subsurface hydrology and septic system loads. Hydrochemical processes of the SHR watershed, including streamflow, DN load fluxes, and corresponding DN concentration responses, were simulated following calibrations against observations of streamflow and DN fluxes. Uncertainty analyses were conducted with a Monte Carlo analysis to vary model parameters for assessing the associated variations in model outputs. The model performed accurately at the watershed scale and provided estimates of monthly streamflows and nutrient loads as well as DN source apportionments. The simulations identified the dominant contribution of agricultural land use and significant monthly variations. These results provide valuable support for science-based watershed management decisions and indicate the utility of ReNuMa for such applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. 76 FR 75599 - 60-Day Notice of Proposed Information Collection: DS-71, Affidavit of Identifying Witness, 1405-0088

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-02

    ... of a passport applicant. Estimated Number of Respondents: 44,000 per year. Estimated Number of...): Passport Forms Management Officer, U.S. Department of State, Office of Program Management and Operational... documents, to Passport Forms Management and Officer, U.S. Department of State, Office of Program Management...

  5. Age-specific survival of male golden-cheeked warblers on the Fort Hood Military Reservation, Texas

    USGS Publications Warehouse

    Duarte, Adam; Hines, James E.; Nichols, James D.; Hatfield, Jeffrey S.; Weckerly, Floyd W.

    2014-01-01

    Population models are essential components of large-scale conservation and management plans for the federally endangered Golden-cheeked Warbler (Setophaga chrysoparia; hereafter GCWA). However, existing models are based on vital rate estimates calculated using relatively small data sets that are now more than a decade old. We estimated more current, precise adult and juvenile apparent survival (Φ) probabilities and their associated variances for male GCWAs. In addition to providing estimates for use in population modeling, we tested hypotheses about spatial and temporal variation in Φ. We assessed whether a linear trend in Φ or a change in the overall mean Φ corresponded to an observed increase in GCWA abundance during 1992-2000 and if Φ varied among study plots. To accomplish these objectives, we analyzed long-term GCWA capture-resight data from 1992 through 2011, collected across seven study plots on the Fort Hood Military Reservation using a Cormack-Jolly-Seber model structure within program MARK. We also estimated Φ process and sampling variances using a variance-components approach. Our results did not provide evidence of site-specific variation in adult Φ on the installation. Because of a lack of data, we could not assess whether juvenile Φ varied spatially. We did not detect a strong temporal association between GCWA abundance and Φ. Mean estimates of Φ for adult and juvenile male GCWAs for all years analyzed were 0.47 with a process variance of 0.0120 and a sampling variance of 0.0113 and 0.28 with a process variance of 0.0076 and a sampling variance of 0.0149, respectively. Although juvenile Φ did not differ greatly from previous estimates, our adult Φ estimate suggests previous GCWA population models were overly optimistic with respect to adult survival. These updated Φ probabilities and their associated variances will be incorporated into new population models to assist with GCWA conservation decision making.

  6. Application of maximum-likelihood estimation in optical coherence tomography for nanometer-class thickness estimation

    NASA Astrophysics Data System (ADS)

    Huang, Jinxin; Yuan, Qun; Tankam, Patrice; Clarkson, Eric; Kupinski, Matthew; Hindman, Holly B.; Aquavella, James V.; Rolland, Jannick P.

    2015-03-01

    In biophotonics imaging, one important and quantitative task is layer-thickness estimation. In this study, we investigate the approach of combining optical coherence tomography and a maximum-likelihood (ML) estimator for layer thickness estimation in the context of tear film imaging. The motivation of this study is to extend our understanding of tear film dynamics, which is the prerequisite to advance the management of Dry Eye Disease, through the simultaneous estimation of the thickness of the tear film lipid and aqueous layers. The estimator takes into account the different statistical processes associated with the imaging chain. We theoretically investigated the impact of key system parameters, such as the axial point spread functions (PSF) and various sources of noise on measurement uncertainty. Simulations show that an OCT system with a 1 μm axial PSF (FWHM) allows unbiased estimates down to nanometers with nanometer precision. In implementation, we built a customized Fourier domain OCT system that operates in the 600 to 1000 nm spectral window and achieves 0.93 micron axial PSF in corneal epithelium. We then validated the theoretical framework with physical phantoms made of custom optical coatings, with layer thicknesses from tens of nanometers to microns. Results demonstrate unbiased nanometer-class thickness estimates in three different physical phantoms.

  7. Protecting Lake Ontario - Treating Wastewater from the Remediated Low-Level Radioactive Waste Management Facility - 13227

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freihammer, Till; Chaput, Barb; Vandergaast, Gary

    2013-07-01

    The Port Granby Project is part of the larger Port Hope Area Initiative, a community-based program for the development and implementation of a safe, local, long-term management solution for historic low level radioactive waste (LLRW) and marginally contaminated soils (MCS). The Port Granby Project involves the relocation and remediation of up to 0.45 million cubic metres of such waste from the current Port Granby Waste Management Facility located in the Municipality of Clarington, Ontario, adjacent to the shoreline of Lake Ontario. The waste material will be transferred to a new suitably engineered Long-Term Waste Management Facility (LTWMF) to be locatedmore » inland approximately 700 m from the existing site. The development of the LTWMF will include construction and commissioning of a new Wastewater Treatment Plant (WWTP) designed to treat wastewater consisting of contaminated surface run off and leachate generated during the site remediation process at the Port Granby Waste Management Facility as well as long-term leachate generated at the new LTWMF. Numerous factors will influence the variable wastewater flow rates and influent loads to the new WWTP during remediation. The treatment processes will be comprised of equalization to minimize impacts from hydraulic peaks, fine screening, membrane bioreactor technology, and reverse osmosis. The residuals treatment will comprise of lime precipitation, thickening, dewatering, evaporation and drying. The distribution of the concentration of uranium and radium - 226 over the various process streams in the WWTP was estimated. This information was used to assess potential worker exposure to radioactivity in the various process areas. A mass balance approach was used to assess the distribution of uranium and radium - 226, by applying individual contaminant removal rates for each process element of the WTP, based on pilot scale results and experience-based assumptions. The mass balance calculations were repeated for various flow and load scenarios. (authors)« less

  8. A Systems Engineering Framework for Implementing a Security and Critical Patch Management Process in Diverse Environments (Academic Departments' Workstations)

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hadi

    Use of the Patch Vulnerability Management (PVM) process should be seriously considered for any networked computing system. The PVM process prevents the operating system (OS) and software applications from being attacked due to security vulnerabilities, which lead to system failures and critical data leakage. The purpose of this research is to create and design a Security and Critical Patch Management Process (SCPMP) framework based on Systems Engineering (SE) principles. This framework will assist Information Technology Department Staff (ITDS) to reduce IT operating time and costs and mitigate the risk of security and vulnerability attacks. Further, this study evaluates implementation of the SCPMP in the networked computing systems of an academic environment in order to: 1. Meet patch management requirements by applying SE principles. 2. Reduce the cost of IT operations and PVM cycles. 3. Improve the current PVM methodologies to prevent networked computing systems from becoming the targets of security vulnerability attacks. 4. Embed a Maintenance Optimization Tool (MOT) in the proposed framework. The MOT allows IT managers to make the most practicable choice of methods for deploying and installing released patches and vulnerability remediation. In recent years, there has been a variety of frameworks for security practices in every networked computing system to protect computer workstations from becoming compromised or vulnerable to security attacks, which can expose important information and critical data. I have developed a new mechanism for implementing PVM for maximizing security-vulnerability maintenance, protecting OS and software packages, and minimizing SCPMP cost. To increase computing system security in any diverse environment, particularly in academia, one must apply SCPMP. I propose an optimal maintenance policy that will allow ITDS to measure and estimate the variation of PVM cycles based on their department's requirements. My results demonstrate that MOT optimizes the process of implementing SCPMP in academic workstations.

  9. The role of models in estimating consequences as part of the risk assessment process.

    PubMed

    Forde-Folle, K; Mitchell, D; Zepeda, C

    2011-08-01

    The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.

  10. Wastewater Management Alternatives for the Cleveland - Akron, Three Rivers Watershed Area. Technical Appendix - Phase II. System Design and Estimate of Cost.

    DTIC Science & Technology

    1973-02-01

    established. Secondly, the applicable process sequence to most economically meet these requirements under local enviromental constraints must be...concentrations are highest for receiving waters containing cold water fisheries. Allowable fecal coliform bacteria counts vary seasonally and dictate...handling system has also been modified to include gravity waste activated sludge thickening and heat conditioning of the combined raw sludge after

  11. [Suicide in the elderly – risk factors and prevention].

    PubMed

    Linnemann, Christoph; Leyhe, Thomas

    2015-10-01

    Suicide rates are highest among the elderly in Switzerland. The estimated number of unreported cases is particularly high in this age group. The risk factors are multidimensional, including depression and social isolation. The detection and management of the controllable risk factors, foremost depression, is of particular importance for suicide prevention. Old age depression often shows an atypical presentation, is misinterpreted as a normal process of aging and is not adequately treated.

  12. Review of Methods and Algorithms for Dynamic Management of CBRNE Collection Assets

    DTIC Science & Technology

    2013-07-01

    where they should be looking. An example sensor is a satellite with a limited energy budget, which may have power to operate, say, only 10 percent of...calculations by incorporating sensor data with initial dispersion estimates.1 DTRA and the Joint Science and Technology Office for Chem- Bio Defense (JSTO-CBD...detection performance through remote processing and fusion of sensor data and modeling of the operational environment. DTRA is actively developing

  13. The contribution of space observations to water resources management; Proceedings of the Symposium, Bangalore, India, May 29-June 9, 1979

    NASA Technical Reports Server (NTRS)

    Salomonson, V. V. (Editor); Bhavsar, P. D.

    1980-01-01

    The symposium focused on hydrology, soil moisture estimation and ground water exploration, wetlands monitoring and water quality estimation, hydrometeorology, snow and ice monitoring, and evapotranspiration estimation. Other problems discussed include surface water and flood mapping, watershed runoff estimation and prediction, and new space systems contributing to water resources management.

  14. Estimating tag loss of the Atlantic Horseshoe crab, Limulus polyphemus, using a multi-state model

    USGS Publications Warehouse

    Butler, Catherine Alyssa; McGowan, Conor P.; Grand, James B.; Smith, David

    2012-01-01

    The Atlantic Horseshoe crab, Limulus polyphemus, is a valuable resource along the Mid-Atlantic coast which has, in recent years, experienced new management paradigms due to increased concern about this species role in the environment. While current management actions are underway, many acknowledge the need for improved and updated parameter estimates to reduce the uncertainty within the management models. Specifically, updated and improved estimates of demographic parameters such as adult crab survival in the regional population of interest, Delaware Bay, could greatly enhance these models and improve management decisions. There is however, some concern that difficulties in tag resighting or complete loss of tags could be occurring. As apparent from the assumptions of a Jolly-Seber model, loss of tags can result in a biased estimate and underestimate a survival rate. Given that uncertainty, as a first step towards estimating an unbiased estimate of adult survival, we first took steps to estimate the rate of tag loss. Using data from a double tag mark-resight study conducted in Delaware Bay and Program MARK, we designed a multi-state model to allow for the estimation of mortality of each tag separately and simultaneously.

  15. Using ecological production functions to link ecological ...

    EPA Pesticide Factsheets

    Ecological production functions (EPFs) link ecosystems, stressors, and management actions to ecosystem services (ES) production. Although EPFs are acknowledged as being essential to improve environmental management, their use in ecological risk assessment has received relatively little attention. Ecological production functions may be defined as usable expressions (i.e., models) of the processes by which ecosystems produce ES, often including external influences on those processes. We identify key attributes of EPFs and discuss both actual and idealized examples of their use to inform decision making. Whenever possible, EPFs should estimate final, rather than intermediate, ES. Although various types of EPFs have been developed, we suggest that EPFs are more useful for decision making if they quantify ES outcomes, respond to ecosystem condition, respond to stressor levels or management scenarios, reflect ecological complexity, rely on data with broad coverage, have performed well previously, are practical to use, and are open and transparent. In an example using pesticides, we illustrate how EPFs with these attributes could enable the inclusion of ES in ecological risk assessment. The biggest challenges to ES inclusion are limited data sets that are easily adapted for use in modeling EPFs and generally poor understanding of linkages among ecological components and the processes that ultimately deliver the ES. We conclude by advocating for the incorporation into E

  16. Applying total quality management concepts to public health organizations.

    PubMed Central

    Kaluzny, A D; McLaughlin, C P; Simpson, K

    1992-01-01

    Total quality management (TQM) is a participative, systematic approach to planning and implementing a continuous organizational improvement process. Its approach is focused on satisfying customers' expectations, identifying problems, building commitment, and promoting open decision-making among workers. TQM applies analytical tools, such as flow and statistical charts and check sheets, to gather data about activities within an organization. TQM uses process techniques, such as nominal groups, brainstorming, and consensus forming to facilitate communication and decision making. TQM applications in the public sector and particularly in public health agencies have been limited. The process of integrating TQM into public health agencies complements and enhances the Model Standards Program and assessment methodologies, such as the Assessment Protocol for Excellence in Public Health (APEX-PH), which are mechanisms for establishing strategic directions for public health. The authors examine the potential for using TQM as a method to achieve and exceed standards quickly and efficiently. They discuss the relationship of performance standards and assessment methodologies with TQM and provide guidelines for achieving the full potential of TQM in public health organizations. The guidelines include redefining the role of management, defining a common corporate culture, refining the role of citizen oversight functions, and setting realistic estimates of the time needed to complete a task or project. PMID:1594734

  17. A system to build distributed multivariate models and manage disparate data sharing policies: implementation in the scalable national network for effectiveness research.

    PubMed

    Meeker, Daniella; Jiang, Xiaoqian; Matheny, Michael E; Farcas, Claudiu; D'Arcy, Michel; Pearlman, Laura; Nookala, Lavanya; Day, Michele E; Kim, Katherine K; Kim, Hyeoneui; Boxwala, Aziz; El-Kareh, Robert; Kuo, Grace M; Resnic, Frederic S; Kesselman, Carl; Ohno-Machado, Lucila

    2015-11-01

    Centralized and federated models for sharing data in research networks currently exist. To build multivariate data analysis for centralized networks, transfer of patient-level data to a central computation resource is necessary. The authors implemented distributed multivariate models for federated networks in which patient-level data is kept at each site and data exchange policies are managed in a study-centric manner. The objective was to implement infrastructure that supports the functionality of some existing research networks (e.g., cohort discovery, workflow management, and estimation of multivariate analytic models on centralized data) while adding additional important new features, such as algorithms for distributed iterative multivariate models, a graphical interface for multivariate model specification, synchronous and asynchronous response to network queries, investigator-initiated studies, and study-based control of staff, protocols, and data sharing policies. Based on the requirements gathered from statisticians, administrators, and investigators from multiple institutions, the authors developed infrastructure and tools to support multisite comparative effectiveness studies using web services for multivariate statistical estimation in the SCANNER federated network. The authors implemented massively parallel (map-reduce) computation methods and a new policy management system to enable each study initiated by network participants to define the ways in which data may be processed, managed, queried, and shared. The authors illustrated the use of these systems among institutions with highly different policies and operating under different state laws. Federated research networks need not limit distributed query functionality to count queries, cohort discovery, or independently estimated analytic models. Multivariate analyses can be efficiently and securely conducted without patient-level data transport, allowing institutions with strict local data storage requirements to participate in sophisticated analyses based on federated research networks. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  18. Managing design excellence tools during the development of new orthopaedic implants.

    PubMed

    Défossez, Henri J P; Serhan, Hassan

    2013-11-01

    Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.

  19. Estimating costs of pressure area management based on a survey of ulcer care in one Irish hospital.

    PubMed

    Gethin, G; Jordan-O'Brien, J; Moore, Z

    2005-04-01

    Pressure ulceration remains a significant cause of morbidity for patients and has a real economic impact on the health sector. Studies to date have estimated the cost of management but have not always given a breakdown of how these figures were calculated. There are no published studies that have estimated the cost of management of pressure ulcers in Ireland. A two-part study was therefore undertaken. Part one determined the prevalence of pressure ulcers in a 626-bed Irish acute hospital. Part two set out to derive a best estimate of the cost of managing pressure ulcers in Ireland. The European Pressure UlcerAdvisory Panel (EPUAP) minimum data set tool was used to complete the prevalence survey. Tissue viability nurses trained in the data-collection tool collected the data. A cost was obtained for all items of care for the management of one patient with three grade IV pressure ulcers over a five-month period. Of the patients, 2.5% had pressure ulcers. It cost Euros 119,000 to successfully treat one patient. We estimate that it costs Euros 250,000,000 per annum to manage pressure ulcers across all care settings in Ireland.

  20. Superstructure-based Design and Optimization of Batch Biodiesel Production Using Heterogeneous Catalysts

    NASA Astrophysics Data System (ADS)

    Nuh, M. Z.; Nasir, N. F.

    2017-08-01

    Biodiesel as a fuel comprised of mono alkyl esters of long chain fatty acids derived from renewable lipid feedstock, such as vegetable oil and animal fat. Biodiesel production is complex process which need systematic design and optimization. However, no case study using the process system engineering (PSE) elements which are superstructure optimization of batch process, it involves complex problems and uses mixed-integer nonlinear programming (MINLP). The PSE offers a solution to complex engineering system by enabling the use of viable tools and techniques to better manage and comprehend the complexity of the system. This study is aimed to apply the PSE tools for the simulation of biodiesel process and optimization and to develop mathematical models for component of the plant for case A, B, C by using published kinetic data. Secondly, to determine economic analysis for biodiesel production, focusing on heterogeneous catalyst. Finally, the objective of this study is to develop the superstructure for biodiesel production by using heterogeneous catalyst. The mathematical models are developed by the superstructure and solving the resulting mixed integer non-linear model and estimation economic analysis by using MATLAB software. The results of the optimization process with the objective function of minimizing the annual production cost by batch process from case C is 23.2587 million USD. Overall, the implementation a study of process system engineering (PSE) has optimized the process of modelling, design and cost estimation. By optimizing the process, it results in solving the complex production and processing of biodiesel by batch.

  1. Advancing Methods for Estimating Soil Nitrous Oxide Emissions by Incorporating Freeze-Thaw Cycles into a Tier 3 Model-Based Assessment

    NASA Astrophysics Data System (ADS)

    Ogle, S. M.; DelGrosso, S.; Parton, W. J.

    2017-12-01

    Soil nitrous oxide emissions from agricultural management are a key source of greenhouse gas emissions in many countries due to the widespread use of nitrogen fertilizers, manure amendments from livestock production, planting legumes and other practices that affect N dynamics in soils. In the United States, soil nitrous oxide emissions have ranged from 250 to 280 Tg CO2 equivalent from 1990 to 2015, with uncertainties around 20-30 percent. A Tier 3 method has been used to estimate the emissions with the DayCent ecosystem model. While the Tier 3 approach is considerably more accurate than IPCC Tier 1 methods, there is still the possibility of biases in emission estimates if there are processes and drivers that are not represented in the modeling framework. Furthermore, a key principle of IPCC guidance is that inventory compilers estimate emissions as accurately as possible. Freeze-thaw cycles and associated hot moments of nitrous oxide emissions are one of key drivers influencing emissions in colder climates, such as the cold temperate climates of the upper Midwest and New England regions of the United States. Freeze-thaw activity interacts with management practices that are increasing N availability in the plant-soil system, leading to greater nitrous oxide emissions during transition periods from winter to spring. Given the importance of this driver, the DayCent model has been revised to incorproate freeze-thaw cycles, and the results suggests that including this driver can significantly modify the emissions estimates in cold temperate climate regions. Consequently, future methodological development to improve estimation of nitrous oxide emissions from soils would benefit from incorporating freeze-thaw cycles into the modeling framework for national territories with a cold climate.

  2. AnimalFinder: A semi-automated system for animal detection in time-lapse camera trap images

    USGS Publications Warehouse

    Price Tack, Jennifer L.; West, Brian S.; McGowan, Conor P.; Ditchkoff, Stephen S.; Reeves, Stanley J.; Keever, Allison; Grand, James B.

    2017-01-01

    Although the use of camera traps in wildlife management is well established, technologies to automate image processing have been much slower in development, despite their potential to drastically reduce personnel time and cost required to review photos. We developed AnimalFinder in MATLAB® to identify animal presence in time-lapse camera trap images by comparing individual photos to all images contained within the subset of images (i.e. photos from the same survey and site), with some manual processing required to remove false positives and collect other relevant data (species, sex, etc.). We tested AnimalFinder on a set of camera trap images and compared the presence/absence results with manual-only review with white-tailed deer (Odocoileus virginianus), wild pigs (Sus scrofa), and raccoons (Procyon lotor). We compared abundance estimates, model rankings, and coefficient estimates of detection and abundance for white-tailed deer using N-mixture models. AnimalFinder performance varied depending on a threshold value that affects program sensitivity to frequently occurring pixels in a series of images. Higher threshold values led to fewer false negatives (missed deer images) but increased manual processing time, but even at the highest threshold value, the program reduced the images requiring manual review by ~40% and correctly identified >90% of deer, raccoon, and wild pig images. Estimates of white-tailed deer were similar between AnimalFinder and the manual-only method (~1–2 deer difference, depending on the model), as were model rankings and coefficient estimates. Our results show that the program significantly reduced data processing time and may increase efficiency of camera trapping surveys.

  3. The Water Availability Tool for Environmental Resources (WATER): A Water-Budget Modeling Approach for Managing Water-Supply Resources in Kentucky - Phase I: Data Processing, Model Development, and Application to Non-Karst Areas

    USGS Publications Warehouse

    Williamson, Tanja N.; Odom, Kenneth R.; Newson, Jeremy K.; Downs, Aimee C.; Nelson, Hugh L.; Cinotto, Peter J.; Ayers, Mark A.

    2009-01-01

    The Water Availability Tool for Environmental Resources (WATER) was developed in cooperation with the Kentucky Division of Water to provide a consistent and defensible method of estimating streamflow and water availability in ungaged basins. WATER is process oriented; it is based on the TOPMODEL code and incorporates historical water-use data together with physiographic data that quantitatively describe topography and soil-water storage. The result is a user-friendly decision tool that can estimate water availability in non-karst areas of Kentucky without additional data or processing. The model runs on a daily time step, and critical source data include a historical record of daily temperature and precipitation, digital elevation models (DEMs), the Soil Survey Geographic Database (SSURGO), and historical records of water discharges and withdrawals. The model was calibrated and statistically evaluated for 12 basins by comparing the estimated discharge to that observed at U.S. Geological Survey streamflow-gaging stations. When statistically evaluated over a 2,119-day time period, the discharge estimates showed a bias of -0.29 to 0.42, a root mean square error of 1.66 to 5.06, a correlation of 0.54 to 0.85, and a Nash-Sutcliffe Efficiency of 0.26 to 0.72. The parameter and input modifications that most significantly improved the accuracy and precision of streamflow-discharge estimates were the addition of Next Generation radar (NEXRAD) precipitation data, a rooting depth of 30 centimeters, and a TOPMODEL scaling parameter (m) derived directly from SSURGO data that was multiplied by an adjustment factor of 0.10. No site-specific optimization was used.

  4. Economical and Environmentally Benign Extraction of Rare Earth Elements (REES) from Coal & Coal Byproducts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, Gary

    This final report provides a complete summary of the activities, results, analytical discussion, and overall evaluation of the project titled “Economical and Environmentally Benign Extraction of Rare Earth Elements (REES) from Coal & Coal Byproducts” under DOE Award Number DE-FE-0027155 that started in March 2016 and ended December 2017. Fly ash was selected as the coal-byproduct source material due to fact that it is readily available with no need for extensive methods to obtain the material, it is produced in large quantities (>50 million tons per year) and had REE concentrations similar to other coal-byproducts. The selected fly ash usedmore » throughout this project was from the Mill Creek power generating facility operated by Louisville Gas and Electric located in Louisville, KY and was subjected to a variety of physical and chemical characterization tests. Results from fusion extractions showed that the selected fly-ash had a TREE+Y concentration of 480 ppm with critical REEs concentration of 200 ppm. The fly ash had an outlook ratio of 1.25 and an estimated value of $16-$18 worth of salable REEs per 1-tonne of fly ash. Additional characterizations by optical evaluation, QEMSCAN, XRD, size fractionation, and SEM analysis showed the fly ash consisted of small glassy spherules with a size range between 1 to 110 µm (ave. diam. of 13 um), was heterogeneous in chemical composition (main crystalline phases: aluminum oxides and iron oxides) and was primarily an amorphous material (75 to 80%). A simple stepped approach was completed to estimate the total REE resource quantity. The approach included REE characterization of the representative samples, evaluation of fly-ash availability, and final determination estimated resource availability with regards to REE grade on a regional and national scale. This data represents the best available information and is based upon the assumptions that the power generating facility where the fly-ash was obtained will use the same coal sources (actual mines were identified), the coal materials will have relatively consistent REE concentrations, and the REE extraction process developed during this project can achieve 42% REE recovery (validated and confirmed). Calculations indicated that the estimated REE resource is approximately 175,000 tonnes with a current estimated value of $3,330MM. The proposed REE extraction and production process developed during this project used four fundamental steps; 1) fly-ash pretreatment to enhance REE extraction, 2) REE extraction by acid digestion, 3) REE separation/concentration by carbon adsorption and column chromatography, and 4) REE oxide production. Secondary processing steps to manage process residuals and additional processing techniques to produce value-added products were incorporated into the process during the project. These secondary steps were not only necessary to manage residuals, but also provided additional revenue streams that offset operational and capital expenditures. The process produces one value product stream (production of zeolite Na-P1), a solids waste stream, and one liquid stream that met RCRA discharge requirements. Based upon final design criteria and operational parameters, the proposed system could produce approximately 200 grams of REOs from 1-tonne of fly-ash, thereby representing a TREE+Y recovery of 42% (project target of > 25%). A detailed economic model was developed to evaluate both CAPEX and OPEX estimates for systems with varying capacities between 100 kg to 200 tonnes of fly ash processed per day. Using a standard system capacity of 10 tonne/day system, capital costs were estimated at $88/kg fly ash while operating costs were estimated at approximately $450/kg fly ash. This operating cost estimate includes a revenue of $495/tonne of fly ash processed from the value-added product produced from the system (zeolite Na-P1). Although operating cost savings due to zeolite production were significant, the capital + operating cost for a 10 tonne system was more expensive than the total dollar value of REEs present in the fly ash material. Specifically, the estimated cost per 1-tonne of fly ash treated is approximately $540 while the estimated value of REEs in the fly ash is $18-$20/tonne. This is an excessive difference showing that the proposed process is not economically feasible strictly on the basis of REE revenue compared to extraction costs. Although the current proposed system does not produce sufficient quantities of REEs or additional revenue sources to offset operational and capital costs, supplementary factors including US strategic concerns, commercial demands, and defense department requirements must be factored. At this time, the process developed during this project provides foundational information for future development of simple processes that require low capital investment and one that will extract a valuable quality and quantity of REE oxides from industrial waste.« less

  5. Benchmark requirements for the the Energy Emergency Management Information System (EEMIS). Phase 1: Work plan

    NASA Astrophysics Data System (ADS)

    1980-09-01

    The energy emergency management information system (EEMIS) has responsibility for providing special information and communication services to government officials at Federal and state levels, who must deal with energy emergencies. Because of proprietary information residing in the data base used for federal purposes, a special system (EEMIS-S) must be established for use by the states. It is planned to acquire teleprocessing services for EEMIS-S from a time-sharing commercial vendor, and the process for procurement must meet guidelines for approval. The work plan and schedule for meeting these guidelines are discussed. Tasks to be included contain estimates of time, cost, and resources required, all of which are briefly described.

  6. The Biggest Loser Thinks Long-Term: Recency as a Predictor of Success in Weight Management

    PubMed Central

    Koritzky, Gilly; Rice, Chantelle; Dieterle, Camille; Bechara, Antoine

    2015-01-01

    Only a minority of participants in behavioral weight management lose weight significantly. The ability to predict who is likely to benefit from weight management can improve the efficiency of obesity treatment. Identifying predictors of weight loss can also reveal potential ways to improve existing treatments. We propose a neuro-psychological model that is focused on recency: the reliance on recent information at the expense of time-distant information. Forty-four weight-management patients completed a decision-making task and their recency level was estimated by a mathematical model. Impulsivity and risk-taking were also measured for comparison. Weight loss was measured in the end of the 16-week intervention. Consistent with our hypothesis, successful dieters (n = 12) had lower recency scores than unsuccessful ones (n = 32; p = 0.006). Successful and unsuccessful dieters were similar in their demographics, intelligence, risk taking, impulsivity, and delay of gratification. We conclude that dieters who process time-distant information in their decision making are more likely to lose weight than those who are high in recency. We argue that having low recency facilitates future-oriented thinking, and thereby contributes to behavior change treatment adherence. Our findings underline the importance of choosing the right treatment for every individual, and outline a way to improve weight-management processes for more patients. PMID:26696930

  7. NASA Land Information System (LIS) Water Availability to Support Reclamation ET Estimation

    NASA Technical Reports Server (NTRS)

    Toll, David; Arsenault, Kristi; Pinheiro, Ana; Peters-Lidard, Christa; Houser, Paul; Kumar, Sujay; Engman, Ted; Nigro, Joe; Triggs, Jonathan

    2005-01-01

    The U.S. Bureau of Reclamation identified the remote sensing of evapotranspiration (ET) as an important water flux for study and designated a test site in the Lower Colorado River basin. A consortium of groups will work together with the goal to develop more accurate and cost effective techniques using the enhanced spatial and temporal coverage afforded by remote sensing. ET is a critical water loss flux where improved estimation should lead to better management of Reclamation responsibilities. There are several areas where NASA satellite and modeling data may be useful to meet Reclamation's objectives for improved ET estimation. In this paper we outline one possible contribution to use NASA's data integration capability of the Land Information System (LIS) to provide a merger of observational (in situ and satellite) with physical process models to provide estimates of ET and other water availability outputs (e.g., runoff, soil moisture) retrospectively, in near real-time, and also providing short-term predictions.

  8. Estimating TCP Packet Loss Ratio from Sampled ACK Packets

    NASA Astrophysics Data System (ADS)

    Yamasaki, Yasuhiro; Shimonishi, Hideyuki; Murase, Tutomu

    The advent of various quality-sensitive applications has greatly changed the requirements for IP network management and made the monitoring of individual traffic flows more important. Since the processing costs of per-flow quality monitoring are high, especially in high-speed backbone links, packet sampling techniques have been attracting considerable attention. Existing sampling techniques, such as those used in Sampled NetFlow and sFlow, however, focus on the monitoring of traffic volume, and there has been little discussion of the monitoring of such quality indexes as packet loss ratio. In this paper we propose a method for estimating, from sampled packets, packet loss ratios in individual TCP sessions. It detects packet loss events by monitoring duplicate ACK events raised by each TCP receiver. Because sampling reveals only a portion of the actual packet loss, the actual packet loss ratio is estimated statistically. Simulation results show that the proposed method can estimate the TCP packet loss ratio accurately from a 10% sampling of packets.

  9. Catchment-scale hydrologic implications of parcel-level stormwater management (Ohio USA)

    NASA Astrophysics Data System (ADS)

    Shuster, William; Rhea, Lee

    2013-04-01

    SummaryThe effectiveness of stormwater management strategies is a key issue affecting decision making on urban water resources management, and so proper monitoring and analysis of pilot studies must be addressed before drawing conclusions. We performed a pilot study in the suburban Shepherd Creek watershed located in Cincinnati, Ohio to evaluate the practicality of voluntary incentives for stormwater quantity reduction on privately owned suburban properties. Stream discharge and precipitation were monitored 3 years before and after implementation of the stormwater management treatments. To implement stormwater control measures, we elicited the participation of citizen landowners with two successive reverse-auctions. Auctions were held in spring 2007, and 2008, resulting in the installation of 85 rain gardens and 174 rain barrels. We demonstrated an analytic process of increasing model flexibility to determine hydrologic effectiveness of stormwater management at the sub-catchment level. A significant albeit small proportion of total variance was explained by both the effects of study period (˜69%) and treatment-vs.-control (˜7%). Precipitation-discharge relationships were synthesized in estimated unit hydrographs, which were decomposed and components tested for influence of treatments. Analysis of unit hydrograph parameters showed a weakened correlation between precipitation and discharge, and support the output from the initial model that parcel-level green infrastructure added detention capacity to treatment basins. We conclude that retrofit management of stormwater runoff quantity with green infrastructure in a small suburban catchment can be successfully initiated with novel economic incentive programs, and that these measures can impart a small, but statistically significant decrease in otherwise uncontrolled runoff volume. Given consistent monitoring data and analysis, water resource managers can use our approach as a way to estimate actual effectiveness of stormwater runoff volume management, with potential benefits for management of both separated and combined sewer systems. We also discuss lessons-learned with regard to monitoring design for catchment-scale hydrologic studies.

  10. Estimation of Energy Consumption and Greenhouse Gas Emissions considering Aging and Climate Change in Residential Sector

    NASA Astrophysics Data System (ADS)

    Lee, M.; Park, C.; Park, J. H.; Jung, T. Y.; Lee, D. K.

    2015-12-01

    The impacts of climate change, particularly that of rising temperatures, are being observed across the globe and are expected to further increase. To counter this phenomenon, numerous nations are focusing on the reduction of greenhouse gas (GHG) emissions. Because energy demand management is considered as a key factor in emissions reduction, it is necessary to estimate energy consumption and GHG emissions in relation to climate change. Further, because South Korea is the world's fastest nation to become aged, demographics have also become instrumental in the accurate estimation of energy demands and emissions. Therefore, the purpose of this study is to estimate energy consumption and GHG emissions in the residential sectors of South Korea with regard to climate change and aging to build more accurate strategies for energy demand management and emissions reduction goals. This study, which was stablished with 2010 and 2050 as the base and target years, respectively, was divided into a two-step process. The first step evaluated the effects of aging and climate change on energy demand, and the second estimated future energy use and GHG emissions through projected scenarios. First, aging characteristics and climate change factors were analyzed by using the logarithmic mean divisia index (LMDI) decomposition analysis and the application of historical data. In the analysis of changes in energy use, the effects of activity, structure, and intensity were considered; the degrees of contribution were derived from each effect in addition to their relations to energy demand. Second, two types of scenarios were stablished based on this analysis. The aging scenarios are business as usual and future characteristics scenarios, and were used in combination with Representative Concentration Pathway (RCP) 2.6 and 8.5. Finally, energy consumption and GHG emissions were estimated by using a combination of scenarios. The results of these scenarios show an increase in energy consumption and GHG emissions from 2010 to 2050. This growth is caused by increases in heating energy because the elderly generally spend more time at home, and cooling energy owing to rising temperatures. This study will be useful in the preparation of energy demand management policies and the establishment and attainability of GHG emissions reduction goals.

  11. A BIM-based system for demolition and renovation waste estimation and planning.

    PubMed

    Cheng, Jack C P; Ma, Lauren Y H

    2013-06-01

    Due to the rising worldwide awareness of green environment, both government and contractors have to consider effective construction and demolition (C&D) waste management practices. The last two decades have witnessed the growing importance of demolition and renovation (D&R) works and the growing amount of D&R waste disposed to landfills every day, especially in developed cities like Hong Kong. Quantitative waste prediction is crucial for waste management. It can enable contractors to pinpoint critical waste generation processes and to plan waste control strategies. In addition, waste estimation could also facilitate some government waste management policies, such as the waste disposal charging scheme in Hong Kong. Currently, tools that can accurately and conveniently estimate the amount of waste from construction, renovation, and demolition projects are lacking. In the light of this research gap, this paper presents a building information modeling (BIM) based system that we have developed for estimation and planning of D&R waste. BIM allows multi-disciplinary information to be superimposed within one digital building model. Our system can extract material and volume information through the BIM model and integrate the information for detailed waste estimation and planning. Waste recycling and reuse are also considered in our system. Extracted material information can be provided to recyclers before demolition or renovation to make recycling stage more cooperative and more efficient. Pick-up truck requirements and waste disposal charging fee for different waste facilities will also be predicted through our system. The results could provide alerts to contractors ahead of time at project planning stage. This paper also presents an example scenario with a 47-floor residential building in Hong Kong to demonstrate our D&R waste estimation and planning system. As the BIM technology has been increasingly adopted in the architectural, engineering and construction industry and digital building information models will likely to be available for most buildings (including historical buildings) in the future, our system can be used in various demolition and renovation projects and be extended to facilitate project control. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Comparison of Irrigation Water Use Estimates Calculated from Remotely Sensed Irrigated Acres and State Reported Irrigated Acres in the Lake Altus Drainage Basin, Oklahoma and Texas, 2000 Growing Season

    USGS Publications Warehouse

    Masoner, J.R.; Mladinich, C.S.; Konduris, A.M.; Smith, S. Jerrod

    2003-01-01

    Increased demand for water in the Lake Altus drainage basin requires more accurate estimates of water use for irrigation. The U.S. Geological Survey, in cooperation with the U.S. Bureau of Reclamation, is investigating new techniques to improve water-use estimates for irrigation purposes in the Lake Altus drainage basin. Empirical estimates of reference evapotranspiration, crop evapotranspiration, and crop irrigation water requirements for nine major crops were calculated from September 1999 to October 2000 using a solar radiation-based evapotranspiration model. Estimates of irrigation water use were calculated using remotely sensed irrigated crop acres derived from Landsat 7 Enhanced Thematic Mapper Plus imagery and were compared with irrigation water-use estimates calculated from irrigated crop acres reported by the Oklahoma Water Resources Board and the Texas Water Development Board for the 2000 growing season. The techniques presented will help manage water resources in the Lake Altus drainage basin and may be transferable to other areas with similar water management needs. Irrigation water use calculated from the remotely sensed irrigated acres was estimated at 154,920 acre-feet; whereas, irrigation water use calculated from state reported irrigated crop acres was 196,026 acre-feet, a 23 percent difference. The greatest difference in irrigation water use was in Carson County, Texas. Irrigation water use for Carson County, Texas, calculated from the remotely sensed irrigated acres was 58,555 acrefeet; whereas, irrigation water use calculated from state reported irrigated acres was 138,180 acre-feet, an 81 percent difference. The second greatest difference in irrigation water use occurred in Beckham County, Oklahoma. Differences between the two irrigation water use estimates are due to the differences of irrigated crop acres derived from the mapping process and those reported by the Oklahoma Water Resources Board and Texas Water Development Board.

  13. Estimates of Soil Moisture Using the Land Information System for Land Surface Water Storage: Case Study for the Western States Water Mission

    NASA Astrophysics Data System (ADS)

    Liu, P. W.; Famiglietti, J. S.; Levoe, S.; Reager, J. T., II; David, C. H.; Kumar, S.; Li, B.; Peters-Lidard, C. D.

    2017-12-01

    Soil moisture is one of the critical factors in terrestrial hydrology. Accurate soil moisture information improves estimation of terrestrial water storage and fluxes, that is essential for water resource management including sustainable groundwater pumping and agricultural irrigation practices. It is particularly important during dry periods when water stress is high. The Western States Water Mission (WSWM), a multiyear mission project of NASA's Jet Propulsion Laboratory, is operated to understand and estimate quantities of the water availability in the western United States by integrating observations and measurements from in-situ and remote sensing sensors, and hydrological models. WSWM data products have been used to assess and explore the adverse impacts of the California drought (2011-2016) and provide decision-makers information for water use planning. Although the observations are often more accurate, simulations using land surface models can provide water availability estimates at desired spatio-temporal scales. The Land Information System (LIS), developed by NASA's Goddard Space Flight Center, integrates developed land surface models and data processing and management tools, that enables to utilize the measurements and observations from various platforms as forcings in the high performance computing environment to forecast the hydrologic conditions. The goal of this study is to implement the LIS in the western United States for estimates of soil moisture. We will implement the NOAH-MP model at the 12km North America Land Data Assimilation System grid and compare to other land surface models included in the LIS. Findings will provide insight into the differences between model estimates and model physics. Outputs from a multi-model ensemble from LIS can also be used to enhance estimated reliability and provide quantification of uncertainty. We will compare the LIS-based soil moisture estimates to the SMAP enhanced 9 km soil moisture product to understand the mechanistic differences between the model and observation. These outcomes will contribute to the WSWM for providing robust products.

  14. Product line cost estimation: a standard cost approach.

    PubMed

    Cooper, J C; Suver, J D

    1988-04-01

    Product line managers often must make decisions based on inaccurate cost information. A method is needed to determine costs more accurately. By using a standard costing model, product line managers can better estimate the cost of intermediate and end products, and hence better estimate the costs of the product line.

  15. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Department of Defense to rely upon information produced by the system that is needed for management purposes... management systems; and (4) Is subject to applicable financial control systems. Estimating system means the... estimates of costs and other data included in proposals submitted to customers in the expectation of...

  16. Improving Life-Cycle Cost Management of Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Clardy, Dennon

    2010-01-01

    This presentation will explore the results of a recent NASA Life-Cycle Cost study and how project managers can use the findings and recommendations to improve planning and coordination early in the formulation cycle and avoid common pitfalls resulting in cost overruns. The typical NASA space science mission will exceed both the initial estimated and the confirmed life-cycle costs by the end of the mission. In a fixed-budget environment, these overruns translate to delays in starting or launching future missions, or in the worst case can lead to cancelled missions. Some of these overruns are due to issues outside the control of the project; others are due to the unpredictable problems (unknown unknowns) that can affect any development project. However, a recent study of life-cycle cost growth by the Discovery and New Frontiers Program Office identified a number of areas that are within the scope of project management to address. The study also found that the majority of the underlying causes for cost overruns are embedded in the project approach during the formulation and early design phases, but the actual impacts typically are not experienced until late in the project life cycle. Thus, project management focus in key areas such as integrated schedule development, management structure and contractor communications processes, heritage and technology assumptions, and operations planning, can be used to validate initial cost assumptions and set in place management processes to avoid the common pitfalls resulting in cost overruns.

  17. Estimating migratory game-bird productivity by integrating age ratio and banding data

    USGS Publications Warehouse

    Zimmerman, G.S.; Link, W.A.; Conroy, M.J.; Sauer, J.R.; Richkus, K.D.; Boomer, G. Scott

    2010-01-01

    Implications: Several national and international management strategies for migratory game birds in North America rely on measures of productivity from harvest survey parts collections, without a justification of the estimator or providing estimates of precision. We derive an estimator of productivity with realistic measures of uncertainty that can be directly incorporated into management plans or ecological studies across large spatial scales.

  18. Middleware Design for Swarm-Driving Robots Accompanying Humans.

    PubMed

    Kim, Min Su; Kim, Sang Hyuck; Kang, Soon Ju

    2017-02-17

    Research on robots that accompany humans is being continuously studied. The Pet-Bot provides walking-assistance and object-carrying services without any specific controls through interaction between the robot and the human in real time. However, with Pet-Bot, there is a limit to the number of robots a user can use. If this limit is overcome, the Pet-Bot can provide services in more areas. Therefore, in this study, we propose a swarm-driving middleware design adopting the concept of a swarm, which provides effective parallel movement to allow multiple human-accompanying robots to accomplish a common purpose. The functions of middleware divide into three parts: a sequence manager for swarm process, a messaging manager, and a relative-location identification manager. This middleware processes the sequence of swarm-process of robots in the swarm through message exchanging using radio frequency (RF) communication of an IEEE 802.15.4 MAC protocol and manages an infrared (IR) communication module identifying relative location with IR signal strength. The swarm in this study is composed of the master interacting with the user and the slaves having no interaction with the user. This composition is intended to control the overall swarm in synchronization with the user activity, which is difficult to predict. We evaluate the accuracy of the relative-location estimation using IR communication, the response time of the slaves to a change in user activity, and the time to organize a network according to the number of slaves.

  19. Middleware Design for Swarm-Driving Robots Accompanying Humans

    PubMed Central

    Kim, Min Su; Kim, Sang Hyuck; Kang, Soon Ju

    2017-01-01

    Research on robots that accompany humans is being continuously studied. The Pet-Bot provides walking-assistance and object-carrying services without any specific controls through interaction between the robot and the human in real time. However, with Pet-Bot, there is a limit to the number of robots a user can use. If this limit is overcome, the Pet-Bot can provide services in more areas. Therefore, in this study, we propose a swarm-driving middleware design adopting the concept of a swarm, which provides effective parallel movement to allow multiple human-accompanying robots to accomplish a common purpose. The functions of middleware divide into three parts: a sequence manager for swarm process, a messaging manager, and a relative-location identification manager. This middleware processes the sequence of swarm-process of robots in the swarm through message exchanging using radio frequency (RF) communication of an IEEE 802.15.4 MAC protocol and manages an infrared (IR) communication module identifying relative location with IR signal strength. The swarm in this study is composed of the master interacting with the user and the slaves having no interaction with the user. This composition is intended to control the overall swarm in synchronization with the user activity, which is difficult to predict. We evaluate the accuracy of the relative-location estimation using IR communication, the response time of the slaves to a change in user activity, and the time to organize a network according to the number of slaves. PMID:28218650

  20. Using population models to evaluate management alternatives for Gulf Striped Bass

    USGS Publications Warehouse

    Aspinwall, Alexander P.; Irwin, Elise R.; Lloyd, M. Clint

    2017-01-01

    Interstate management of Gulf Striped Bass Morone saxatilis has involved a thirty-year cooperative effort involving Federal and State agencies in Georgia, Florida and Alabama (Apalachicola-Chattahoochee-Flint Gulf Striped Bass Technical Committee). The Committee has recently focused on developing an adaptive framework for conserving and restoring Gulf Striped Bass in the Apalachicola, Chattahoochee, and Flint River (ACF) system. To evaluate the consequences and tradeoffs among management activities, population models were used to inform management decisions. Stochastic matrix models were constructed with varying recruitment and stocking rates to simulate effects of management alternatives on Gulf Striped Bass population objectives. An age-classified matrix model that incorporated stock fecundity estimates and survival estimates was used to project population growth rate. In addition, combinations of management alternatives (stocking rates, Hydrilla control, harvest regulations) were evaluated with respect to how they influenced Gulf Striped Bass population growth. Annual survival and mortality rates were estimated from catch-curve analysis, while fecundity was estimated and predicted using a linear least squares regression analysis of fish length versus egg number from hatchery brood fish data. Stocking rates and stocked-fish survival rates were estimated from census data. Results indicated that management alternatives could be an effective approach to increasing the Gulf Striped Bass population. Population abundance was greatest under maximum stocking effort, maximum Hydrilla control and a moratorium. Conversely, population abundance was lowest under no stocking, no Hydrilla control and the current harvest regulation. Stocking rates proved to be an effective management strategy; however, low survival estimates of stocked fish (1%) limited the potential for population growth. Hydrilla control increased the survival rate of stocked fish and provided higher estimates of population abundances than maximizing the stocking rate. A change in the current harvest regulation (50% harvest regulation) was not an effective alternative to increasing the Gulf Striped Bass population size. Applying a moratorium to the Gulf Striped Bass fishery increased survival rates from 50% to 74% and resulted in the largest population growth of the individual management alternatives. These results could be used by the Committee to inform management decisions for other populations of Striped Bass in the Gulf Region.

  1. Sample Size for Tablet Compression and Capsule Filling Events During Process Validation.

    PubMed

    Charoo, Naseem Ahmad; Durivage, Mark; Rahman, Ziyaur; Ayad, Mohamad Haitham

    2017-12-01

    During solid dosage form manufacturing, the uniformity of dosage units (UDU) is ensured by testing samples at 2 stages, that is, blend stage and tablet compression or capsule/powder filling stage. The aim of this work is to propose a sample size selection approach based on quality risk management principles for process performance qualification (PPQ) and continued process verification (CPV) stages by linking UDU to potential formulation and process risk factors. Bayes success run theorem appeared to be the most appropriate approach among various methods considered in this work for computing sample size for PPQ. The sample sizes for high-risk (reliability level of 99%), medium-risk (reliability level of 95%), and low-risk factors (reliability level of 90%) were estimated to be 299, 59, and 29, respectively. Risk-based assignment of reliability levels was supported by the fact that at low defect rate, the confidence to detect out-of-specification units would decrease which must be supplemented with an increase in sample size to enhance the confidence in estimation. Based on level of knowledge acquired during PPQ and the level of knowledge further required to comprehend process, sample size for CPV was calculated using Bayesian statistics to accomplish reduced sampling design for CPV. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  2. Waste management in the meat processing industry: Conversion of paunch and DAF sludge into solid fuel.

    PubMed

    Hamawand, Ihsan; Pittaway, Pam; Lewis, Larry; Chakrabarty, Sayan; Caldwell, Justin; Eberhard, Jochen; Chakraborty, Arpita

    2017-02-01

    This article addresses the novel dewatering process of immersion-frying of paunch and dissolved air flotation (DAF) sludge to produce high energy pellets. Literature have been analysed to address the feasibility of replacing conventional boiler fuel at meat processing facilities with high energy paunch-DAF sludge pellets (capsules). The value proposition of pelleting and frying this mixture into energy pellets is based on a Cost-Benefit Analysis (CBA). The CBA is based on information derived from the literature and consultation with the Australian Meat Processing Industry. The calorific properties of a mixture of paunch cake solids and DAF sludge were predicted from literature and industry consultation to validate the product. This study shows that the concept of pelletizing and frying paunch is economically feasible. The complete frying and dewatering of the paunch and DAF sludge mixture produces pellets with energy content per kilogram equivalent to coal. The estimated cost of this new product is half the price of coal and the payback period is estimated to be between 1.8 and 3.2years. Further research is required for proof of concept, and to identify the technical challenges associated with integrating this technology into existing meat processing plants. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  3. The Influence of Mark-Recapture Sampling Effort on Estimates of Rock Lobster Survival

    PubMed Central

    Kordjazi, Ziya; Frusher, Stewart; Buxton, Colin; Gardner, Caleb; Bird, Tomas

    2016-01-01

    Five annual capture-mark-recapture surveys on Jasus edwardsii were used to evaluate the effect of sample size and fishing effort on the precision of estimated survival probability. Datasets of different numbers of individual lobsters (ranging from 200 to 1,000 lobsters) were created by random subsampling from each annual survey. This process of random subsampling was also used to create 12 datasets of different levels of effort based on three levels of the number of traps (15, 30 and 50 traps per day) and four levels of the number of sampling-days (2, 4, 6 and 7 days). The most parsimonious Cormack-Jolly-Seber (CJS) model for estimating survival probability shifted from a constant model towards sex-dependent models with increasing sample size and effort. A sample of 500 lobsters or 50 traps used on four consecutive sampling-days was required for obtaining precise survival estimations for males and females, separately. Reduced sampling effort of 30 traps over four sampling days was sufficient if a survival estimate for both sexes combined was sufficient for management of the fishery. PMID:26990561

  4. Estimation and prediction under local volatility jump-diffusion model

    NASA Astrophysics Data System (ADS)

    Kim, Namhyoung; Lee, Younhee

    2018-02-01

    Volatility is an important factor in operating a company and managing risk. In the portfolio optimization and risk hedging using the option, the value of the option is evaluated using the volatility model. Various attempts have been made to predict option value. Recent studies have shown that stochastic volatility models and jump-diffusion models reflect stock price movements accurately. However, these models have practical limitations. Combining them with the local volatility model, which is widely used among practitioners, may lead to better performance. In this study, we propose a more effective and efficient method of estimating option prices by combining the local volatility model with the jump-diffusion model and apply it using both artificial and actual market data to evaluate its performance. The calibration process for estimating the jump parameters and local volatility surfaces is divided into three stages. We apply the local volatility model, stochastic volatility model, and local volatility jump-diffusion model estimated by the proposed method to KOSPI 200 index option pricing. The proposed method displays good estimation and prediction performance.

  5. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    PubMed

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.

  6. Introduction of the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Costing Tool: a user-friendly spreadsheet program to estimate costs of providing patient-centered interventions.

    PubMed

    Reed, Shelby D; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L; Bowers, Margaret T; Samsa, Gregory P; Paul, Sara; Schulman, Kevin A; Whellan, David J; Riegel, Barbara J

    2012-01-01

    Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers and health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions.

  7. Daily time management in children with spina bifida.

    PubMed

    Persson, Marika; Janeslätt, Gunnel; Peny-Dahlstrand, Marie

    2017-12-11

    Spina bifida (SB) often results in a complex disability and can also cause cognitive dysfunction. No previous study has investigated the ability to adapt to time in children with SB. This ability is crucial for an individual's possibility to develop autonomy in life. The purpose of this study was to investigate whether children aged 10-17 with SB have lower time-processing abilities than typically-developing children, and to describe the profile of time-processing in children with SB. Participants comprised a consecutive sample of 21 children (drawn from a geographical cohort of 45) aged 10-17 years (mean: 14 years, SD: 2 years); 13 were boys. The instruments used were KaTid-Y, Time-S, and Time-P. The children with SB had lower time-processing abilities than typically-developing children (52.4% under -2SD), particularly difficulties to orient to and to estimate objective time, to understand time perspectives and with time planning. They also self-rated low use of strategies to adapt to time. The parents rated their children as having extensive difficulties in daily time management. The low time-processing ability found in children with SB is likely to be an important contributing factor to low autonomy and independence.

  8. The complexity of earth observation valuation: Modeling the patterns and processes of agricultural production and groundwater quality to construct a production possibilities frontier

    NASA Astrophysics Data System (ADS)

    Forney, W.; Raunikar, R. P.; Bernknopf, R.; Mishra, S.

    2012-12-01

    A production possibilities frontier (PPF) is a graph comparing the production interdependencies for two commodities. In this case, the commodities are defined as the ecosystem services of agricultural production and groundwater quality. This presentation focuses on the refinement of techniques used in an application to estimate the value of remote sensing information. Value of information focuses on the use of uncertain and varying qualities of information within a specific decision-making context for a certain application, which in this case included land use, biogeochemical, hydrogeologic, economic and geospatial data and models. The refined techniques include deriving alternate patterns and processes of ecosystem functions, new estimates of ecosystem service values to construct a PPF, and the extension of this work into decision support systems. We have coupled earth observations of agricultural production with groundwater quality measurements to estimate the value of remote sensing information in northeastern Iowa to be 857M ± 198M (at the 2010 price level) per year. We will present an improved method for modeling crop rotation patterns to include multiple years of rotation, reduction in the assumptions associated with optimal land use allocations, and prioritized improvement of the resolution of input data (for example, soil resources and topography). The prioritization focuses on watersheds that were identified at a coarse-scale of analysis to have higher intensities of agricultural production and lower probabilities of groundwater survivability (in other words, remaining below a regulatory threshold for nitrate pollution) over time, and thus require finer-scaled modeling and analysis. These improved techniques and the simulation of certain scale-dependent policy and management actions, which trade-off the objectives of optimizing crop value versus maintaining potable groundwater, and provide new estimates for the empirical values of the PPF. The calculation of a PPF in this way provides a decision maker with a tool to consider the ramifications of different policies, management practices and regional objectives.

  9. Production of Copper as a Complex Mining and Metallurgical Processing System in Polish Copper Mines of the Legnica-Glogów Copper Belt

    NASA Astrophysics Data System (ADS)

    Malewski, Jerzy

    2017-12-01

    Geological and technological conditions of Cu production in the Polish copper mines of the Legnica-Glogów Copper Belt are presented. Cu production is recognized as a technological fractal consisting of subsystems for mineral exploration, ore extraction and processing, and metallurgical treatment. Qualitative and quantitative models of these operations have been proposed, including estimation of their costs of process production. Numerical calculations of such a system have been performed, which allow optimize the system parameters according to economic criteria under variable Cu mineralization in the ore deposit. The main objective of the study is to develop forecasting tool for analysis of production efficiency in domestic copper mines based on available sources of information. Such analyses are primarily of social value, allowing for assessment of the efficiency of management of local mineral resources in the light of current technological and market constraints. At the same time, this is a concept of the system analysis method to manage deposit exploitation on operational and strategic level.

  10. Ecohydrologic process modeling of mountain block groundwater recharge.

    PubMed

    Magruder, Ian A; Woessner, William W; Running, Steve W

    2009-01-01

    Regional mountain block recharge (MBR) is a key component of alluvial basin aquifer systems typical of the western United States. Yet neither water scientists nor resource managers have a commonly available and reasonably invoked quantitative method to constrain MBR rates. Recent advances in landscape-scale ecohydrologic process modeling offer the possibility that meteorological data and land surface physical and vegetative conditions can be used to generate estimates of MBR. A water balance was generated for a temperate 24,600-ha mountain watershed, elevation 1565 to 3207 m, using the ecosystem process model Biome-BGC (BioGeochemical Cycles) (Running and Hunt 1993). Input data included remotely sensed landscape information and climate data generated with the Mountain Climate Simulator (MT-CLIM) (Running et al. 1987). Estimated mean annual MBR flux into the crystalline bedrock terrain is 99,000 m(3) /d, or approximately 19% of annual precipitation for the 2003 water year. Controls on MBR predictions include evapotranspiration (radiation limited in wet years and moisture limited in dry years), soil properties, vegetative ecotones (significant at lower elevations), and snowmelt (dominant recharge process). The ecohydrologic model is also used to investigate how climatic and vegetative controls influence recharge dynamics within three elevation zones. The ecohydrologic model proves useful for investigating controls on recharge to mountain blocks as a function of climate and vegetation. Future efforts will need to investigate the uncertainty in the modeled water balance by incorporating an advanced understanding of mountain recharge processes, an ability to simulate those processes at varying scales, and independent approaches to calibrating MBR estimates. Copyright © 2009 The Author(s). Journal compilation © 2009 National Ground Water Association.

  11. A calibrated, high-resolution goes satellite solar insolation product for a climatology of Florida evapotranspiration

    USGS Publications Warehouse

    Paech, S.J.; Mecikalski, J.R.; Sumner, D.M.; Pathak, C.S.; Wu, Q.; Islam, S.; Sangoyomi, T.

    2009-01-01

    Estimates of incoming solar radiation (insolation) from Geostationary Operational Environmental Satellite observations have been produced for the state of Florida over a 10-year period (1995-2004). These insolation estimates were developed into well-calibrated half-hourly and daily integrated solar insolation fields over the state at 2 km resolution, in addition to a 2-week running minimum surface albedo product. Model results of the daily integrated insolation were compared with ground-based pyranometers, and as a result, the entire dataset was calibrated. This calibration was accomplished through a three-step process: (1) comparison with ground-based pyranometer measurements on clear (noncloudy) reference days, (2) correcting for a bias related to cloudiness, and (3) deriving a monthly bias correction factor. Precalibration results indicated good model performance, with a station-averaged model error of 2.2 MJ m-2/day (13%). Calibration reduced errors to 1.7 MJ m -2/day (10%), and also removed temporal-related, seasonal-related, and satellite sensor-related biases. The calibrated insolation dataset will subsequently be used by state of Florida Water Management Districts to produce statewide, 2-km resolution maps of estimated daily reference and potential evapotranspiration for water management-related activities. ?? 2009 American Water Resources Association.

  12. Stochastic investigation of precipitation process for climatic variability identification

    NASA Astrophysics Data System (ADS)

    Sotiriadou, Alexia; Petsiou, Amalia; Feloni, Elisavet; Kastis, Paris; Iliopoulou, Theano; Markonis, Yannis; Tyralis, Hristos; Dimitriadis, Panayiotis; Koutsoyiannis, Demetris

    2016-04-01

    The precipitation process is important not only to hydrometeorology but also to renewable energy resources management. We use a dataset consisting of daily and hourly records around the globe to identify statistical variability with emphasis on the last period. Specifically, we investigate the occurrence of mean, maximum and minimum values and we estimate statistical properties such as marginal probability distribution function and the type of decay of the climacogram (i.e., mean process variance vs. scale). Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students in the Assembly.

  13. Symposium on Machine Processing of Remotely Sensed Data, Purdue University, West Lafayette, Ind., June 29-July 1, 1976, Proceedings

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Papers are presented on the applicability of Landsat data to water management and control needs, IBIS, a geographic information system based on digital image processing and image raster datatype, and the Image Data Access Method (IDAM) for the Earth Resources Interactive Processing System. Attention is also given to the Prototype Classification and Mensuration System (PROCAMS) applied to agricultural data, the use of Landsat for water quality monitoring in North Carolina, and the analysis of geophysical remote sensing data using multivariate pattern recognition. The Illinois crop-acreage estimation experiment, the Pacific Northwest Resources Inventory Demonstration, and the effects of spatial misregistration on multispectral recognition are also considered. Individual items are announced in this issue.

  14. Decision aids for multiple-decision disease management as affected by weather input errors.

    PubMed

    Pfender, W F; Gent, D H; Mahaffee, W F; Coop, L B; Fox, A D

    2011-06-01

    Many disease management decision support systems (DSSs) rely, exclusively or in part, on weather inputs to calculate an indicator for disease hazard. Error in the weather inputs, typically due to forecasting, interpolation, or estimation from off-site sources, may affect model calculations and management decision recommendations. The extent to which errors in weather inputs affect the quality of the final management outcome depends on a number of aspects of the disease management context, including whether management consists of a single dichotomous decision, or of a multi-decision process extending over the cropping season(s). Decision aids for multi-decision disease management typically are based on simple or complex algorithms of weather data which may be accumulated over several days or weeks. It is difficult to quantify accuracy of multi-decision DSSs due to temporally overlapping disease events, existence of more than one solution to optimizing the outcome, opportunities to take later recourse to modify earlier decisions, and the ongoing, complex decision process in which the DSS is only one component. One approach to assessing importance of weather input errors is to conduct an error analysis in which the DSS outcome from high-quality weather data is compared with that from weather data with various levels of bias and/or variance from the original data. We illustrate this analytical approach for two types of DSS, an infection risk index for hop powdery mildew and a simulation model for grass stem rust. Further exploration of analysis methods is needed to address problems associated with assessing uncertainty in multi-decision DSSs.

  15. An Approach to Modeling the Water Balance Sensitivity to Landscape Vegetation Changes

    NASA Astrophysics Data System (ADS)

    Mohammed, I. N.; Tarboton, D. G.

    2008-12-01

    Watershed development and management require an understanding of how hydrological processes affect water balance components. The study of water resources management, especially in Western United States, is currently motivated by climate change, the impact of vegetation cover change on water production, and the need to manage water supplies. Vegetation management and its relation to runoff has been well documented, as reduction of forest cover, reducing evapotranspiration, increases water yield and in contrast the establishment of forest cover on sparsely vegetated land, increasing evapotranspiration, deceases water yield. This paper presents a water balance model developed to quantify the sensitivity of runoff production to changes in vegetation based on differences in evapotranspiration from different land cover types. The model is intended to provide a simple framework for estimating long term yield changes due to managed vegetation change. The model assumes that relative potential evapotranspiration from specific land cover can be quantified by a set of potential evapotranspiration coefficients for each land cover type. The model uses the Budyko curve to partition precipitation into evapotranspiration and runoff over the long term. Potential evapotranspiration is estimated from the Budyko curve for present conditions, then adjusted for land cover changes using the relative potential evapotranspiration coefficients for each land cover type. The adjusted potential evapotranspiration is then partitioned using the Budyko curve to provide estimates of long term runoff and evapotranspiration for the changed conditions. We found that the changes in runoff were in general close to being linearly proportional to the changes in land cover. In Utah study watersheds, reducing 50% of the present coniferous forests resulted in runoff increase that ranged from 0.5 to 38 mm/year, while the transition of 50% of area present as range/shrub/other to forest resulted in runoff decrease that ranged from 3.8 to 37 mm/year. The model helps to evaluate long term runoff production sensitivities to vegetation changes and answer, in a broad sense without requiring detailed information or modeling, how much runoff production could potentially be changed through vegetation management. The theoretical approach taken in this study is simple and general and could be applied to a wide range of watersheds.

  16. Efficiently estimating salmon escapement uncertainty using systematically sampled data

    USGS Publications Warehouse

    Reynolds, Joel H.; Woody, Carol Ann; Gove, Nancy E.; Fair, Lowell F.

    2007-01-01

    Fish escapement is generally monitored using nonreplicated systematic sampling designs (e.g., via visual counts from towers or hydroacoustic counts). These sampling designs support a variety of methods for estimating the variance of the total escapement. Unfortunately, all the methods give biased results, with the magnitude of the bias being determined by the underlying process patterns. Fish escapement commonly exhibits positive autocorrelation and nonlinear patterns, such as diurnal and seasonal patterns. For these patterns, poor choice of variance estimator can needlessly increase the uncertainty managers have to deal with in sustaining fish populations. We illustrate the effect of sampling design and variance estimator choice on variance estimates of total escapement for anadromous salmonids from systematic samples of fish passage. Using simulated tower counts of sockeye salmon Oncorhynchus nerka escapement on the Kvichak River, Alaska, five variance estimators for nonreplicated systematic samples were compared to determine the least biased. Using the least biased variance estimator, four confidence interval estimators were compared for expected coverage and mean interval width. Finally, five systematic sampling designs were compared to determine the design giving the smallest average variance estimate for total annual escapement. For nonreplicated systematic samples of fish escapement, all variance estimators were positively biased. Compared to the other estimators, the least biased estimator reduced bias by, on average, from 12% to 98%. All confidence intervals gave effectively identical results. Replicated systematic sampling designs consistently provided the smallest average estimated variance among those compared.

  17. Implementing sustainable drainage systems for urban surface water management within the regulatory framework in England and Wales.

    PubMed

    Ellis, J Bryan; Lundy, Lian

    2016-12-01

    The UK 2007 floods resulted in damages estimated to exceed over £4 billion. This triggered a national review of strategic flood risk management (Pitt, 2008) with its recommendations informing and implemented by the Flood and Water Management, Act (FWMA, 2010). Estimating that up to two-thirds of properties flooded in the 2007 event as a direct result of overloaded sewer systems, the FWMA set out an ambitious overhaul of flood risk management approaches including identifying bodies responsible for the management of local flood risk (local municipalities) and the development of over-arching Lead Local Flood Authorities (LLFAs) at a regional level. LLFAs duties include developing local flood risk management strategies and, aligned with this, many LLFAs and local municipalities produced sustainable drainage system (SUDS) guidance notes. In parallel, changes to the national planning policy framework (NPPF) in England give priority to the use of SUDS in new major developments, as does the related Town and Country Planning Order (2015). However, whilst all three pieces of legislation refer to the preferential use of SUDs, these requirements remain "economically proportionate" and thus the inclusion of SUDS within development controls remain desirable - but not mandatory - obligations. Within this dynamic policy context, reignited most recently by the December 2015 floods, this paper examines some of the challenges to the implementation of SUDS in England and Wales posed by the new regulatory frameworks. In particular, it examines how emerging organisational procedures and processes are likely to impact on future SUDS implementation, and highlights the need for further cross-sectoral working to ensure opportunities for cross-sectoral benefits- such as that accrued by reducing stormwater flows within combined sewer systems for water companies, property developers and environmental protection - are not lost. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  19. Uncertainty quantification in downscaling procedures for effective decisions in energy systems

    NASA Astrophysics Data System (ADS)

    Constantinescu, E. M.

    2010-12-01

    Weather is a major driver both of energy supply and demand, and with the massive adoption of renewable energy sources and changing economic and producer-consumer paradigms, the management of the next-generation energy systems is becoming ever more challenging. The operational and planning decisions in energy systems are guided by efficiency and reliability, and therefore a central role in these decisions will be played by the ability to obtain weather condition forecasts with accurate uncertainty estimates. The appropriate temporal and spatial resolutions needed for effective decision-making, be it operational or planning, is not clear. It is arguably certain however, that such temporal scales as hourly variations of temperature or wind conditions and ramp events are essential in this process. Planning activities involve decade or decades-long projections of weather. One sensible way to achieve this is to embed regional weather models in a global climate system. This strategy acts as a downscaling procedure. Uncertainty modeling techniques must be developed in order to quantify and minimize forecast errors as well as target variables that impact the decision-making process the most. We discuss the challenges of obtaining a realistic uncertainty quantification estimate using mathematical algorithms based on scalable matrix-free computations and physics-based statistical models. The process of making decisions for energy management systems based on future weather scenarios is a very complex problem. We shall focus on the challenges in generating wind power predictions based on regional weather predictions, and discuss the implications of making the common assumptions about the uncertainty models.

  20. Landsat Collection 1 Data: Assessing the Impacts of Surface Reflectance Product Changes on an Operational Rangeland Management Tool

    NASA Astrophysics Data System (ADS)

    Holifield Collins, C.; Skirvin, S. M.; Kautz, M. A.; Metz, L. J.

    2017-12-01

    The Landsat Surface Reflectance (SR) Product is valuable for applied research of the Earth's surface and has been used in the development of a number of operational products. Landsat SR data available as of April 2017 have been processed through a new system and are publically available as part of a new product set known as Collection 1. The impact of these changes has not yet been described, but knowing their nature and magnitude is vital for continued confidence in operational products produced from these datasets. The Rangeland Brush Estimation Toolbox (RaBET) developed for USDA Natural Resources Conservation Service (NRCS) land managers is based on relationships developed using the pre-April 2017 Landsat SR Product to derive estimates of woody cover in western rangelands. The maps produced from this tool will be used to aid in planning the implementation of brush removal treatments. Due to the vast expenditure of resources (millions of dollars per year) required to execute these treatments, it is imperative that the effects of the SR data processing changes are understood to allow for modifications in the tool if necessary. The objectives of this study are to: 1) determine where SR data processing changes have the greatest effect on the Landsat-based vegetation indices used within RaBET for Major Land Resource Areas (MLRAs) in Arizona and Texas, and 2) compare model outputs arising from Landsat SR data obtained pre- and post- April 2017 to assess the magnitude of the changes to the RaBET end product.

  1. Spatio-temporal interpolation of precipitation during monsoon periods in Pakistan

    NASA Astrophysics Data System (ADS)

    Hussain, Ijaz; Spöck, Gunter; Pilz, Jürgen; Yu, Hwa-Lung

    2010-08-01

    Spatio-temporal estimation of precipitation over a region is essential to the modeling of hydrologic processes for water resources management. The changes of magnitude and space-time heterogeneity of rainfall observations make space-time estimation of precipitation a challenging task. In this paper we propose a Box-Cox transformed hierarchical Bayesian multivariate spatio-temporal interpolation method for the skewed response variable. The proposed method is applied to estimate space-time monthly precipitation in the monsoon periods during 1974-2000, and 27-year monthly average precipitation data are obtained from 51 stations in Pakistan. The results of transformed hierarchical Bayesian multivariate spatio-temporal interpolation are compared to those of non-transformed hierarchical Bayesian interpolation by using cross-validation. The software developed by [11] is used for Bayesian non-stationary multivariate space-time interpolation. It is observed that the transformed hierarchical Bayesian method provides more accuracy than the non-transformed hierarchical Bayesian method.

  2. Fine-granularity inference and estimations to network traffic for SDN.

    PubMed

    Jiang, Dingde; Huo, Liuwei; Li, Ya

    2018-01-01

    An end-to-end network traffic matrix is significantly helpful for network management and for Software Defined Networks (SDN). However, the end-to-end network traffic matrix's inferences and estimations are a challenging problem. Moreover, attaining the traffic matrix in high-speed networks for SDN is a prohibitive challenge. This paper investigates how to estimate and recover the end-to-end network traffic matrix in fine time granularity from the sampled traffic traces, which is a hard inverse problem. Different from previous methods, the fractal interpolation is used to reconstruct the finer-granularity network traffic. Then, the cubic spline interpolation method is used to obtain the smooth reconstruction values. To attain an accurate the end-to-end network traffic in fine time granularity, we perform a weighted-geometric-average process for two interpolation results that are obtained. The simulation results show that our approaches are feasible and effective.

  3. Recharge estimation in semi-arid karst catchments: Central West Bank, Palestine

    NASA Astrophysics Data System (ADS)

    Jebreen, Hassan; Wohnlich, Stefan; Wisotzky, Frank; Banning, Andre; Niedermayr, Andrea; Ghanem, Marwan

    2018-03-01

    Knowledge of groundwater recharge constitutes a valuable tool for sustainable management in karst systems. In this respect, a quantitative evaluation of groundwater recharge can be considered a pre-requisite for the optimal operation of groundwater resources systems, particular for semi-arid areas. This paper demonstrates the processes affecting recharge in Palestine aquifers. The Central Western Catchment is one of the main water supply sources in the West Bank. Quantification of potential recharge rates are estimated using chloride mass balance (CMB) and empirical recharge equations over the catchment. The results showing the spatialized recharge rate, which ranges from 111-216 mm/year, representing 19-37% of the long-term mean annual rainfall. Using Water Balance models and climatological data (e. g. solar radiation, monthly temperature, average monthly relative humidity and precipitation), actual evapotranspiration (AET) is estimated. The mean annual actual evapotranspiration was about 66-70% of precipitation.

  4. Fine-granularity inference and estimations to network traffic for SDN

    PubMed Central

    Huo, Liuwei; Li, Ya

    2018-01-01

    An end-to-end network traffic matrix is significantly helpful for network management and for Software Defined Networks (SDN). However, the end-to-end network traffic matrix's inferences and estimations are a challenging problem. Moreover, attaining the traffic matrix in high-speed networks for SDN is a prohibitive challenge. This paper investigates how to estimate and recover the end-to-end network traffic matrix in fine time granularity from the sampled traffic traces, which is a hard inverse problem. Different from previous methods, the fractal interpolation is used to reconstruct the finer-granularity network traffic. Then, the cubic spline interpolation method is used to obtain the smooth reconstruction values. To attain an accurate the end-to-end network traffic in fine time granularity, we perform a weighted-geometric-average process for two interpolation results that are obtained. The simulation results show that our approaches are feasible and effective. PMID:29718913

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anastasia M. Gribik; Ronald E. Mizia; Harry Gatley

    This project addresses both the technical and economic feasibility of replacing industrial gas in lime kilns with synthesis gas from the gasification of hog fuel. The technical assessment includes a materials evaluation, processing equipment needs, and suitability of the heat content of the synthesis gas as a replacement for industrial gas. The economic assessment includes estimations for capital, construction, operating, maintenance, and management costs for the reference plant. To perform these assessments, detailed models of the gasification and lime kiln processes were developed using Aspen Plus. The material and energy balance outputs from the Aspen Plus model were used asmore » inputs to both the material and economic evaluations.« less

  6. Attribution of Net Carbon Change by Disturbance Type across Forest Lands of the Continental United States

    NASA Astrophysics Data System (ADS)

    Hagen, S. C.; Harris, N.; Saatchi, S. S.; Domke, G. M.; Woodall, C. W.; Pearson, T.

    2016-12-01

    We generated spatially comprehensive maps of carbon stocks and net carbon changes from US forestlands between 2005 and 2010 and attributed the changes to natural and anthropogenic processes. The prototype system created to produce these maps is designed to assist with national GHG inventories and support decisions associated with land management. Here, we present the results and methodological framework of our analysis. In summary, combining estimates of net C losses and gains results in net carbon change of 269±49 Tg C yr-1 (sink) in the coterminous US forest land, with carbon loss from harvest acting as the predominent source process.

  7. A national surveillance project on chronic kidney disease management in Canadian primary care: a study protocol.

    PubMed

    Bello, Aminu K; Ronksley, Paul E; Tangri, Navdeep; Singer, Alexander; Grill, Allan; Nitsch, Dorothea; Queenan, John A; Lindeman, Cliff; Soos, Boglarka; Freiheit, Elizabeth; Tuot, Delphine; Mangin, Dee; Drummond, Neil

    2017-08-04

    Effective chronic disease care is dependent on well-organised quality improvement (QI) strategies that monitor processes of care and outcomes for optimal care delivery. Although healthcare is provincially/territorially structured in Canada, there are national networks such as the Canadian Primary Care Sentinel Surveillance Network (CPCSSN) as important facilitators for national QI-based studies to improve chronic disease care. The goal of our study is to improve the understanding of how patients with chronic kidney disease (CKD) are managed in primary care and the variation across practices and provinces and territories to drive improvements in care delivery. The CPCSSN database contains anonymised health information from the electronic medical records for patients of participating primary care practices (PCPs) across Canada (n=1200). The dataset includes information on patient sociodemographics, medications, laboratory results and comorbidities. Leveraging validated algorithms, case definitions and guidelines will help define CKD and the related processes of care, and these enable us to: (1) determine prevalent CKD burden; (2) ascertain the current practice pattern on risk identification and management of CKD and (3) study variation in care indicators (eg, achievement of blood pressure and proteinuria targets) and referral pattern for specialist kidney care. The process of care outcomes will be stratified across patients' demographics as well as provider and regional (provincial/territorial) characteristics. The prevalence of CKD stages 3-5 will be presented as age-sex standardised prevalence estimates stratified by province and as weighted averages for population rates with 95% CIs using census data. For each PCP, age-sex standardised prevalence will be calculated and compared with expected standardised prevalence estimates. The process-based outcomes will be defined using established methods. The CPCSSN is committed to high ethical standards when dealing with individual data collected, and this work is reviewed and approved by the Network Scientific Committee. The results will be published in peer-reviewed journals and presented at relevant national and international scientific meetings. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. Risk analysis for biological hazards: What we need to know about invasive species

    USGS Publications Warehouse

    Stohlgren, T.J.; Schnase, J.L.

    2006-01-01

    Risk analysis for biological invasions is similar to other types of natural and human hazards. For example, risk analysis for chemical spills requires the evaluation of basic information on where a spill occurs; exposure level and toxicity of the chemical agent; knowledge of the physical processes involved in its rate and direction of spread; and potential impacts to the environment, economy, and human health relative to containment costs. Unlike typical chemical spills, biological invasions can have long lag times from introduction and establishment to successful invasion, they reproduce, and they can spread rapidly by physical and biological processes. We use a risk analysis framework to suggest a general strategy for risk analysis for invasive species and invaded habitats. It requires: (1) problem formation (scoping the problem, defining assessment endpoints); (2) analysis (information on species traits, matching species traits to suitable habitats, estimating exposure, surveys of current distribution and abundance); (3) risk characterization (understanding of data completeness, estimates of the “potential” distribution and abundance; estimates of the potential rate of spread; and probable risks, impacts, and costs); and (4) risk management (containment potential, costs, and opportunity costs; legal mandates and social considerations and information science and technology needs).

  9. Enhanced monitoring of the temporal and spatial relationships between water demand and water availability

    NASA Astrophysics Data System (ADS)

    Schneider, C. A.; Aggett, G. R.; Hattendorf, M. J.

    2007-12-01

    Better information on evapotranspiration (ET) is essential to better understanding of consumptive use of water by crops. RTi is using NASA Earth-sun System research results and METRIC (Mapping ET at high Resolution with Internalized Calibration) to increase the repeatability and accuracy of consumptive use estimates. METRIC, an image-processing model for calculating ET as a residual of the surface energy balance, utilizes the thermal band on various satellite remote sensors. Calculating actual ET from satellites can avoid many of the assumptions driving other methods of calculating ET over a large area. Because it is physically based and does not rely on explicit knowledge of crop type in the field, a large potential source of error should be eliminated. This paper assesses sources of error in current operational estimates of ET for an area of the South Platte irrigated lands of Colorado, and benchmarks potential improvements in the accuracy of ET estimates gained using METRIC, as well as the processing efficiency of consumptive use demand for large irrigated lands. Examples highlighting how better water planning decisions and water management can be achieved via enhanced monitoring of the temporal and spatial relationships between water demand and water availability are provided.

  10. Data modeling and processing in deregulated power system

    NASA Astrophysics Data System (ADS)

    Xu, Lin

    The introduction of open electricity markets and the fast pace of changes brought by modern information technology bring both opportunities and challenges to the power industry. Vast quantities of data are generated by the underlying physical system and the business operations. Fast and low cost communications allow the data to be more widely accessed. For electric utilities, it is becoming clear that data and information are vital assets. Proper management and modeling of these assets is as essential to the engineering of the power system as is the underlying physical system. This dissertation introduces several new methods to address information modeling and data processing concerns in the new utility environment. Presently, legacy information systems in the industry do not make adequate use of the data produced. Hence, a new information infrastructure using data warehousing---a data integration technology used for decision support---is proposed for novel management and utilization of data. Detailed examples and discussion are given on the schema building, extract transform and load (ETL) strategies for power system specific data. The benefits of this approach are shown through a new viewpoint of state estimation. Inaccurate grid information, especially topology information, can be a major detriment to energy market traders' ability to make appropriate bids. A two-stage DC state estimation algorithm is presented to provide them with a simpler data viewpoint to make knowledgeable trading decisions. Numerical results show how the results of a DC state estimator can be accurately made available to all concerned. Additionally, the proposed communication and information infrastructure allow for new formulations and solutions to traditional power problems. In this vein, a new distributed communication model of the power system using publisher/subscriber paradigm is presented and simulated. The simulation results prove its feasibility and show it has adequate performance under today's communication technology. Based on this model, a new state estimation algorithm, which can decentralizes computations and minimizes communication overhead, is derived using a set of overlapping areas to cover the entire network. Numerical experiments show that it is efficient, robust, and has comparable accuracy as the conventional full network state estimation.

  11. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    USGS Publications Warehouse

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are becoming increasingly popular for group size modeling. Choosing appropriate statistical distributions for modeling flock size data is fundamental to accurately estimating population summaries, determining required survey effort, and assessing and propagating uncertainty through decision-making processes.

  12. On formally integrating science and policy: walking the walk

    USGS Publications Warehouse

    Nichols, James D.; Johnson, Fred A.; Williams, Byron K.; Boomer, G. Scott

    2015-01-01

    The contribution of science to the development and implementation of policy is typically neither direct nor transparent.  In 1995, the U.S. Fish and Wildlife Service (FWS) made a decision that was unprecedented in natural resource management, turning to an unused and unproven decision process to carry out trust responsibilities mandated by an international treaty.  The decision process was adopted for the establishment of annual sport hunting regulations for the most economically important duck population in North America, the 6 to 11 million mallards Anas platyrhynchos breeding in the mid-continent region of north-central United States and central Canada.  The key idea underlying the adopted decision process was to formally embed within it a scientific process designed to reduce uncertainty (learn) and thus make better decisions in the future.  The scientific process entails use of models to develop predictions of competing hypotheses about system response to the selected action at each decision point.  These prediction not only are used to select the optimal management action, but also are compared with the subsequent estimates of system state variables, providing evidence for modifying degrees of confidence in, and hence relative influence of, these models at the next decision point.  Science and learning in one step are formally and directly incorporated into the next decision, contrasting with the usual ad hoc and indirect use of scientific results in policy development and decision-making.  Application of this approach over the last 20 years has led to a substantial reduction in uncertainty, as well as to an increase in transparency and defensibility of annual decisions and a decrease in the contentiousness of the decision process.  As resource managers are faced with increased uncertainty associated with various components of global change, this approach provides a roadmap for the future scientific management of natural resources.  

  13. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering process parameters are elicited from the engineer/expert on the project and are based on that expert's technical knowledge. These are converted by a parametric cost model into a cost estimate. The method discussed makes no assumptions about the distribution underlying the distribution of possible costs, and is not tied to the analysis of previous projects, except through the expert calibrations performed by the parametric cost analyst.

  14. The Air Force Processes for Approving Air Force Life Cycle Management Center Single-Award Indefinite-Delivery Indefinite-Quantity Contracts Need Improvement

    DTIC Science & Technology

    2016-04-29

    Force senior procurement executive (SPE), did not approve four D&Fs for three contracts because SAF/AQ officials incorrectly concluded that senior ...by the senior procurement executive (SPE) when awarding single-award IDIQ contracts estimated to exceed the dollar threshold then at $103 million...Procurement and Acquisition Policy (DPAP), as required. We will provide a copy of the report to the senior official responsible for internal

  15. Using a decision support system to estimate departures of present forest landscape patterns from historical reference condition—an example from the inland Northwest region of the United States.

    Treesearch

    P.F. Hessburg; K.M. Reynolds; R.B. Salter; M.B. Richmond

    2004-01-01

    Human settlement and management activities have altered the patterns and processes of forest landscapes across the inland northwest region of the United States (Hessburg et al. 2000C; Hessburg and Agee in press). As a consequence, many attributes of current disturbance regimes (e.g., the frequency, duration, severity, and extent of fires) differ markedly from those of...

  16. Estimating Supply-Chain Burdens in Support of Acquisition Decisions

    DTIC Science & Technology

    2013-03-20

    Çì~íÉ=pÅÜççä= Abstract Acquisition decisions drive supply-chain requirements that incur financial costs and other critical impacts. To account ...their rates are fully burdened with respect to the cost of transport assets and any force protection, if necessary. However, DoD accounting systems and...Board (DSB) report (2008) highlighted the failure of DoD management processes to properly account for the enterprise- wide costs of fuel. Before 2009

  17. Estimating the Term Structure With a Semiparametric Bayesian Hierarchical Model: An Application to Corporate Bonds.

    PubMed

    Cruz-Marcelo, Alejandro; Ensor, Katherine B; Rosner, Gary L

    2011-06-01

    The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material.

  18. Estimating the Term Structure With a Semiparametric Bayesian Hierarchical Model: An Application to Corporate Bonds1

    PubMed Central

    Cruz-Marcelo, Alejandro; Ensor, Katherine B.; Rosner, Gary L.

    2011-01-01

    The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material. PMID:21765566

  19. An Expert System For Multispectral Threat Assessment And Response

    NASA Astrophysics Data System (ADS)

    Steinberg, Alan N.

    1987-05-01

    A concept has been defined for an automatic system to manage the self-defense of a combat aircraft. Distinctive new features of this concept include: a. the flexible prioritization of tasks and coordinated use of sensor, countermeasures, flight systems and weapons assets by means of an automated planning function; b. the integration of state-of-the-art data fusion algorithms with event prediction processing; c. the use of advanced Artificial Intelligence tools to emulate the decision processes of tactical EW experts. Threat Assessment functions (a) estimate threat identity, lethality and intent on the basis of multi-spectral sensor data, and (b) predict the time to critical events in threat engagements (e.g., target acquisition, tracking, weapon launch, impact). Response Management functions (a) select candidate responses to reported threat situations; (b) estimate the effects of candidate actions on survival; and (c) coordinate the assignment of sensors, weapons and countermeasures with the flight plan. The system employs Finite State Models to represent current engagements and to predict subsequent events. Each state in a model is associated with a set of observable features, allowing interpretation of sensor data and adaptive use of sensor assets. Defined conditions on state transitions allow prediction of times to critical future states and are used in planning self-defensive responses, which are designed either to impede a particular state transition or to force a transition to a lower threat state.

  20. GIS-based probability assessment of natural hazards in forested landscapes of Central and South-Eastern Europe.

    PubMed

    Lorz, C; Fürst, C; Galic, Z; Matijasic, D; Podrazky, V; Potocic, N; Simoncic, P; Strauch, M; Vacik, H; Makeschin, F

    2010-12-01

    We assessed the probability of three major natural hazards--windthrow, drought, and forest fire--for Central and South-Eastern European forests which are major threats for the provision of forest goods and ecosystem services. In addition, we analyzed spatial distribution and implications for a future oriented management of forested landscapes. For estimating the probability of windthrow, we used rooting depth and average wind speed. Probabilities of drought and fire were calculated from climatic and total water balance during growing season. As an approximation to climate change scenarios, we used a simplified approach with a general increase of pET by 20%. Monitoring data from the pan-European forests crown condition program and observed burnt areas and hot spots from the European Forest Fire Information System were used to test the plausibility of probability maps. Regions with high probabilities of natural hazard are identified and management strategies to minimize probability of natural hazards are discussed. We suggest future research should focus on (i) estimating probabilities using process based models (including sensitivity analysis), (ii) defining probability in terms of economic loss, (iii) including biotic hazards, (iv) using more detailed data sets on natural hazards, forest inventories and climate change scenarios, and (v) developing a framework of adaptive risk management.

  1. Climate change: evaluating your local and regional water resources

    USGS Publications Warehouse

    Flint, Lorraine E.; Flint, Alan L.; Thorne, James H.

    2015-01-01

    The BCM is a fine-scale hydrologic model that uses detailed maps of soils, geology, topography, and transient monthly or daily maps of potential evapotranspiration, air temperature, and precipitation to generate maps of recharge, runoff, snow pack, actual evapotranspiration, and climatic water deficit. With these comprehensive environmental inputs and experienced scientific analysis, the BCM provides resource managers with important hydrologic and ecologic understanding of a landscape or basin at hillslope to regional scales. The model is calibrated using historical climate and streamflow data over the range of geologic materials specific to an area. Once calibrated, the model is used to translate climate-change data into hydrologic responses for a defined landscape, to provide managers an understanding of potential ecological risks and threats to water supplies and managed hydrologic systems. Although limited to estimates of unimpaired hydrologic conditions, estimates of impaired conditions, such as agricultural demand, diversions, or reservoir outflows can be incorporated into the calibration of the model to expand its utility. Additionally, the model can be linked to other models, such as groundwater-flow models (that is, MODFLOW) or the integrated hydrologic model (MF-FMP), to provide information about subsurface hydrologic processes. The model can be applied at a relatively small scale, but also can be applied to large-scale national and international river basins.

  2. Sediment management and renewability of floodplain clay for structural ceramics

    NASA Astrophysics Data System (ADS)

    van der Meulen, M. J.; Wiersma, A. P.; Middelkoop, H.; van der Perk, M.; Bakker, M.; Maljers, D.; Hobo, N.; Makaske, B.

    2009-04-01

    The Netherlands have vast resources of clay that are exploited for the fabrication of structural ceramic products such as bricks and roof tiles. The extraction of clay creates land surface lowerings of about 1.5 m, of which the majority are located in the embanked floodplains of the rivers Rhine and Meuse. At these surface lowerings, clay is replenished within several decades. This study explores to which extent the clay can be regarded as a renewable resource, with potential for sustainable use. For this purpose, first the current and past clay consumption is calculated. Subsequently, clay deposition in the floodplains is estimated from literature data on clay accumulation using sediment traps, heavy metal and radionuclide distribution in soil profiles, and from morphological modelling studies. These estimates of clay-deposition and consumption are then compared following three approaches that consider various temporal and spatial scales of clay deposition. This allows us to establish the extent to which man determines sedimentary processes in the Dutch floodplains. Consequently, using the sediment response to the land surface lowering resulting from clay extraction, we explore sediment management options for the Dutch Rhine and Meuse. Altogether we argue that clay has been, probably is, and certainly can be managed as a renewable mineral resource.

  3. [Development of key indicators for nurses performance evaluation and estimation of their weights for management by objectives].

    PubMed

    Lee, Eun Hwa; Ahn, Sung Hee

    2010-02-01

    This methodological research was designed to develop performance evaluation key indicators (PEKIs) for management by objectives (MBO) and to estimate their weights for hospital nurses. The PEKIs were developed by selecting preliminary indicators from a literature review, examining content validity and identifying their level of importance. Data were collected from November 14, 2007 to February 18, 2008. Data set for importance of indicators was obtained from 464 nurses and weights of PEKIs domain was from 453 nurses, who worked for at least 2 yr in one of three hospitals. Data were analyzed using X(2)-test, factor analysis, and the Analytical Hierarchy Process. Based upon Content Validity Index of .8 or above, 61 indicators were selected from the 100 preliminary indicators. Finally, 40 PEKIs were developed from the 61 indicators, and categorized into 10 domains. The highest weight of the 10 domains was customer satisfaction, which was followed by patient education, direct nursing care, profit increase, safety management, improvement of nursing quality, completeness of nursing records, enhancing competence of nurses, indirect nursing care, and cost reduction, in that order. PEKIs and their weights can be utilized for impartial evaluation and MBO for hospital nurses. Further research to verify PEKIs would lead to successful implementation of MBO.

  4. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... identify and estimate safety and environmental management risks and appropriate risk reduction strategies... responsible for identifying/estimating risks and for appropriate risk reduction strategies? 102-80.50 Section... Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for...

  5. Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach

    ERIC Educational Resources Information Center

    Stevenson, Glenn A.

    2012-01-01

    For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…

  6. Application of remote sensing in crop growth simulation and an ensembles approach to reduce model uncertainties

    NASA Astrophysics Data System (ADS)

    Setiyono, T. D.; Nelson, A.; Ravis, J.; Maunahan, A.; Villano, L.; Li, T.; Bouman, B.

    2012-12-01

    A semi-empirical model derived from the water-cloud model was used to convert synthetic- aperture radar (SAR) backscattering data into LAI. The SAR-based LAI at early rice growth stages were in a close agreement (90%) with LAI derived from MODIS data for the same study location in Nueva Ecija, Philippines. ORYZA2000 simulated rice yield of 4.5 Mg ha-1 for the 2008 wet season in Nueva Ejica, Philippines when using LAI inputs derived from SAR data, which is closer to the observed yield of 3.9 Mg ha-1, whereas simulated yield without SAR-derived LAI inputs was 5.4 Mg ha-1. The dynamic water and nitrogen balances were accounted in these simulations based on site-specific soil properties and actual fertilizer N and water management. The use of remote sensing data was promising for model application to approximate actual growth conditions and to compensate for limitations in the model due to relevant underlining processes absent in model formulations such as detailed tillering, leaf shading effect, etc., and also limiting factors not accounted in the model such as biotic factors and abiotic factors other than water and N shortages. This study also demonstrated the use an ensembles approach for provincial level rice yield estimation in the Philippines. Such ensembles approach involved statistical classifications of agronomic management settings into 25% percentile, median, and 75% levels followed by generation of factorial combinations. For irrigated lowland system, 4 factors were considered that include transplanting date, plant density, fertilizer N rate, and amount of irrigation water. For rainfed lowland system, there were 3 agronomic management factors (transplanting date, plant density, fertilizer N) and 1 soil parameter (depth of ground water table). These 4 management/soil factors and 3 statistical levels resulted in 81 total factorial combinations representing simulation scenarios for each area of interest (province in the Philippines) and water environments (irrigated vs. rainfed). Finally a normal distribution was assumed and applied to the simulations outputs. This ensembles approach provided an efficient and yet effective method of aggregating point-based crop model results into a larger spatial level of interest. Lack of access to accurate model parameters (e.g. depth of ground water table) could be solved with this approach. The use of process-based crop growth model was critical because the ultimate aim of this study was not just to establish a reliable rice yield estimation system but also to allow yield estimation outputs explainable by the underlining agronomic practices such as transplanting date, fertilizer N application, and water management.

  7. Demographics of reintroduced populations: estimation, modeling, and decision analysis

    USGS Publications Warehouse

    Converse, Sarah J.; Moore, Clinton T.; Armstrong, Doug P.

    2013-01-01

    Reintroduction can be necessary for recovering populations of threatened species. However, the success of reintroduction efforts has been poorer than many biologists and managers would hope. To increase the benefits gained from reintroduction, management decision making should be couched within formal decision-analytic frameworks. Decision analysis is a structured process for informing decision making that recognizes that all decisions have a set of components—objectives, alternative management actions, predictive models, and optimization methods—that can be decomposed, analyzed, and recomposed to facilitate optimal, transparent decisions. Because the outcome of interest in reintroduction efforts is typically population viability or related metrics, models used in decision analysis efforts for reintroductions will need to include population models. In this special section of the Journal of Wildlife Management, we highlight examples of the construction and use of models for informing management decisions in reintroduced populations. In this introductory contribution, we review concepts in decision analysis, population modeling for analysis of decisions in reintroduction settings, and future directions. Increased use of formal decision analysis, including adaptive management, has great potential to inform reintroduction efforts. Adopting these practices will require close collaboration among managers, decision analysts, population modelers, and field biologists.

  8. Management of Esophageal Food Impaction Varies Among Gastroenterologists and Affects Identification of Eosinophilic Esophagitis.

    PubMed

    Hiremath, Girish; Vaezi, Michael F; Gupta, Sandeep K; Acra, Sari; Dellon, Evan S

    2018-06-01

    Esophageal food impaction (EFI) is a gastrointestinal emergency requiring immediate evaluation in the emergency room (ER) and an esophagogastroduodenoscopy (EGD) for disimpaction. EFI is also a distinct presenting feature of eosinophilic esophagitis (EoE). This study aimed at understanding the management of EFI among gastroenterologists (GIs) and estimated its impact on identification of EoE in USA. GIs associated with three major gastroenterology societies based in USA were invited to participate in a web-based survey. Information on the resources available and utilized, and the clinical decision-making process related to management of EFI cases was collected and analyzed. Of 428 responses, 49% were from pediatric GIs, 86% practiced in the USA, and 78% practiced in an academic setting. Compared to the pediatric GIs, adult GIs were more likely to perform EGD in the emergency room [OR 87.96 (25.43-304.16)] and advance the food bolus into stomach [5.58 (3.08-10.12)]. Only 34% of respondents obtained esophageal biopsies during EGD, and pediatric GIs were more likely to obtain esophageal biopsies [3.49 (1.12-10.84)] compared to adult GIs. In USA, by our conservative estimates, 10,494 patients presenting to ER with EFI and at risk of EoE are likely being missed each year. EFI management varies substantially among GIs associated with three major gastroenterology societies in USA. Based on their practice patterns, the GIs in USA are likely to miss numerous EoE patients presenting to ER with EFI. Our findings highlight the need for developing and disseminating evidence-based EFI management practice guidelines.

  9. Hatchery Contributions to Emerging Naturally Produced Lake Huron Lake Trout.

    PubMed

    Scribner, Kim; Tsehaye, Iyob; Brenden, Travis; Stott, Wendylee; Kanefsky, Jeannette; Bence, James

    2018-06-19

    Recent assessments indicate the emergence of naturally produced lake trout (Salvelinus namaycush) recruitment throughout Lake Huron in the North American Laurentian Great Lakes (>50% of fish <7 yrs). Because naturally produced fish derived from different stocked hatchery strains are unmarked, managers cannot distinguish strains contributing to natural recruitment. We used 15 microsatellite loci to identify strains of naturally produced lake trout (N=1567) collected in assessment fisheries during early (2002-2004) and late (2009-2012) sampling periods. Individuals from 13 American and Canadian hatchery strains (N=1143) were genotyped to develop standardized baseline information. Strain contributions were estimated using a Bayesian inferential approach. Deviance information criteria was used to compare models evaluating strain contributions at different spatial and temporal scales. The best performing models were the most complex models, suggesting that hatchery strain contributions to naturally produced lake trout varied spatially among management districts and temporally between time periods. Contributions of Seneca strain lake trout were consistently high across most management districts, with contributions increasing from early to late time periods (estimates ranged from 52-94% for the late period across eight of nine districts). Strain contributions deviated from expectations based on historical stocking levels, indicating strains differed with respect to survival, reproductive success, and/or dispersal. Knowledge of recruitment levels of strains stocked in different management districts, and how strain-specific recruitment varies temporally, spatially, and as a function of local or regional stocking is important to prioritize strains for future stocking and management of the transition process from primarily hatchery to naturally produced stocks.

  10. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  11. Filtering data from the collaborative initial glaucoma treatment study for improved identification of glaucoma progression.

    PubMed

    Schell, Greggory J; Lavieri, Mariel S; Stein, Joshua D; Musch, David C

    2013-12-21

    Open-angle glaucoma (OAG) is a prevalent, degenerate ocular disease which can lead to blindness without proper clinical management. The tests used to assess disease progression are susceptible to process and measurement noise. The aim of this study was to develop a methodology which accounts for the inherent noise in the data and improve significant disease progression identification. Longitudinal observations from the Collaborative Initial Glaucoma Treatment Study (CIGTS) were used to parameterize and validate a Kalman filter model and logistic regression function. The Kalman filter estimates the true value of biomarkers associated with OAG and forecasts future values of these variables. We develop two logistic regression models via generalized estimating equations (GEE) for calculating the probability of experiencing significant OAG progression: one model based on the raw measurements from CIGTS and another model based on the Kalman filter estimates of the CIGTS data. Receiver operating characteristic (ROC) curves and associated area under the ROC curve (AUC) estimates are calculated using cross-fold validation. The logistic regression model developed using Kalman filter estimates as data input achieves higher sensitivity and specificity than the model developed using raw measurements. The mean AUC for the Kalman filter-based model is 0.961 while the mean AUC for the raw measurements model is 0.889. Hence, using the probability function generated via Kalman filter estimates and GEE for logistic regression, we are able to more accurately classify patients and instances as experiencing significant OAG progression. A Kalman filter approach for estimating the true value of OAG biomarkers resulted in data input which improved the accuracy of a logistic regression classification model compared to a model using raw measurements as input. This methodology accounts for process and measurement noise to enable improved discrimination between progression and nonprogression in chronic diseases.

  12. Use of structured decision-making to explicitly incorporate environmental process understanding in management of coastal restoration projects: Case study on barrier islands of the northern Gulf of Mexico.

    PubMed

    Dalyander, P Soupy; Meyers, Michelle; Mattsson, Brady; Steyer, Gregory; Godsey, Elizabeth; McDonald, Justin; Byrnes, Mark; Ford, Mark

    2016-12-01

    Coastal ecosystem management typically relies on subjective interpretation of scientific understanding, with limited methods for explicitly incorporating process knowledge into decisions that must meet multiple, potentially competing stakeholder objectives. Conversely, the scientific community lacks methods for identifying which advancements in system understanding would have the highest value to decision-makers. A case in point is barrier island restoration, where decision-makers lack tools to objectively use system understanding to determine how to optimally use limited contingency funds when project construction in this dynamic environment does not proceed as expected. In this study, collaborative structured decision-making (SDM) was evaluated as an approach to incorporate process understanding into mid-construction decisions and to identify priority gaps in knowledge from a management perspective. The focus was a barrier island restoration project at Ship Island, Mississippi, where sand will be used to close an extensive breach that currently divides the island. SDM was used to estimate damage that may occur during construction, and guide repair decisions within the confines of limited availability of sand and funding to minimize adverse impacts to project objectives. Sand was identified as more limiting than funds, and unrepaired major breaching would negatively impact objectives. Repairing minor damage immediately was determined to be generally more cost effective (depending on the longshore extent) than risking more damage to a weakened project. Key gaps in process-understanding relative to project management were identified as the relationship of island width to breach formation; the amounts of sand lost during breaching, lowering, or narrowing of the berm; the potential for minor breaches to self-heal versus developing into a major breach; and the relationship between upstream nourishment and resiliency of the berm to storms. This application is a prototype for using structured decision-making in support of engineering projects in dynamic environments where mid-construction decisions may arise; highlights uncertainty about barrier island physical processes that limit the ability to make robust decisions; and demonstrates the potential for direct incorporation of process-based models in a formal adaptive management decision framework. Published by Elsevier Ltd.

  13. Use of structured decision-making to explicitly incorporate environmental process understanding in management of coastal restoration projects: Case study on barrier islands of the northern Gulf of Mexico

    USGS Publications Warehouse

    Dalyander, P. Soupy; Meyers, Michelle B.; Mattsson, Brady; Steyer, Gregory; Godsey, Elizabeth; McDonald, Justin; Byrnes, Mark R.; Ford, Mark

    2016-01-01

    Coastal ecosystem management typically relies on subjective interpretation of scientific understanding, with limited methods for explicitly incorporating process knowledge into decisions that must meet multiple, potentially competing stakeholder objectives. Conversely, the scientific community lacks methods for identifying which advancements in system understanding would have the highest value to decision-makers. A case in point is barrier island restoration, where decision-makers lack tools to objectively use system understanding to determine how to optimally use limited contingency funds when project construction in this dynamic environment does not proceed as expected. In this study, collaborative structured decision-making (SDM) was evaluated as an approach to incorporate process understanding into mid-construction decisions and to identify priority gaps in knowledge from a management perspective. The focus was a barrier island restoration project at Ship Island, Mississippi, where sand will be used to close an extensive breach that currently divides the island. SDM was used to estimate damage that may occur during construction, and guide repair decisions within the confines of limited availability of sand and funding to minimize adverse impacts to project objectives. Sand was identified as more limiting than funds, and unrepaired major breaching would negatively impact objectives. Repairing minor damage immediately was determined to be generally more cost effective (depending on the longshore extent) than risking more damage to a weakened project. Key gaps in process-understanding relative to project management were identified as the relationship of island width to breach formation; the amounts of sand lost during breaching, lowering, or narrowing of the berm; the potential for minor breaches to self-heal versus developing into a major breach; and the relationship between upstream nourishment and resiliency of the berm to storms. This application is a prototype for using structured decision-making in support of engineering projects in dynamic environments where mid-construction decisions may arise; highlights uncertainty about barrier island physical processes that limit the ability to make robust decisions; and demonstrates the potential for direct incorporation of process-based models in a formal adaptive management decision framework.

  14. A model for estimating pathogen variability in shellfish and predicting minimum depuration times.

    PubMed

    McMenemy, Paul; Kleczkowski, Adam; Lees, David N; Lowther, James; Taylor, Nick

    2018-01-01

    Norovirus is a major cause of viral gastroenteritis, with shellfish consumption being identified as one potential norovirus entry point into the human population. Minimising shellfish norovirus levels is therefore important for both the consumer's protection and the shellfish industry's reputation. One method used to reduce microbiological risks in shellfish is depuration; however, this process also presents additional costs to industry. Providing a mechanism to estimate norovirus levels during depuration would therefore be useful to stakeholders. This paper presents a mathematical model of the depuration process and its impact on norovirus levels found in shellfish. Two fundamental stages of norovirus depuration are considered: (i) the initial distribution of norovirus loads within a shellfish population and (ii) the way in which the initial norovirus loads evolve during depuration. Realistic assumptions are made about the dynamics of norovirus during depuration, and mathematical descriptions of both stages are derived and combined into a single model. Parameters to describe the depuration effect and norovirus load values are derived from existing norovirus data obtained from U.K. harvest sites. However, obtaining population estimates of norovirus variability is time-consuming and expensive; this model addresses the issue by assuming a 'worst case scenario' for variability of pathogens, which is independent of mean pathogen levels. The model is then used to predict minimum depuration times required to achieve norovirus levels which fall within possible risk management levels, as well as predictions of minimum depuration times for other water-borne pathogens found in shellfish. Times for Escherichia coli predicted by the model all fall within the minimum 42 hours required for class B harvest sites, whereas minimum depuration times for norovirus and FRNA+ bacteriophage are substantially longer. Thus this study provides relevant information and tools to assist norovirus risk managers with future control strategies.

  15. Inferring invasive species abundance using removal data from management actions.

    PubMed

    Davis, Amy J; Hooten, Mevin B; Miller, Ryan S; Farnsworth, Matthew L; Lewis, Jesse; Moxcey, Michael; Pepin, Kim M

    2016-10-01

    Evaluation of the progress of management programs for invasive species is crucial for demonstrating impacts to stakeholders and strategic planning of resource allocation. Estimates of abundance before and after management activities can serve as a useful metric of population management programs. However, many methods of estimating population size are too labor intensive and costly to implement, posing restrictive levels of burden on operational programs. Removal models are a reliable method for estimating abundance before and after management using data from the removal activities exclusively, thus requiring no work in addition to management. We developed a Bayesian hierarchical model to estimate abundance from removal data accounting for varying levels of effort, and used simulations to assess the conditions under which reliable population estimates are obtained. We applied this model to estimate site-specific abundance of an invasive species, feral swine (Sus scrofa), using removal data from aerial gunning in 59 site/time-frame combinations (480-19,600 acres) throughout Oklahoma and Texas, USA. Simulations showed that abundance estimates were generally accurate when effective removal rates (removal rate accounting for total effort) were above 0.40. However, when abundances were small (<50) the effective removal rate needed to accurately estimates abundances was considerably higher (0.70). Based on our post-validation method, 78% of our site/time frame estimates were accurate. To use this modeling framework it is important to have multiple removals (more than three) within a time frame during which demographic changes are minimized (i.e., a closed population; ≤3 months for feral swine). Our results show that the probability of accurately estimating abundance from this model improves with increased sampling effort (8+ flight hours across the 3-month window is best) and increased removal rate. Based on the inverse relationship between inaccurate abundances and inaccurate removal rates, we suggest auxiliary information that could be collected and included in the model as covariates (e.g., habitat effects, differences between pilots) to improve accuracy of removal rates and hence abundance estimates. © 2016 by the Ecological Society of America.

  16. Cost-effective sampling of (137)Cs-derived net soil redistribution: part 2 - estimating the spatial mean change over time.

    PubMed

    Chappell, A; Li, Y; Yu, H Q; Zhang, Y Z; Li, X Y

    2015-06-01

    The caesium-137 ((137)Cs) technique for estimating net, time-integrated soil redistribution by the processes of wind, water and tillage is increasingly being used with repeated sampling to form a baseline to evaluate change over small (years to decades) timeframes. This interest stems from knowledge that since the 1950s soil redistribution has responded dynamically to different phases of land use change and management. Currently, there is no standard approach to detect change in (137)Cs-derived net soil redistribution and thereby identify the driving forces responsible for change. We outline recent advances in space-time sampling in the soil monitoring literature which provide a rigorous statistical and pragmatic approach to estimating the change over time in the spatial mean of environmental properties. We apply the space-time sampling framework, estimate the minimum detectable change of net soil redistribution and consider the information content and cost implications of different sampling designs for a study area in the Chinese Loess Plateau. Three phases (1954-1996, 1954-2012 and 1996-2012) of net soil erosion were detectable and attributed to well-documented historical change in land use and management practices in the study area and across the region. We recommend that the design for space-time sampling is considered carefully alongside cost-effective use of the spatial mean to detect and correctly attribute cause of change over time particularly across spatial scales of variation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. A conservation planning tool for Greater Sage-grouse using indices of species distribution, resilience, and resistance.

    PubMed

    Ricca, Mark A; Coates, Peter S; Gustafson, K Benjamin; Brussee, Brianne E; Chambers, Jeanne C; Espinosa, Shawn P; Gardner, Scott C; Lisius, Sherri; Ziegler, Pilar; Delehanty, David J; Casazza, Michael L

    2018-06-01

    Managers require quantitative yet tractable tools that identify areas for restoration yielding effective benefits for targeted wildlife species and the ecosystems they inhabit. As a contemporary example of high national significance for conservation, the persistence of Greater Sage-grouse (Centrocercus urophasianus) in the Great Basin is compromised by strongly interacting stressors of conifer expansion, annual grass invasion, and more frequent wildfires occurring in sagebrush ecosystems. Associated restoration treatments to a sagebrush-dominated state are often costly and may yield relatively little ecological benefit to sage-grouse if implemented without estimating how Sage-grouse may respond to treatments, or do not consider underlying processes influencing sagebrush ecosystem resilience to disturbance and resistance to invasive species. Here, we describe example applications of a spatially explicit conservation planning tool (CPT) to inform prioritization of: (1) removal of conifers (i.e., pinyon-juniper); and (2) wildfire restoration aimed at improving habitat conditions for the Bi-State Distinct Population Segment of Sage-grouse along the California-Nevada state line. The CPT measures ecological benefits to sage-grouse for a given management action through a composite index comprised of resource selection functions and estimates of abundance and space use. For pinyon-juniper removal, we simulated changes in land-cover composition following the removal of sparse trees with intact understories, and ranked treatments on the basis of changes in ecological benefits per dollar-unit of cost. For wildfire restoration, we formulated a conditional model to simulate scenarios for land cover changes (e.g., sagebrush to annual grass) given estimated fire severity and underlying ecosystem processes influencing resilience to disturbance and resistance to invasion by annual grasses. For both applications, we compared CPT rankings to land cover changes along with sagebrush resistance and resilience metrics. Model results demonstrated how the CPT can be an important step in identifying management projects that yield the highest quantifiable benefit to Sage-grouse while avoiding costly misallocation of resources, and highlight the importance of considering changes in sage-grouse ecological response and factors influencing sagebrush ecosystem resilience to disturbance and resistance to invasion. This unique framework can be adopted to help inform other management questions aimed at improving habitat for other species across sagebrush and other ecosystems. © 2018 The Authors. Ecological Applications published by Wiley Periodicals, Inc. on behalf of Ecological Society of America.

  18. The cost and management of different types of clinical mastitis in dairy cows estimated by dynamic programming.

    PubMed

    Cha, E; Bar, D; Hertl, J A; Tauer, L W; Bennett, G; González, R N; Schukken, Y H; Welcome, F L; Gröhn, Y T

    2011-09-01

    The objective of this study was to estimate the cost of 3 different types of clinical mastitis (CM) (caused by gram-positive bacteria, gram-negative bacteria, and other organisms) at the individual cow level and thereby identify the economically optimal management decision for each type of mastitis. We made modifications to an existing dynamic optimization and simulation model, studying the effects of various factors (incidence of CM, milk loss, pregnancy rate, and treatment cost) on the cost of different types of CM. The average costs per case (US$) of gram-positive, gram-negative, and other CM were $133.73, $211.03, and $95.31, respectively. This model provided a more informed decision-making process in CM management for optimal economic profitability and determined that 93.1% of gram-positive CM cases, 93.1% of gram-negative CM cases, and 94.6% of other CM cases should be treated. The main contributor to the total cost per case was treatment cost for gram-positive CM (51.5% of the total cost per case), milk loss for gram-negative CM (72.4%), and treatment cost for other CM (49.2%). The model affords versatility as it allows for parameters such as production costs, economic values, and disease frequencies to be altered. Therefore, cost estimates are the direct outcome of the farm-specific parameters entered into the model. Thus, this model can provide farmers economically optimal guidelines specific to their individual cows suffering from different types of CM. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  19. Development of an integrated utilities billing management system for the Navy Public Works Center San Diego, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monsabert, S. de; Lemmer, H.; Dinwiddie, D.

    1995-10-01

    In the past, most buildings, structures, and ship visits were not metered, and flat estimates were calculated based on various estimating techniques. The decomposition process was further complicated by the fact that many of the meters monitor consumption values only and do not provide demand or time of use data. This method of billing provides no incentives to the PWC customers to implement energy conservation programs, including load shedding, Energy Monitoring and Control Systems (EMCS), building shell improvements, low flow toilets and shower heads, efficient lighting systems, or other energy savings alternatives. Similarly, the method had no means of adjustmentmore » for seasonal or climatic variations outside of the norm. As an alternative to flat estimates, the Customized Utility Billing Integrated Control (CUBIC) system and the Graphical Data Input System (GDIS) were developed to better manage the data to the major claimant area users based on utilities usage factors, building size, weather data, and hours of operation. GDIS is a graphical database that assists PWC engineers in the development and maintenance of single-line utility diagrams of the facilities and meters. It functions as a drawing associate system and is written in AutoLISP for AutoCAD version 12. GDIS interprets the drawings and provides the facility-to-meter and meter-to-meter hierarchy data that are used by the CUBIC to allocate the billings. This paper reviews the design, development and implementation aspects of CUBIC/GDIS and discusses the benefits of this improved utilities management system.« less

  20. Greenhouse gas emissions from alternative futures of deforestation and agricultural management in the southern Amazon

    PubMed Central

    Galford, Gillian L.; Melillo, Jerry M.; Kicklighter, David W.; Cronin, Timothy W.; Cerri, Carlos E. P.; Mustard, John F.; Cerri, Carlos C.

    2010-01-01

    The Brazilian Amazon is one of the most rapidly developing agricultural areas in the world and represents a potentially large future source of greenhouse gases from land clearing and subsequent agricultural management. In an integrated approach, we estimate the greenhouse gas dynamics of natural ecosystems and agricultural ecosystems after clearing in the context of a future climate. We examine scenarios of deforestation and postclearing land use to estimate the future (2006–2050) impacts on carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) emissions from the agricultural frontier state of Mato Grosso, using a process-based biogeochemistry model, the Terrestrial Ecosystems Model (TEM). We estimate a net emission of greenhouse gases from Mato Grosso, ranging from 2.8 to 15.9 Pg CO2-equivalents (CO2-e) from 2006 to 2050. Deforestation is the largest source of greenhouse gas emissions over this period, but land uses following clearing account for a substantial portion (24–49%) of the net greenhouse gas budget. Due to land-cover and land-use change, there is a small foregone carbon sequestration of 0.2–0.4 Pg CO2-e by natural forests and cerrado between 2006 and 2050. Both deforestation and future land-use management play important roles in the net greenhouse gas emissions of this frontier, suggesting that both should be considered in emissions policies. We find that avoided deforestation remains the best strategy for minimizing future greenhouse gas emissions from Mato Grosso. PMID:20651250

  1. Predicting foraging wading bird populations in Everglades National Park from seasonal hydrologic statistics under different management scenarios

    NASA Astrophysics Data System (ADS)

    Kwon, Hyun-Han; Lall, Upmanu; Engel, Vic

    2011-09-01

    The ability to map relationships between ecological outcomes and hydrologic conditions in the Everglades National Park (ENP) is a key building block for their restoration program, a primary goal of which is to improve conditions for wading birds. This paper presents a model linking wading bird foraging numbers to hydrologic conditions in the ENP. Seasonal hydrologic statistics derived from a single water level recorder are well correlated with water depths throughout most areas of the ENP, and are effective as predictors of wading bird numbers when using a nonlinear hierarchical Bayesian model to estimate the conditional distribution of bird populations. Model parameters are estimated using a Markov chain Monte Carlo (MCMC) procedure. Parameter and model uncertainty is assessed as a byproduct of the estimation process. Water depths at the beginning of the nesting season, the average dry season water level, and the numbers of reversals from the dry season recession are identified as significant predictors, consistent with the hydrologic conditions considered important in the production and concentration of prey organisms in this system. Long-term hydrologic records at the index location allow for a retrospective analysis (1952-2006) of foraging bird numbers showing low frequency oscillations in response to decadal fluctuations in hydroclimatic conditions. Simulations of water levels at the index location used in the Bayesian model under alternative water management scenarios allow the posterior probability distributions of the number of foraging birds to be compared, thus providing a mechanism for linking management schemes to seasonal rainfall forecasts.

  2. Sources, distribution and export coefficient of phosphorus in lowland polders of Lake Taihu Basin, China.

    PubMed

    Huang, Jiacong; Gao, Junfeng; Jiang, Yong; Yin, Hongbin; Amiri, Bahman Jabbarian

    2017-12-01

    Identifying phosphorus (P) sources, distribution and export from lowland polders is important for P pollution management, however, is challenging due to the high complexity of hydrological and P transport processes in lowland areas. In this study, the spatial pattern and temporal dynamics of P export coefficient (PEC) from all the 2539 polders in Lake Taihu Basin, China were estimated using a coupled P model for describing P dynamics in a polder system. The estimated amount of P export from polders in Lake Taihu Basin during 2013 was 1916.2 t/yr, with a spatially-averaged PEC of 1.8 kg/ha/yr. PEC had peak values (more than 4.0 kg/ha/yr) in the polders near/within the large cities, and was high during the rice-cropping season. Sensitivity analysis based on the coupled P model revealed that the sensitive factors controlling the PEC varied spatially and changed through time. Precipitation and air temperature were the most sensitive factors controlling PEC. Culvert controlling and fertilization were sensitive factors controlling PEC during some periods. This study demonstrated an estimation of PEC from 2539 polders in Lake Taihu Basin, and an identification of sensitive environmental factors affecting PEC. The investigation of polder P export in a watershed scale is helpful for water managers to learn the distribution of P sources, to identify key P sources, and thus to achieve best management practice in controlling P export from lowland areas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Assimilation of Remotely-Sensed Snow information to improve streamflow predictions in the Southwestern US

    NASA Astrophysics Data System (ADS)

    López-Burgos, V.; Rajagopal, S.; Martinez Baquero, G. F.; Gupta, H. V.

    2009-12-01

    Rapidly growing population in the southwestern US is leading to increasing demand and decreasing availability of water, requiring a detailed quantification of hydrological processes. The integration of detailed spatial information of water fluxes from remote sensing platforms, and hydrological models coupled with ground based data is an important step towards this goal. This project is exploring the use of Snow Water Equivalent (SWE) estimates to update the snow component of the Variable Infiltration Capacity model (VIC). SWE estimates are obtained by combining SNOTEL data with MODIS Snow Cover Area (SCA) information. Because, cloud cover corrupts the estimates of SCA, a rule-based method is used to clean up the remotely sensed images. The rules include a time interpolation method, and the probability of a pixel for been covered with snow based on the relationships between elevation, temperature, lapse rate, aspect and topographic shading. The approach is used to improve streamflow predictions on two rivers managed by the Salt River Project, a water and energy supplier in central Arizona. This solution will help improve the management of reservoirs in the Salt and Verde River in Phoenix, Arizona (tributaries of the lower Colorado River basin), by incorporating physically based distributed models and remote sensing observations into their Decision Support Tools and planning tools. This research seeks to increase the knowledge base used to manage reservoirs and groundwater resources in a region affected by a long-term drought. It will be applicable and relevant for other water utility companies facing the challenges of climate change and decreasing water resources.

  4. Methodology to Estimate the Quantity, Composition, and Management of Construction and Demolition Debris in the United States

    EPA Science Inventory

    This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estima...

  5. Physician Practice Participation in Accountable Care Organizations: The Emergence of the Unicorn

    PubMed Central

    Shortell, Stephen M; McClellan, Sean R; Ramsay, Patricia P; Casalino, Lawrence P; Ryan, Andrew M; Copeland, Kennon R

    2014-01-01

    Objective To provide the first nationally based information on physician practice involvement in ACOs. Data Sources/Study Setting Primary data from the third National Survey of Physician Organizations (January 2012–May 2013). Study Design We conducted a 40-minute phone survey in a sample of physician practices. A nationally representative sample of practices was surveyed in order to provide estimates of organizational characteristics, care management processes, ACO participation, and related variables for four major chronic illnesses. Data Collection/Extraction Methods We evaluated the associations between ACO participation, organizational characteristics, and a 25-point index of patient-centered medical home processes. Principal Findings We found that 23.7 percent of physician practices (n = 280) reported joining an ACO; 15.7 percent (n = 186) were planning to become involved within the next 12 months and 60.6 percent (n = 717) reported no involvement and no plans to become involved. Larger practices, those receiving patients from an IPA and/or PHO, those that were physician-owned versus hospital/health system-owned, those located in New England, and those with greater patient-centered medical home (PCMH) care management processes were more likely to have joined an ACO. Conclusions Physician practices that are currently participating in ACOs appear to be relatively large, or to be members of an IPA or PHO, are less likely to be hospital-owned and are more likely to use more care management processes than nonparticipating practices. PMID:24628449

  6. Reliability of Fault Tolerant Control Systems. Part 2

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2000-01-01

    This paper reports Part II of a two part effort that is intended to delineate the relationship between reliability and fault tolerant control in a quantitative manner. Reliability properties peculiar to fault-tolerant control systems are emphasized, such as the presence of analytic redundancy in high proportion, the dependence of failures on control performance, and high risks associated with decisions in redundancy management due to multiple sources of uncertainties and sometimes large processing requirements. As a consequence, coverage of failures through redundancy management can be severely limited. The paper proposes to formulate the fault tolerant control problem as an optimization problem that maximizes coverage of failures through redundancy management. Coverage modeling is attempted in a way that captures its dependence on the control performance and on the diagnostic resolution. Under the proposed redundancy management policy, it is shown that an enhanced overall system reliability can be achieved with a control law of a superior robustness, with an estimator of a higher resolution, and with a control performance requirement of a lesser stringency.

  7. A "total parameter estimation" method in the varification of distributed hydrological models

    NASA Astrophysics Data System (ADS)

    Wang, M.; Qin, D.; Wang, H.

    2011-12-01

    Conventionally hydrological models are used for runoff or flood forecasting, hence the determination of model parameters are common estimated based on discharge measurements at the catchment outlets. With the advancement in hydrological sciences and computer technology, distributed hydrological models based on the physical mechanism such as SWAT, MIKESHE, and WEP, have gradually become the mainstream models in hydrology sciences. However, the assessments of distributed hydrological models and model parameter determination still rely on runoff and occasionally, groundwater level measurements. It is essential in many countries, including China, to understand the local and regional water cycle: not only do we need to simulate the runoff generation process and for flood forecasting in wet areas, we also need to grasp the water cycle pathways and consumption process of transformation in arid and semi-arid regions for the conservation and integrated water resources management. As distributed hydrological model can simulate physical processes within a catchment, we can get a more realistic representation of the actual water cycle within the simulation model. Runoff is the combined result of various hydrological processes, using runoff for parameter estimation alone is inherits problematic and difficult to assess the accuracy. In particular, in the arid areas, such as the Haihe River Basin in China, runoff accounted for only 17% of the rainfall, and very concentrated during the rainy season from June to August each year. During other months, many of the perennial rivers within the river basin dry up. Thus using single runoff simulation does not fully utilize the distributed hydrological model in arid and semi-arid regions. This paper proposed a "total parameter estimation" method to verify the distributed hydrological models within various water cycle processes, including runoff, evapotranspiration, groundwater, and soil water; and apply it to the Haihe river basin in China. The application results demonstrate that this comprehensive testing method is very useful in the development of a distributed hydrological model and it provides a new way of thinking in hydrological sciences.

  8. Geospatial Investigation into Groundwater Pollution and Water Quality Supported by Satellite Data: A Case Study from the Evros River (Eastern Mediterranean)

    NASA Astrophysics Data System (ADS)

    Elias, Dimitriou; Angeliki, Mentzafou; Vasiliki, Markogianni; Maria, Tzortziou; Christina, Zeri

    2014-06-01

    Managing water resources, in terms of both quality and quantity, in transboundary rivers is a difficult and challenging task that requires efficient cross-border cooperation and transparency. Groundwater pollution risk assessment and mapping techniques over the full catchment area are important tools that could be used as part of these water resource management efforts, to estimate pollution pressures and optimize land planning processes. The Evros river catchment is the second largest river in Eastern Europe and sustains a population of 3.6 million people in three different countries (Bulgaria, Turkey and Greece). This study provides detailed information on the main pollution sources and pressures in the Evros catchment and, for the first time, applies, assesses and evaluates a groundwater pollution risk mapping technique using satellite observations (Landsat NDVI) and an extensive dataset of field measurements covering different seasons and multiple years. We found that approximately 40 % of the Greek part of the Evros catchment is characterized as of high and very high pollution risk, while 14 % of the study area is classified as of moderate risk. Both the modeled and measured water quality status of the river showed large spatiotemporal variations consistent with the strong anthropogenic pressures in this system, especially on the northern and central segments of the catchment. The pollutants identified illustrate inputs of agrochemicals and urban wastes in the river. High correlation coefficients ( R between 0.79 and 0.85) were found between estimated pollution risks and measured concentrations of those chemical parameters that are mainly attributed to anthropogenic activities rather than in situ biogeochemical processes. The pollution risk method described here could be used elsewhere as a decision support tool for mitigating the impact of hazardous human activities and improving management of groundwater resources.

  9. Heritage and Advanced Technology Systems Engineering Lessons Learned from NASA Space Missions

    NASA Technical Reports Server (NTRS)

    Barley, Bryan; Newhouse, Marilyn; Bacskay, Allen

    2010-01-01

    Use of heritage and new technology is necessary/enabling to implementing small, low cost missions, yet overruns decrease the ability to sustain future mission flight rates The majority of the cost growth drivers seen in the D&NF study were embedded early during formulation phase and later realized during the development and I&T phases Cost drivers can be avoided or significantly decreased by project management and SE emphasis on early identification of risks and realistic analyses SE processes that emphasize an assessment of technology within the mission system to identify technical issues in the design or operational use of the technology. Realistic assessment of new and heritage spacecraft technology assumptions , identification of risks and mitigation strategies. Realistic estimates of effort required to inherit existing or qualify new technology, identification of risks to estimates and develop mitigation strategies. Allocation of project reserves for risk-based mitigation strategies of each individual area of heritage or new technology. Careful tailoring of inheritance processes to ensure due diligence.

  10. Leveraging Real-World Evidence in Disease-Management Decision-Making with a Total Cost of Care Estimator.

    PubMed

    Nguyen, Thanh-Nghia; Trocio, Jeffrey; Kowal, Stacey; Ferrufino, Cheryl P; Munakata, Julie; South, Dell

    2016-12-01

    Health management is becoming increasingly complex, given a range of care options and the need to balance costs and quality. The ability to measure and understand drivers of costs is critical for healthcare organizations to effectively manage their patient populations. Healthcare decision makers can leverage real-world evidence to explore the value of disease-management interventions in shifting total cost trends. To develop a real-world, evidence-based estimator that examines the impact of disease-management interventions on the total cost of care (TCoC) for a patient population with nonvalvular atrial fibrillation (NVAF). Data were collected from a patient-level real-world evidence data set that uses the IMS PharMetrics Health Plan Claims Database. Pharmacy and medical claims for patients meeting the inclusion or exclusion criteria were combined in longitudinal cohorts with a 180-day preindex and 360-day follow-up period. Descriptive statistics, such as mean and median patient costs and event rates, were derived from a real-world evidence analysis and were used to populate the base-case estimates within the TCoC estimator, an exploratory economic model that was designed to estimate the potential impact of several disease-management activities on the TCoC for a patient population with NVAF. Using Microsoft Excel, the estimator is designed to compare current direct costs of medical care to projected costs by varying assumptions on the impact of disease-management activities and applying the associated changes in cost trends to the affected populations. Disease-management levers are derived from literature-based concepts affecting costs along the NVAF disease continuum. The use of the estimator supports analyses across 4 US geographic regions, age, cost types, and care settings during 1 year. All patients included in the study were continuously enrolled in their health plan (within the IMS PharMetrics Health Plan Claims Database) between July 1, 2010, and June 30, 2012. Patients were included in the final analytic file and were indexed based on (1) the service date of the first claim within the selection window (December 28, 2010-July 11, 2011) with a diagnosis of NVAF, or (2) the service date of the second claim for an NVAF medication of interest during the same selection window. The model estimates the current trends in national benchmark data for a hypothetical health plan with 1 million covered lives. The annual total direct healthcare costs (allowable and patient out-of-pocket costs) of managing patients with NVAF in this hypothetical plan are estimated at $184,981,245 ($25,754 per patient, for 7183 patients). A potential 25% improvement from the base-case disease burden and disease management could translate into TCoC savings from reducing the excess costs related to hypertension (-5.3%) and supporting the use of an appropriate antithrombotic treatment that prevents ischemic stroke (-0.7%) and reduces bleeding events (-0.1%). The use of the TCoC estimator supports population health management by providing real-world evidence benchmark data on NVAF disease burden and by quantifying the potential value of disease-management activities in shifting cost trends.

  11. Leveraging Real-World Evidence in Disease-Management Decision-Making with a Total Cost of Care Estimator

    PubMed Central

    Nguyen, Thanh-Nghia; Trocio, Jeffrey; Kowal, Stacey; Ferrufino, Cheryl P.; Munakata, Julie; South, Dell

    2016-01-01

    Background Health management is becoming increasingly complex, given a range of care options and the need to balance costs and quality. The ability to measure and understand drivers of costs is critical for healthcare organizations to effectively manage their patient populations. Healthcare decision makers can leverage real-world evidence to explore the value of disease-management interventions in shifting total cost trends. Objective To develop a real-world, evidence-based estimator that examines the impact of disease-management interventions on the total cost of care (TCoC) for a patient population with nonvalvular atrial fibrillation (NVAF). Methods Data were collected from a patient-level real-world evidence data set that uses the IMS PharMetrics Health Plan Claims Database. Pharmacy and medical claims for patients meeting the inclusion or exclusion criteria were combined in longitudinal cohorts with a 180-day preindex and 360-day follow-up period. Descriptive statistics, such as mean and median patient costs and event rates, were derived from a real-world evidence analysis and were used to populate the base-case estimates within the TCoC estimator, an exploratory economic model that was designed to estimate the potential impact of several disease-management activities on the TCoC for a patient population with NVAF. Using Microsoft Excel, the estimator is designed to compare current direct costs of medical care to projected costs by varying assumptions on the impact of disease-management activities and applying the associated changes in cost trends to the affected populations. Disease-management levers are derived from literature-based concepts affecting costs along the NVAF disease continuum. The use of the estimator supports analyses across 4 US geographic regions, age, cost types, and care settings during 1 year. Results All patients included in the study were continuously enrolled in their health plan (within the IMS PharMetrics Health Plan Claims Database) between July 1, 2010, and June 30, 2012. Patients were included in the final analytic file and were indexed based on (1) the service date of the first claim within the selection window (December 28, 2010-July 11, 2011) with a diagnosis of NVAF, or (2) the service date of the second claim for an NVAF medication of interest during the same selection window. The model estimates the current trends in national benchmark data for a hypothetical health plan with 1 million covered lives. The annual total direct healthcare costs (allowable and patient out-of-pocket costs) of managing patients with NVAF in this hypothetical plan are estimated at $184,981,245 ($25,754 per patient, for 7183 patients). A potential 25% improvement from the base-case disease burden and disease management could translate into TCoC savings from reducing the excess costs related to hypertension (−5.3%) and supporting the use of an appropriate antithrombotic treatment that prevents ischemic stroke (−0.7%) and reduces bleeding events (−0.1%). Conclusions The use of the TCoC estimator supports population health management by providing real-world evidence benchmark data on NVAF disease burden and by quantifying the potential value of disease-management activities in shifting cost trends. PMID:28465775

  12. Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3

    NASA Astrophysics Data System (ADS)

    Endsley, K. A.; Billmire, M. G.

    2016-01-01

    Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.

  13. Scalable Motion Estimation Processor Core for Multimedia System-on-Chip Applications

    NASA Astrophysics Data System (ADS)

    Lai, Yeong-Kang; Hsieh, Tian-En; Chen, Lien-Fei

    2007-04-01

    In this paper, we describe a high-throughput and scalable motion estimation processor architecture for multimedia system-on-chip applications. The number of processing elements (PEs) is scalable according to the variable algorithm parameters and the performance required for different applications. Using the PE rings efficiently and an intelligent memory-interleaving organization, the efficiency of the architecture can be increased. Moreover, using efficient on-chip memories and a data management technique can effectively decrease the power consumption and memory bandwidth. Techniques for reducing the number of interconnections and external memory accesses are also presented. Our results demonstrate that the proposed scalable PE-ringed architecture is a flexible and high-performance processor core in multimedia system-on-chip applications.

  14. Ranking landscape development scenarios affecting natterjack toad (Bufo calamita) population dynamics in Central Poland.

    PubMed

    Franz, Kamila W; Romanowski, Jerzy; Johst, Karin; Grimm, Volker

    2013-01-01

    When data are limited it is difficult for conservation managers to assess alternative management scenarios and make decisions. The natterjack toad (Bufo calamita) is declining at the edges of its distribution range in Europe and little is known about its current distribution and abundance in Poland. Although different landscape management plans for central Poland exist, it is unclear to what extent they impact this species. Based on these plans, we investigated how four alternative landscape development scenarios would affect the total carrying capacity and population dynamics of the natterjack toad. To facilitate decision-making, we first ranked the scenarios according to their total carrying capacity. We used the software RAMAS GIS to determine the size and location of habitat patches in the landscape. The estimated carrying capacities were very similar for each scenario, and clear ranking was not possible. Only the reforestation scenario showed a marked loss in carrying capacity. We therefore simulated metapopulation dynamics with RAMAS taking into account dynamical processes such as reproduction and dispersal and ranked the scenarios according to the resulting species abundance. In this case, we could clearly rank the development scenarios. We identified road mortality of adults as a key process governing the dynamics and separating the different scenarios. The renaturalisation scenario clearly ranked highest due to its decreased road mortality. Taken together our results suggest that road infrastructure development might be much more important for natterjack toad conservation than changes in the amount of habitat in the semi-natural river valley. We gained these insights by considering both the resulting metapopulation structure and dynamics in the form of a PVA. We conclude that the consideration of dynamic processes in amphibian conservation management may be indispensable for ranking management scenarios.

  15. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    The theory of System Health Management (SHM) and of its operational subset Fault Management (FM) states that FM is implemented as a "meta" control loop, known as an FM Control Loop (FMCL). The FMCL detects that all or part of a system is now failed, or in the future will fail (that is, cannot be controlled within acceptable limits to achieve its objectives), and takes a control action (a response) to return the system to a controllable state. In terms of control theory, the effectiveness of each FMCL is estimated based on its ability to correctly estimate the system state, and on the speed of its response to the current or impending failure effects. This paper describes how this theory has been successfully applied on the National Aeronautics and Space Administration's (NASA) Space Launch System (SLS) Program to quantitatively estimate the effectiveness of proposed abort triggers so as to select the most effective suite to protect the astronauts from catastrophic failure of the SLS. The premise behind this process is to be able to quantitatively provide the value versus risk trade-off for any given abort trigger, allowing decision makers to make more informed decisions. All current and planned crewed launch vehicles have some form of vehicle health management system integrated with an emergency launch abort system to ensure crew safety. While the design can vary, the underlying principle is the same: detect imminent catastrophic vehicle failure, initiate launch abort, and extract the crew to safety. Abort triggers are the detection mechanisms that identify that a catastrophic launch vehicle failure is occurring or is imminent and cause the initiation of a notification to the crew vehicle that the escape system must be activated. While ensuring that the abort triggers provide this function, designers must also ensure that the abort triggers do not signal that a catastrophic failure is imminent when in fact the launch vehicle can successfully achieve orbit. That is, the abort triggers must have low false negative rates to be sure that real crew-threatening failures are detected, and also low false positive rates to ensure that the crew does not abort from non-crew-threatening launch vehicle behaviors. The analysis process described in this paper is a compilation of over six years of lessons learned and refinements from experiences developing abort triggers for NASA's Constellation Program (Ares I Project) and the SLS Program, as well as the simultaneous development of SHM/FM theory. The paper will describe the abort analysis concepts and process, developed in conjunction with SLS Safety and Mission Assurance (S&MA) to define a common set of mission phase, failure scenario, and Loss of Mission Environment (LOME) combinations upon which the SLS Loss of Mission (LOM) Probabilistic Risk Assessment (PRA) models are built. This abort analysis also requires strong coordination with the Multi-Purpose Crew Vehicle (MPCV) and SLS Structures and Environments (STE) to formulate a series of abortability tables that encapsulate explosion dynamics over the ascent mission phase. The design and assessment of abort conditions and triggers to estimate their Loss of Crew (LOC) Benefits also requires in-depth integration with other groups, including Avionics, Guidance, Navigation and Control(GN&C), the Crew Office, Mission Operations, and Ground Systems. The outputs of this analysis are a critical input to SLS S&MA's LOC PRA models. The process described here may well be the first full quantitative application of SHM/FM theory to the selection of a sensor suite for any aerospace system.

  16. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  17. Variation in agricultural CO2 fluxes during the growing season, collected from more than ten eddy covariance towers in the Mississippi Delta Region

    NASA Astrophysics Data System (ADS)

    Runkle, B.; Suvocarev, K.; Reba, M. L.; Novick, K. A.; White, P.; Anapalli, S.; Locke, M. A.; Rigby, J.; Bhattacharjee, J.

    2016-12-01

    Agriculture is unique as an anthropogenic activity that plays both a large role in carbon and water cycling and whose management activities provide a key opportunity for responses to climate change. It is therefore especially crucial to bring field observations into the modeling community, test remote sensing products, encourage policy debate, and enable carbon offsets markets that generate revenue and fund climate-smart activities. The accurate measurement of agricultural CO2 exchange - both primary productivity and ecosystem respiration - in concert with evapotranspiration provides crucial information on agro-ecosystem functioning and improves our predictive capacity for estimating the impacts of climate change. In this study we report field measurements from more than 10 eddy covariance towers in the Lower Mississippi River Basin taken during the summer months of 2016. Many towers, some recently deployed, are being aggregated into a regional network known as Delta-Flux, which will ultimately include 15-20 towers by 2017. Set in and around the Mississippi Delta Region within Louisiana, Arkansas, and Mississippi, the network will collect flux, micrometeorological, and crop yield data in order to construct estimates of regional CO2 exchange. These time-series data are gap-filled using statistical and process-based models to generate estimates of summer CO2 flux. The tower network is comprised of sites representing widespread agriculture production, including rice, cotton, corn, soybean, and sugarcane; intensively managed pine forest; and bottomland hardwood forest. Unique experimental production practices are represented in the network and include restricted water use, bioenergy, and by-product utilization. Several towers compose multi-field sites testing innovative irrigation or management practices. Current mapping of agricultural carbon exchange - based on land cover layers and fixed crop emission factors - suggests an unconstrained carbon flux estimate in this region. The observations from the Delta-Flux network will significantly constrain the multi-state C budget and provide guidance for regional conservation efforts. We include implications for regional carbon modeling, sustainable agricultural management, crop and land use cover changes, and responses to a warming climate.

  18. Mapping cropland GPP in the north temperate region with space measurements of chlorophyll fluorescence

    NASA Astrophysics Data System (ADS)

    Guanter, L.; Zhang, Y.; Jung, M.; Joiner, J.; Voigt, M.; Huete, A. R.; Zarco-Tejada, P.; Frankenberg, C.; Lee, J.; Berry, J. A.; Moran, S. M.; Ponce-Campos, G.; Beer, C.; Camps-Valls, G.; Buchmann, N. C.; Gianelle, D.; Klumpp, K.; Cescatti, A.; Baker, J. M.; Griffis, T.

    2013-12-01

    Monitoring agricultural productivity is important for optimizing management practices in a world under a continuous increase of food and biofuel demand. We used new space measurements of sun-induced chlorophyll fluorescence (SIF), a vegetation parameter intrinsically linked to photosynthesis, to capture photosynthetic uptake of the crop belts in the north temperate region. The following data streams and procedures have been used in this analysis: (1) SIF retrievals have been derived from measurements of the MetOp-A / GOME-2 instrument in the 2007-2011 time period; (2) ensembles of process-based and data-driven biogeochemistry models have been analyzed in order to assess the capability of global models to represent crop gross primary production (GPP); (3) flux tower-based GPP estimates covering the 2007-2011 time period have been extracted over 18 cropland and grassland sites in the Midwest US and Western Europe from the Ameriflux and the European Fluxes Database networks; (4) large-scale NPP estimates have been derived by the agricultural inventory data sets developed by USDA-NASS and Monfreda et al. The strong linear correlation between the SIF space retrievals and the flux tower-based GPP, found to be significantly higher than that between reflectance-based vegetation indices (EVI, NDVI and MTCI) and GPP, has enabled the direct upscaling of SIF to cropland GPP maps at the synoptic scale. The new crop GPP estimates we derive from the scaling of SIF space retrievals are consistent with both flux tower GPP estimates and agricultural inventory data. These new GPP estimates show that crop productivity in the US Western Corn Belt, and most likely also in the rice production areas in the Indo-Gangetic plain and China, is up to 50-75% higher than estimates by state-of-the-art data-driven and process-oriented biogeochemistry models. From our analysis we conclude that current carbon models have difficulties in reproducing the special conditions of those highly productive crops subject to an intense management. Observational inputs closely linked to physiological condition and the photosynthetic dynamics of the vegetation, such as the fluorescence measurements presented in this study, can be an essential complement to existing models and remotely-sensed observations for the evaluation of global agricultural yields.

  19. Detecting Potential Water Quality Issues by Mapping Trophic Status Using Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Nguy-Robertson, A. L.; Harvey, K.; Huening, V.; Robinson, H.

    2017-12-01

    The identification, timing, and spatial distribution of recurrent algal blooms and aquatic vegetation can help water managers and policy makers make better water resource decisions. In many parts of the world there is little monitoring or reporting of water quality due to the required costs and effort to collect and process water samples. We propose to use Google Earth Engine to quickly identify the recurrence of trophic states in global inland water systems. Utilizing Landsat and Sentinel multispectral imagery, inland water quality parameters (i.e. chlorophyll a concentration) can be estimated and waters can be classified by trophic state; oligotrophic, mesotrophic, eutrophic, and hypereutrophic. The recurrence of eutrophic and hypereutrophic observations can highlight potentially problematic locations where algal blooms or aquatic vegetation occur routinely. Eutrophic and hypereutrophic waters commonly include many harmful algal blooms and waters prone to fish die-offs from hypoxia. While these maps may be limited by the accuracy of the algorithms utilized to estimate chlorophyll a; relative comparisons at a local scale can help water managers to focus limited resources.

  20. The Safe Yield and Climatic Variability: Implications for Groundwater Management.

    PubMed

    Loáiciga, Hugo A

    2017-05-01

    Methods for calculating the safe yield are evaluated in this paper using a high-quality and long historical data set of groundwater recharge, discharge, extraction, and precipitation in a karst aquifer. Consideration is given to the role that climatic variability has on the determination of a climatically representative period with which to evaluate the safe yield. The methods employed to estimate the safe yield are consistent with its definition as a long-term average extraction rate that avoids adverse impacts on groundwater. The safe yield is a useful baseline for groundwater planning; yet, it is herein shown that it is not an operational rule that works well under all climatic conditions. This paper shows that due to the nature of dynamic groundwater processes it may be most appropriate to use an adaptive groundwater management strategy that links groundwater extraction rates to groundwater discharge rates, thus achieving a safe yield that represents an estimated long-term sustainable yield. An example of the calculation of the safe yield of the Edwards Aquifer (Texas) demonstrates that it is about one-half of the average annual recharge. © 2016, National Ground Water Association.

  1. STV fueling options

    NASA Technical Reports Server (NTRS)

    Flemming, Ken

    1991-01-01

    Lunar vehicles that will be space based and reusable will require resupply of propellants in orbit. Approximately 75 pct. of the total mass delivered to low earth orbit will be propellants. Consequently, the propellant management techniques selected for Space Exploration Initiative (SEI) orbital operations will have a major influence on the overall SEI architecture. Five proposed propellant management facility (PMF) concepts were analyzed and compared in order to determine the best method of resupplying reusable, space based Lunar Transfer Vehicles (LTVs). The processing time needed at the Space Station to prepare LTV for its next lunar mission was estimated for each of the PMF concepts. The estimated times required to assemble and maintain the different PMF concepts were also compared. The results of the maintenance analysis were similar, with co-orbiting depots needing 100 to 350 pct. more annual maintenance. The first few external tanks mating operations at KSC encountered many problems that could cause serious lunar mission schedule delays. The use of drop tanks on lunar vehicles increases by a factor of four the number of critical propellant interface disturbances.

  2. The climate change performance scorecard and carbon estimates for national forest

    Treesearch

    John W. Coulston; Kellen Nelson; Christopher W. Woodall; David Meriwether; Gregory A. Reams

    2012-01-01

    The U.S. Forest Service manages 20 percent of the forest land in the United States. Both the Climate Change Performance Scorecard and the revised National Forest Management Act require the assessment of carbon stocks on these lands. We present circa 2010 estimates of carbon stocks for each national forest and recommendations to improve these estimates.

  3. La Conchita Landslide Risk Assessment

    NASA Astrophysics Data System (ADS)

    Kropp, A.; Johnson, L.; Magnusen, W.; Hitchcock, C. S.

    2009-12-01

    Following the disastrous landslide in La Conchita in 2005 that resulted in ten deaths, the State of California selected our team to prepare a risk assessment for a committee of key stakeholders. The stakeholders represented the State of California, Ventura County, members of the La Conchita community, the railroad, and the upslope ranch owner (where the slide originated); a group with widely varying views and interests. Our team was charged with characterizing the major hazards, developing a series of mitigation concepts, evaluating the benefits and costs of mitigation, and gathering stakeholder input throughout the process. Two unique elements of the study were the methodologies utilized for the consequence assessment and for the decision-making framework. La Conchita is exposed to multiple slope hazards, each with differing geographical distributions, as well as depth and velocity characteristics. Three consequence matrices were developed so that the potential financial losses, structural vulnerabilities, and human safety exposure could be evaluated. The matrices utilized semi-quantitative loss evaluations (both financial and life safety) based on a generalized understanding of likely vulnerability and hazard characteristics. The model provided a quantitative estimate of cumulative losses over a 50-year period, including losses of life based on FEMA evaluation criteria. Conceptual mitigation options and loss estimates were developed to provide a range of risk management solutions that were feasible from a cost-benefit standpoint. A decision tree approach was adopted to focus on fundamental risk management questions rather than on specific outcomes since the committee did not have a consensus view on the preferred solution. These questions included: 1. Over what time period can risks be tolerated before implementation of decisions? 2. Whose responsibility is it to identify a workable risk management solution? 3. Who will own the project? The decision tree developed for assessment can also be used in the reverse direction to evaluate a project impasse or to evaluate owners and time-frames associated with a particular risk management outcome. Although the processes developed were specific to the La Conchita study, we believe that they are applicable elsewhere for localized multi-hazard assessments and/or committee-led risk management efforts.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xuesong; Izaurralde, Roberto C.; Manowitz, David H.

    Accurate quantification and clear understanding of regional scale cropland carbon (C) cycling is critical for designing effective policies and management practices that can contribute toward stabilizing atmospheric CO2 concentrations. However, extrapolating site-scale observations to regional scales represents a major challenge confronting the agricultural modeling community. This study introduces a novel geospatial agricultural modeling system (GAMS) exploring the integration of the mechanistic Environmental Policy Integrated Climate model, spatially-resolved data, surveyed management data, and supercomputing functions for cropland C budgets estimates. This modeling system creates spatially-explicit modeling units at a spatial resolution consistent with remotely-sensed crop identification and assigns cropping systems tomore » each of them by geo-referencing surveyed crop management information at the county or state level. A parallel computing algorithm was also developed to facilitate the computationally intensive model runs and output post-processing and visualization. We evaluated GAMS against National Agricultural Statistics Service (NASS) reported crop yields and inventory estimated county-scale cropland C budgets averaged over 2000–2008. We observed good overall agreement, with spatial correlation of 0.89, 0.90, 0.41, and 0.87, for crop yields, Net Primary Production (NPP), Soil Organic C (SOC) change, and Net Ecosystem Exchange (NEE), respectively. However, we also detected notable differences in the magnitude of NPP and NEE, as well as in the spatial pattern of SOC change. By performing crop-specific annual comparisons, we discuss possible explanations for the discrepancies between GAMS and the inventory method, such as data requirements, representation of agroecosystem processes, completeness and accuracy of crop management data, and accuracy of crop area representation. Based on these analyses, we further discuss strategies to improve GAMS by updating input data and by designing more efficient parallel computing capability to quantitatively assess errors associated with the simulation of C budget components. The modularized design of the GAMS makes it flexible to be updated and adapted for different agricultural models so long as they require similar input data, and to be linked with socio-economic models to understand the effectiveness and implications of diverse C management practices and policies.« less

  5. Multi-objective reverse logistics model for integrated computer waste management.

    PubMed

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  6. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.; Kramer, S.; Koch, T.; Pfützner, B.

    2017-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high-resolution groundwater level simulation was carried out. A decision support process with an intensive stakeholder interaction combined with high-resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  7. An urban runoff model designed to inform stormwater management decisions.

    PubMed

    Beck, Nicole G; Conley, Gary; Kanner, Lisa; Mathias, Margaret

    2017-05-15

    We present an urban runoff model designed for stormwater managers to quantify runoff reduction benefits of mitigation actions that has lower input data and user expertise requirements than most commonly used models. The stormwater tool to estimate load reductions (TELR) employs a semi-distributed approach, where landscape characteristics and process representation are spatially-lumped within urban catchments on the order of 100 acres (40 ha). Hydrologic computations use a set of metrics that describe a 30-year rainfall distribution, combined with well-tested algorithms for rainfall-runoff transformation and routing to generate average annual runoff estimates for each catchment. User inputs include the locations and specifications for a range of structural best management practice (BMP) types. The model was tested in a set of urban catchments within the Lake Tahoe Basin of California, USA, where modeled annual flows matched that of the observed flows within 18% relative error for 5 of the 6 catchments and had good regional performance for a suite of performance metrics. Comparisons with continuous simulation models showed an average of 3% difference from TELR predicted runoff for a range of hypothetical urban catchments. The model usually identified the dominant BMP outflow components within 5% relative error of event-based measured flow data and simulated the correct proportionality between outflow components. TELR has been implemented as a web-based platform for use by municipal stormwater managers to inform prioritization, report program benefits and meet regulatory reporting requirements (www.swtelr.com). Copyright © 2017. Published by Elsevier Ltd.

  8. Assessing angler effort, catch, and harvest and the efficacy of a use-estimation system on a multi-lake fishery in middle Georgia

    USGS Publications Warehouse

    Roop, Hunter J.; Poudyal, Neelam C.; Jennings, Cecil A.

    2018-01-01

    Creel surveys are valuable tools in recreational fisheries management. However, multiple‐impoundment fisheries of complex spatial structure can complicate survey designs and pose logistical challenges for management agencies. Marben Public Fishing Area in Mansfield, GA is a multi‐impoundment fishery with many access points, and these features prevent or complicate use of traditional on‐site contact methods such as standard roving‐ or access‐point designs because many anglers may be missed during the survey process. Therefore, adaptation of a traditional survey method is often required for sampling this special case of multi‐lake fisheries to develop an accurate fishery profile. Accordingly, a modified non‐uniform probability roving creel survey was conducted at the Marben PFA during 2013 to estimate fishery characteristics relating to fishing effort, catch, and fish harvest. Monthly fishing effort averaged 7,523 angler‐hours (h) (SD = 5,956) and ranged from 1,301 h (SD = 562) in December to 21,856 h (SD = 5909) in May. A generalized linear mixed model was used to determine that angler catch and harvest rates were significantly higher in the spring and summer (all p < 0.05) than in the other seasons, but did not vary by fishing location. Our results demonstrate the utility of modifying existing creel methodology for monitoring small, spatially complex, intensely managed impoundments that support quality recreational fisheries and provide a template for the assessment and management of similar regional fisheries.

  9. How well does the Post-fire Erosion Risk Management Tool (ERMiT) really work?

    NASA Astrophysics Data System (ADS)

    Robichaud, Peter; Elliot, William; Lewis, Sarah; Miller, Mary Ellen

    2016-04-01

    The decision of where, when, and how to apply the most effective postfire erosion mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. The Erosion Risk Management Tool (ERMiT) was developed to assist post fire assessment teams identify high erosion risk areas and effectiveness of various mitigation treatments to reduce that risk. ERMiT is a web-based application that uses the Water Erosion Prediction Project (WEPP) technology to estimate erosion, in probabilistic terms, on burned and recovering forest, range, and chaparral lands with and without the application of mitigation treatments. User inputs are processed by ERMiT to combine rain event variability with spatial and temporal variabilities of hillslope burn severity and soil properties which are then used as WEPP inputs. Since 2007, the model has been used in making hundreds of land management decisions in the US and elsewhere. We use eight published field study sites in the Western US to compare ERMiT predictions to observed hillslope erosion rates. Most sites experience only a few rainfall events that produced runoff and sediment except for a California site with a Mediterranean climate. When hillslope erosion occurred, significant correlations occurred between the observed hillslope erosion and those predicted by ERMiT. Significant correlation occurred for most mitigation treatments as well as the five recovery years. These model validation results suggest reasonable estimates of probabilistic post-fire hillslope sediment delivery when compared to observation.

  10. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    NASA Astrophysics Data System (ADS)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  11. Assessment of atmospheric mercury emissions in Finland

    PubMed

    Mukherjee; Melanen; Ekqvist; Verta

    2000-10-02

    This paper is part of the study of atmospheric emissions of heavy metals conducted by the Finnish Environment Institute in collaboration with the Technical Research Centre of Finland (VTT) under the umbrella of the Finnish Ministry of the Environment. The scope of our study is limited solely to anthropogenic mercury that is emitted directly to the atmosphere. This article addresses emission factors and trends of atmospheric mercury emissions during the 1990s and is based mainly on the database of the Finnish Environmental Administration. In addition, data based on the measurements taken by the VTT regarding emission factors have been used to estimate emissions of mercury from the incineration of waste. The study indicates that the total emission of mercury has decreased from 1140 kg in 1990 to 620 kg in 1997, while industrial and energy production have been on the increase simultaneously. The 45% emission reduction is due to improved gas cleaning equipment, process changes, automation, the installation of flue gas desulfurization process in coal-fired power plants and strict pollution control laws. In the past, some authors have estimated a higher mercury emission in Finland. In this study, it is also observed that there are no big changes in the quality of raw materials. Estimated emission factors can be of great help to management for estimating mercury emissions and also its risk assessment.

  12. Kriging and local polynomial methods for blending satellite-derived and gauge precipitation estimates to support hydrologic early warning systems

    USGS Publications Warehouse

    Verdin, Andrew; Funk, Christopher C.; Rajagopalan, Balaji; Kleiber, William

    2016-01-01

    Robust estimates of precipitation in space and time are important for efficient natural resource management and for mitigating natural hazards. This is particularly true in regions with developing infrastructure and regions that are frequently exposed to extreme events. Gauge observations of rainfall are sparse but capture the precipitation process with high fidelity. Due to its high resolution and complete spatial coverage, satellite-derived rainfall data are an attractive alternative in data-sparse regions and are often used to support hydrometeorological early warning systems. Satellite-derived precipitation data, however, tend to underrepresent extreme precipitation events. Thus, it is often desirable to blend spatially extensive satellite-derived rainfall estimates with high-fidelity rain gauge observations to obtain more accurate precipitation estimates. In this research, we use two different methods, namely, ordinary kriging and κ-nearest neighbor local polynomials, to blend rain gauge observations with the Climate Hazards Group Infrared Precipitation satellite-derived precipitation estimates in data-sparse Central America and Colombia. The utility of these methods in producing blended precipitation estimates at pentadal (five-day) and monthly time scales is demonstrated. We find that these blending methods significantly improve the satellite-derived estimates and are competitive in their ability to capture extreme precipitation.

  13. A strategy to estimate the rate of recruitment of inflammatory cells during bovine intramammary infection under field management.

    PubMed

    Detilleux, J

    2017-06-08

    In most infectious diseases, among which bovine mastitis, promptness of the recruitment of inflammatory cells (mainly neutrophils) in inflamed tissues has been shown to be of prime importance in the resolution of the infection. Although this information should aid in designing efficient control strategies, it has never been quantified in field studies. Here, a system of ordinary differential equations is proposed that describes the dynamic process of the inflammatory response to mammary pathogens. The system was tested, by principal differential analysis, on 1947 test-day somatic cell counts collected on 756 infected cows, from 50 days before to 50 days after the diagnosis of clinical mastitis. Cell counts were log-transformed before estimating recruitment rates. Daily rates of cellular recruitment was estimated at 0.052 (st. err. = 0.005) during health. During disease, an additional cellular rate of recruitment was estimated at 0.004 (st. err. = 0.001) per day and per bacteria. These estimates are in agreement with analogous measurements of in vitro neutrophil functions. Results suggest the method is adequate to estimate one of the components of innate resistance to mammary pathogens at the individual level and in field studies. Extension of the method to estimate components of innate tolerance and limits of the study are discussed.

  14. Global estimation of long-term persistence in annual river runoff

    NASA Astrophysics Data System (ADS)

    Markonis, Y.; Moustakis, Y.; Nasika, C.; Sychova, P.; Dimitriadis, P.; Hanel, M.; Máca, P.; Papalexiou, S. M.

    2018-03-01

    Long-term persistence (LTP) of annual river runoff is a topic of ongoing hydrological research, due to its implications to water resources management. Here, we estimate its strength, measured by the Hurst coefficient H, in 696 annual, globally distributed, streamflow records with at least 80 years of data. We use three estimation methods (maximum likelihood estimator, Whittle estimator and least squares variance) resulting in similar mean values of H close to 0.65. Subsequently, we explore potential factors influencing H by two linear (Spearman's rank correlation, multiple linear regression) and two non-linear (self-organizing maps, random forests) techniques. Catchment area is found to be crucial for medium to larger watersheds, while climatic controls, such as aridity index, have higher impact to smaller ones. Our findings indicate that long-term persistence is weaker than found in other studies, suggesting that enhanced LTP is encountered in large-catchment rivers, were the effect of spatial aggregation is more intense. However, we also show that the estimated values of H can be reproduced by a short-term persistence stochastic model such as an auto-regressive AR(1) process. A direct consequence is that some of the most common methods for the estimation of H coefficient, might not be suitable for discriminating short- and long-term persistence even in long observational records.

  15. Computer multitasking with Desqview 386 in a family practice.

    PubMed Central

    Davis, A E

    1990-01-01

    Computers are now widely used in medical practice for accounting and secretarial tasks. However, it has been much more difficult to use computers in more physician-related activities of daily practice. I investigated the Desqview multitasking system on a 386 computer as a solution to this problem. Physician-directed tasks of management of patient charts, retrieval of reference information, word processing, appointment scheduling and office organization were each managed by separate programs. Desqview allowed instantaneous switching back and forth between the various programs. I compared the time and cost savings and the need for physician input between Desqview 386, a 386 computer alone and an older, XT computer. Desqview significantly simplified the use of computer programs for medical information management and minimized the necessity for physician intervention. The time saved was 15 minutes per day; the costs saved were estimated to be $5000 annually. PMID:2383848

  16. Managing heteroscedasticity in general linear models.

    PubMed

    Rosopa, Patrick J; Schaffer, Meline M; Schroeder, Amber N

    2013-09-01

    Heteroscedasticity refers to a phenomenon where data violate a statistical assumption. This assumption is known as homoscedasticity. When the homoscedasticity assumption is violated, this can lead to increased Type I error rates or decreased statistical power. Because this can adversely affect substantive conclusions, the failure to detect and manage heteroscedasticity could have serious implications for theory, research, and practice. In addition, heteroscedasticity is not uncommon in the behavioral and social sciences. Thus, in the current article, we synthesize extant literature in applied psychology, econometrics, quantitative psychology, and statistics, and we offer recommendations for researchers and practitioners regarding available procedures for detecting heteroscedasticity and mitigating its effects. In addition to discussing the strengths and weaknesses of various procedures and comparing them in terms of existing simulation results, we describe a 3-step data-analytic process for detecting and managing heteroscedasticity: (a) fitting a model based on theory and saving residuals, (b) the analysis of residuals, and (c) statistical inferences (e.g., hypothesis tests and confidence intervals) involving parameter estimates. We also demonstrate this data-analytic process using an illustrative example. Overall, detecting violations of the homoscedasticity assumption and mitigating its biasing effects can strengthen the validity of inferences from behavioral and social science data.

  17. Psychiatrist Health Human Resource Planning - An Essential Component of a Hospital-Based Mental Healthcare System Transformation.

    PubMed

    Jarmain, Sarah

    2016-01-01

    The World Health Organization (WHO) defines health human resource planning as "the process of estimating the number of persons and the kinds of knowledge, skills, and attitudes they need to achieve predetermined health targets and ultimately health status objectives" (OHA 2015). Health human resource planning is a critical component of successful organizational and system transformation, and yet little has been written on how to do this for physicians at the local level. This paper will outline a framework for developing and managing key aspects of physician human resource planning related to both the quantity and quality of work within a hospital setting. Using the example of a complex multiphase hospital-based mental health transformation that involved both the reduction and divestment of beds and services, we will outline how we managed the physician human resource aspects to establish the number of psychiatrists needed and the desired attributes of those psychiatrists, and how we helped an existing workforce transition to meet the new expectations. The paper will describe a process for strategically aligning the selection and management of physicians to meet organizational vision and mandate.

  18. WHO WOULD EAT IN A WORLD WITHOUT PHOSPHORUS? A GLOBAL DYNAMIC MODEL

    NASA Astrophysics Data System (ADS)

    Dumas, M.

    2009-12-01

    Phosphorus is an indispensable and non-substitutable resource, as agriculture is impossible if soils do not hold adequate amounts of this nutrient. Phosphorus is also considered to be a non-renewable and increasingly scarce resource, as phosphate rock reserves - as one measure of availability amongst others - are estimated to last for 50 to 100 years at current rates of consumption. How would food production decline in different parts of the world in the scenario of a sudden shortage in phosphorus? To answer this question and explore management scenarios, I present a probabilistic model of the structure and dynamics of the global P cycle in the world’s agro-ecosystems. The model proposes an original solution to the challenge of capturing the large-scale aggregate dynamics of multiple micro-scale soil cycling processes. Furthermore, it integrates the essential natural processes with a model of human-managed flows, thereby bringing together several decades of research and measurements from soil science, plant nutrition and long-term agricultural experiments from around the globe. In this paper, I present the model, the first simulation results and the implications for long-term sustainable management of phosphorus and soil fertility.

  19. Simulating Heterogeneous Infiltration and Contaminant leaching Processes at Chalk River, Ontario

    NASA Astrophysics Data System (ADS)

    Ali, M. A.; Ireson, A. M.; Keim, D.

    2015-12-01

    A study is conducted at a waste management area in Chalk River, Ontario to characterize flow and contaminant transport with the aim of contributing to improved hydrogeological risk assessment in the context of waste management. Field monitoring has been performed to gain insights into the unsaturated zone characteristics, moisture dynamics, and contaminant transport rates. The objective is to provide quantitative estimates of surface fluxes (quantification of infiltration and evaporation) and investigations of unsaturated zone processes controlling water infiltration and spatial variability in head distributions and flow rates. One particular issue is to examine the effectiveness of the clayey soil cap installed to prevent infiltration of water into the waste repository and the top sand soil cover above the clayey layer to divert the infiltrated water laterally. The spatial variability in the unsaturated zone properties and associated effects on water flow and contaminant transport observed at the site, have led to a concerted effort to develop improved model of flow and transport based on stochastic concepts. Results obtained through the unsaturated zone model investigations are combined with the hydrogeological and geochemical components and develop predictive tools to assess the long term fate of the contaminants at the waste management site.

  20. Genetic structure of coexisting wild and managed agave populations: implications for the evolution of plants under domestication

    PubMed Central

    Figueredo, Carmen Julia; Casas, Alejandro; González-Rodríguez, Antonio; Nassar, Jafet M.; Colunga-GarcíaMarín, Patricia; Rocha-Ramírez, Víctor

    2015-01-01

    Domestication is a continuous evolutionary process guided by humans. This process leads to divergence in characteristics such as behaviour, morphology or genetics, between wild and managed populations. Agaves have been important resources for Mesoamerican peoples since prehistory. Some species are domesticated and others vary in degree of domestication. Agave inaequidens Koch is used in central Mexico to produce mescal, and a management gradient from gathered wild and silvicultural populations, as well as cultivated plantations, has been documented. Significant morphological differences were reported among wild and managed populations, and a high phenotypic variation in cultivated populations composed of plants from different populations. We evaluated levels of genetic diversity and structure associated with management, hypothesizing that high morphological variation would be accompanied by high genetic diversity in populations with high gene flow and low genetic structure among managed and unmanaged populations. Wild, silvicultural and cultivated populations were studied, collecting tissue of 19–30 plants per population. Through 10 nuclear microsatellite loci, we compared population genetic parameters. We analysed partition of variation associated with management categories to estimate gene flow among populations. Agave inaequidens exhibits high levels of genetic diversity (He = 0.707) and moderate genetic structure (FST = 0.112). No differences were found in levels of genetic diversity among wild (He = 0.704), silviculturally managed (He = 0.733) and cultivated (He = 0.698) populations. Bayesian analysis indicated that five genetic clusters best fit the data, with genetic groups corresponding to habitats where populations grow rather than to management. Migration rates ranged from zero between two populations to markedly high among others (M = 0.73–35.25). Natural mechanisms of gene flow and the dynamic management of agave propagules among populations favour gene flow and the maintenance of high levels of variation within all populations. The slight differentiation associated with management indicates that domestication is in an incipient stage. PMID:26433707

Top